The best and easiest approach in my opinion, is to resize the partitions offline using a utility such as Gnome Partition Editor (GParted has a live CD/USB for download)
/usr should never have user files on it. I am guessing you have a process writing some big logfiles that has filled /usr. CentOS distros do not usually have 10G worth of software, although it is completely possible.
Regardless of adding space you need to see what is going on with /usr
What you want is a really bad idea. You should fix /usr.
However:
1. lets assume /usr/local has tons of extra software. Reboot into single user mode.
2. As root, use mkdir to create a directory on the /home/ /home/usr/local
3. use tar to move the files
4. Be ABOSULUTELY positive this worked correctly.
5.
You really should either clean up junk on /usr, or add another disk or partition and move file over it the way I described. Using /home will be a mistake. NeutronScott gave you another way to play with partitions
Last edited by jim mcnamara; 04-30-2012 at 11:55 AM..
Location: Saint Paul, MN USA / BSD, CentOS, Debian, OS X, Solaris
Posts: 2,288
Thanks Given: 430
Thanked 480 Times in 395 Posts
Hi.
For comparison, on a CentOS machine I use:
using commands:
produces:
My include and lib64 might be a bit large because I have the boost libraries installed.
In looking at /usr/bin/*, after omitting script, dynamic, and symbolic lines, I find fewer than 20 items, out of more than 2000.
If you have directories that are in addition to these, or some that are far larger, that might be a starting point for jim's suggestion.
This is really a critical situation. What application does that server run? How big is your /usr/share ? Compared to other directories in /usr, moving /usr/share to another location and creating a symbolic link would be easier to avoid serious damage. You could curve out a little bit of space doing that while ensuring zero down time.
What version of CentOS are you using? I guess 5.x? For a permanent solution, I would go for LVM. But for that, you would have to spend a lot of time in single user mode to migrate the data and at the end things may not turn out as expected.
Hi,
I run Fedora 17.
I created a physical volume of 30GB on a disk with 60GB of space so there is 30GB of free space. On the physical volume, I created my volume group and logical volumes. I assigned all the space in the physical volume to my volume group. I need to add the 30GB of free space... (1 Reply)
I currently have a web server its on a small harddrive I didn't know my site would grow so fast but now I need a bigger hard drive. Instead of adding another harddrive (host charge monthly of how many hard drives connected to server) is there anyway to just move the whole os to a bigger hard drive... (2 Replies)
Hey, I'm kinda new to the shell scripting and I don't wanna mess things up yet :)
Looking for a solution to the following:
I need to move all the files like "filename.failed.dateandtime" to another directory also renaming them "filename.ready". I can't figure how to do this with multiple files... (4 Replies)
hi guys, me again ;)
i recently opened a thread about physical to zone migration.
My zone is mounted over a "bigger" LUN (500GB) and step is now to move the old files, from the physical server, to my zone.
We are talking about 22mio of files.
i used rsync to do that and every time at... (8 Replies)
in the same VG?
Is there a way we can do this?
We basically have a test server that used to be a production server. Now the newly created test directories have run out of space and the old production directories have alot of free space. Can we transfer that free space over?
If so how? Have... (2 Replies)
Good afternoon! Im new at scripting and Im trying to write a script to
calculate total space, total used space and total free space in filesystem names matching a keyword (in this one we will use keyword virginia). Please dont be mean or harsh, like I said Im new and trying my best. Scripting... (4 Replies)
Hello Techies,
m here with a small issue. Have read out all the relavent threads on unix.com but it was not so good sol. for me.
It's a simple & known issue. I want to move lots of files to dest folder. Actually I want to pick up 1 year older files, but that is even taking lots of... (16 Replies)
I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames
I tried the following logics
(1)
find . -name \*.dat > list
for i in `cat list` do mv $i test/ done
(2)
cat list|xargs -i mv "{}"... (7 Replies)