Du -sh command taking time to calculate the big size files

 
Thread Tools Search this Thread
Operating Systems Linux Red Hat Du -sh command taking time to calculate the big size files
# 1  
Old 07-17-2014
Hammer & Screwdriver Du -sh command taking time to calculate the big size files

Hi ,

My linux server is taking more time to calculate big size from long time.

* i am accessing server through ssh
* commands
Code:
# - du -sh *
#du -sh * | sort -n | grep G

Please guide me for fast way to find big size directories under to / partition

Thanks

Last edited by Scott; 07-17-2014 at 03:25 PM.. Reason: Code tags
# 2  
Old 07-17-2014
You are limited by the speed of your disk here. du must read the inode of each and every file to generate a summary.

I'm guessing you have an awful lot of files.

Do you have more than one partition? I'd suggest searching inside a partition instead of the entire disk in general.
# 3  
Old 07-17-2014
only two partitions are there / and /boot other directories mounted by NFS.

Yes, / having many files.. I am waiting since 3 hours, no output yet . Smilie
# 4  
Old 07-17-2014
You're sorting it. You're not going to see any output until it's 100% finished. Try leaving out the sort.

I might also try this:

Code:
for DIR in /*
do
        [ "$DIR" == "/proc" ] && continue # Not a real folder
        [ "$DIR" == "/sys" ] && continue # Not a real folder

        echo "Checking $DIR"
        du -hs "$DIR"
done

...that way, you can at least tell what directory it's freezing on and avoid bothering with the system pseudo-folders.

If it's freezing on a particular folder, try starting inside that folder and checking the size of its sub-contents. Trawling an NFS mounted folder could be quite slow indeed. Is there any way you could get onto the server they're hosted by directly instead?

Last edited by Corona688; 07-17-2014 at 02:37 PM..
# 5  
Old 07-17-2014
Having nfs mounts directly off the root e.g., /nfsdirectory is a no-no, big time. In your case, du does a stat on every file under /, including files mounted on remote systems.

The remote connections are nowhere near as fast or reliable as the locally mount disks.
df and du can hang for hours due to nfs slowness, remote server timeouts, and so on. This can also break the pwd command.

nfs is almost guaranteed to be your problem.
# 6  
Old 07-17-2014
I have noticed on NFS mountpoints this situation :

Make a dir on NFS with couple of files in them.
Copy a big tar file or alike (couple of GB, so it takes time) to that directory.

Do a ls -lrt with truss/truss (do same with du to determine where it got stuck)
I noticed that ls command is sleeping on lstat call (get file status).

But if you do a while true do ls -lrt ... it will be slow first time, and fast every other loop iterations.
From command line during file copy, every ls executed will exhibit the same symptoms.

Can any of you experts explain, is there some kind of issue in NFS design in these kinds of situation with total size in bytes changing and lstat call on NFS files ?
# 7  
Old 07-17-2014
Hey Jim,

You are right but i cant remove nfs share..Smilie is there any other way to see huge files under / directory ?

---------- Post updated at 11:30 PM ---------- Previous update was at 11:17 PM ----------

Code:
Filesystem            Size  Used Avail Use% Mounted on
/dev/mapper/vg00-lvol2
                      512G  485G     0 100% /

This is the local one and i want to see large from / directory..

Last edited by Scott; 07-17-2014 at 03:25 PM.. Reason: Code tags
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Programming

[c] How to calculate size of the file from size of the buffer?

Hi, Can I find size of the file from size of the buffer written? nbECRITS = fwrite(strstr(data->buffer, ";") + 1, sizeof(char), (data->buffsize) - LEN_NOM_FIC, fic_sortie); Thank You :) (1 Reply)
Discussion started by: ezee
1 Replies

2. UNIX for Advanced & Expert Users

command taking lot of time to execute

Hi, I am running the following command, and it tries to delete some dn from ldap, however, it takes lot of time before it finally request LDAP server to delete it. I am trying to find why it is taking lot of time. Could you anyone help me in this regard. I have copies the pstack output, and... (3 Replies)
Discussion started by: john_prince
3 Replies

3. Shell Programming and Scripting

Shell script to calculate the size of files

Dear all, Please help me to write a script that can calculate the size of files. For example: I have a directory which contain thousands of files. I need to know the size of files that their name begin with abc_123 Thank all!! (4 Replies)
Discussion started by: hainguyen1402
4 Replies

4. Shell Programming and Scripting

Calculate age of a file | calculate time difference

Hello, I'm trying to create a shell script (#!/bin/sh) which should tell me the age of a file in minutes... I have a process, which delivers me all 15 minutes a new file and I want to have a monitoring script, which sends me an email, if the present file is older than 20 minutes. To do... (10 Replies)
Discussion started by: worm
10 Replies

5. Solaris

calculate sum size of files by date (arg list too long)

Hi, I wanted a script to find sum of files for a particular date, below is my script ls -lrt *.req | nawk '$6 == "Aug"' | nawk '$7 == "1"'| awk '{sum = sum + $5} END {print sum}' However, i get the error below /usr/bin/ls: arg list too long How do i fix that. Many thanks before. (2 Replies)
Discussion started by: beginningDBA
2 Replies

6. UNIX for Dummies Questions & Answers

gref -f taking long time for big file

grep -f taking long time to compare for big files, any alternate for fast check I am using grep -f file1 file2 to check - to ckeck dups/common rows prsents. But my files contains file1 contains 5gb and file 2 contains 50 mb and its taking such a long time to compare the files. Do we have any... (10 Replies)
Discussion started by: gkskumar
10 Replies

7. Shell Programming and Scripting

Help in extracting multiple files and taking average at same time

Hi, I have 20 files which have respective 50 lines with different values. I would like to process each line of the 50 lines in these 20 files one at a time and do an average of 3rd field ($3) of these 20 files. This will be output to an output file. Instead of using join to generate whole... (8 Replies)
Discussion started by: ahjiefreak
8 Replies

8. Shell Programming and Scripting

bash script working for small size files but not for big size files.

Hi, I have one file stat. Stat file contents are as follows: for example. H50768020040913,00260100,507680,13,0000000643,0000000643,00000,0000 H50769520040808,00260100,507695,13,0000000000,0000000000,00000,0000 H50770620040611,00260100,507706,13,0000000000,0000000000,00000,0000 Now i... (1 Reply)
Discussion started by: davidpreml
1 Replies

9. Shell Programming and Scripting

calculate size of some files

Hi, 1-I want to calculate the size of all files which are generated during last month in a directory. How can I do that ? Of cours, I find them by : $ls -l | grep jun but how to calculate the sum of their size ? 2- the same but for all files generated last month and before that. many thanks... (11 Replies)
Discussion started by: big123456
11 Replies
Login or Register to Ask a Question