Find large files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Find large files
# 1  
Old 01-30-2014
Find large files

All,

I'm running a simple find for large files in a bash shell that works fine:

Code:
find <dir> -xdev -ls | awk '{print $7,$8,$9,$10,$11}' | sort -nr | head

It gives me everything I want: file size, time stamp, and file name. However, I'd like to have the file size in human readable form.

I've tried:

Code:
find <dir> -xdev -type f -size +50M -exec ls -lh {} \;

and

Code:
find <dir> -xdev -type f -size +50M -exec du -h --time {} \;

variations but haven't managed to get anything to work.

Does anyone know of a solution to what I'm looking for?

TIA,

Herb

Moderator's Comments:
Mod Comment Code tags for code please. [code] code [/code]
# 2  
Old 01-30-2014
The trouble then is how to sort it after... So I'd modify your first one a little, converting it to a friendly unit in awk but still printing the original value for sorting.

Code:
$ find . -xdev -ls | awk 'BEGIN{ split(" K M G T P",U); }
        { T=0; N=$7+0; while(N > 1024) { N/=1024; T++ } print $7,N U[T],$8,$9,$10,$11}' CONVFMT="%.2f" OFS="\t" - |
        sort -nr | head

262144  256K    Jul     13      2009    ./unixcom/file.bin
184762  180.43K Feb     5       2008    ./table.html
6960    6.80K   Jan     22      15:05   ./date.pl
5632    5.50K   Sep     18      11:36   ./excelxml/output.xls
4096    4K      Sep     18      2012    ./makepdf
4096    4K      Sep     18      11:17   ./excelxml
4096    4K      Nov     28      2007    ./tbl
4096    4K      Jun     8       2012    ./median

$

Divide by 1000 if you prefer drivemaker's kilobytes over computer ones.
This User Gave Thanks to Corona688 For This Post:
# 3  
Old 01-30-2014
Try:
Code:
find <dir> -xdev -ls | awk '{print $7,$8,$9,$10,$11}' | sort -nr | head | nawk '{$1/=(1024*1024);$1=$1" MB"}1'

This User Gave Thanks to bartus11 For This Post:
# 4  
Old 02-03-2014
Thank you for the replies.

@bartus11 - I need to make this search more portable and a lot of the systems will not have nawk.

@Corona688 - Your suggestion works great, thanks for the guidance.

Herb
# 5  
Old 02-03-2014
What systems will not have nawk?

If a system doesn't have nawk it probably doesn't need it, i.e. their ordinary awk is as good as solaris' nawk.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Find Large Files Recursively From Specific Directory

Hi. I found many scripts in the web of achieving this. But I like to use this one find /EDWH-DMT03 -xdev -size +10000 -exec ls -la {} \;|sort -n -k 5 > LARGE.rst But the problem is, why it still list out files with 89 bytes as the output? Is there anything wrong with the command? My... (7 Replies)
Discussion started by: aimy
7 Replies

2. UNIX for Dummies Questions & Answers

Find common numbers from two very large files using awk or the like

I've got two files that each contain a 16-digit number in positions 1-16. The first file has 63,120 entries all sorted numerically. The second file has 142,479 entries, also sorted numerically. I want to read through each file and output the entries that appear in both. So far I've had no... (13 Replies)
Discussion started by: Scottie1954
13 Replies

3. UNIX for Dummies Questions & Answers

Find word in a large file

Hi all I am working on disallowing users to use easy passwords in pam.d setting on RHEL 5.7 and SuSe 11, and I was hoping to add more words into the current cracklib dict, so I use "echo" command to append new words into the file I dont want to add the same words into the dict, I think I... (2 Replies)
Discussion started by: hedkandi
2 Replies

4. AIX

Want to find the last access time for large group of files at one go

Dear All, I'm working as a DBA and dont have much knowledge at OS level commands.we have requirement that we need find the files which has been last accessed >= apr 2010and also access date <= apr 2010 for a large set of files.Do know some commands likeistat, ls -u.But can anyone provide me the... (4 Replies)
Discussion started by: dbanrb
4 Replies

5. UNIX for Dummies Questions & Answers

find large files in root filesystem and exclude others

I am wondering if there is a way to search for top 10 files in size in root filesystem but exclude all other mounts including nfs mounts . For example excluded /var /boot /app /app1 /u01/ (1 Reply)
Discussion started by: gubbu
1 Replies

6. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

7. UNIX for Advanced & Expert Users

Find common Strings in two large files

Hi , I have a text file in the format DB2: DB2: WB: WB: WB: WB: and a second text file of the format Time=00:00:00.473 Time=00:00:00.436 Time=00:00:00.016 Time=00:00:00.027 Time=00:00:00.471 Time=00:00:00.436 the last string in both the text files is of the... (4 Replies)
Discussion started by: kanthrajgowda
4 Replies

8. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

9. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies

10. UNIX for Dummies Questions & Answers

Need to find large files

I have found the following code on this forum ls -lh | awk '{print $5,$9}' | sort -n Its purpose is to show a list of files in a dir sorted by file size. I need to make it recursive ls -lhR | awk '{print $5,$9}' | sort -n The problem is that there are lots of files on the... (3 Replies)
Discussion started by: jadionne
3 Replies
Login or Register to Ask a Question