Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Files list which are more than 300 MB size Post 302088620 by redlotus72 on Wednesday 13th of September 2006 02:15:56 PM
Old 09-13-2006
Files list which are more than 300 MB size

Hi,
I need to find files list which are more than 300 MB size? My script It should search all inner directories too.
How can I proceed? Any idea?
 

9 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Command to list Files less than or equal to 2k size

Hi, *.xml files stored on date wise folders. I need to extract *.xml files from all the folders. File size is lessthan or equal to 2K. Please let me know command or some shell program to due this job on linux m/c This is urgent, Thanks in advance - Bache Gowda (3 Replies)
Discussion started by: bache_gowda
3 Replies

2. Shell Programming and Scripting

bash script working for small size files but not for big size files.

Hi, I have one file stat. Stat file contents are as follows: for example. H50768020040913,00260100,507680,13,0000000643,0000000643,00000,0000 H50769520040808,00260100,507695,13,0000000000,0000000000,00000,0000 H50770620040611,00260100,507706,13,0000000000,0000000000,00000,0000 Now i... (1 Reply)
Discussion started by: davidpreml
1 Replies

3. Shell Programming and Scripting

list the files with size in bytes

hi all plz help in listing the files with size in bytes. thnks -Bali (4 Replies)
Discussion started by: balireddy_77
4 Replies

4. Solaris

calculate sum size of files by date (arg list too long)

Hi, I wanted a script to find sum of files for a particular date, below is my script ls -lrt *.req | nawk '$6 == "Aug"' | nawk '$7 == "1"'| awk '{sum = sum + $5} END {print sum}' However, i get the error below /usr/bin/ls: arg list too long How do i fix that. Many thanks before. (2 Replies)
Discussion started by: beginningDBA
2 Replies

5. UNIX for Dummies Questions & Answers

How to get size of a list of files with specified extension?

Command ls -l *cpp lists all cpp program files in a directory. It shows the size of each file. Using a calculator to work out the total size of the cpp files would be very tedious. Is there a way to get the total size from the command line? (5 Replies)
Discussion started by: resander
5 Replies

6. UNIX for Dummies Questions & Answers

How to list files which have same size?

Hi guys, I need to do 100 files comparison after I sorted the files. There are no specific key for sorting so i plan to arrange the files based on the file size. The command that i used to sort the files by size is as per below:- ls -l | sort +4rn | awk '{print $5, $9}' The problem that i... (3 Replies)
Discussion started by: shahril
3 Replies

7. Shell Programming and Scripting

Compare list [ names and size files ]

Hello, I've downloaded a huge amont of files I've got a list of files from a remote server. -rw-r--r-- 1 str661 strem 453465260 Dec 16 15:54 SATRYS2V1_20021218_temp_bias.nc -rw-r--r-- 1 str661 strem 17669468 Dec 16 18:01 SATRYS2V1_20021225_hdyn_bias.nc -rw-r--r-- 1... (9 Replies)
Discussion started by: Aswex
9 Replies

8. UNIX for Dummies Questions & Answers

Determining file size for a list of files with paths

Hello, I have a flat file with a list of files with the path to the file and I am attempting to calculate the filesize for each one; however xargs isn't playing nicely and I am sure there is probably a better way of doing this. What I envisioned is this: cat filename|xargs -i ls -l {} |awk... (4 Replies)
Discussion started by: joe8mofo
4 Replies

9. Shell Programming and Scripting

List duplicate files based on Name and size

Hello, I have a huge directory (with millions of files) and need to find out duplicates based on BOTH file name and File size. I know fdupes but it calculates MD5 which is very time-consuming and especially it takes forever as I have millions of files. Can anyone please suggest a script or... (7 Replies)
Discussion started by: prvnrk
7 Replies
DH_COMPRESS(1)							     Debhelper							    DH_COMPRESS(1)

NAME
       dh_compress - compress files and fix symlinks in package build directories

SYNOPSIS
       dh_compress [debhelperoptions] [-Xitem] [-A] [file...]

DESCRIPTION
       dh_compress is a debhelper program that is responsible for compressing the files in package build directories, and makes sure that any
       symlinks that pointed to the files before they were compressed are updated to point to the new files.

       By default, dh_compress compresses files that Debian policy mandates should be compressed, namely all files in usr/share/info,
       usr/share/man, files in usr/share/doc that are larger than 4k in size, (except the copyright file, .html and other web files, image files,
       and files that appear to be already compressed based on their extensions), and all changelog files. Plus PCF fonts underneath
       usr/share/fonts/X11/

FILES
       debian/package.compress
	   These files are deprecated.

	   If this file exists, the default files are not compressed. Instead, the file is ran as a shell script, and all filenames that the shell
	   script outputs will be compressed. The shell script will be run from inside the package build directory. Note though that using -X is a
	   much better idea in general; you should only use a debian/package.compress file if you really need to.

OPTIONS
       -Xitem, --exclude=item
	   Exclude files that contain item anywhere in their filename from being compressed. For example, -X.tiff will exclude TIFF files from
	   compression.  You may use this option multiple times to build up a list of things to exclude.

       -A, --all
	   Compress all files specified by command line parameters in ALL packages acted on.

       file ...
	   Add these files to the list of files to compress.

CONFORMS TO
       Debian policy, version 3.0

SEE ALSO
       debhelper(7)

       This program is a part of debhelper.

AUTHOR
       Joey Hess <joeyh@debian.org>

11.1.6ubuntu2							    2018-05-10							    DH_COMPRESS(1)
All times are GMT -4. The time now is 09:42 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy