Sponsored Content
Operating Systems Linux Red Hat Empty directory, large size and performance Post 302591700 by verdepollo on Friday 20th of January 2012 11:02:57 AM
Old 01-20-2012
ls -l does not report overall directory size as far as I know.

Are you deleting the content of the files or the files themselves?

Inode usage does not affect performance in any way, there isn't any performance penalty for using them (or not using them).

Disk performance is usually affected by the number of reads/writes you do at a single time (aka I/O operations), the physical area of the disk you use (the inner tracks of the disk rim are faster) and the spin velocity of the platter - This of course does not apply for solid state disks.

As long as you don't run out of inodes, the only problem that may arise is that you run out of disk space.
 

9 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Unix File System performance with large directories

Hi, how does the Unix File System perform with large directories (containing ~30.000 files)? What kind of structure is used for the organization of a directory's content, linear lists, (binary) trees? I hope the description 'Unix File System' is exact enough, I don't know more about the file... (3 Replies)
Discussion started by: dive
3 Replies

2. UNIX for Dummies Questions & Answers

Empty directories having different size

$ls -lrt mydir total 12 drwxrwxrwx 2 nobody nobody 512 Aug 8 11:51 tmp drwxrwxrwx 2 nobody nobody 4608 Jan 19 12:20 web.cache $ ls -lrt mydir/web.cache/ total 0 $ ls -lrt mydir/tmp/ total 0 Can anyone explain me the above results? I know the o/p of ls, but this... (3 Replies)
Discussion started by: rahulrathod
3 Replies

3. UNIX for Dummies Questions & Answers

Empty Directory, big size

Hello, Can somebody please explain why I have EMPTY directories on HP-UX with BIG SIZE ? Thank you ! Double post, continued here (0 Replies)
Discussion started by: drbiloukos
0 Replies

4. HP-UX

Empty Directory, big size

Hello, Can you please explain why I have various empty directories with large size ? OS is B.11.11 (3 Replies)
Discussion started by: drbiloukos
3 Replies

5. Shell Programming and Scripting

Severe performance issue while 'grep'ing on large volume of data

Background ------------- The Unix flavor can be any amongst Solaris, AIX, HP-UX and Linux. I have below 2 flat files. File-1 ------ Contains 50,000 rows with 2 fields in each row, separated by pipe. Row structure is like Object_Id|Object_Name, as following: 111|XXX 222|YYY 333|ZZZ ... (6 Replies)
Discussion started by: Souvik
6 Replies

6. Shell Programming and Scripting

How to delete some of the files in the directory, if the directory size limits the specified size

To find the whole size of a particular directory i use "du -sk /dirname".. but after finding the direcory's size how do i make conditions like if the size of the dir is more than 1 GB i hav to delete some of the files inside the dir (0 Replies)
Discussion started by: shaal89
0 Replies

7. Shell Programming and Scripting

Performance issue in Grepping large files

I have around 300 files(*.rdf,*.fmb,*.pll,*.ctl,*.sh,*.sql,*.prog) which are of large size. Around 8000 keywords(which will be in the file $keywordfile) needed to be searched inside those files. If a keyword is found in a file..I have to insert the filename,extension,catagoery,keyword,occurrence... (8 Replies)
Discussion started by: millan
8 Replies

8. UNIX for Beginners Questions & Answers

Command to extract empty field in a large UNIX file?

Hi All, I have records in unix file like below. In this file, we have empty fields from 4th Column to 22nd Column. I have some 200000 records in a file. I want to extract records only which have empty fields from 4th field to 22nd filed. This file is comma separated file. what is the unix... (2 Replies)
Discussion started by: rakeshp
2 Replies

9. Shell Programming and Scripting

Bash script search, improve performance with large files

Hello, For several of our scripts we are using awk to search patterns in files with data from other files. This works almost perfectly except that it takes ages to run on larger files. I am wondering if there is a way to speed up this process or have something else that is quicker with the... (15 Replies)
Discussion started by: SDohmen
15 Replies
casparize(1)							  USER COMMANDS 						      casparize(1)

  NAME
      casparize - Set up caspar Makefile in a new directory

  SYNOPSIS
      casparize dir (/path/to/config/dir)

      casparize file (/path/to/config/dir/file)

  DESCRIPTION
      casparize  creates  a  new  configuration  working directory in your current working directory, sets up a Makefile for caspar(7) in this new
      directory, and optionally copies an original configuration file from its original system place to the newly  created  configuration  working
      directory.

  USAGE
      You typically use casparize when you already have created the root configuration working directory with its include directory and install.mk
      Caspar include file. By analysing your current working directory and the configuration directory path you give on  the  command  line,  cas-
      parize  can  deduce the contents of the Makefile in the newly created configuration working directory. It creates the new directory, creates
      the correct Makefile, and optionally copies the given configuration file in the new directory, ready for its first version commit.

  EXAMPLES
      A typical example:

	$ cd <svn>/etc
	$ casparize /etc/postfix/main.cf

      creates the directory <svn>/etc/postfix, creates <svn>/etc/postfix/Makefile including the proper content,  and  copies  /etc/postfix/main.cf
      into <svn>/etc/postfix/main.cf. You can now directly add and commit the new directory.

  BUGS
      Non known at this moment.

  AUTHOR
      Jeroen Hoppenbrouwers

  SEE ALSO
      caspar(7) The caspar homepage is at http://mdcc.cx/caspar/ .

  casparize 20120508						      8 mai 2012							casparize(1)
All times are GMT -4. The time now is 05:59 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy