Sponsored Content
Operating Systems HP-UX Processing very big directory Post 302322793 by reborg on Thursday 4th of June 2009 02:56:39 PM
Old 06-04-2009
There is no really easy to do it other than finding a more efficient data storage structure.

However it you are using vxfs and are deleting files are an amount of time has elapsed you could try a directory defrag to see if the directory is bigger than it needs to be and reduce it if possible. The is not the current number of files but the highest number ever that matters.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Need script to make big directory structure

Hi, I'm a novice and I'd like to make a directory structure with a hundred or so folders. I've tried mkdir /foo/foo1/etc... mkdir /foo/foo2/etc mkdir /foo/foo3/etc mkdir /foo/foo4/etc ...but it appends '@' to each folder name and then fails on the subdirectories. Is it better to use a... (2 Replies)
Discussion started by: kamur
2 Replies

2. Shell Programming and Scripting

Processing files within a directory one by one

Hi How to create a shell script which takes in to account all the files present within a directory DIR one by one e.g. suppose i have a directory named DIR where there are files with the extension .ABC i want to create shell script which processes all these files one by one. ... (1 Reply)
Discussion started by: skyineyes
1 Replies

3. Shell Programming and Scripting

how to extract files one by one from a directory and let some processing happen

how to extract files one by one from a directory and let some processing be done on the file I have a directory by name INTRN which has files like INTR.0003080248636814 INTR.0003080248636816 INTR.0003080248636818 . . . . and so on and in a script... (5 Replies)
Discussion started by: saniya
5 Replies

4. UNIX for Advanced & Expert Users

mv OR cp many files from 1 big directory with sub folders all to 1 spot?

Hello All. I am trying to do this from a terminal prompt on my mac.... I have 100 folders all named different things. Those 100 folders are inside ~/Desktop/Pictures directory. Each of the 100 folders are uniquely named. The image files inside of each folder only have some similarities. ... (1 Reply)
Discussion started by: yoyoyo777
1 Replies

5. AIX

Fastest way to count big amount of files in sub directory

Hi, what happened is we want to count all the files in a directory and inside this directory got many folders and so take long time to count it. Already run for about few minutes but still not done. The command we use to count is find . -type f | wc -l Just wondering if there is any other... (9 Replies)
Discussion started by: ngaisteve1
9 Replies

6. Shell Programming and Scripting

Find which dir is big size in a directory

Hi all, Could you please tellme the commadn which sorts the list of directories in a parent dir by their size. Thanks. (2 Replies)
Discussion started by: firestar
2 Replies

7. UNIX for Dummies Questions & Answers

Empty Directory, big size

Hello, Can somebody please explain why I have EMPTY directories on HP-UX with BIG SIZE ? Thank you ! Double post, continued here (0 Replies)
Discussion started by: drbiloukos
0 Replies

8. HP-UX

Empty Directory, big size

Hello, Can you please explain why I have various empty directories with large size ? OS is B.11.11 (3 Replies)
Discussion started by: drbiloukos
3 Replies

9. Shell Programming and Scripting

Delete big directory issue

Hello folks, I am deleting a directory with script it is taking 11Hour and also increase the IO on server. I am using below command, inside date directory there are hour directories, which i am deleting after archiving. Archiving is not taking long time, only "rm -rf" is taking alot of time with... (21 Replies)
Discussion started by: learnbash
21 Replies
GENBACKUPDATA(1)					      General Commands Manual						  GENBACKUPDATA(1)

NAME
genbackupdata - generate backup test data SYNOPSIS
genbackupdata [--chunk-size=SIZE] [--config=FILE] [-c=SIZE] [--create=SIZE] [--depth=DEPTH] [--dump-config] [--dump-setting-names] [--file-size=SIZE] [--generate-manpage=TEMPLATE] [-h] [--help] [--list-config-files] [--log=FILE] [--log-keep=N] [--log-level=LEVEL] [--log-max=SIZE] [--max-files=MAX-FILES] [--no-default-configs] [--output=FILE] [--quiet] [--seed=SEED] [--version] DESCRIPTION
genbackupdata generates test data sets for performance testing of backup software. It creates a directory tree filled with files of dif- ferent sizes. The total size and the distribution of sizes between small and big are configurable. The program can also modify an exist- ing directory tree by creating new files, and deleting, renaming, or modifying existing files. This can be used to generate test data for successive generations of backups. The program is deterministic: with a given set of parameters (and a given pre-existing directory tree), it always creates the same output. This way, it is possible to reproduce backup tests exactly, without having to distribute the potentially very large test sets. The data set consists of plain files and directories. Files are either small text files or big binary files. Text files contain the "lorem ipsum" stanza, binary files contain randomly generated byte streams. The percentage of file data that is small text or big binary files can be set, as can the sizes of the respective file types. Files and directories are named "fileXXXX" or "dirXXXX", where "XXXX" is a successive integer, separate successions for files and directo- ries. There is an upper limit to how many files a directory may contain. After the file limit is reached, a new sub-directory is created. The first set of files go into the root directory of the test set. You have to give one of the options --create, --delete, --rename, or --modify for the program to do anything. You can, however, give more than one of them, if DIR already exists. (Giving the same option more than once means that only the last instance is counted.) (DIR) is created if it doesn't exist already. OPTIONS
--chunk-size=SIZE generate data in chunks of this size (default: 16384) --config=FILE add FILE to config files -c, --create=SIZE how much data to create (default: 0) --depth=DEPTH depth of directory tree (default: 3) --dump-config write out the entire current configuration --dump-setting-names write out all names of settings and quit --file-size=SIZE size of one file (default: 16384) --generate-manpage=TEMPLATE fill in manual page TEMPLATE -h, --help show this help message and exit --list-config-files list all possible config files --log=FILE write log entries to FILE --log-keep=N keep last N logs (10) --log-level=LEVEL log at LEVEL, one of debug, info, warning, error, critical, fatal (default: debug) --log-max=SIZE rotate logs larger than SIZE, zero for never (default: 0) --max-files=MAX-FILES max files/dirs per dir (default: 128) --no-default-configs clear list of configuration files to read --output=FILE write output to FILE, instead of standard output --quiet do not report progress --seed=SEED seed for random number generator (default: 0) --version show program's version number and exit EXAMPLES
Create data for the first generation of a backup: genbackupdata --create=10G testdir Modify an existing set of backup data to create a new generation: genbackupdata -c 5% -d 2% -m 5% -r 0.5% testdir The above command can be run for each new generation. GENBACKUPDATA(1)
All times are GMT -4. The time now is 07:12 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy