Sponsored Content
Full Discussion: Files more than 60K
Operating Systems Solaris Files more than 60K Post 302394669 by digitalrg on Friday 12th of February 2010 07:57:49 AM
Old 02-12-2010
Files more than 60K

Hi All,

We have an application which is expected to create 12000 files per day. We have to archive files only after 6 days. As i know that maximum limit for files/directories to be created are 32K. What is that option/variable in Soalris to change this limit from 32K to 64K or any other value?
Thanks in advance.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to retrieve all the linked script files/ctl files/sql files?

Hi I am going to migrate our datawarehouse system from HP Tru 64 Unix to the Red Hat Linux. Inside the box, it is running around 40 cron jobs; inside each cron job, it is calling other shell script files, and the shell script files may again call other shell script files or ctl files(for... (1 Reply)
Discussion started by: franksubramania
1 Replies

2. Shell Programming and Scripting

How to extract data from indexed files (ISAM files) maintained in an unix server.

Hi, Could someone please assist on a quick way of How to extract data from indexed files (ISAM files) maintained in an UNIX(AIX) server.The file data needs to be extracted in flat text file or CSV or excel format . Usually we have programs in microfocus COBOL to extract data, but would like... (2 Replies)
Discussion started by: devina
2 Replies

3. UNIX for Dummies Questions & Answers

write a program in c in unix that display the files(includ sub-direc and files within) in a sorted

the sorting is based on name of file, file size modification time stamps o f file it should dislay the output in the following format "." and ".." enteries should be ignored please give some idea how to do it (1 Reply)
Discussion started by: pappu kumar jha
1 Replies

4. Shell Programming and Scripting

need a shell script to extract the files from source file and check whether those files existonserve

Hi, I am new to shell scripting.Please help me on this.I am using solaris 10 OS and shell i am using is # echo $0 -sh My requirement is i have source file say makefile.I need to extract files with extensions (.c |.cxx |.h |.hxx |.sc) from the makefile.after doing so i need to check whether... (13 Replies)
Discussion started by: muraliinfy04
13 Replies

5. Shell Programming and Scripting

How to create zip/gz/tar files for if the files are older than particular days in UNIX or Linux?

I need a script file for backup (zip or tar or gz) of old log files in our unix server (causing the space problem). Could you please help me to create the zip or gz files for each log files in current directory and sub-directories also? I found one command which is to create gz file for the... (4 Replies)
Discussion started by: Mallikgm
4 Replies

6. UNIX for Advanced & Expert Users

Find all files in the current directory excluding hidden files and directories

Find all files in the current directory only excluding hidden directories and files. For the below command, though it's not deleting hidden files.. it is traversing through the hidden directories and listing normal which should be avoided. `find . \( ! -name ".*" -prune \) -mtime +${n_days}... (7 Replies)
Discussion started by: ksailesh1
7 Replies

7. Shell Programming and Scripting

Append string to all the files inside a directory excluding subdirectories and .zip files

Hii, Could someone help me to append string to the starting of all the filenames inside a directory but it should exclude .zip files and subdirectories. Eg. file1: test1.log file2: test2.log file3 test.zip After running the script file1: string_test1.log file2: string_test2.log file3:... (4 Replies)
Discussion started by: Ravi Kishore
4 Replies

8. Shell Programming and Scripting

Shell script for field wise record count for different Files .csv files

Hi, Very good wishes to all! Please help to provide the shell script for generating the record counts in filed wise from the .csv file My question: Source file: Field1 Field2 Field3 abc 12f sLm 1234 hjd 12d Hyd 34 Chn My target file should generate the .csv file with the... (14 Replies)
Discussion started by: Kirands
14 Replies

9. UNIX for Beginners Questions & Answers

Automate splitting of files , scp files as each split completes and combine files on target server

i use the split command to split a one terabyte backup file into 10 chunks of 100 GB each. The files are split one after the other. While the files is being split, I will like to scp the files one after the other as soon as the previous one completes, from server A to Server B. Then on server B ,... (2 Replies)
Discussion started by: malaika
2 Replies
nfexpire(1)															       nfexpire(1)

NAME
nfexpire - data expiry program SYNOPSIS
nfexpire [options] DESCRIPTION
nfexpire is used to manage the expiration of old netflow data files, created by nfcapd(1) or other data collectors such as sfcapd(1). Data expiration is done either by nfcapd(1) in auto expiry mode, or by nfexpire which can by run at any time or any desired time interval by cron. nfexpire can also be savely run while nfcapd auto expires files, for cleaning up full disks etc. nfexpire is sub directory hierarchy aware, and handles any format automatically. For a fast and efficient expiration, nfexpire creates and maintains a stat file named .nfstat in the data directory. Any directory supplied with the options below corresponds to the data directory supplied to nfcapd(1) using option -l. OPTIONS
-l directory List current data statistics in directory datadir. -r directory Rescan the specified directory to update the statfile. To be used only when explicit update is required. Usually nfexpire takes care itself about rescanning, when needed. -e datadir Expire files in the specified directory. Expire limits are taken from statfile ( see -u ) or from supplied options -s -t and -w. Command line options overwrite stat file values, however the statfile limits are not changed. -s maxsize Set size limit for the directory. The specified limit accepts values such as 100M, 100MB 1G 1.5G etc. Accpeted size factors are K, KB, M, MB, G, GB and T, TB. If no factor is supplied bytes (B) is assumed. A value of 0 disables the max size limit. -t maxlife_time Sets the max life time for files in the directory. The supplied maxlife_time accepts values such as 31d, 240H 1.5d etc. Accpeted time scales are w (weeks) d (days) H (hours). A value of 0 disables the max lifetime limit. If no scale is given, H (hours) are assumed. -u datadir Updates the max size and lifetime limits, specified by -s -t and -w and stores them in the statfile as default values. A running nfcapd(1) processs doing auto expiry will take these new values starting with the next expiry cycle. Running nfexpire next time doing file expiration will take these new limits unless -s -t or -w are specified. -w watermark Set the water mark in % for expiring data. If a limit is hit, files get expired down to this level in % of that limit. If not set, the default is 95%. -h Print help text on stdout with all options and exit. -p Directories specified by -e, -l and -r are interpreted as profile directories. Only NfSen will need this option. -Y Print result in parseable format. Only NfSen will need this option. RETURN VALUE
Returns 0 No error. 255 Initialization failed. 250 Internal error. NOTES
There are two ways to expire files: nfcapd in auto-expire mode ( option -e ) and nfexpire running by hand or periodically as cron job. Both ways synchronize access to the files, therefore both ways can be run in parallel if required. Expiring by nfcapd in auto-expire mode: option -e If nfcapd is started with option -e, the auto-expire mode is enabled. After each cycle ( typically 5min ) nfcapd expires files according to the limits set with nfexpire using options -u -s -t and -w. If initially no limits are set, no files get expired. Expiring by nfexpire nfexpire can be run at any time to expire files. It automatically syncs up with the files created by nfcapd in the mean time since the last expire run, if a nfcapd collector process is running for that directory in question and expires the files according the limits set. Limits Files are expired according to two limits: maximum disk space used by all files in the directory and maximum lifetime of data files, what- ever limit is reached first. If one of the limit is hit the expire process will delete files down to the watermark of that limit. SEE ALSO
nfcapd(1) BUGS
2009-09-09 nfexpire(1)
All times are GMT -4. The time now is 10:24 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy