Sponsored Content
Full Discussion: Extrange Find command Issue
Top Forums UNIX for Dummies Questions & Answers Extrange Find command Issue Post 302342587 by juanklavera on Monday 10th of August 2009 09:33:29 AM
Old 08-10-2009
i'm getting the same error both ways
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

An issue with find command.

Hi all, I have a shell script(ksh) which has the code as follows. ------------------ cd $mydir for i in `find ./ -type f -mtime +$k` do echo $i done ----------------------- And in $mydir , i have some files which have space in theie names like "Case att15". The out put of the... (6 Replies)
Discussion started by: rajugp1
6 Replies

2. UNIX for Dummies Questions & Answers

Issue with find command using links

Hi, Having a simple issue with find command on Sun. The command works fine if the variable is set to the actual filesystem but fails when the variable is set to a link which is pointing to the same filesystem. export DUMPDEST=/oradata1/exports/pbm - Set the variable ... (2 Replies)
Discussion started by: win_vin
2 Replies

3. Shell Programming and Scripting

Issue with Find command on Linux

Hi, I am issuing find command below mentioned ways but it givs different count. I don't understand the behaviour. Could any one have any clue? $ find . -mtime -5 -maxdepth 1 -exec ls -lrt {} \; | wc -l 169 $ find . -mtime -5 -maxdepth 1 | wc -l 47 $ find . -mtime -5 -maxdepth 1 | wc -l... (2 Replies)
Discussion started by: siba.s.nayak
2 Replies

4. UNIX for Dummies Questions & Answers

Find command issue

I am currently using below command to get the 1st three characters of a file(PDM). Issue is, when i use find command in root dir, it finds all the files in sub dir also. How to limit the find command search to a given path only(ie: say only find file in apps/cmplus/datamigration/data path... (3 Replies)
Discussion started by: abhi_n123
3 Replies

5. Shell Programming and Scripting

Issue with Find Command

Hi All, I'm a bit new to Linux environment, moderately okay when it comes to Unix AIX. I'm facing an issue while trying to run a simple find command: $ for file in `find . -name *.*` > do > ls $file > done This is throwing the following error: Strangely, a few minutes... (4 Replies)
Discussion started by: adi_2_chaos
4 Replies

6. Linux

find command issue

Hi, I am not root user. I am trying to find the file which has contains pattern "fvsfile" in root directory. If i run the find cmd then i got permission denied and all the files are listed include pattern files. i cant get file name yet find . print | xargs grep -i "fvsfile" I want... (2 Replies)
Discussion started by: Mani_apr08
2 Replies

7. Shell Programming and Scripting

Performance issue while using find command

Hi, I have created a shell script for Server Log Automation Process. I have used find xargs grep command to search the string. for Example, find -name | xargs grep "816995225" > test.txt . Here my problem is, We have lot of records and we want to grep the string... (4 Replies)
Discussion started by: nanthagopal
4 Replies

8. Shell Programming and Scripting

Find command issue

Guys, Here is my requirement.. Sample.cfg file="*log.gz *txt.gz" sample.sh #!/bin/sh . $HOME/Sample.cfg find . -name "$file" -mtime +20 -exec ls -la {} \; Its not finding the given *log.gz and txt.gz files. Could anyone please help me? (8 Replies)
Discussion started by: AraR87
8 Replies

9. Shell Programming and Scripting

Issue in Find and mv command

Hi I am using the below code to find mv the files. Files are moving to the Target location as expected but find is displaying some errors like below. find ./ -name "Archive*" -mtime +300 -exec mv {} /mnt/X/ARC/ \; find: `./Archive_09-30-12': No such file or directory find:... (6 Replies)
Discussion started by: rakeshkumar
6 Replies

10. Shell Programming and Scripting

Find command issue

Hi Guys, I have a file called error.logs. am just trying to display the content in the file which was modified last 1 day. I tried below command but it doesnt give the proper output. find /u/text/vinoth/bin "error.logs" -mtime -1 -exec cat {} \; >> mail.txt Any help is much... (21 Replies)
Discussion started by: Vinoth Kumar G
21 Replies
nfexpire(1)															       nfexpire(1)

NAME
nfexpire - data expiry program SYNOPSIS
nfexpire [options] DESCRIPTION
nfexpire is used to manage the expiration of old netflow data files, created by nfcapd(1) or other data collectors such as sfcapd(1). Data expiration is done either by nfcapd(1) in auto expiry mode, or by nfexpire which can by run at any time or any desired time interval by cron. nfexpire can also be savely run while nfcapd auto expires files, for cleaning up full disks etc. nfexpire is sub directory hierarchy aware, and handles any format automatically. For a fast and efficient expiration, nfexpire creates and maintains a stat file named .nfstat in the data directory. Any directory supplied with the options below corresponds to the data directory supplied to nfcapd(1) using option -l. OPTIONS
-l directory List current data statistics in directory datadir. -r directory Rescan the specified directory to update the statfile. To be used only when explicit update is required. Usually nfexpire takes care itself about rescanning, when needed. -e datadir Expire files in the specified directory. Expire limits are taken from statfile ( see -u ) or from supplied options -s -t and -w. Command line options overwrite stat file values, however the statfile limits are not changed. -s maxsize Set size limit for the directory. The specified limit accepts values such as 100M, 100MB 1G 1.5G etc. Accpeted size factors are K, KB, M, MB, G, GB and T, TB. If no factor is supplied bytes (B) is assumed. A value of 0 disables the max size limit. -t maxlife_time Sets the max life time for files in the directory. The supplied maxlife_time accepts values such as 31d, 240H 1.5d etc. Accpeted time scales are w (weeks) d (days) H (hours). A value of 0 disables the max lifetime limit. If no scale is given, H (hours) are assumed. -u datadir Updates the max size and lifetime limits, specified by -s -t and -w and stores them in the statfile as default values. A running nfcapd(1) processs doing auto expiry will take these new values starting with the next expiry cycle. Running nfexpire next time doing file expiration will take these new limits unless -s -t or -w are specified. -w watermark Set the water mark in % for expiring data. If a limit is hit, files get expired down to this level in % of that limit. If not set, the default is 95%. -h Print help text on stdout with all options and exit. -p Directories specified by -e, -l and -r are interpreted as profile directories. Only NfSen will need this option. -Y Print result in parseable format. Only NfSen will need this option. RETURN VALUE
Returns 0 No error. 255 Initialization failed. 250 Internal error. NOTES
There are two ways to expire files: nfcapd in auto-expire mode ( option -e ) and nfexpire running by hand or periodically as cron job. Both ways synchronize access to the files, therefore both ways can be run in parallel if required. Expiring by nfcapd in auto-expire mode: option -e If nfcapd is started with option -e, the auto-expire mode is enabled. After each cycle ( typically 5min ) nfcapd expires files according to the limits set with nfexpire using options -u -s -t and -w. If initially no limits are set, no files get expired. Expiring by nfexpire nfexpire can be run at any time to expire files. It automatically syncs up with the files created by nfcapd in the mean time since the last expire run, if a nfcapd collector process is running for that directory in question and expires the files according the limits set. Limits Files are expired according to two limits: maximum disk space used by all files in the directory and maximum lifetime of data files, what- ever limit is reached first. If one of the limit is hit the expire process will delete files down to the watermark of that limit. SEE ALSO
nfcapd(1) BUGS
2009-09-09 nfexpire(1)
All times are GMT -4. The time now is 09:06 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy