Why do these 2 find commands return different results?
Hi,
I am using the korn shell on Solaris box.
Why does the following 2 commands return different results?
This command returns no results (I already used this command to create a list of files which I moved to an archive directory)
However this command returns results (I expected this to return no results since the pattern is the same )
I know im missing something stupid, I just dont know what!
Thanks in advance.
Last edited by Franklin52; 04-08-2011 at 06:26 AM..
Reason: Please use code tags
Your first command will not behave in a "reliable" recursive way.
It will find file like ????10??_*.dat ONLY if they ARE in a directory whose name ALSO match ????10??_*.dat ????10??_*.dat/????10??_*.dat
Otherwise they won't be found.
The first argument of the find command should basically be a PATH if not, the research won't be recursive
check
the -name option wait for a filename
whereas the first argument just after "find" is a path
---------- Post updated at 05:28 PM ---------- Previous update was at 05:13 PM ----------
In fact your first command says:
find any filename matching ????10??_*.dat in my current directory (not recursive) that match +91 mtime
find any filname (that are +91 days modified) that are in any directory whose name match ./????10??_*.dat
(and that'not recursive, if a file is in ./foobar/xxxx10xx_xxx.dat, it won't be matched.)
whereas the second command says:
Find recusively from the current directory any file whose name is matching ????10??_*.dat and is modified +91 days.
(maybe i miss some point around those 2 commands , but that is mainly what i understand : in fact the first call is syntaxically not correct (unless the pattern specified match directories)...
The above command syntax is unreliable but may sort of work if the files are in the current directory (because Shell expands the filenames).
The following command would have been safer. The start directory field is "." (current directory). The pattern is in single quotes which means that "find" expands the filenames.
I have just read ctsgnb's post. The first parameter to "find" is a "pathname list". This can be directory names or filenames but the command will fail if the "pathname list" refers to items which do not exist.
I rarely put a filename in the "pathname" list but it can for example be quick and useful for deciding whether an individual existing file is older than a number of days without searching an entire tree.
The point is : if pathname passed to find is not a directory, the research will not be recursive, it will just display the matching file as is, but not the others that are in subdirectories (... except if the subdirectory itself is matching the given pattern as well)
Thanks for your time and the excellent explanation. D'oh, I knew it would end up being something simple that I was missing .
There were files in a sub directory that matched the pattern.
So this only searches for files matching the pattern in the current directory:
Quote:
find ????10??_*.dat -type f -mtime +91
Whereas this searches the current directory and all sub-directories for files matching the pattern:
Quote:
find . -name '????10??_*.dat' -type f -mtime +91
Is there a way to make
Quote:
find . -name '????10??_*.dat' -type f -mtime +91
NOT search any sub-directories - just search the current directory. I have looked at the man page and tried -prune but cant get it to work
If you don't want to search in subdirectories, then you can just leave the find command and use command ls instead
I have to create a file containing a list of files that I have to perform a few actions on.
Unfortunately ls doesnt provide me with a way of getting only files that match a pattern but also by a pre-defined time period. The find command provides me with the -mtime option to find only files modified after a certain length of time.
Hello and thanks in advance for any help anyone can offer me
I'm trying to learn the find command and thought I was understanding it... Apparently I was wrong. I was doing compound searches and I started getting weird results with the -size test. I was trying to do a search on a 1G file owned by... (14 Replies)
cmd()
{
echo " "
echo "$(whoami)@$(hostname):$(pwd)# $*"
results=`eval $*`
echo $results
}
I want to get the eval $* 's return value and pass it to a new variable $val, and get "eval $*" 's the ... (7 Replies)
Good day every one.
When a use df -h comand on my read hat linux server i get something like this:
/dev/mapper/Vg02-Lv19 30G 29G 145M 100% /app
Then when i do du -sh /app/
i get
12G /app/
For me it is meaning that only 12G was used on /app partition.
How can i see where are... (9 Replies)
This is a three step process:
a) Upload date ->scrub\prep data,
b) insert into db,
c) return php results page.
I have a question about the best practices for unix to process this.
I have data from a flat file that I've scrubbed and cleaned with sed and awk. When it is complete I have an... (0 Replies)
Hi
I want to output the results of multiple commands to a single file.
I use simple Ping, telnet commands to check the connectivity to many servers.
Can i execute all the commands and write the output to a file instead of executing them one by one?
Thanks
Ashok (2 Replies)
Hello
I have a script which emails identifies the user ID of a user and sends them an email. A user can enter part of the name of the person he/wants to send the email to. Then I use the ypcat command to identify the UID of that person.
The problem I'm having, is building in an error trap... (1 Reply)
Hi,
I am wondering how I can check the return value of all commands in a pipe such as
gzip -dc file.gz | sort -u > output.txt
If I run this sequence in bash and check $?, I get the return status from sort. But I want to know if the initial gzip failed.
Similarly for longer pipe chains,... (6 Replies)
i have to run set of commands
command1
command2
command3
command4
Now Whenever any of these command fails i should quit while capturing error message.
Is there a better way then checking for $? after each command. (1 Reply)
I have a Korn shell script that executes a number of commands on a remote server.
Is it possible to feed in the last exit code of the rsh commands (i.e. something like $?) to a variable within the local shell script?
I tried the following:
returncode=$(rsh spns31 ".... (1 Reply)