Sponsored Content
Top Forums Shell Programming and Scripting Search for patterns in thousands of files Post 302785689 by PikK45 on Tuesday 26th of March 2013 07:23:47 AM
Old 03-26-2013
Are there files with .GPX.Z extension in the "/comptel4/elink/backup1/output/vas/NG0/" directory or its sub-directories??
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Finding a specific pattern from thousands of files ????

Hi All, I want to find a specific pattern from approximately 400000 files on solaris platform. Its very heavy for me to grep that pattern to each file individually. Can anybody suggest me some way to search for specific pattern (alpha numeric) from these forty thousand files. Please note that... (6 Replies)
Discussion started by: aarora_98
6 Replies

2. UNIX Desktop Questions & Answers

how to search files efficiently using patterns

hi friens, :) if i need to find files with extension .c++,.C++,.cpp,.Cpp,.CPp,.cPP,.CpP,.cpP,.c,.C wat is the pattern for finding them :confused: (2 Replies)
Discussion started by: arunsubbhian
2 Replies

3. UNIX for Advanced & Expert Users

Copying Thousands of Tiny or Empty Files?

There is a procedure I do here at work where I have to synchronize file systems. The source file system always has three or four directories of hundreds of thousands of tiny (1k or smaller) or empty files. Whenever my rsync command reaches these directories, I'm waiting for hours for those files... (3 Replies)
Discussion started by: deckard
3 Replies

4. UNIX for Advanced & Expert Users

Best way to search for patterns in huge text files

I have the following situation: a text file with 50000 string patterns: abc2344536 gvk6575556 klo6575556 .... and 3 text files each with more than 1 million lines: ... 000000 abc2344536 46575 0000 000000 abc2344536 46575 4444 000000 abc2344555 46575 1234 ... I... (8 Replies)
Discussion started by: andy2000
8 Replies

5. Shell Programming and Scripting

help to parallelize work on thousands of files

I need to find a smarter way to process about 60,000 files in a single directory. Every night a script runs on each file generating a output on another directory; this used to take 5 hours, but as the data grows it is taking 7 hours. The files are of different sizes, but there are 16 cores... (10 Replies)
Discussion started by: vhope07
10 Replies

6. UNIX for Dummies Questions & Answers

script to search patterns inside list of files

>testfile while read x do if then echo $x >> testfile else fi if then echo $x >> testfile else fi done < list_of_files is there any efficient way to search abc.dml and xyz.dml ? (2 Replies)
Discussion started by: dr46014
2 Replies

7. Shell Programming and Scripting

to read two files, search for patterns and store the output in third file

hello i have two files temp.txt and temp_unique.text the second file consists the unique fields from the temp.txt file the strings stored are in the following form 4,4 17,12 15,65 4,4 14,41 15,65 65,89 1254,1298i'm able to run the following script to get the total count of a... (3 Replies)
Discussion started by: vaibhavkorde
3 Replies

8. SuSE

Search all files based on first and in all listed files search the second patterns

Hello Linux Masters, I am not a linux expert therefore i need help from linux gurus. Well i have a requirement where i need to search all files based on first patterns and after seraching all files then serach second pattern in all files which i have extracted based on first pattern.... (1 Reply)
Discussion started by: Black-Linux
1 Replies

9. Shell Programming and Scripting

Bash-awk to process thousands of files

Hi to all, I have thousand of files in a folder with names with format "FILE-YYYY-MM-DD-HHMM" for what I want to send the following AWK command awk '/Code.*/' FILE-2014* I'd like to separate all files that have the same date to a folder named with the corresponding date. For example, if I... (7 Replies)
Discussion started by: Ophiuchus
7 Replies

10. Shell Programming and Scripting

Bash - Find files excluding file patterns and subfolder patterns

Hello. For a given folder, I want to select any files find $PATH1 -f \( -name "*" but omit any files like pattern name ! -iname "*.jpg" ! -iname "*.xsession*" ..... \) and also omit any subfolder like pattern name -type d \( -name "/etc/gconf/gconf.*" -o -name "*cache*" -o -name "*Cache*" -o... (2 Replies)
Discussion started by: jcdole
2 Replies
GENDIFF(1)						      General Commands Manual							GENDIFF(1)

NAME
gendiff - utility to aid in error-free diff file generation SYNOPSIS
gendiff <directory> <diff-extension> DESCRIPTION
gendiff is a rather simple script which aids in generating a diff file from a single directory. It takes a directory name and a "diff- extension" as its only arguments. The diff extension should be a unique sequence of characters added to the end of all original, unmodi- fied files. The output of the program is a diff file which may be applied with the patch program to recreate the changes. The usual sequence of events for creating a diff is to create two identical directories, make changes in one directory, and then use the diff utility to create a list of differences between the two. Using gendiff eliminates the need for the extra, original and unmodified directory copy. Instead, only the individual files that are modified need to be saved. Before editing a file, copy the file, appending the extension you have chosen to the filename. I.e. if you were going to edit somefile.cpp and have chosen the extension "fix", copy it to somefile.cpp.fix before editing it. Then edit the first copy (somefile.cpp). After editing all the files you need to edit in this fashion, enter the directory one level above where your source code resides, and then type $ gendiff somedirectory .fix > mydiff-fix.patch You should redirect the output to a file (as illustrated) unless you want to see the results on stdout. SEE ALSO
diff(1), patch(1) AUTHOR
Marc Ewing <marc@redhat.com> 4th Berkeley Distribution Mon Jan 10 2000 GENDIFF(1)
All times are GMT -4. The time now is 12:34 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy