Sponsored Content
Special Forums UNIX Desktop Questions & Answers how to search files efficiently using patterns Post 302135229 by vino on Friday 7th of September 2007 02:38:24 AM
Old 09-07-2007
If you want to use find, then use

Code:
find /dir/to/search -iname "*.c*"

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

search patterns

hello, i have an input file of about 50,00,000 lines. few of its lines are as follows: <CR:0023498789,TPO-14987084;BO=IC&SUB=ALLP <CF:0023498789,CB=YES;BIL&NC=NO <CF:0023498789,CW=NO;NS=NO <GC:0023498789,CG=YES;TPO&NC=YES <CR:0024659841,TPO-14484621;BO=NO&BA=OC&SUB=ALLH... (1 Reply)
Discussion started by: rochitsharma
1 Replies

2. UNIX for Advanced & Expert Users

Best way to search for patterns in huge text files

I have the following situation: a text file with 50000 string patterns: abc2344536 gvk6575556 klo6575556 .... and 3 text files each with more than 1 million lines: ... 000000 abc2344536 46575 0000 000000 abc2344536 46575 4444 000000 abc2344555 46575 1234 ... I... (8 Replies)
Discussion started by: andy2000
8 Replies

3. UNIX for Dummies Questions & Answers

script to search patterns inside list of files

>testfile while read x do if then echo $x >> testfile else fi if then echo $x >> testfile else fi done < list_of_files is there any efficient way to search abc.dml and xyz.dml ? (2 Replies)
Discussion started by: dr46014
2 Replies

4. Shell Programming and Scripting

Search multiple patterns in multiple files

Hi, I have to write one script that has to search a list of numbers in certain zipped files. For eg. one file file1.txt contains the numbers. File1.txt contains 5,00,000 numbers and I have to search each number in zipped files(The number of zipped files are around 1000 each file is 5 MB) I have... (10 Replies)
Discussion started by: vsachan
10 Replies

5. Shell Programming and Scripting

to read two files, search for patterns and store the output in third file

hello i have two files temp.txt and temp_unique.text the second file consists the unique fields from the temp.txt file the strings stored are in the following form 4,4 17,12 15,65 4,4 14,41 15,65 65,89 1254,1298i'm able to run the following script to get the total count of a... (3 Replies)
Discussion started by: vaibhavkorde
3 Replies

6. SuSE

Search all files based on first and in all listed files search the second patterns

Hello Linux Masters, I am not a linux expert therefore i need help from linux gurus. Well i have a requirement where i need to search all files based on first patterns and after seraching all files then serach second pattern in all files which i have extracted based on first pattern.... (1 Reply)
Discussion started by: Black-Linux
1 Replies

7. Shell Programming and Scripting

Algorithm to load files efficiently without missing or accidently archiving....

We have a requirement where we get the Delta Files in every one hour and we need to load them into Oracle database every one hour using Powercenter. To efficiently do this we need to build an File management system. Here is our process: we get 6 files for 6 tables with a timestamp appended... (2 Replies)
Discussion started by: okkadu
2 Replies

8. Shell Programming and Scripting

Search for patterns in thousands of files

Hi All, I want to search for a certain string in thousands of files and these files are distributed over different directories created daily. For that I created a small script in bash but while running it I am getting the below error: /ms.sh: xrealloc: subst.c:5173: cannot allocate... (17 Replies)
Discussion started by: danish0909
17 Replies

9. Shell Programming and Scripting

Efficiently altering and merging files in perl

I have two files fileA HEADER LINE A CommentLine A Content A .... .... .... TAILER AfileB HEADER LINE B CommentLine B Content B .... .... .... TAILER BI want to merge these two files as HEADER LINE A CommentLine A Content A (4 Replies)
Discussion started by: sam05121988
4 Replies

10. Shell Programming and Scripting

Bash - Find files excluding file patterns and subfolder patterns

Hello. For a given folder, I want to select any files find $PATH1 -f \( -name "*" but omit any files like pattern name ! -iname "*.jpg" ! -iname "*.xsession*" ..... \) and also omit any subfolder like pattern name -type d \( -name "/etc/gconf/gconf.*" -o -name "*cache*" -o -name "*Cache*" -o... (2 Replies)
Discussion started by: jcdole
2 Replies
GENDIFF(1)						      General Commands Manual							GENDIFF(1)

NAME
gendiff - utility to aid in error-free diff file generation SYNOPSIS
gendiff <directory> <diff-extension> DESCRIPTION
gendiff is a rather simple script which aids in generating a diff file from a single directory. It takes a directory name and a "diff- extension" as its only arguments. The diff extension should be a unique sequence of characters added to the end of all original, unmodi- fied files. The output of the program is a diff file which may be applied with the patch program to recreate the changes. The usual sequence of events for creating a diff is to create two identical directories, make changes in one directory, and then use the diff utility to create a list of differences between the two. Using gendiff eliminates the need for the extra, original and unmodified directory copy. Instead, only the individual files that are modified need to be saved. Before editing a file, copy the file, appending the extension you have chosen to the filename. I.e. if you were going to edit somefile.cpp and have chosen the extension "fix", copy it to somefile.cpp.fix before editing it. Then edit the first copy (somefile.cpp). After editing all the files you need to edit in this fashion, enter the directory one level above where your source code resides, and then type $ gendiff somedirectory .fix > mydiff-fix.patch You should redirect the output to a file (as illustrated) unless you want to see the results on stdout. SEE ALSO
diff(1), patch(1) AUTHOR
Marc Ewing <marc@redhat.com> 4th Berkeley Distribution Mon Jan 10 2000 GENDIFF(1)
All times are GMT -4. The time now is 05:23 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy