Speeding up a Shell Script (find, grep and a for loop)


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Speeding up a Shell Script (find, grep and a for loop)
# 8  
Old 08-07-2008
Quote:
Originally Posted by Dave Stockdale
Code:
echo "Finding All PDFs..."
ls -R | grep .pdf > /tmp/pdfs/all_pdfs.out
echo "Done."

# Remove rubbish from list

echo "Removing Rubbish From List..."
sed 's|^\./[a-zA-Z0-9_ &./:]*$||g' /tmp/pdfs/all_pdfs.out > /tmp/pdfs/all_pdfs2.out
sed '/^$/d' /tmp/pdfs/all_pdfs2.out > /tmp/pdfs/all_pdfs.out
echo "Done."

You could trim this down to avoid using so many temporary files.

Code:
ls -R | sed -e '/\.pdf$/!d' -e 's|^\./[a-zA-Z0-9_ &./:]*$||g' -e '/^$/d' >/tmp/pdfs/all_pdfs.out

The first sed command is somewhat more specific than just grep ,pdf -- instead of accepting any character (sic) followed by "pdf" anywhere in the file name, it looks specifically for .pdf at the end of the line. Maybe that's not what you want; if so, take out the $ perhaps.

Quote:
Originally Posted by Dave Stockdale
Code:
echo "Finding All PDFs..."
# List all PDFs Linked to

echo "Gathering List of PDF Links..."
find . -name "*.htm*" -exec grep -o "[a-zA-Z0-9_]\{1,\}\.pdf" {} \; > /tmp/pdfs/all_links.out
find . -name "*.php" -exec grep -o "[a-zA-Z0-9_]\{1,\}\.pdf" {} \; >> /tmp/pdfs/all_links.out
echo "Done."

Also, you could run a single find here; that should reduce running time significantly if the directory tree is big.

Code:
 find . -name "*.htm*" -o -name "*.php" \
  -exec grep -o "[a-zA-Z0-9_]\{1,\}\.pdf" {} \; > /tmp/pdfs/all_links.out

(The wrapping with a backslash is insignificant; I just did that here to avoid getting a very wide forum posting.)
# 9  
Old 08-11-2008
Ok, thanks.

I've made these changes. It still takes a little while to complete, but this is due to the number of PDFs that aren't linked to more than anything else.

After the first time this is run for real, and all of the PDFs not linked to are archived, the process will be much quicker. I'm probably going to add it to a cron or something so that it runs once a week automatically, and I won't have to worry about unused files wasting disk space Smilie

Thanks for all your help!
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help with speeding up my working script to take less time - how to use more CPU usage for a script

Hello experts, we have input files with 700K lines each (one generated for every hour). and we need to convert them as below and move them to another directory once. Sample INPUT:- # cat test1 1559205600000,8474,NormalizedPortInfo,PctDiscards,0.0,Interface,BG-CTA-AX1.test.com,Vl111... (7 Replies)
Discussion started by: prvnrk
7 Replies

2. Shell Programming and Scripting

Help 'speeding' up this 'parsing' script - taking 24+ hours to run

Hi, I've written a ksh script that read a file and parse/filter/format each line. The script runs as expected but it runs for 24+ hours for a file that has 2million lines. And sometimes, the input file has 10million lines which means it can be running for more than 2 days and still not finish.... (9 Replies)
Discussion started by: newbie_01
9 Replies

3. Shell Programming and Scripting

Speeding up shell script with grep

HI Guys hoping some one can help I have two files on both containing uk phone numbers master is a file which has been collated over a few years ad currently contains around 4 million numbers new is a file which also contains 4 million number i need to split new nto two separate files... (4 Replies)
Discussion started by: dunryc
4 Replies

4. Shell Programming and Scripting

How to use grep in a loop using a bash script?

Dear all, Please help with the following. I have a file, let's call it data.txt, that has 3 columns and approx 700,000 lines, and looks like this: rs1234 A C rs1236 T G rs2345 G T Please use code tags as required by forum rules! I have a second file, called reference.txt,... (1 Reply)
Discussion started by: aberg
1 Replies

5. Shell Programming and Scripting

Help speeding up script

This is my first experience writing unix script. I've created the following script. It does what I want it to do, but I need it to be a lot faster. Is there any way to speed it up? cat 'Tax_Provision_Sample.dat' | sort | while read p; do fn=`echo $p|cut -d~ -f2,4,3,8,9`; echo $p >> "$fn.txt";... (20 Replies)
Discussion started by: JohnN6
20 Replies

6. Shell Programming and Scripting

Speeding up search and replace in a for loop

Hello, I am using sed in a for loop to replace text in a 100MB file. I have about 55,000 entries to convert in a csv file with two entries per line. The following script works to search file.txt for the first field from conversion.csv and then replace it with the second field. While it works fine,... (15 Replies)
Discussion started by: pbluescript
15 Replies

7. UNIX for Dummies Questions & Answers

Speeding/Optimizing GREP search on CSV files

Hi all, I have problem with searching hundreds of CSV files, the problem is that search is lasting too long (over 5min). Csv files are "," delimited, and have 30 fields each line, but I always grep same 4 fields - so is there a way to grep just those 4 fields to speed-up search. Example:... (11 Replies)
Discussion started by: Whit3H0rse
11 Replies

8. Shell Programming and Scripting

Shell script / Grep / Awk to variable and Loop

Hi, I have a text file with data in that I wish to extract, assign to a variable and process through a loop. Kind of the process that I am after: 1: Grep the text file for the values. Currently using: cat /root/test.txt | grep TESTING= | awk -F"=" '{ a = $2 } {print a}' | sort -u ... (0 Replies)
Discussion started by: Spoonless
0 Replies

9. Shell Programming and Scripting

Bash script (using find and grep)

I'm trying to make a simple search script but cannot get it right. The script should search for keywords inside files. Then return the file paths in a variable. (Each file path separated with \n). #!/bin/bash SEARCHQUERY="searchword1 searchword2 searchword3"; for WORD in $SEARCHQUERY do ... (6 Replies)
Discussion started by: limmer
6 Replies

10. Shell Programming and Scripting

grep'ing and sed'ing chunks in bash... need help on speeding up a log parser.

I have a file that is 20 - 80+ MB in size that is a certain type of log file. It logs one of our processes and this process is multi-threaded. Therefore the log file is kind of a mess. Here's an example: The logfile looks like: "DATE TIME - THREAD ID - Details", and a new file is created... (4 Replies)
Discussion started by: elinenbe
4 Replies
Login or Register to Ask a Question