Sponsored Content
Top Forums Shell Programming and Scripting Bash Scipting (New); Run multiple greps > multiple files Post 302753375 by LDHB2012 on Tuesday 8th of January 2013 01:13:49 PM
Old 01-08-2013
That seems to work like a charm.

So if I wanted to create a separate file, that has the contents of EVERYTHING else (not included in the "infile"), how would that else statement look?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

when I try to run rm on multiple files I have problem to delete files with space

Hello when I try to run rm on multiple files I have problem to delete files with space. I have this command : find . -name "*.cmd" | xargs \rm -f it doing the work fine but when it comes across files with spaces like : "my foo file.cmd" it refuse to delete it why? (1 Reply)
Discussion started by: umen
1 Replies

2. Shell Programming and Scripting

Run Multiple Functions over SSH (BASH)

I am trying to write a script that will ssh into a remote machine and recurse through a specified directory, find mp3 files which may be two or three directories deep (think iTunes: music/artist/album/song.mp3), and scp them back to the machine running the script. The script should also maintain... (3 Replies)
Discussion started by: johnnybg00de
3 Replies

3. Shell Programming and Scripting

How to run multiple awk files

I'm trying some thing like this. But not working It worked for bash files Now I want some thing like that along with multiple input files by redirecting their outputs as inputs of next command like below Could you guyz p0lz help me on this #!/usr/bin/awk -f BEGIN { } script1a.awk... (2 Replies)
Discussion started by: repinementer
2 Replies

4. Shell Programming and Scripting

Run perl script on files in multiple directories

Hi, I want to run a Perl script on multiple files, with same name ("Data.txt") but in different directories (eg : 2010_06_09_A/Data.txt, 2010_06_09_B/Data.txt). I know how to run this perl script on files in the same directory like: for $i in *.txt do perl myscript.pl $i > $i.new... (8 Replies)
Discussion started by: ad23
8 Replies

5. UNIX for Dummies Questions & Answers

Run one script on multiple files and print out multiple files.

How can I run the following command on multiple files and print out the corresponding multiple files. perl script.pl genome.gff 1.txt > 1.gff However, there are multiples files of 1.txt, from 1----100.txt Thank you so much. No duplicate posting! Continue here. (0 Replies)
Discussion started by: grace_shen
0 Replies

6. Shell Programming and Scripting

Run one script on multiple files and print out multiple files.

How can I Run one script on multiple files and print out multiple files. FOR EXAMPLE i want to run script.pl on 100 files named 1.txt ....100.txt under same directory and print out corresponding file 1.gff ....100.gff.THANKS (4 Replies)
Discussion started by: grace_shen
4 Replies

7. Shell Programming and Scripting

Run script on multiple files

I have a script that I need to run on one file at a time. Unfortunately using for i in F* or cat F* is not possible. When I run the script using that, it jumbles the files and they are out of order. Here is the script: gawk '{count++; keyword = $1} END { for (k in count) {if (count == 2)... (18 Replies)
Discussion started by: newbie2010
18 Replies

8. UNIX for Dummies Questions & Answers

Run script on multiple files

Hi Guys, I've been having a look around to try and understand how i can do the below however havent come across anything that will work. Basically I have a parser script that I need to run across all files in a certain directory, I can do this one my by one on comand line however I... (1 Reply)
Discussion started by: mutley2202
1 Replies

9. Shell Programming and Scripting

Ssh to multiple hosts and then run multiple for loops under remote session

Hello, I am trying to login to multiple servers and i have to run multiple loops to gather some details..Could you please help me out. I am specifically facing issues while running for loops. I have to run multiple for loops in else condition. but the below code is giving errors in for... (2 Replies)
Discussion started by: mohit_vardhani
2 Replies

10. UNIX for Beginners Questions & Answers

Combining multiple greps

I'm trying to learn about regular expressions. Let's say I want to list all the files in /usr/bin beginning with "p", ending with "x", and containing an "a". I know this works:ls | grep ^p | grep x$ | grep abut I'm thinking there must be a way to do it without typing grep three times. Some of my... (9 Replies)
Discussion started by: Xubuntu56
9 Replies
MAKE_SOCKDFR(8) 					      System Manager's Manual						   MAKE_SOCKDFR(8)

NAME
make_sockdfr - Generates frozen route file for SOCKS server SYNOPSIS
make_sockdfr [infile [outfile] ] DESCRIPTION
make_sockdfr reads in a plain-text route file for the SOCKS server and produces a frozen route file as the output. Both arguments are optional. The default for infile is /etc/sockd.route; the default for outfile is /etc/sockd.fr. You may specify infile while omitting outfile, but you cannot specify outfile without also speficying infile. The contents of the frozen route file is essentially the memory image of the parsed input file. Using the frozen route file can reduce the start-up delay of the SOCKS server program since it no longer has to parse the file contents. When the SOCKS server starts, it always looks for the frozen route file /etc/sockd.fr first. If that file is not found, it then tries to use the plain-text route file /etc/sockd.route. If you use frozen route file, you must remember to run make_sockdfr every time after you modify the plain-text file or the SOCKS server will continue to use the frozen version of a previous route file. To find out the contents of a frozen route file, use dump_sockdfr. FILES
/etc/sockd.fr, /etc/sockd.route SEE ALSO
dump_sockdfr(8), sockd.fr(5), sockd.route(5) AUTHOR
Ying-Da Lee, yingda@esd.sgi.com or yingda@best.com May 6, 1996 MAKE_SOCKDFR(8)
All times are GMT -4. The time now is 08:15 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy