Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Loop awk command on files in a folder Post 302959825 by dovah on Friday 6th of November 2015 08:52:31 AM
Old 11-06-2015
Sorry, I should have mentioned, "doesn't work" meant that I got empty files :/

Also, I got it working, thanks to one of your previous replies to another post (How to use a loop for multiple files in a folder to run awk command?)

Code:
ls *.subVCF | while read FN; do awk -F "\t" 'BEGIN{OFS="\t"}{if ($10=="S") print$0; }'  $FN > ${FN/subVCF/txt}; done

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

using mv command for moving multiple files in a folder

Hi, I have a requirement where I need to move Bunch of folders containing multiple files to another archive location. i want to use mv command .I am thinking when we use mv command to move directory does it create directory 1st and then move all the files ? e.g source... (4 Replies)
Discussion started by: rkmbcbs
4 Replies

2. Shell Programming and Scripting

Comparison and editing of files using awk.(And also a possible bug in awk for loop?)

I have two files which I would like to compare and then manipulate in a way. File1: pictures.txt 1.1 1.3 dance.txt 1.2 1.4 treehouse.txt 1.3 1.5 File2: pictures.txt 1.5 ref2313 1.4 ref2345 1.3 ref5432 1.2 ref4244 dance.txt 1.6 ref2342 1.5 ref2352 1.4 ref0695 1.3 ref5738 1.2... (1 Reply)
Discussion started by: linuxkid
1 Replies

3. Shell Programming and Scripting

Loop through text file > Copy Folder > Edit XML files in bulk?

I have a text file which contains lines in this format - it contains 105 lines in total, but I'm just putting 4 here to keep it short: 58571,east_ppl_ppla_por 58788,east_pcy_hd_por 58704,east_pcy_ga_por 58697,east_pcy_pcybs_por It's called id_key.txt I have a sample folder called... (9 Replies)
Discussion started by: biscuitcreek
9 Replies

4. Shell Programming and Scripting

loop through files in each folder and perform sed

Dear all, I have few log folders in directory (FILE) and i need to perform sed on each files inside each folders. Here is my script, and i wish to rename each file back to the same file name after modification. Can anyone guide me on my script below? eg: folder1 contain files... (2 Replies)
Discussion started by: ymeyaw
2 Replies

5. UNIX for Dummies Questions & Answers

how to create a command that would print files only from 1 folder

Hi there how to create a command in csh that would print files only from 1 folder but as an argument takes home directory for e.g. in my home diecrtory I have 3 folders unix, windows,mac and alot of files. and I running the program as ./op ~ this should return me files from unix folder without... (1 Reply)
Discussion started by: FUTURE_EINSTEIN
1 Replies

6. Shell Programming and Scripting

For loop for number of files in a folder

Hi All, Need a for loop which should run for number of files in a folder and should pass the file name as parameter to another shell script for each loop. Please help me. Thanks. (2 Replies)
Discussion started by: chillblue
2 Replies

7. Shell Programming and Scripting

How to use a loop for multiple files in a folder to run awk command?

Dear folks I have two data set which there names are "final.map" and "1.geno" and look like this structures: final.map: gi|358485511|ref|NC_006088.3| 2044 gi|358485511|ref|NC_006088.3| 2048 gi|358485511|ref|NC_006088.3| 2187 gi|358485511|ref|NC_006088.3| 17654 ... (2 Replies)
Discussion started by: sajmar
2 Replies

8. Shell Programming and Scripting

awk error when increasing number of files in folder

I have a folder with several files of which I want to eliminate all of the terms that they have in common using `awk`. Here is the script that I have been using: awk ' FNR==1 { if (seen++) { firstPass = 0 outfile = FILENAME "_new" ... (4 Replies)
Discussion started by: owwow14
4 Replies

9. Shell Programming and Scripting

Apply command to all files in folder

Hi all! I have this command grep -E '^\To: |^\Date: |^\Subject: ' fileA.txt > fileA_1.txt && grep -v '^\To: |^\Date: |^\Subject: ' fileA.txt >> fileA_1.txt && rm fileA.txt && sed -i -e 's/\(Date: \|Subject: \|To: \)//g' fileA_1.txtHow do I apply it to all the files in the folder (each file has a... (7 Replies)
Discussion started by: guilliber
7 Replies

10. UNIX for Beginners Questions & Answers

For loop to accept params and delete folder/files

Hi Folks - I'm trying to build a simple for loop to accept params and then delete the folder & files on the path older than 6 days. Here is what I have: Purge () { for _DIR in "$1" do find "${_DIR}"/* -mtime +0 -exec rm {} \; done } I would be passing... (4 Replies)
Discussion started by: SIMMS7400
4 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 03:08 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy