08-21-2008
Passing a variable to awk while in a shell for loop
I am a newbie to awk and c programming, however am not a unix newbie. However, I do need help with a kshell script I am writing. It is almost complete, the last step is killing me. Any help would be greatly appreciated. What I am trying to do is cat a text file that has usernames. Then, using the for loop I want to extract out lines from a second file where the name exist in a record using awk. There could be multiple records with the username. The second file has variable length records. I only need the last field from the second file and the username.
Here are examples of the two files.
users.txt ------- secondfile.txt
user1 -------- user3 user1 user2 /mydir/next/firstdir
user2 -------- user1 /mydir/next/seconddir
user3 -------- user2 user3 /mydir/next/thirdir
Here is my script:
for i in cat `users.txt`
do
awk -v t01=$i 't01 { print t01 " " $NF }' secondfile.txt >> thirdfile.txt
done
Initially I tried '/t01/ { print t01 " " $NF }' but found that awk does not like variables within the / /. The problem with the above syntax is that for every record in users.txt, I get a print of the last field for every record in secondfile.txt. Yet, when I use awk at the command line and use /user1/ I get only the records from secondfile.txt that I would expect.
Hope this explanation is clear. Your help is greatly appreciated.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
hi;
i have a file containing lines like:
1|1069108123|96393669788|00963215755711|2|0|941||;serv:Pps6aSyria;first:0;bear
i want to extract the second, third and fourth record of each line and store it in a file ";" seperated
this is what i wrote
while read line
do
... (3 Replies)
Discussion started by: bcheaib
3 Replies
2. UNIX for Dummies Questions & Answers
After searching through books and the internet for days I can't seem to find an example of this.
I'm trying to pass a variable from a for loop into nawk but can't seem to get all the syntax right.
my script (thanks to anbu23 for nawk help) is this:
for customers in `cat customers.txt`
do... (3 Replies)
Discussion started by: Ant1815
3 Replies
3. Shell Programming and Scripting
Hi,
I have a text file with data in that I wish to extract, assign to a variable and process through a loop.
Kind of the process that I am after:
1: Grep the text file for the values.
Currently using:
cat /root/test.txt | grep TESTING= | awk -F"=" '{ a = $2 } {print a}' | sort -u
... (0 Replies)
Discussion started by: Spoonless
0 Replies
4. UNIX for Dummies Questions & Answers
Hello,
I have a file with 4 columns.
An arbitrary example is shown below:
a Tp 10 xyz
b Tq 8 abc
c Tp 99 pqr
d Tp 44 rst
e Tr 98 efg
Based on the values in col 2 and col 3, I will execute another program.
I have been running this:... (5 Replies)
Discussion started by: Gussifinknottle
5 Replies
5. Shell Programming and Scripting
I want to pass a shell variable to awk script :
# cat file
PSAPSR3 3722000 91989.25 2 98
PSAPSR7 1562000 77000.1875 5 95
PSAPUNDO 92000 4087.5625 4 96
#... (8 Replies)
Discussion started by: Reboot
8 Replies
6. Shell Programming and Scripting
consider the script below
sh /opt/hqe/hqapi1-client-5.0.0/bin/hqapi.sh alert list --host=localhost --port=7443 --user=hqadmin --password=hqadmin --secure=true >/tmp/alerts.xml
awk -F'' '{for(i=1;i<=NF;i++){
if($i=="Alert id") {
if(id!="")
if(dt!=""){
cmd="sh someScript.sh... (2 Replies)
Discussion started by: vivek d r
2 Replies
7. Shell Programming and Scripting
I am writing a script where I need awk to test if two columns are the same and shell to do something if they are or are not.
Here is the code I'm working with:
@ test = 0
...
test = `awk '{if($1!=$2) print 1; else print 0}' time_test.tmp`
#time_test.tmp holds two values separated by a space... (3 Replies)
Discussion started by: Malavin
3 Replies
8. Shell Programming and Scripting
Hi All,
I am new to AWK programming. I have the following for loop in my awk program.
cat printhtml.awk:
BEGIN
-------- <some code here>
END{
----------<some code here>
for(N=0; N<H; N++)
{
for(M=5; M<D; M++) print "\t" D "";
}
-----
}
... (2 Replies)
Discussion started by: ctrld
2 Replies
9. UNIX for Dummies Questions & Answers
Hello All,
May i please why my shell variable is not getting passed into awk script.
#!/bin/bash -vx
i="1EB07C50"
/bin/awk -v ID="$i" '/ID/ {match($0,/ID/);print substr($0,RSTART,RLENGTH)}' /var/log/ScriptLogs/keys.13556.txt
Thank you. (1 Reply)
Discussion started by: Ariean
1 Replies
10. Shell Programming and Scripting
I wrote this script which works well when I manually input 55518622 and 1
but I need this script to be generic and loop over the following table
awk '$4>(55518622-500000) && $4<(55518622+500000)' chr1_GEN2bim | awk 'BEGIN {min=1000000000; max=0;}; {\
if($4<min && $4 != "") min = $4; if($4>max... (8 Replies)
Discussion started by: fat
8 Replies
LEARN ABOUT REDHAT
www::robotrules
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)
NAME
WWW::RobotsRules - Parse robots.txt files
SYNOPSIS
require WWW::RobotRules;
my $robotsrules = new WWW::RobotRules 'MOMspider/1.0';
use LWP::Simple qw(get);
$url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
$url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
# Now we are able to check if a URL is valid for those servers that
# we have obtained and parsed "robots.txt" files for.
if($robotsrules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in
<http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access
to parts of their web site.
The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can parse multiple /robots.txt files.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://info.webcrawler.com/mak/projects/robots/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)