Sponsored Content
Full Discussion: while loop inside while loop
Top Forums Shell Programming and Scripting while loop inside while loop Post 302156156 by panknil on Monday 7th of January 2008 11:49:57 AM
Old 01-07-2008
MySQL while loop inside while loop

Dear All,

i have a awk script where i'm using while loop inside while loop
here is the code:

awk -v DATE="$CURRDATE" -v -F'@' 'BEGIN {

while(( getline < "Merge_Calldet.txt" ))
{

ARR=$5
LINE=$0
while(( getline < "Merge_Accessnum.txt" ))
{
TESTSIMENTRY=$1
FILEDATE=$15
if ( TESTSIMENTRY == ARR )
{
FLAG=1
break
}
else
{
FLAG=0
continue
}

}
if (FLAG == 1)
{
print "Y" "@" FILEDATE "@" LINE
}
else
{
print "N" "@" DATE "@" LINE
}
}

}' >> Join_Calldt_Accnum.txt

I'm trying to compare 5th field of Merge_Calldet.txt with the fisrt field of Merge_Accessnum.txt.
means it will take 5th field from first line of Merge_Calldet.txt and will compare of entire first field of Merge_Accessnum.txt file.
again it will take the 5th field from second line of Merge_Calldet.txt and so on.
but the problem is in the flow is coming inside the second while loop only for one tiime.
can anyone plz suggest me the mistake i have done.

Thanks,
Regards,
Pankaj
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Using variables created sequentially in a loop while still inside of the loop [bash]

I'm trying to understand if it's possible to create a set of variables that are numbered based on another variable (using eval) in a loop, and then call on it before the loop ends. As an example I've written a script called question (The fist command is to show what is the contents of the... (2 Replies)
Discussion started by: DeCoTwc
2 Replies

2. UNIX for Dummies Questions & Answers

SED inside while loop

Hi, im having problem creating a loop using my code: aside from the fact that the 1st variable (VAR) does not increment, it loops more than the expected output. for sample purposes, test csv contains 3 lines. #get number of lines in the file lines=$( wc -l < test.csv ) ... (5 Replies)
Discussion started by: paoie
5 Replies

3. Shell Programming and Scripting

Is it possible to have a for loop inside another for loop?

Is it possible to have a for loop nested inside another for loop? I am trying to run a script against specific files inside a child directory but I can't seem to find a solution that is clean and affect to do it. Here is what I am trying do. I have 2 gzip files in each of the Target Directories... (4 Replies)
Discussion started by: scottzx7rr
4 Replies

4. Shell Programming and Scripting

Using 'su' inside a loop

Hi, I am using su within a for loop. As you might expect, it prompts for password during each loop execution. Here is my piece of code: for i in $LIST do if then DATABASE=`echo $i | awk -F "|" '{ print $1 }'` USER_ID=`echo $i | awk -F "|" '{ print $2 }'` su - apstage -c... (1 Reply)
Discussion started by: sugan
1 Replies

5. Shell Programming and Scripting

Need help using sed inside the loop.

Hi, i have written a script. it collects data based on the sql queries executed by it. i have multiple output files. after the output file is made i need to do some cosmetic changes in the files and then store them. i am unable to use sed conditions inside the loop. see below code for... (3 Replies)
Discussion started by: dazdseg
3 Replies

6. Shell Programming and Scripting

BASH loop inside a loop question

Hi all Sorry for the basic question, but i am writing a shell script to get around a slightly flaky binary that ships with one of our servers. This particular utility randomly generates the correct information and could work first time or may work on the 12th or 100th attempt etc !.... (4 Replies)
Discussion started by: rethink
4 Replies

7. Shell Programming and Scripting

For Loop inside For loop

I am new to unix and trying to make a script for writing all my command into another file and use that file to run all commands I am trying to use for loop with echo command to generate a command based script for writing the file with all the command sequentially w.r.t for loop. I want... (6 Replies)
Discussion started by: nnani
6 Replies

8. UNIX for Dummies Questions & Answers

while loop inside a for loop

Hi, I am a newbie and would like to create a shell script that will move one file at a time from one path to another path in the same server. However, the next file should wait until the first file gets deleted by an application. I tried creating the script but right after the first file has... (1 Reply)
Discussion started by: rgomons
1 Replies

9. Shell Programming and Scripting

If inside If loop

Hi All, Below is the very simple code snippet but it si giving me syntax error #!/bin/bash #To ensure If JMS directory exists or not ServerName=$(hostname) #To ensure If JMS directory exists or not echo $ServerName if ; then echo "Inside First If" if ; then echo 'JMS... (4 Replies)
Discussion started by: sharsour
4 Replies

10. UNIX for Dummies Questions & Answers

Write a while loop inside for loop?

I'm taking a unix class and need to countdown to 0 from whatever number the user inputs. I know how to do this with a while or until loop but using the for loop is throwing me off.... I know I can use an if-then statement in my for loop but can I include a while loop in my for loop? (3 Replies)
Discussion started by: xxhieixx
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 05:14 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy