Sponsored Content
Full Discussion: grep a variable from list
Top Forums Shell Programming and Scripting grep a variable from list Post 302574178 by godzilla07 on Wednesday 16th of November 2011 05:48:48 PM
Old 11-16-2011
grep a variable from list

I have a script to create a variable from a list, B.txt and then search it in another file, file.txt and then print the pattern line and next line.

Code:
#!/bin/bash
while read a
do
echo "$a" | grep -A 1 $a file.txt > $a\.txt    
done < B.txt

I always get that no such file or directory exists for any $a.txt
Any suggestions

Moderator's Comments:
Mod Comment Suggestion: Watch this Video on how to use code tags

Last edited by pludi; 11-16-2011 at 06:55 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to grep a variable?

Hi, I'd like to grep a variable that I saved in the program. Like grep '0\$variable1' file1 Does someone know what's wrong with this command? Thanks a lot! (2 Replies)
Discussion started by: whatisthis
2 Replies

2. Shell Programming and Scripting

grep with variable

Hi, I can't get this script to work (returns 0, should return 3): $ cat A.lst | \ while read LINE do echo "$LINE" grep -c "$LINE" B.tmp done> > > > > Socket 0 $ but in contrast this one works fine (returns 3 as expected): $ LINE=Socket $ grep -c $LINE B.tmp 3 $ (5 Replies)
Discussion started by: ozvena
5 Replies

3. Shell Programming and Scripting

grep a variable

Hi all, I am trying to do a simple thing in my mind. However I am fairly new to bash. What I need to do is create a folder for each partition on each CD, and each partition has a unique name (with spaces in it, do not ask why, it is already done :confused: ) . All CD's will show up... (2 Replies)
Discussion started by: sgstuart
2 Replies

4. Shell Programming and Scripting

grep a variable

can i grep a variable say i have a variable var=`hostname` and I want to make an if statement like if grep "esp-ueh" $var;then...... how can i do this I dont want to store this variable in a file and the grep it because my script will be used at the same time on multiple stations and then that... (9 Replies)
Discussion started by: lassimanji
9 Replies

5. Shell Programming and Scripting

Grep through a variable

I want to search a text in file but that file is pointing to variable. ex: file=there was nothing special grep "there was nothing" $file but its not working . Can u let me know that how we can use variable($file) in grep command. Please use code tags (6 Replies)
Discussion started by: allthanksquery
6 Replies

6. Shell Programming and Scripting

grep using variable

I have a pattern like: column "5" is missing PS: the no is in double quotes. The number usally changes, so we use a loop to grep. grep 'column "$number" is missing' filename.txt But it is not working.... How to solve this? (2 Replies)
Discussion started by: karumudi7
2 Replies

7. Shell Programming and Scripting

grep in a variable

Hello, I usually search extensively and have to date found what I've needed. However, this one's got me stumped. I need to create a variable as follow. The issue however is that upon execution, it freezes. $var1 isn't always present in usage.log and this is fine but I'd like it to continue with... (6 Replies)
Discussion started by: shadyuk
6 Replies

8. Shell Programming and Scripting

grep variable

I've got a file that I'm trying to grep through that looks like this: alpha1 alpha2 alpha3 beta1 beta2 gamma5 gamma6 gamma7 gamma8 gamma9 and I want the output to only contain the line with the highest value for each, so the output I want is: alpha3 beta2 gamma9 I also need... (11 Replies)
Discussion started by: tiberione
11 Replies

9. Shell Programming and Scripting

Help grep one variable over other

Hi, I am trying to grep one variable over the other variable Example: i=abc j=ab grep $j $i I am getting this error: The error is due to $i being variable and not file. I know I could do it by putting the value of abc in a file and then greping it. (1 Reply)
Discussion started by: pinnacle
1 Replies

10. UNIX for Beginners Questions & Answers

How to use $variable in grep?

hi i have a file which contains some messages counters. below is the snippet on the file. 17-05-29::22:36:21|message|231 17-05-29::22:36:31|message|222 17-05-29::22:36:41|message|213 17-05-30::22:36:51|message|221 17-05-30::22:37:01|message|227 17-05-30::22:37:11|message|207... (5 Replies)
Discussion started by: scriptor
5 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 03:05 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy