Sponsored Content
Full Discussion: UNIX extracting fields
Top Forums Shell Programming and Scripting UNIX extracting fields Post 302980343 by vamsi.valiveti on Friday 26th of August 2016 09:07:12 AM
Old 08-26-2016
UNIX extracting fields

I have one file A.txt which is comma separated and I want to extract first 4 field's in a file and also I want to add one more column in output
A.txt in output for all records. A.txt should not be hard coded since I do not filename it may be any file.

Last edited by rbatte1; 08-26-2016 at 11:49 AM.. Reason: Highlights and spelling
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Extracting fields from an output 8-)

I am getting a variable as x=2006/01/18 now I have to extract each field from it. Like x1=2006, x2=01 and x3=18. Any idea how? Thanks a lot for help. Thanks CSaha (6 Replies)
Discussion started by: csaha
6 Replies

2. UNIX for Dummies Questions & Answers

Extracting information from text fields.

Dear friends, I'm a novice Unix user and I'm trying to learn the ropes. I have a big task I have to accomplish and I'm convinced Unix can get the job done, I just haven't figured out how. I recently posted on the topic of cutting text between unique text patterns and somebody helped me a great... (24 Replies)
Discussion started by: spindoctor
24 Replies

3. Shell Programming and Scripting

extracting fields

Hi, i have a line with several fields (indefinite number of - count varies) separated by colon. Now, i need to pick each field (except the first one) and have it assigned to variable within a loop. In other words, in the first iteration of the loop, the variable must be assigned with 2nd... (2 Replies)
Discussion started by: prvnrk
2 Replies

4. Shell Programming and Scripting

Removing LF and extracting two fields

I need some assistance, I am writing a script in bash. I want to do two things: 1/. I want to replace the LF at the end of the RFH  Ø  ¸MQSTR ¸ so I can process the file record by record using a while loop. 2/. I want to extract two fields from each record, they are identified with... (1 Reply)
Discussion started by: gugs
1 Replies

5. Shell Programming and Scripting

Extracting fields from file

I am need to extract a number of values from a file, put have now clue how to do this. The file looks like this: # My file Dest=87;CompatibleSystemSoftwareVersion=2.5300-; Dest=87;ImageVersion=000061f3;SystemSoftwareVersion=2.5300;CDN=http://my.backup.com/download.txt;CDN_Timeout=30; I... (3 Replies)
Discussion started by: MagicDude4Eva
3 Replies

6. Shell Programming and Scripting

need help in writing a script for extracting fields

Hi, I need to extract last character of the field retrieved from the database using select command. eg: select event,text from event_data; o/p: Event1,text1 But I need to extract only '1' from the fields...similarly '2' from Event2,text2 and '3' from Event3,text3 etc., and need to pass... (6 Replies)
Discussion started by: Rajesh Putnala
6 Replies

7. Shell Programming and Scripting

Help in extracting fields from a file

I have an input file with contents like: 203969 OrdAcctCycChg USAGE_DAEMON1 203970 OrdAcctCycChg USAGE_DAEMON2 203971 OrdAcctCycChg USAGE_DAEMON3 203972 OrdAcctCycChg USAGE_DAEMON4 I need to extract variables in first column... (51 Replies)
Discussion started by: Rajesh Putnala
51 Replies

8. Shell Programming and Scripting

Pattern Matching and extracting the required fields in Perl

Hi All, I am writing the following Perl Scrip and need your help in Pattern matching : I have the following Shell Script that would read line by line from the file (file_svn) and would inturn calls the Perl Script: #!/bin/bash perl_path="/home/dev/filter"... (2 Replies)
Discussion started by: filter
2 Replies

9. UNIX for Beginners Questions & Answers

Extracting specific fields from an XML file

Hello All, I have a requirement to split the input.xml file different files and i have tried using earlier examples(where i have posted in the forum), but still no luck Here is my input.xml <jms-system-resource> <name>UMSJMSSystemResource</name> ... (4 Replies)
Discussion started by: Siv51427882
4 Replies

10. UNIX for Beginners Questions & Answers

Is there a UNIX command that can compare fields of files with differing number of fields?

Hi, Below are the sample files. x.txt is from an Excel file that is a list of users from Windows and y.txt is a list of database account. $ head -500 x.txt y.txt ==> x.txt <== TEST01 APP_USER_PROFILE USER03 APP_USER_PROFILE TEST02 APP_USER_EXP_PROFILE TEST04 APP_USER_PROFILE USER01 ... (3 Replies)
Discussion started by: newbie_01
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.18.2 2012-02-18 WWW::RobotRules(3)
All times are GMT -4. The time now is 02:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy