Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Append value(batch number) to start of records Post 302186206 by kiran_418 on Wednesday 16th of April 2008 04:52:33 PM
Old 04-16-2008
Append value(batch number) to start of records

Hi all,
I am new to unix shell scripting and I am trying to append batch number that comes in Trailer record to the detailed record.

TR|20080312|22881 |000000005|20080319|2202
LN|20080312|077777722220 |0000100000017|ABS
LN|20080312|000799439326 |0000709943937|AA
TR|20080313|22897 |000000008|20080319|2202
LN|20080313|077777722220 |0000100000017|BCG
LN|20080313|603497005222 |0000600070057|ADXC
TR|20080314|22898 |000000011|20080319|2202
LN|20080314|077777722220 |0000100000017|kKAS
LN|20080314|603497005222 |0000600070057|adfras

This is how my data would be. I want to pull out Batch number i.e 4th column in the record that starts with TR and append it to first column of LN. This is one single file.

Filename=$1
grep '^TR' $Filename>A.txt
grep '^LN' $Filename>B.txt

#join -t"|" -1 2 -2 2 -0 1.4 A.txt B.txt >C.txt

cat B.txt | while read line
do
var=`echo $line | cut -d"|" -f2`
cat A.txt | while read value
do
val=`echo $value | cut -d"|" -f2`
val1=`echo $value | cut -d"|" -f4`
if [[ $var == $val ]]
then
line=$val1"|"$line
echo $line>>Apple.txt
fi
done
done

I tried the above code and its very very slow. The volume of records is from 2Mb to 60 MB.

Please adive. I think awk is faster but not sure how to do it in awk

Thanks
Kiran
 

10 More Discussions You Might Find Interesting

1. AIX

Script to start a remote batch job on another server

Hi , I am trying to execute one script residing on server B from server A and in automated way but with a trigger. My main quetion are 1) How I will login to the remote server automatically with user name and password. ( rsh or any other way ?) 2) Once logged in I need to execute... (2 Replies)
Discussion started by: agent47
2 Replies

2. Shell Programming and Scripting

How to append records to a file using PERL

Hi All, Great Forum and Great help. Keep up the good work. My question is what is the command and it's syntax to append a record to an output file using PERL. Please provide the command syntax. In regular shell you can use the '>>' to append. Basically, I am creating a small report... (1 Reply)
Discussion started by: nurani
1 Replies

3. Shell Programming and Scripting

Append string at start of line

Hi, I want to append # at the start of line wherever keyword xyz is found through stream editor? Is it possible? (18 Replies)
Discussion started by: db2cap
18 Replies

4. UNIX for Dummies Questions & Answers

Append symbol at the start of each line

hi, i have some values in excel sheet as in below format: 122 144 222 555 666 etc.... i need to get the output in the below manner.. £122 £144 £222 £555 £666 (1 Reply)
Discussion started by: arunmanas
1 Replies

5. UNIX for Dummies Questions & Answers

pull date from header and append to all records

I did some searches, but couldn't really find what I'm looking for. I have a file formatted as below: BOF ABC CO - XYZ COMM DATA OF 07/05/2011 EBA00000001 sdfa rtyus uyml EBB00000001 54682 984w3 EBA00000002 mkiyuasdf 98234 I want to pull the date from the header record and add it... (4 Replies)
Discussion started by: keeferb
4 Replies

6. Windows & DOS: Issues & Discussions

How to start a vbs from a windows batch file?

Morning, I'm trying to execute a vbs from a .bat file. Can someone tell me what the difference is between these statements: start c:\lib\runit.vbc c:\lib\runit.vbs When I run the batch with the 'start' parameter it doesn't seem to do anything. (1 Reply)
Discussion started by: Grueben
1 Replies

7. Shell Programming and Scripting

AWK print number of records, divide this number

I would like to print the number of records of 2 files, and divide the two numbers awk '{print NR}' file1 > output1 awk '{print NR}' file2 > output2 paste output1 output2 > output awl '{print $1/$2}' output > output_2 is there a faster way? (8 Replies)
Discussion started by: programmerc
8 Replies

8. Shell Programming and Scripting

Compare two files with different number of records and output only the Extra records from file1

Hi Freinds , I have 2 files . File 1 |nag|HYd|1|Che |esw|Gun|2|hyd |pra|bhe|3|hyd |omu|hei|4|bnsj |uer|oeri|5|uery File 2 |nag|HYd|1|Che |esw|Gun|2|hyd |uer|oi|3|uery output : (9 Replies)
Discussion started by: i150371485
9 Replies

9. Shell Programming and Scripting

The difference between end number in the early row and the start number in the next

Hi Power User, I'm trying to compute this kind of text file format: file1: jakarta 100 150 jakarta 170 210 beijing 220 250 beijing 260 280 beijing 290 320 new_york 330 350 new_york 370 420 tokyo 430 470 tokyo 480 ... (2 Replies)
Discussion started by: anjas
2 Replies

10. Shell Programming and Scripting

Start process on X number of files and then wait for the next batch

Thanks for RudiC for his extraordinary help on organizing files in a batch of 10 using below code. FL=($(ls)); for ((i=0;i<=${#FL};i++)); do for j in ${FL:$i:10}; do $batch ${j} ${j}.txt done; echo "Pausing for next iteration"; echo... (6 Replies)
Discussion started by: busyboy
6 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 04:08 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy