Sponsored Content
Top Forums Shell Programming and Scripting awk - fetch multiple data from huge dump Post 302884056 by Klashxx on Friday 17th of January 2014 08:53:55 AM
Old 01-17-2014
Quote:
Originally Posted by navkanwal
In the Dump files, the field is not constant so it could be in any field number.
Regards
Navkanwal
In that case awk in not gonna beat egrep.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Fetch selected data from webpage

Hi All, Can anybody tell me the command used for extracting some selected lines from a web-page. I guess we'll have to do this using wget or Curl to achieve this.... If anbody has any idea abt it, kindly post your reply ASAP. Thanks. (1 Reply)
Discussion started by: sunnydynamic15
1 Replies

2. Shell Programming and Scripting

fetch data between two timestamp using script

Hi Guys, I have the data in below format. 25 Dec 2011 03:00:01 : aaaaaaaaaaaaaaa 25 Dec 2011 04:23:23 : bbbbbbbbbbbbbbb 25 Dec 2011 16:12:45 : ccccccccccccccc 26 Dec 2011 04:45:34 : ddddddddddddddd 26 Dec 2011 17:01:22 : eeeeeeeeeeeeeee 27 Dec 2011 12:33:45 : ffffffffffffffffffffffff 28... (13 Replies)
Discussion started by: jaituteja
13 Replies

3. Shell Programming and Scripting

Awk to Count Multiple patterns in a huge file

Hi, I have a file that is 430K lines long. It has records like below |site1|MAP |site2|MAP |site1|MODAL |site2|MAP |site2|MODAL |site2|LINK |site1|LINK My task is to count the number of time MAP, MODAL, LINK occurs for a single site and write new records like below to a new file ... (5 Replies)
Discussion started by: reach.sree@gmai
5 Replies

4. Shell Programming and Scripting

Fetch data from a particular location

I want to fetch value from a particular location from a file but in each line in the file it appears at a different position so i tried using variable with cut command but it is not working properly. The code i have written is #!/bin/sh cat Sri1.log | while read d2 do grep -w... (9 Replies)
Discussion started by: Prachi Gupta
9 Replies

5. UNIX for Dummies Questions & Answers

how to fetch data in unix

Hi All, I have a file with the below data as shown. A|2|20120430 B|EMP|NAME|DEPT C|12|SARC|01 C|23||ASDD|02 D|END OF FILE I want to fetch only the records that contains C|, what is unix command to fetch this data. Thanks (5 Replies)
Discussion started by: halpavan2
5 Replies

6. Shell Programming and Scripting

Help Need to fetch the required data

Hi Guys, Am in need of your help one more time on my real data. I have a file which contains more than thousand lines of data Live data shown for 4 iterations. We have more than thousand lines of data:- -------------------------------------------------------------------------- ... (4 Replies)
Discussion started by: rocky2013
4 Replies

7. Shell Programming and Scripting

awk does not work well with huge data?

Dear all , I found that if we work with thousands line of data, awk does not work perfectly. It will cut hundreds line (others are deleted) and works only on the remain data. I used this command : awk '$1==1{$1="Si"}{print>FILENAME}' coba.xyz to change value of first column whose value is 1... (4 Replies)
Discussion started by: ariesto
4 Replies

8. Shell Programming and Scripting

Need to fetch only selected data in CSV

Hi Team, I m getting my script commands output like given below GETA-TILL-INF; U-UU-YRYT-NOD-6002 2015-05-14 THU 19:44:10 C2221 RETRIEVE TILL INFORMATION : COMPLD ---------------------------------------------------------------------- CONNECT_CARD_ID ... (9 Replies)
Discussion started by: Ganesh Mankar
9 Replies

9. Shell Programming and Scripting

Fetch data from file

Hi, I am new to scripting. I have a log file and need to fetch specific logs and copy to another file. A copy of the log is like this: =============================================================== = JOB : server123#jobs1.jobstream1 = USER : andyc = Tue 08/01/17... (3 Replies)
Discussion started by: Prngp
3 Replies

10. UNIX for Advanced & Expert Users

Need Optimization shell/awk script to aggreagte (sum) for all the columns of Huge data file

Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file File delimiter "|" Need to have Sum of all columns, with column number : aggregation (summation) for each column File not having the header Like below - Column 1 "Total Column 2 : "Total ... ...... (2 Replies)
Discussion started by: kartikirans
2 Replies
JOIN(1) 						    BSD General Commands Manual 						   JOIN(1)

NAME
join -- relational database operator SYNOPSIS
join [-a file_number | -v file_number] [-e string] [-o list] [-t char] [-1 field] [-2 field] file1 file2 DESCRIPTION
The join utility performs an ``equality join'' on the specified files and writes the result to the standard output. The ``join field'' is the field in each file by which the files are compared. The first field in each line is used by default. There is one line in the output for each pair of lines in file1 and file2 which have identical join fields. Each output line consists of the join field, the remaining fields from file1 and then the remaining fields from file2. The default field separators are tab and space characters. In this case, multiple tabs and spaces count as a single field separator, and leading tabs and spaces are ignored. The default output field separator is a single space character. Many of the options use file and field numbers. Both file numbers and field numbers are 1 based, i.e. the first file on the command line is file number 1 and the first field is field number 1. The following options are available: -a file_number In addition to the default output, produce a line for each unpairable line in file file_number. -e string Replace empty output fields with string. -o list The -o option specifies the fields that will be output from each file for each line with matching join fields. Each element of list has the either the form 'file_number.field', where file_number is a file number and field is a field number, or the form '0' (zero), representing the join field. The elements of list must be either comma (``,'') or whitespace separated. (The latter requires quot- ing to protect it from the shell, or, a simpler approach is to use multiple -o options.) -t char Use character char as a field delimiter for both input and output. Every occurrence of char in a line is significant. -v file_number Do not display the default output, but display a line for each unpairable line in file file_number. The options -v 1 and -v 2 may be specified at the same time. -1 field Join on the field'th field of file 1. -2 field Join on the field'th field of file 2. When the default field delimiter characters are used, the files to be joined should be ordered in the collating sequence of sort(1), using the -b option, on the fields on which they are to be joined, otherwise join may not report all field matches. When the field delimiter char- acters are specified by the -t option, the collating sequence should be the same as sort(1) without the -b option. If one of the arguments file1 or file2 is ``-'', the standard input is used. DIAGNOSTICS
The join utility exits 0 on success, and >0 if an error occurs. COMPATIBILITY
For compatibility with historic versions of join, the following options are available: -a In addition to the default output, produce a line for each unpairable line in both file 1 and file 2. -j1 field Join on the field'th field of file 1. -j2 field Join on the field'th field of file 2. -j field Join on the field'th field of both file 1 and file 2. -o list ... Historical implementations of join permitted multiple arguments to the -o option. These arguments were of the form 'file_number.field_number' as described for the current -o option. This has obvious difficulties in the presence of files named '1.2'. These options are available only so historic shellscripts don't require modification and should not be used. STANDARDS
The join command conforms to IEEE Std 1003.1-2001 (``POSIX.1''). SEE ALSO
awk(1), comm(1), paste(1), sort(1), uniq(1) BSD
April 18, 2002 BSD
All times are GMT -4. The time now is 05:22 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy