Sponsored Content
Top Forums Shell Programming and Scripting KSH to group records in a file and compare it with another file Post 302345447 by Matrix2682 on Wednesday 19th of August 2009 09:10:41 AM
Old 08-19-2009
malcomex999, thanks for the code... it works... but if a same employee belongs to 2 departments, that record is not coming in output.
Code:
awk 'NR>2 && NR==FNR{arr[$2]=$1}NR>2 && NR!=FNR {for(i in arr){if(i==$1) print arr[i],i,$2}}' dept emp |
sort -n +2 | awk '{arr[$1]=$0}END{for (i in arr){print arr[i]}}'

I'll give you the actual data:

Dept.csv:

Code:
22405        ,22405        
22979        ,22979        
23193        ,23193        
23193        ,23200        
23193        ,27741        
23193        ,27743        
23193        ,38758        
23193        ,38759        
23194        ,23194        
23194        ,23202        
23194        ,27744        
23194        ,27745        
23200        ,23193        
23200        ,23200        
23200        ,27741        
23200        ,27743        
23200        ,38758        
23200        ,38759

Emp.csv

Code:
22405        ,3500
22979        ,3800
23193        ,3900
23194        ,3900
23200        ,3900
23202        ,3900
27797        ,3900

Here, I am not getting Emp# 23193 in the output. Please help...
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

find and group records in a file

Hi, I have this file which has 3 columns, District , stores and unit. What I want is all rows belonging to one district to be created separately under each district, the districts may vary every day , the source file may have 3 districts today and may have 160 tomorrow, so what I need is a... (20 Replies)
Discussion started by: thumsup9
20 Replies

2. Shell Programming and Scripting

awk/sed/ksh script to cleanup /etc/group file

Many of my servers' /etc/group file have many userid's that does not exist in /etc/passwd file and they need to be deleted. This happened due to manual manipulation of /etc/passwd files. I need to do this for 40 servers. Can anyone help me in achieving this? Even reducing a step or two will be... (6 Replies)
Discussion started by: pdtak
6 Replies

3. Shell Programming and Scripting

compare fields in a file with duplicate records

Hi: I've been searching the net but didnt find a clue. I have a file in which, for some records, some fields coincide. I want to compare one (or more) of the dissimilar fields and retain the one record that fulfills a certain condition. For example, on this file: 99 TR 1991 5 06 ... (1 Reply)
Discussion started by: rleal
1 Replies

4. Shell Programming and Scripting

how to group records in a file

hi, I have records like this D127@dm.com,127,569,BRAD,25/08/2009 23:59 D127@dm.com,127,569,BRAD,25/08/2009 23:59 D159@dm.com,159,1170,DAVE,25/08/2009 23:59 D159@dm.com,159,1181,HALE,25/08/2009 23:59 D393@dm.com,393,1209,CAPIT,25/08/2009 23:59 D457@dm.com,457,571,NORTT,25/08/2009 23:59... (4 Replies)
Discussion started by: trichyselva
4 Replies

5. Shell Programming and Scripting

ksh coding to extract records from file

Hello, I have a file with various records in it (from length 30 - 195) and I want to run a script to read each line and copy only the recl=80 files to an output file. Any help much appreciated (4 Replies)
Discussion started by: Grueben
4 Replies

6. Shell Programming and Scripting

Nawk script to compare records of a file based on a particular column.

Hi Gurus, I am struggling with nawk command where i am processing a file based on columns. Here is the sample data file. UM113570248|24-AUG-11|4|man1|RR211 Alert: Master Process failure |24-AUG-11 UM113570624|24-AUG-11|4|man1| Alert: Pattern 'E_DCLeDAOException' found |24-AUG-11... (7 Replies)
Discussion started by: usha rao
7 Replies

7. Shell Programming and Scripting

How to compare data from 2 zip files and capture the new records from file2 to a new file

I have 2 zip files which have about 20 million records in each file. file 2 will have additional records than file 1. I want to compare the records in both the files and capture the new records from file 2 into another file file3. Please help me with a command/script which provides me the desired... (8 Replies)
Discussion started by: koneru
8 Replies

8. Shell Programming and Scripting

Compare multiple files, identify common records and combine unique values into one file

Good morning all, I have a problem that is one step beyond a standard awk compare. I would like to compare three files which have several thousand records against a fourth file. All of them have a value in each row that is identical, and one value in each of those rows which may be duplicated... (1 Reply)
Discussion started by: nashton
1 Replies

9. Shell Programming and Scripting

awk - compare records of 1 file with 3 files

hi.. I want to compare records present in 1 file with those in 3 other files and print those records of file 1 which are not present in any of the files. for eg - file1 file2 file3 file4 1 1 5 7 2 2 6 9 3 4 5 6 7 8 9 ... (3 Replies)
Discussion started by: Abhiraj Singh
3 Replies

10. Shell Programming and Scripting

To group records in the file in UNIX

Hi All, I am using RHEL 6.9. I got a requirement to group the records in a file.The file content as shown below. #### FAILED JOBS IN XXX ##### 1> ABCD failed in the project XXX 2> HJK Job is in compiled state in the project XXX 3> ILKD failed in the project XXX 4> DFG failed in the... (5 Replies)
Discussion started by: ginrkf
5 Replies
PG_FETCH_ARRAY(3)														 PG_FETCH_ARRAY(3)

pg_fetch_array - Fetch a row as an array

SYNOPSIS
array pg_fetch_array (resource $result, [int $row], [int $result_type = PGSQL_BOTH]) DESCRIPTION
pg_fetch_array(3) returns an array that corresponds to the fetched row (record). pg_fetch_array(3) is an extended version of pg_fetch_row(3). In addition to storing the data in the numeric indices (field number) to the result array, it can also store the data using associative indices (field name). It stores both indicies by default. Note This function sets NULL fields to the PHP NULL value. pg_fetch_array(3) is NOT significantly slower than using pg_fetch_row(3), and is significantly easier to use. PARAMETERS
o $result - PostgreSQL query result resource, returned by pg_query(3), pg_query_params(3) or pg_execute(3) (among others). o $row - Row number in result to fetch. Rows are numbered from 0 upwards. If omitted or NULL, the next row is fetched. o $result_type - An optional parameter that controls how the returned array is indexed. $result_type is a constant and can take the following values: PGSQL_ASSOC, PGSQL_NUM and PGSQL_BOTH. Using PGSQL_NUM, pg_fetch_array(3) will return an array with numerical indices, using PGSQL_ASSOC it will return only associative indices while PGSQL_BOTH, the default, will return both numerical and associa- tive indices. RETURN VALUES
An array indexed numerically (beginning with 0) or associatively (indexed by field name), or both. Each value in the array is represented as a string. Database NULL values are returned as NULL. FALSE is returned if $row exceeds the number of rows in the set, there are no more rows, or on any other error. EXAMPLES
Example #1 pg_fetch_array(3) example <?php $conn = pg_pconnect("dbname=publisher"); if (!$conn) { echo "An error occurred. "; exit; } $result = pg_query($conn, "SELECT author, email FROM authors"); if (!$result) { echo "An error occurred. "; exit; } $arr = pg_fetch_array($result, 0, PGSQL_NUM); echo $arr[0] . " <- Row 1 Author "; echo $arr[1] . " <- Row 1 E-mail "; // As of PHP 4.1.0, the row parameter is optional; NULL can be passed instead, // to pass a result_type. Successive calls to pg_fetch_array will return the // next row. $arr = pg_fetch_array($result, NULL, PGSQL_ASSOC); echo $arr["author"] . " <- Row 2 Author "; echo $arr["email"] . " <- Row 2 E-mail "; $arr = pg_fetch_array($result); echo $arr["author"] . " <- Row 3 Author "; echo $arr[1] . " <- Row 3 E-mail "; ?> SEE ALSO
pg_fetch_row(3), pg_fetch_object(3), pg_fetch_result(3). PHP Documentation Group PG_FETCH_ARRAY(3)
All times are GMT -4. The time now is 11:44 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy