Sponsored Content
Top Forums Shell Programming and Scripting Vlookup using Ask from specific column shell script Post 303008124 by Yoda on Monday 27th of November 2017 12:16:02 PM
Old 11-27-2017
For your example:-
Code:
awk 'NR==FNR{A[$1];next}$2 in A{$2="sanpfound," $3}1' file2 file1

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Insert a text from a specific row into a specific column using SED or AWK

Hi, I am having trouble converting a text file. I have been working for this whole day now, still i couldn't make it. Here is how the text file looks: _______________________________________________________ DEVICE STATUS INFORMATION FOR LOCATION 1: OPER STATES: Disabled E:Enabled ... (5 Replies)
Discussion started by: Issemael
5 Replies

2. Shell Programming and Scripting

Creating Report file with 'vlookup' kind of structure in shell

Hi, I have some files in the following structure. File_a.txt Field_1 Pass Field_2 Pass Field_3 Pass File_b.txt Field_1 Pass Field_2 Fail Field_3 Pass File_c.txt Field_1 Fail Field_2 Pass Field_3 Pass (2 Replies)
Discussion started by: vikaskm
2 Replies

3. Shell Programming and Scripting

Assigning a specific format to a specific column in a text file using awk and printf

Hi, I have the following text file: 8 T1mapping_flip02 ok 128 108 30 1 665000-000008-000001.dcm 9 T1mapping_flip05 ok 128 108 30 1 665000-000009-000001.dcm 10 T1mapping_flip10 ok 128 108 30 1 665000-000010-000001.dcm 11 T1mapping_flip15 ok 128 108 30... (2 Replies)
Discussion started by: goodbenito
2 Replies

4. Shell Programming and Scripting

KSH Script to Execute command from specific column in file

Hi Unix Experts,Team I have a file (say comand_file.prm), The file has a command specified in column 6 (say "./execute_script.sh"). I want to execute this command in my script. I am writing a KSH script to achieve this. Could you please assist me with this. (6 Replies)
Discussion started by: Jeevanm
6 Replies

5. Shell Programming and Scripting

Shell Scripting Vlookup

I am developing shell script on Linux OS and I have two files.Data in each file is like : File1 : A B C E F G X Y Z File 2: A C 12 E G 22 X Z 41 I need if first and third column entries ( $1 & $3) of File1 in same row matches with first & second column... (3 Replies)
Discussion started by: suneet17
3 Replies

6. Shell Programming and Scripting

Overwrite specific column in xml file with the specific column from adjacent line

I have an xml file dumped from rrd file, that I want to "patch" so the xml file doesn't contain any blank hole in the resulting graph of the rrd file. Here is the file. <!-- 2015-10-12 14:00:00 WIB / 1444633200 --> <row><v> 4.0419731265e+07 </v><v> 4.5045912770e+06... (2 Replies)
Discussion started by: rk4k
2 Replies

7. Shell Programming and Scripting

Vlookup using Ask from specific column from two files

File1 alias:server1_00,20:f1:0a:25:b5:03:02:90 alias:server2_00,20:f1:0a:25:b5:03:02:91 alias:server3_00,50:00:09:75:50:0d:bd:da alias:server4_00,20:f1:0a:25:b5:03:02:93 alias:server5_00,21:00:00:24:ff:8b:e1:fe alias:server6_00,50:00:09:75:50:08:54:44... (17 Replies)
Discussion started by: ranjancom2000
17 Replies

8. Shell Programming and Scripting

Need awk or Shell script to compare Column-1 of two different CSV files and print if column-1 matche

Example: I have files in below format file 1: zxc,133,joe@example.com cst,222,xyz@example1.com File 2 Contains: hxd hcd jws zxc cst File 1 has 50000 lines and file 2 has around 30000 lines : Expected Output has to be : hxd hcd jws (5 Replies)
Discussion started by: TestPractice
5 Replies

9. UNIX for Beginners Questions & Answers

How to insert data into black column( Secound Column ) in excel (.XLSX) file using shell script?

Source Code of the original script is down below please run the script and try to solve this problem this is my data and I want it column wise 2019-03-20 13:00:00:000 2019-03-20 15:00:00:000 1 Operating System LAB 0 1 1 1 1 1 1 1 1 1 0 1 (5 Replies)
Discussion started by: Shubham1182
5 Replies

10. Shell Programming and Scripting

Using awk to change a specific column and in a specific row

I am trying to change the number in bold to 2400 01,000300032,193631306,190619,0640,1,80,,2/ 02,193631306,000300032,1,190618,0640,CAD,2/ I'm not sure if sed or awk is the answer. I was going to use sed and do a character count up to that point, but that column directly before 0640 might... (8 Replies)
Discussion started by: juggernautjoee
8 Replies
DIFF(1) 						      General Commands Manual							   DIFF(1)

NAME
diff - print differences between two files SYNOPSIS
diff [-c | -e | -C n] [-br]file1 file2 OPTIONS
-C n Produce output that contains n lines of context -b Ignore white space when comparing -c Produce output that contains three lines of context -e Produce an ed-script to convert file1 into file2 -r Apply diff recursively to files and directories of EXAMPLES
diff file1 file2 # Print differences between 2 files diff -C 0 file1 file2 # Same as above diff -C 3 file1 file2 # Output three lines of context with every diff -c file1 file2 # Same diff /etc /dev # Compares recursively the directories /etc and /dev diff passwd /etc # Compares ./passwd to /etc/passwd DESCRIPTION
the same name, when file1 and file2 are both directories" difference encountered" Diff compares two files and generates a list of lines telling how the two files differ. Lines may not be longer than 128 characters. If the two arguments on the command line are both directories, diff recursively steps through all subdirectories comparing files of the same name. If a file name is found only in one directory, a diagnostic message is written to stdout. A file that is of either block special, character special or FIFO special type, cannot be compared to any other file. On the other hand, if there is one directory and one file given on the command line, diff tries to compare the file with the same name as file in the directory directory. SEE ALSO
cdiff(1), cmp(1), comm(1), patch(1). DIFF(1)
All times are GMT -4. The time now is 09:10 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy