Sponsored Content
Full Discussion: How to combined file?
Top Forums Shell Programming and Scripting How to combined file? Post 302271338 by jantzen16 on Wednesday 24th of December 2008 10:08:43 PM
Old 12-24-2008
Quote:
Originally Posted by cfajohnson
Code:
join -o 1.1,1.2,2.2,1.3,2.3,1.4,2.4 file1 file2 | tr ' ' :


its working.. Smilie

what if sir in file1 separator is <tab> and in file2 is <column> :
is it also applicable to that function?

Last edited by jantzen16; 12-24-2008 at 11:20 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

FIND/CHMOD combined

I am trying to change permission for all subdirectories and files inside folder1 so this is what i came with after many seraches on the internet. man find and man chmod mirc and few articles. find .public_html/folder1 -print0 | xargs -0 chmod 777 what's wrong with this command? it is FTP... (33 Replies)
Discussion started by: smoother
33 Replies

2. UNIX for Advanced & Expert Users

combined stdout & stderr

Hello Everyone! I'm trying to combine output for standard output and for possible standard error to the log file. I was trying to use tee command, but it turned out if error occurred error output will be send to the screen only and will not be redirected with tee command to the log file. Anyone... (11 Replies)
Discussion started by: slavam
11 Replies

3. Shell Programming and Scripting

How to combined to file with their values

I have two files and I need to combine their values example: i have file1 and file2 in file1 1019 40 50 1119 55 62 in file2 1019 33 10 1119 12 44 desired output should be: file3 1019:40:33:50:10 1119:55:12:62:44 (5 Replies)
Discussion started by: jantzen16
5 Replies

4. Shell Programming and Scripting

Awk not printing the last combined column

nawk -F "|" 'FNR==NR {a=$2 OFS $3 OFS $4 OFS $5 OFS $6;next}\ {if ($5 in a)print $1,"test",$5,a, $2,$3,$4 OFS OFS OFS OFS OFS OFS OFS OFS $2-$3-$4 ; \ else print $1,"Database",$5 OFS OFS OFS OFS OFS OFS $2,$3,$4 OFS OFS OFS OFS OFS OFS OFS OFS $2-$3-$4 }' OFS="|" \ file1 file2 > file3 This... (5 Replies)
Discussion started by: pinnacle
5 Replies

5. UNIX for Dummies Questions & Answers

Combined many txt files by columns

Hi, I want to combined 500 .txt files (each file only has one line) to a single txt file (A.txt). for example, A1.txt with 12 23 34 45 A2.txt with 56 67 78 89 ...etc, so I have A1.txt, A2.txt... , A499.txt, A500.txt (1 Reply)
Discussion started by: AMBER
1 Replies

6. Shell Programming and Scripting

Merging of files with different headers to make combined headers file

Hi , I have a typical situation. I have 4 files and with different headers (number of headers is varible ). I need to make such a merged file which will have headers combined from all files (comman coluns should appear once only). For example - File 1 H1|H2|H3|H4 11|12|13|14 21|22|23|23... (1 Reply)
Discussion started by: marut_ashu
1 Replies

7. Shell Programming and Scripting

Combined Two CSV Lines

I have two CSV lines, I.e.: Line 1 = the,quick,brown,fox, ,jumps, ,the, ,dog Line 2 = the,quick,brown,fox, , ,over, ,lazy,dog Literally, columns missing from line 1 exist in line 2. Any suggestions on quick ways to combined these two lines into one line: New line:... (2 Replies)
Discussion started by: msf004
2 Replies

8. UNIX for Dummies Questions & Answers

Grep and cat combined

Hello, i need to search one word (snp1) from many files and copy the content of the columns of this word in new file. example: file 1: SNP BP CHR P snp1 1 3 0.01 snp2 2 2 0.05 . . file 2: SNP BP CHR P snp1 1 3 0.06 snp2 2 2 0.3 output... (6 Replies)
Discussion started by: biopsy
6 Replies

9. Shell Programming and Scripting

awk combined with an IF

Hi everybody! I try to printout a csv-file with the exeption of cell $1 and $4. what i tried so far: awk '{for(i = 1; i<=NF; i++);if(i == 1 || i == 4);else print($i)}' file.csv ..any ideas how it work and why my example fails? Thanks in advance! IMPe (3 Replies)
Discussion started by: IMPe
3 Replies

10. Shell Programming and Scripting

Combined sed+awk for lookup csv file

have written a combined sed+awk to perform a lookup operation which works but looking to enhance it. looking to match a record using any of the comma separated values + return selected fields from the record - including the field header. so: cat foo make,model,engine,trim,value... (6 Replies)
Discussion started by: jack.bauer
6 Replies
JOIN(1) 						      General Commands Manual							   JOIN(1)

NAME
join - relational database operator SYNOPSIS
join [-an] [-e s] [-o list] [-tc] file1 file2 DESCRIPTION
Join forms, on the standard output, a join of the two relations specified by the lines of file1 and file2. If file1 is `-', the standard input is used. File1 and file2 must be sorted in increasing ASCII collating sequence on the fields on which they are to be joined, normally the first in each line. There is one line in the output for each pair of lines in file1 and file2 that have identical join fields. The output line normally con- sists of the common field, then the rest of the line from file1, then the rest of the line from file2. Fields are normally separated by blank, tab or newline. In this case, multiple separators count as one, and leading separators are dis- carded. These options are recognized: -an In addition to the normal output, produce a line for each unpairable line in file n, where n is 1 or 2. -e s Replace empty output fields by string s. -o list Each output line comprises the fields specified in list, each element of which has the form n.m, where n is a file number and m is a field number. -tc Use character c as a separator (tab character). Every appearance of c in a line is significant. SEE ALSO
sort(1), comm(1), awk(1). BUGS
With default field separation, the collating sequence is that of sort -b; with -t, the sequence is that of a plain sort. The conventions of join, sort, comm, uniq, look and awk(1) are wildly incongruous. 7th Edition April 29, 1985 JOIN(1)
All times are GMT -4. The time now is 01:34 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy