Sponsored Content
Full Discussion: Merging two files
Top Forums Shell Programming and Scripting Merging two files Post 302773543 by pamu on Thursday 28th of February 2013 07:22:40 AM
Old 02-28-2013
Code:
awk -F "|" 'NR==FNR{X[$1,++Y[$1]]=$2;next}
    {for(i=1;i<=Y[$1];i++){print $1,$2,X[$1,i]}}' OFS="|" file2 file1

This User Gave Thanks to pamu For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

merging files

Thanks in advance I have 2 files having key field in each.I would like to join both on common key.I have used join but not sucessful. The files are attached here . what i Want in the output is on the key field SLS OFFR . I have used join commd but not successful. File one ======= SNO ... (6 Replies)
Discussion started by: vakharia Mahesh
6 Replies

2. Shell Programming and Scripting

Merging 2 files

Hi, I have got two files 1.txt 1111|apple| 2222|orange| 2.txt 1111|1234|000000000004356| 1111|1234|000000001111| 1111|1234|002000011112| 2222|5678|000000002222| 2222|9102|000000002222| I need to merge these two so that my out put looks like below: Search code being used should be... (4 Replies)
Discussion started by: jisha
4 Replies

3. Shell Programming and Scripting

Help with merging files

i would like to merge two files that have the same format but have different data. i would like to create one output file that contains information from both the original files.:rolleyes: (2 Replies)
Discussion started by: joe black
2 Replies

4. Shell Programming and Scripting

merging two files

Hi everyone, I have two files which will be exactly same at first. After sometime there will be inserts in one file. My problem is how to reflect these changes in second file also. I found out that any compare and merge utility would do the job like, GNU " sdiff " command. But the... (14 Replies)
Discussion started by: rameshonline
14 Replies

5. Shell Programming and Scripting

merging of files.

Hi, I want to merge the two files on the basis of columns like... file 1 Data Key A 12 B 13 file2 Data Value A A1 A A2 B B1 B B2 (5 Replies)
Discussion started by: clx
5 Replies

6. UNIX for Dummies Questions & Answers

Merging two files

Hi, I have two files a.txt and b.txt. a.txt 1 2 3 4 b.txt a b c d e I want to generate a file c.txt by merging these two file and the resultant file would contain c.txt 1 (4 Replies)
Discussion started by: siba.s.nayak
4 Replies

7. Shell Programming and Scripting

Merging files

I have two files file 1 containing x rows and 1 column file 2 containing x rows and 1 column I want to merge both the files and add a comma between the two eg plz guide (1 Reply)
Discussion started by: test_user
1 Replies

8. Shell Programming and Scripting

Merging two files with same name

Hello all, I have limited experience in shell scripting. Here goes my question: I have two directories that have same number of files with same file names i.e. consider 2 directories A and B. Both directories have files 1.txt, 2.txt...... I need to merge the file 1.txt of A with file 1.txt... (5 Replies)
Discussion started by: jaysean
5 Replies

9. Shell Programming and Scripting

Help with merging 2 files into 1

::::::::: ::FileA:: ::::::::: A1-------A2--------A3---A4---A5-- ================================= AC5VXVLT-XX---------------------- B57E434--XXXX1-----MMMM-ZZZ--111- C325G20--XXXXX3----CCCC------3332 DC35S51--XXXXY1----DDDD------44X- DC35S52--XXXXY2----DDDD------44Y-... (5 Replies)
Discussion started by: lordsmiter
5 Replies

10. Shell Programming and Scripting

Merging two files

Guys, I am having little problem with getting a daily report! The daily process that I do is as follows 1. Unload Header for the report from the systables to one unl file, say Header.unl 2. Unload the data from the required table/tables to another unl file, say Data.unl 3. Send a... (2 Replies)
Discussion started by: PikK45
2 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 11:37 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy