Sponsored Content
Full Discussion: Concatenate files
Top Forums Shell Programming and Scripting Concatenate files Post 302601796 by bobby1015 on Friday 24th of February 2012 11:36:59 AM
Old 02-24-2012
If I need my output as

Code:
10000|ABC
20000|DEF
30000|XYZ

How am I going to achieve this?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

How to concatenate all files.

Hi, I'm totally new to Unix. I'm an MVS mainframer but ran into a situation where a Unix server I have available will help me. I want to be able to remotely connect to another server using FTP, login and MGET all files from it's root or home directory, logout, then login as a different user and do... (1 Reply)
Discussion started by: s80bob
1 Replies

2. Shell Programming and Scripting

Script to concatenate several files

I need a script to concatenate several files in one step, I have 3 header files say file.S, file.X and file.R, I need to concatenate these 3 header files to data files, say file1.S, file1.R, file1.X so that the header file "file.S" will be concatenated to all data files with .S extentions and so on... (3 Replies)
Discussion started by: docaia
3 Replies

3. Shell Programming and Scripting

Concatenate files

I have directory structure sales_only under which i have multiple directories for each dealer example: ../../../Sales_Only/xxx_Dealer ../../../Sales_Only/yyy_Dealer ../../../Sales_Only/zzz_Dealer Every day i have one file produce under each directory when the process runs. The requirement... (3 Replies)
Discussion started by: mohanmuthu
3 Replies

4. Shell Programming and Scripting

Concatenate files

Hi, I want to create a batch(bash) file to combine 23 files together. These files have the same extension. I want the final file is save to a given folder. Once it is done it will delete the 23 files. Thanks for help. Need script. (6 Replies)
Discussion started by: zhshqzyc
6 Replies

5. UNIX for Dummies Questions & Answers

Concatenate more than 200 files

Hi, I have a bunch of files (around 500) with the .log extension and each having a different name. Is there an easy way to concatenate them all into a single file? I've tried: cat *.log > complete.log and it seems that files are overlapping each other...thus the new file is not complete.... (2 Replies)
Discussion started by: amarn
2 Replies

6. UNIX for Dummies Questions & Answers

Concatenate 560 files in one

HI all, could please help me in this code. I have 560 files containing the same columns but different rows. i want to concatenate all these files in one big file. i want to keep the header of the first file then add the dat from other files horizontally. the name of my files contains 2 variables... (7 Replies)
Discussion started by: biopsy
7 Replies

7. UNIX for Dummies Questions & Answers

Concatenate Several Files to One

Hi All, Need your help. I will need to concatenate around 100 files but each end of the file I will need to insert my name DIRT1228 on each of the file and before the next file is added and arrived with just one file for all the 100files. Appreciate your time. Dirt (6 Replies)
Discussion started by: dirt1228
6 Replies

8. UNIX for Dummies Questions & Answers

Concatenate files

Hi I am trying to learn linux step by step an i am wondering can i use cat command for concatenate files but i want to place context of file1 to a specific position in file2 place of file 2 and not at the end as it dose on default? Thank you. (3 Replies)
Discussion started by: iliya24
3 Replies

9. UNIX for Dummies Questions & Answers

Concatenate files and delete source files. Also have to add a comment.

- Concatenate files and delete source files. Also have to add a comment. - I need to concatenate 3 files which have the same characters in the beginning and have to remove those files and add a comment and the end. Example: cat REJ_FILE_ABC.txt REJ_FILE_XYZ.txt REJ_FILE_PQR.txt >... (0 Replies)
Discussion started by: eskay
0 Replies

10. UNIX for Dummies Questions & Answers

Concatenate files by pairs

Hi, I would like to batch concatenate files by pairs. I have quite a few of them so I would not like to do that pair by pair separately. the names of the file is of the type: file1.fastq newfile1_new.fastq file2.fastq newfile2_new.fastq and so on... I would like to concatenate file1... (2 Replies)
Discussion started by: jawad
2 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 10:45 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy