Sponsored Content
Top Forums Shell Programming and Scripting Split a text file into multiple text files? Post 302840335 by sammy777 on Monday 5th of August 2013 09:32:55 AM
Old 08-05-2013
Split a text file into multiple text files?

I have a text file with entries like
Code:
 1186
 5556
 90844
 7873

 7722
 12
 7890.6
 78.52
 6679
 
 3455
 9867
 1127
 5642

..N so many records like this.

I want to split this file into multiple files like cluster1.txt, cluster2.txt, cluster3.txt, ..... clusterN.txt.

cluster1.txt should be like:
Code:
 1186
 5556
 90844
 7873

cluster2.txt should be like:
Code:
 7722
 12
 7890.6
 78.52
 6679

..... so on. ..

Last edited by Scott; 08-05-2013 at 10:35 AM.. Reason: Please use code tags
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

grep multiple text files in folder into 1 text file?

How do I use the grep command to take mutiple text files in a folder and make one huge text file out of them. I'm using Mac OS X and can not find a text tool that does it so I figured I'd resort to the BSD Unix CLI for a solution... there are 5,300 files that I want to write to one huge file so... (7 Replies)
Discussion started by: coppertone
7 Replies

2. Shell Programming and Scripting

Create multiple text file from a single text file on AIX

Hi I need to create multiple text files from onc text file on AIX. The data of text files is as below: ********************************************** ********************************************** DBVERIFY: Release 10.2.0.4.0 - Production on Tue Nov 10 13:45:42 2009 Copyright (c) 1982,... (11 Replies)
Discussion started by: lodhi1978
11 Replies

3. Shell Programming and Scripting

[bash help]Adding multiple lines of text into a specific spot into a text file

I am attempting to insert multiple lines of text into a specific place in a text file based on the lines above or below it. For example, Here is a portion of a zone file. IN NS ns1.domain.tld. IN NS ns2.domain.tld. IN ... (2 Replies)
Discussion started by: cdn_humbucker
2 Replies

4. UNIX for Dummies Questions & Answers

Changing text in multiple files, but with different text for each file

Hello, I have a situation where I want to change a line of text in multiple files, but the problem is that I want to change the text to something unique for each file. For example, let's say I have five files named bob.txt, joe.txt, john.txt, tom.txt, and zach.txt. Each of these files has a... (5 Replies)
Discussion started by: Scatterbrain26
5 Replies

5. UNIX for Dummies Questions & Answers

Splitting up a text file into multiple files by columns

Hi, I have a space delimited text file with multiple columns 102 columns. I want to break it up into 100 files labelled 1.txt through 100.txt (n.txt). Each text file will contain the first two columns and in addition the nth column (that corresponds to n.txt). The third file will contain the... (1 Reply)
Discussion started by: evelibertine
1 Replies

6. Shell Programming and Scripting

Merge the multiple text files into one file

Hi All, I am trying to merge all the text files into one file using below snippet cat /home/Temp/Test/Log/*.txt >> all.txt But it seems it is not working. I have multiple files like Output_ServerName1.txt, Output_ServreName2.txt I want to merge each file into one single file and... (6 Replies)
Discussion started by: sharsour
6 Replies

7. Shell Programming and Scripting

Split text separated by ; in a column into multiple columns

Hi, I need help to split a long text in a column which is separated by ; and i need to print them out in multiple columns. My input file is tab-delimited and has 11 columns as below:- aRg02004 21452 asdfwf 21452 21452 4.6e-29 5e-29 -1 3 50 ffg|GGD|9009 14101.10 High class -node. ; ffg|GGD|969... (3 Replies)
Discussion started by: redse171
3 Replies

8. UNIX for Dummies Questions & Answers

Pdftotext from multiple pdf files to a single text file

I have a directory having a number of pdf files. I want to convert all the files to text, stored in a single text file The following creates multiple text files ls *.pdf | xargs -n1 pdftotext (1 Reply)
Discussion started by: kristinu
1 Replies

9. Shell Programming and Scripting

Split a text file into multiple pages based on pattern

Hi, I have a text file (attached the sample). I have also, attached the way the way the files need to be split. We get this file, that will either have 24 Jurisdictions, or will miss some and retain some. Like in the attached sample file, there are only Jurisdictions 03,11,14,15, 20 and 30.... (3 Replies)
Discussion started by: ebsus
3 Replies

10. Shell Programming and Scripting

Copying a file to multiple other files using a text file as input

Hello, I have a file called COMPLIST as follows that contains 4 digit numbers.0002 0003 0010 0013 0015 0016 0022 0023 0024 0025 0027 0030 0031 0032 0033 0035 0038 0041 (3 Replies)
Discussion started by: sph90457
3 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 02:59 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy