Sponsored Content
Full Discussion: split file problem
Top Forums Shell Programming and Scripting split file problem Post 302350320 by methyl on Thursday 3rd of September 2009 11:55:43 AM
Old 09-03-2009
If use say "-a 4" this gives you up to 26x26x26x26=456,976 suffix combinations (aaaa-zzzz) which at 5000 records per file comes to 2,284,880,000 maximum input records.

Not sure if I understand your question. If your "-a" value is high enough you don't need to know the number of records in advance.

If you are creating large numbers of files be careful that you have enough inodes in the filesystem (df -i).
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Problem in split command

I want to split a file containing millions of records. I am issuing the command split -l 20000 filename which will split the file in 20K records each. It works fine except in some files, data after one particular field is lost( the field with space). Say the record is ... (4 Replies)
Discussion started by: superprogrammer
4 Replies

2. UNIX for Dummies Questions & Answers

Split a file with no pattern -- Split, Csplit, Awk

I have gone through all the threads in the forum and tested out different things. I am trying to split a 3GB file into multiple files. Some files are even larger than this. For example: split -l 3000000 filename.txt This is very slow and it splits the file with 3 million records in each... (10 Replies)
Discussion started by: madhunk
10 Replies

3. Shell Programming and Scripting

problem with awk for file split

Hi all, i have a .ksh script which is, among other stuff, splitting a file and saveing the filenames into variables for further processing: # file split before ftp and put result filenames into variables if ]; then awk '{close(f);f=$1}{sub("^","");print > f".TXT"}' $_ftpfile set B*.TXT... (0 Replies)
Discussion started by: spidermike
0 Replies

4. Shell Programming and Scripting

How to split a data file into separate files with the file names depending upon a column's value?

Hi, I have a data file xyz.dat similar to the one given below, 2345|98|809||x|969|0 2345|98|809||y|0|537 2345|97|809||x|544|0 2345|97|809||y|0|651 9685|98|809||x|321|0 9685|98|809||y|0|357 9685|98|709||x|687|0 9685|98|709||y|0|234 2315|98|809||x|564|0 2315|98|809||y|0|537... (2 Replies)
Discussion started by: nithins007
2 Replies

5. Shell Programming and Scripting

Split File by Pattern with File Names in Source File... Awk?

Hi all, I'm pretty new to Shell scripting and I need some help to split a source text file into multiple files. The source has a row with pattern where the file needs to be split, and the pattern row also contains the file name of the destination for that specific piece. Here is an example: ... (2 Replies)
Discussion started by: cul8er
2 Replies

6. Shell Programming and Scripting

Split a file into multiple files based on first two digits of file.

Hi , I do have a fixedwidth flatfile that has data for 10 different datasets each identified by the first two digits in the flatfile. 01 in the first two digit position refers to Set A 02 in the first two digit position refers to Set B and so on I want to genrate 10 different files from my... (6 Replies)
Discussion started by: okkadu
6 Replies

7. Shell Programming and Scripting

awk to split one field and print the last two fields within the split part.

Hello; I have a file consists of 4 columns separated by tab. The problem is the third fields. Some of the them are very long but can be split by the vertical bar "|". Also some of them do not contain the string "UniProt", but I could ignore it at this moment, and sort the file afterwards. Here is... (5 Replies)
Discussion started by: yifangt
5 Replies

8. Shell Programming and Scripting

Split file based on file size in Korn script

I need to split a file if it is over 2GB in size (or any size), preferably split on the lines. I have figured out how to get the file size using awk, and I can split the file based on the number of lines (which I got with wc -l) but I can't figure out how to connect them together in the script. ... (6 Replies)
Discussion started by: ssemple2000
6 Replies

9. Shell Programming and Scripting

How to split file into multiple files using awk based on 1 field in the file?

Good day all I need some helps, say that I have data like below, each field separated by a tab DATE NAME ADDRESS 15/7/2012 LX a.b.c 15/7/2012 LX1 a.b.c 16/7/2012 AB a.b.c 16/7/2012 AB2 a.b.c 15/7/2012 LX2 a.b.c... (2 Replies)
Discussion started by: alexyyw
2 Replies

10. UNIX for Beginners Questions & Answers

sed awk: split a large file to unique file names

Dear Users, Appreciate your help if you could help me with splitting a large file > 1 million lines with sed or awk. below is the text in the file input file.txt scaffold1 928 929 C/T + scaffold1 942 943 G/C + scaffold1 959 960 C/T +... (6 Replies)
Discussion started by: kapr0001
6 Replies
OTFDUMP(1)							   User Commands							OTFDUMP(1)

NAME
otfdump - otfdump DESCRIPTION
otfdump - convert otf traces or parts of it into a human readable, long version Options: -h, --help show this help message -V show OTF version -f <n> set max number of filehandles available (default: 50) -o <file> output file if the ouput file is unspecified the stdout will be used --num <a> <b> output only records no. [a,b] --time <a> <b> output only records with time stamp in [a,b] --nodef omit definition records --noevent omit event records --nostat omit statistic records --nosnap omit snapshot records --nomarker omit marker records --nokeyvalue omit key-value pairs --fullkeyvalue show key-value pairs including the contents of byte-arrays --procs <a> show only processes <a> <a> is a space-seperated list of process-tokens --records <a> show only records <a> <a> is a space-seperated list of record-type-numbers record-type-numbers can be found in OTF_Definitions.h (OTF_*_RECORD) -s, --silent do not display anything except the time otfdump needed to read the tracefile otfdump 1.10.2 May 2012 OTFDUMP(1)
All times are GMT -4. The time now is 05:05 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy