Split large file to smaller fastly


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Split large file to smaller fastly
# 8  
Old 09-22-2014
Quote:
Originally Posted by mechvijays
awk commands showing the error

awk: too many output files 10
record number 11
Which command did that? You were given actually 2 commands and - without having tried it - as much as i know about awk the one Ashkay Hedge gave seems to be OK.

While you are at it: some information about your system (like: which awk you use or, alternatively, which OS you use so we can deduce which awk version is involved and which other utilities can be expected as available) would not exactly hurt your cause either.

Further, I'd like to know if this is homework. The file contents look either very simplified - and, given how you come across, i wonder if you would be able to adapt a working solution to also work with the original file contents - or they are as they are, but then they seem so meaningless that they can only be dummy contents, like in a homework/classwork example.

bakunin
# 9  
Old 09-22-2014
which OS you are using ? if sunos / solaris use nawk
# 10  
Old 09-22-2014
SunOS sasbsd27c1 5.10 Generic_150400-10 sun4u sparc SUNW,SPARC-Enterprise
# 11  
Old 09-22-2014
Hi.

Similar to others, and run on Solaris specifically:
Code:
#!/usr/bin/env bash

# @(#) s1       Demonstrate sifting, collecting lines to files, awk.

# Utility functions: print-as-echo, print-line-with-visual-space, debug.
# export PATH="/usr/local/bin:/usr/bin:/bin"
LC_ALL=C ; LANG=C ; export LC_ALL LANG
pe() { for _i;do printf "%s" "$_i";done; printf "\n"; }
pl() { pe;pe "-----" ;pe "$*"; }
db() { ( printf " db, ";for _i;do printf "%s" "$_i";done;printf "\n" ) >&2 ; }
db() { : ; }
C=$HOME/bin/context && [ -f $C ] && $C awk

FILE=${1-data1}

# Remove debris.
rm -f [0-9]*.txt

pl " Input data file $FILE:"
cat $FILE

pl " Results:"
awk '
BEGIN          { outfile = lastfile = "" }
NR == 1        { lastfile = $3; outfile = $3".txt" ; print > outfile ; next }
$3 == lastfile { print > outfile; next }
               { close (outfile) ; lastfile = $3 ; outfile = $3".txt" ; print > outfile }
' $FILE
wc -l [0-9]*.txt

sample=109.txt
pl " Sample output: file $sample:"
cat $sample

exit 0

producing:
Code:
$ ./s1

Environment: LC_ALL = C, LANG = C
(Versions displayed with local utility "version")
OS, ker|rel, machine: SunOS, 5.10, i86pc
Distribution        : Solaris 10 10/08 s10x_u6wos_07b X86
bash GNU bash 3.00.16
awk - ( local: /usr/xpg4/bin/awk, Oct 10 2007 )

-----
 Input data file data1:
1       1111111111111   108
1       1111111111111   109
1       1111111111111   109
1       1111111111111   110
1       1111111111111   111
1       1111111111111   111
1       1111111111111   111
1       1111111111111   112
1       1111111111111   112
1       1111111111111   112

-----
 Results:
       1 108.txt
       2 109.txt
       1 110.txt
       3 111.txt
       3 112.txt
      10 total

-----
 Sample output: file 109.txt:
1       1111111111111   109
1       1111111111111   109

Best wishes ... cheers, drl
This User Gave Thanks to drl For This Post:
# 12  
Old 09-23-2014
Hi All,

Thank you very much for your response.

all the scripts is not working in my machine.I have attached the file i am using and the error i am getting.

the script is able to create only 10 files after that it is throwing error.

Please test with the file and give a new command please.
# 13  
Old 09-23-2014
Did you try post #2 ???
# 14  
Old 09-23-2014
yes Akshay. that also throwed error
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

Split large file into smaller files without disturbing the entry chunks

Dears, Need you help with the below file manipulation. I want to split the file into 8 smaller files but without cutting/disturbing the entries (meaning every small file should start with a entry and end with an empty line). It will be helpful if you can provide a one liner command for this... (12 Replies)
Discussion started by: Kamesh G
12 Replies

2. UNIX for Dummies Questions & Answers

Split files into smaller ones with 1000 hierarchies in a single file.

input file: AD,00,--,---,---,---,---,---,---,--,--,--- AM,000,---,---,---,---,---,--- AR, ,---,--,---,--- AA,---,---,---,--- AT,--- AU,---,---,--- AS,---,--- AP,---,---,--- AI,--- AD,00,---,---,---, ,---,---,---,---,---,--- AM,000,---,---,--- AR,... (6 Replies)
Discussion started by: kcdg859
6 Replies

3. Shell Programming and Scripting

Help needed - Split large file into smaller files based on pattern match

Help needed urgently please. I have a large file - a few hundred thousand lines. Sample CP START ACCOUNT 1234556 name 1 CP END ACCOUNT CP START ACCOUNT 2224444 name 1 CP END ACCOUNT CP START ACCOUNT 333344444 name 1 CP END ACCOUNT I need to split this file each time "CP START... (7 Replies)
Discussion started by: frustrated1
7 Replies

4. Shell Programming and Scripting

How to split a file into smaller files

Hi, I have a big text file with m columns and n rows. The format is like: STF123450001000200030004STF123450005000600070008STF123450009001000110012 STF234560345002208330154STF234590705620600070080STF234567804094562357688 STF356780001000200030004STF356780005000600070080STF356780800094562657687... (2 Replies)
Discussion started by: wintersnow2011
2 Replies

5. Shell Programming and Scripting

Split large file into smaller file

hi Guys i need some help here.. i have a file which has > 800,000 lines in it. I need to split this file into smaller files with 25000 lines each. please help thanks (1 Reply)
Discussion started by: sitaldip
1 Replies

6. UNIX for Dummies Questions & Answers

multiple smaller files from one large file

I have a file with a simple list of ids. 750,000 rows. I have to break it down into multiple 50,000 row files to submit in a batch process.. Is there an easy script I could write to accomplish this task? (2 Replies)
Discussion started by: rtroscianecki
2 Replies

7. Shell Programming and Scripting

Help with splitting a large text file into smaller ones

Hi Everyone, I am using a centos 5.2 server as an sflow log collector on my network. Currently I am using inmons free sflowtool to collect the packets sent by my switches. I have a bash script running on an infinate loop to stop and start the log collection at set intervals - currently one... (2 Replies)
Discussion started by: lord_butler
2 Replies

8. Shell Programming and Scripting

perl help to split big verilog file into smaller ones for each module

Hi I have a big verilog file with multiple modules. Each module begin with the code word 'module <module-name>(ports,...)' and end with the 'endmodule' keyword. Could you please suggest the best way to split each of these modules into multiple files? Thank you for the help. Example of... (7 Replies)
Discussion started by: return_user
7 Replies

9. UNIX for Dummies Questions & Answers

splitting the large file into smaller files

hi all im new to this forum..excuse me if anythng wrong. I have a file containing 600 MB data in that. when i do parse the data in perl program im getting out of memory error. so iam planning to split the file into smaller files and process one by one. can any one tell me what is the code... (1 Reply)
Discussion started by: vsnreddy
1 Replies

10. Shell Programming and Scripting

Cutting a large log file in to smaller ones

I have a very large (150 megs) IRC log file from 2000-2001 which I want to cut down to individual daily log files. I have a very basic knowledge of the cat, sed and grep commands. The log file is time stamped and each day in the large log file begins with a "Session Start" string like so: ... (11 Replies)
Discussion started by: MrTangent
11 Replies
Login or Register to Ask a Question