Sponsored Content
Top Forums Shell Programming and Scripting Help with splitting a large text file into smaller ones Post 302334381 by lord_butler on Wednesday 15th of July 2009 11:39:38 AM
Old 07-15-2009
wow thanks for a VERY quick response. Worked perfectly first time, although I needed to use gawk. I was almost certain that the solution lay with awk, but i am surprised at how elegant and concise the code is.

Thanks again

Joe
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Cutting a large log file in to smaller ones

I have a very large (150 megs) IRC log file from 2000-2001 which I want to cut down to individual daily log files. I have a very basic knowledge of the cat, sed and grep commands. The log file is time stamped and each day in the large log file begins with a "Session Start" string like so: ... (11 Replies)
Discussion started by: MrTangent
11 Replies

2. Shell Programming and Scripting

Splitting a Larger File Into Mutiple Smaller ones.

Hello.. Iam in need to urgent help with the below. Have data-file with 40,567 and need to split them into multiple files with smaller line-count. Iam aware of "split" command with -l option which allows you to specify the no of lines in smaller files ,with the target file-name pattern... (1 Reply)
Discussion started by: madhubt_1982
1 Replies

3. UNIX for Dummies Questions & Answers

splitting the large file into smaller files

hi all im new to this forum..excuse me if anythng wrong. I have a file containing 600 MB data in that. when i do parse the data in perl program im getting out of memory error. so iam planning to split the file into smaller files and process one by one. can any one tell me what is the code... (1 Reply)
Discussion started by: vsnreddy
1 Replies

4. Shell Programming and Scripting

splitting text file into smaller ones

Hello We have a text file with 400,000 lines and need to split into multiple files each with 5000 lines ( will result in 80 files) Got an idea of using head and tail commands to do that with a loop but looked not efficient. Please advise the simple and yet effective way to do it. TIA... (3 Replies)
Discussion started by: prvnrk
3 Replies

5. UNIX for Dummies Questions & Answers

multiple smaller files from one large file

I have a file with a simple list of ids. 750,000 rows. I have to break it down into multiple 50,000 row files to submit in a batch process.. Is there an easy script I could write to accomplish this task? (2 Replies)
Discussion started by: rtroscianecki
2 Replies

6. Shell Programming and Scripting

splitting a large text file into paragraphs

Hello all, newbie here. I've searched the forum and found many "how to split a text file" topics but none that are what I'm looking for. I have a large text file (~15 MB) in size. It contains a variable number of "paragraphs" (for lack of a better word) that are each of variable length. A... (3 Replies)
Discussion started by: lupin..the..3rd
3 Replies

7. Shell Programming and Scripting

Splitting a file into several smaller files using perl

Hi, I'm trying to split a large file into several smaller files the script will have two input arguments argument1=filename and argument2=no of files to be split. In my large input file I have a header followed by 100009 records The first line is a header; I want this header in all my... (9 Replies)
Discussion started by: ramky79
9 Replies

8. Shell Programming and Scripting

Sed: Splitting A large File into smaller files based on recursive Regular Expression match

I will simplify the explaination a bit, I need to parse through a 87m file - I have a single text file in the form of : <NAME>house........ SOMETEXT SOMETEXT SOMETEXT . . . . </script> MORETEXT MORETEXT . . . (6 Replies)
Discussion started by: sumguy
6 Replies

9. UNIX for Dummies Questions & Answers

Split large file to smaller fastly

hi , I have a requirement input file: 1 1111111111111 108 1 1111111111111 109 1 1111111111111 109 1 1111111111111 110 1 1111111111111 111 1 1111111111111 111 1 1111111111111 111 1 1111111111111 112 1 1111111111111 112 1 1111111111111 112 The output should be, (19 Replies)
Discussion started by: mechvijays
19 Replies

10. Shell Programming and Scripting

Splitting a text file into smaller files with awk, how to create a different name for each new file

Hello, I have some large text files that look like, putrescine Mrv1583 01041713302D 6 5 0 0 0 0 999 V2000 2.0928 -0.2063 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 5.6650 0.2063 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 3.5217 ... (3 Replies)
Discussion started by: LMHmedchem
3 Replies
sa1(8)							    BSD System Manager's Manual 						    sa1(8)

NAME
sa1 -- Generate a system activity daily data file. SYNOPSIS
/usr/lib/sa/sa1 [t n] DESCRIPTION
The sa1 command is a shell script used to invoke the system activity data collector, sadc. The binary sample data is collected at intervals t seconds apart, in a loop n times. The binary sample data is written to the standard daily data file, /var/log/sa/sadd where the dd repre- sents the current day of the month. sa1 is intended to be started by cron. EXAMPLE CRON ENTRY
# Starting at 8am collect system activity records # every 20 minutes for 12 hours # 20 minutes = 1200 seconds # 12 hours with 3 samples each hour = 36 loops 0 8 * * 1-5 /usr/lib/sa/sa1 1200 36 # After the 12 hour period, # collect a system activity report 30 20 * * 1-5 /usr/lib/sa/sa2 -A FILES
/var/log/sa/sadd Default daily activity file that holds the binary sampling data. dd are digits that represent the day of the month. SEE ALSO
sa2(8), sadc(8), sar(1), iostat(8), vm_stat(1), netstat(1), top(1), sc_usage(1), fs_usage(1), crontab(1), crontab(5) Mac OS X Jul 25 2003 Mac OS X
All times are GMT -4. The time now is 10:39 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy