Sponsored Content
Top Forums Shell Programming and Scripting splitting text file into smaller ones Post 302303826 by prvnrk on Friday 3rd of April 2009 02:31:39 PM
Old 04-03-2009
splitting text file into smaller ones

Hello

We have a text file with 400,000 lines and need to split into multiple files each with 5000 lines ( will result in 80 files)

Got an idea of using head and tail commands to do that with a loop but looked not efficient.

Please advise the simple and yet effective way to do it.

TIA
Prvn
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Splitting text file to several other files using sed.

I'm trying to figure out how to do this efficiently with as little execution time as possible and I'm pretty sure using sed is the best way. However I'm new to sed and all the reading and examples I've found don't seem to show a similar exercise: I have a long text file (i'll call it... (3 Replies)
Discussion started by: JeffV
3 Replies

2. Shell Programming and Scripting

Splitting a Larger File Into Mutiple Smaller ones.

Hello.. Iam in need to urgent help with the below. Have data-file with 40,567 and need to split them into multiple files with smaller line-count. Iam aware of "split" command with -l option which allows you to specify the no of lines in smaller files ,with the target file-name pattern... (1 Reply)
Discussion started by: madhubt_1982
1 Replies

3. UNIX for Dummies Questions & Answers

splitting the large file into smaller files

hi all im new to this forum..excuse me if anythng wrong. I have a file containing 600 MB data in that. when i do parse the data in perl program im getting out of memory error. so iam planning to split the file into smaller files and process one by one. can any one tell me what is the code... (1 Reply)
Discussion started by: vsnreddy
1 Replies

4. Shell Programming and Scripting

Help with splitting a large text file into smaller ones

Hi Everyone, I am using a centos 5.2 server as an sflow log collector on my network. Currently I am using inmons free sflowtool to collect the packets sent by my switches. I have a bash script running on an infinate loop to stop and start the log collection at set intervals - currently one... (2 Replies)
Discussion started by: lord_butler
2 Replies

5. Linux

Splitting a Text File by Rows

Hello, Please help me. I have hundreds of text files composed of several rows of information and I need to separate each row into a new text file. I was trying to figure out how to split the text file into different text files, based on each row of text in the original text file. Here is an... (2 Replies)
Discussion started by: dvdrevilla
2 Replies

6. Shell Programming and Scripting

splitting a large text file into paragraphs

Hello all, newbie here. I've searched the forum and found many "how to split a text file" topics but none that are what I'm looking for. I have a large text file (~15 MB) in size. It contains a variable number of "paragraphs" (for lack of a better word) that are each of variable length. A... (3 Replies)
Discussion started by: lupin..the..3rd
3 Replies

7. Shell Programming and Scripting

Splitting a file into several smaller files using perl

Hi, I'm trying to split a large file into several smaller files the script will have two input arguments argument1=filename and argument2=no of files to be split. In my large input file I have a header followed by 100009 records The first line is a header; I want this header in all my... (9 Replies)
Discussion started by: ramky79
9 Replies

8. Shell Programming and Scripting

Sed: Splitting A large File into smaller files based on recursive Regular Expression match

I will simplify the explaination a bit, I need to parse through a 87m file - I have a single text file in the form of : <NAME>house........ SOMETEXT SOMETEXT SOMETEXT . . . . </script> MORETEXT MORETEXT . . . (6 Replies)
Discussion started by: sumguy
6 Replies

9. Shell Programming and Scripting

Splitting a delimited text file

Howdy folks, I've got a very large plain text file that I need to split into many smaller files. My script-fu is not powerful enough for this, so any assistance is much appreciated. The file is a database dump from Cyrus IMAP server. It's basically a bunch of emails (thousands) all... (13 Replies)
Discussion started by: lupin..the..3rd
13 Replies

10. Shell Programming and Scripting

Splitting a text file into smaller files with awk, how to create a different name for each new file

Hello, I have some large text files that look like, putrescine Mrv1583 01041713302D 6 5 0 0 0 0 999 V2000 2.0928 -0.2063 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 5.6650 0.2063 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 3.5217 ... (3 Replies)
Discussion started by: LMHmedchem
3 Replies
TAIL(1) 						      General Commands Manual							   TAIL(1)

NAME
tail - deliver the last part of a file SYNOPSIS
tail [ +-number[lbc][rf] ] [ file ] tail [ -fr ] [ -n nlines ] [ -c nbytes ] [ file ] DESCRIPTION
Tail copies the named file to the standard output beginning at a designated place. If no file is named, the standard input is copied. Copying begins at position +number measured from the beginning, or -number from the end of the input. Number is counted in lines, 1K blocks or bytes, according to the appended flag or Default is -10l (ten ell). The further flag causes tail to print lines from the end of the file in reverse order; (follow) causes tail, after printing to the end, to keep watch and print further data as it appears. The second syntax is that promulgated by POSIX, where the numbers rather than the options are signed. EXAMPLES
tail file Print the last 10 lines of a file. tail +0f file Print a file, and continue to watch data accumulate as it grows. sed 10q file Print the first 10 lines of a file. SOURCE
/src/cmd/tail.c BUGS
Tails relative to the end of the file are treasured up in a buffer, and thus are limited in length. According to custom, option +number counts lines from 1, and counts blocks and bytes from 0. Tail is ignorant of UTF. TAIL(1)
All times are GMT -4. The time now is 09:20 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy