Sponsored Content
Top Forums Shell Programming and Scripting how to split a huge file by every 100 lines Post 302550766 by yazu on Saturday 27th of August 2011 04:04:53 AM
Old 08-27-2011
Code:
split -l 100 INPUTFILE
c=0
for f in x??; do
  ((c++))
  cat head.txt $f tail.txt > q$c.xml
  rm $f
done

===

Hmm... The same solution.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split a huge line into multiple 120 characters lines with sed?

Hello , I'm trying to split a file which contains a single very long line. My aim is to split this single line each 120 characters. I tried with the sed command : `cat ${MYPATH}/${FILE}|sed -e :a -e 's/^.\{1,120\}$/&\n/;ta' >{MYPATH}/${DEST}` but when I wc -l the destination file it is... (2 Replies)
Discussion started by: jerome_1664
2 Replies

2. Shell Programming and Scripting

Delete lines from huge file

I have to delete 1st 7000 lines of a file which is 12GB large. As it is so large, i can't open in vi and delete these lines. Also I found one post here which gave solution using perl, but I don't have perl installed. Also some solutions were redirecting the o/p to a different file and renaming it.... (3 Replies)
Discussion started by: rahulrathod
3 Replies

3. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

4. Shell Programming and Scripting

NAWK array to store lines from huge file

Hi, I would like to clarify about the NAWK array to store multiple lines from huge file. The file is having an unique REF.NO, I wants to store the lines (it may be 100+ lines) till I found the new REF.NO. How can I apply NAWK - arrays for the above? Rgds, sharif. (1 Reply)
Discussion started by: sharif
1 Replies

5. Shell Programming and Scripting

Help- counting delimiter in a huge file and split data into 2 files

I’m new to Linux script and not sure how to filter out bad records from huge flat files (over 1.3GB each). The delimiter is a semi colon “;” Here is the sample of 5 lines in the file: Name1;phone1;address1;city1;state1;zipcode1 Name2;phone2;address2;city2;state2;zipcode2;comment... (7 Replies)
Discussion started by: lv99
7 Replies

6. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

7. UNIX for Dummies Questions & Answers

How to split a huge file into small pieces (per 2000 columns)?

Dear all, I have a big file:2879(rows)x400,170 (columns) like below. I 'd like to split the file into small pieces:2879(rows)x2000(columns) per file (the last small piece will be 2879x170. So far, I only know how to create one samll piece at one time. But actually I need to repeat this work... (6 Replies)
Discussion started by: forevertl
6 Replies

8. UNIX for Dummies Questions & Answers

Split a huge 7 GB File Based on Pattern into 4 files

Hi, I have a Huge 7 GB file which has around 1 million records, i want to split this file into 4 files to contain around 250k messages each. Please help me as Split command cannot work here as it might miss tags.. Format of the file is as below <!--###### ###### START-->... (6 Replies)
Discussion started by: KishM
6 Replies

9. UNIX for Advanced & Expert Users

How to split a large file with the first 100 lines of each condition?

I have a huge file with the following input: Case1 Specific_Info Specific_Info Case1 Specific_Info Specific_Info Case3 Specific_Info Specific_Info Case4 Specific_Info Specific_Info Case1 Specific_Info Specific_Info Case2 Specific_Info Specific_Info Case2 Specific_Info Specific_Info... (2 Replies)
Discussion started by: laurigo
2 Replies

10. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies
TV_SORT(1p)						User Contributed Perl Documentation					       TV_SORT(1p)

NAME
tv_sort - Sort XMLTV listings files by date, and add stop times. SYNOPSIS
tv_sort [--help] [--by-channel] [--output FILE] [FILE...] DESCRIPTION
Read XMLTV data and write out the same data sorted in date order. Where stop times of programmes are missing, guess them from the start time of the next programme on the same channel. For the last programme of a channel, no stop time can be added. Tv_sort also performs some sanity checks such as making sure no two programmes on the same channel overlap. --output FILE write to FILE rather than standard output --by-channel sort first by channel id, then by date within each channel. --duplicate-error If the input contains the same programme more than once, consider this as an error. Default is to silently ignore duplicate entries. The time sorting is by start time, then by stop time. Without --by-channel, if start times and stop times are equal then two programmes are sorted by internal channel id. With --by-channel, channel id is compared first and then times. You can think of tv_sort as converting XMLTV data into a canonical form, useful for diffing two files. EXAMPLES
At a typical Unix shell or Windows command prompt: tv_sort <in.xml >out.xml tv_sort in.xml --output out.xml These are different ways of saying the same thing. AUTHOR
Ed Avis, ed@membled.com perl v5.14.2 2006-03-02 TV_SORT(1p)
All times are GMT -4. The time now is 06:37 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy