Sponsored Content
Top Forums Shell Programming and Scripting how to split a huge file by every 100 lines Post 302550766 by yazu on Saturday 27th of August 2011 04:04:53 AM
Old 08-27-2011
Code:
split -l 100 INPUTFILE
c=0
for f in x??; do
  ((c++))
  cat head.txt $f tail.txt > q$c.xml
  rm $f
done

===

Hmm... The same solution.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split a huge line into multiple 120 characters lines with sed?

Hello , I'm trying to split a file which contains a single very long line. My aim is to split this single line each 120 characters. I tried with the sed command : `cat ${MYPATH}/${FILE}|sed -e :a -e 's/^.\{1,120\}$/&\n/;ta' >{MYPATH}/${DEST}` but when I wc -l the destination file it is... (2 Replies)
Discussion started by: jerome_1664
2 Replies

2. Shell Programming and Scripting

Delete lines from huge file

I have to delete 1st 7000 lines of a file which is 12GB large. As it is so large, i can't open in vi and delete these lines. Also I found one post here which gave solution using perl, but I don't have perl installed. Also some solutions were redirecting the o/p to a different file and renaming it.... (3 Replies)
Discussion started by: rahulrathod
3 Replies

3. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

4. Shell Programming and Scripting

NAWK array to store lines from huge file

Hi, I would like to clarify about the NAWK array to store multiple lines from huge file. The file is having an unique REF.NO, I wants to store the lines (it may be 100+ lines) till I found the new REF.NO. How can I apply NAWK - arrays for the above? Rgds, sharif. (1 Reply)
Discussion started by: sharif
1 Replies

5. Shell Programming and Scripting

Help- counting delimiter in a huge file and split data into 2 files

I’m new to Linux script and not sure how to filter out bad records from huge flat files (over 1.3GB each). The delimiter is a semi colon “;” Here is the sample of 5 lines in the file: Name1;phone1;address1;city1;state1;zipcode1 Name2;phone2;address2;city2;state2;zipcode2;comment... (7 Replies)
Discussion started by: lv99
7 Replies

6. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

7. UNIX for Dummies Questions & Answers

How to split a huge file into small pieces (per 2000 columns)?

Dear all, I have a big file:2879(rows)x400,170 (columns) like below. I 'd like to split the file into small pieces:2879(rows)x2000(columns) per file (the last small piece will be 2879x170. So far, I only know how to create one samll piece at one time. But actually I need to repeat this work... (6 Replies)
Discussion started by: forevertl
6 Replies

8. UNIX for Dummies Questions & Answers

Split a huge 7 GB File Based on Pattern into 4 files

Hi, I have a Huge 7 GB file which has around 1 million records, i want to split this file into 4 files to contain around 250k messages each. Please help me as Split command cannot work here as it might miss tags.. Format of the file is as below <!--###### ###### START-->... (6 Replies)
Discussion started by: KishM
6 Replies

9. UNIX for Advanced & Expert Users

How to split a large file with the first 100 lines of each condition?

I have a huge file with the following input: Case1 Specific_Info Specific_Info Case1 Specific_Info Specific_Info Case3 Specific_Info Specific_Info Case4 Specific_Info Specific_Info Case1 Specific_Info Specific_Info Case2 Specific_Info Specific_Info Case2 Specific_Info Specific_Info... (2 Replies)
Discussion started by: laurigo
2 Replies

10. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies
VK_LOGMERGE(1)						      General Commands Manual						    VK_LOGMERGE(1)

NAME
vk_logmerge - a Valgrind XML log file merger SYNOPSIS
vk_logmerge [flags and input files in any order] DESCRIPTION
vk_logmerge is a valkyrie(1) helper. Given multiple log files (in xml format) generated by multiple runs on a parallel machine, or multiple log files generated by sequential runs on a single-processor machine, for the same binary, vk_logmerge merges the log files together, sum- ming the counts of duplicates, and outputs the result to a single file. As input, vk_logmerge expects the log-files to-be-merged and/or a file containing the list of log-files to-be-merged, with each entry on a separate line. Log files can be merged from within valkyrie(1) , or use can invoke vk_logmerge directly. OPTIONS
-h Show help message -v Be verbose (more -v's give more) -t Output plain text (non-xml) -f <log_list> Obtain input files from <log_list> file (one per line) -o <writefile> File to write output to At least 1 input file must be given. If no '-o outfile' is given, writes to standard output. EXAMPLES
vk_logmerge log1.xml -f loglist.fls -o merged.xml SEE ALSO
valkyrie(1), valgrind(1). AUTHOR
vk_logmerge was written by Donna Robinson, Cerion Armour-Brown and others. This manual page was written by Hai Zaar <haizaar@haizaar.com>, for the Debian project (but may be used by others). 2009-05-02 VK_LOGMERGE(1)
All times are GMT -4. The time now is 01:32 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy