Sponsored Content
Top Forums UNIX for Beginners Questions & Answers Split large file into smaller files without disturbing the entry chunks Post 303016510 by Kamesh G on Saturday 28th of April 2018 12:59:37 AM
Old 04-28-2018
@vgersh99, it worked, thank you for the help.

the problem is the file split part is okay but the contents are not as expected, can you help with the command to create the output like sample_file1 and sample_file2 examples.

right now my files are getting split but the expected result is to be each file should begin with the start of smaller chunk eg: the line start with dn: should be the start of each smaller files.

Last edited by Kamesh G; 04-28-2018 at 02:07 AM.. Reason: I posted wrongly.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

splitting the large file into smaller files

hi all im new to this forum..excuse me if anythng wrong. I have a file containing 600 MB data in that. when i do parse the data in perl program im getting out of memory error. so iam planning to split the file into smaller files and process one by one. can any one tell me what is the code... (1 Reply)
Discussion started by: vsnreddy
1 Replies

2. UNIX for Dummies Questions & Answers

multiple smaller files from one large file

I have a file with a simple list of ids. 750,000 rows. I have to break it down into multiple 50,000 row files to submit in a batch process.. Is there an easy script I could write to accomplish this task? (2 Replies)
Discussion started by: rtroscianecki
2 Replies

3. Shell Programming and Scripting

Split large file into smaller file

hi Guys i need some help here.. i have a file which has > 800,000 lines in it. I need to split this file into smaller files with 25000 lines each. please help thanks (1 Reply)
Discussion started by: sitaldip
1 Replies

4. Shell Programming and Scripting

How to split a file into smaller files

Hi, I have a big text file with m columns and n rows. The format is like: STF123450001000200030004STF123450005000600070008STF123450009001000110012 STF234560345002208330154STF234590705620600070080STF234567804094562357688 STF356780001000200030004STF356780005000600070080STF356780800094562657687... (2 Replies)
Discussion started by: wintersnow2011
2 Replies

5. Shell Programming and Scripting

Help needed - Split large file into smaller files based on pattern match

Help needed urgently please. I have a large file - a few hundred thousand lines. Sample CP START ACCOUNT 1234556 name 1 CP END ACCOUNT CP START ACCOUNT 2224444 name 1 CP END ACCOUNT CP START ACCOUNT 333344444 name 1 CP END ACCOUNT I need to split this file each time "CP START... (7 Replies)
Discussion started by: frustrated1
7 Replies

6. Shell Programming and Scripting

Split a large array into small chunks

Hi, I need to split a large array "@sharedArray" into 10 small arrays. The arrays should be like @sharedArray1,@sharedArray2,@sharedArray3...so on.. Can anyone help me with the logic to do so :(:confused: (6 Replies)
Discussion started by: rkrish
6 Replies

7. Shell Programming and Scripting

Sed: Splitting A large File into smaller files based on recursive Regular Expression match

I will simplify the explaination a bit, I need to parse through a 87m file - I have a single text file in the form of : <NAME>house........ SOMETEXT SOMETEXT SOMETEXT . . . . </script> MORETEXT MORETEXT . . . (6 Replies)
Discussion started by: sumguy
6 Replies

8. UNIX for Dummies Questions & Answers

Split large file to smaller fastly

hi , I have a requirement input file: 1 1111111111111 108 1 1111111111111 109 1 1111111111111 109 1 1111111111111 110 1 1111111111111 111 1 1111111111111 111 1 1111111111111 111 1 1111111111111 112 1 1111111111111 112 1 1111111111111 112 The output should be, (19 Replies)
Discussion started by: mechvijays
19 Replies

9. UNIX for Dummies Questions & Answers

Split files into smaller ones with 1000 hierarchies in a single file.

input file: AD,00,--,---,---,---,---,---,---,--,--,--- AM,000,---,---,---,---,---,--- AR, ,---,--,---,--- AA,---,---,---,--- AT,--- AU,---,---,--- AS,---,--- AP,---,---,--- AI,--- AD,00,---,---,---, ,---,---,---,---,---,--- AM,000,---,---,--- AR,... (6 Replies)
Discussion started by: kcdg859
6 Replies

10. Shell Programming and Scripting

Modification of perl script to split a large file into chunks of 5000 chracters

I have a perl script which splits a large file into chunks.The script is given below use strict; use warnings; open (FH, "<monolingual.txt") or die "Could not open source file. $!"; my $i = 0; while (1) { my $chunk; print "process part $i\n"; open(OUT, ">part$i.log") or die "Could... (4 Replies)
Discussion started by: gimley
4 Replies
TRACE-CMD-SPLIT(1)														TRACE-CMD-SPLIT(1)

NAME
trace-cmd-split - split a trace.dat file into smaller files SYNOPSIS
trace-cmd split [OPTIONS] [start-time [end-time]] DESCRIPTION
The trace-cmd(1) split is used to break up a trace.dat into small files. The start-time specifies where the new file will start at. Using trace-cmd-report(1) and copying the time stamp given at a particular event, can be used as input for either start-time or end-time. The split will stop creating files when it reaches an event after end-time. If only the end-time is needed, use 0.0 as the start-time. If start-time is left out, then the split will start at the beginning of the file. If end-time is left out, then split will continue to the end unless it meets one of the requirements specified by the options. OPTIONS
-i file If this option is not specified, then the split command will look for the file named trace.dat. This options will allow the reading of another file other than trace.dat. -o file By default, the split command will use the input file name as a basis of where to write the split files. The output file will be the input file with an attached '.#' to the end: trace.dat.1, trace.dat.2, etc. This option will change the name of the base file used. -o file will create file.1, file.2, etc. -s seconds This specifies how many seconds should be recorded before the new file should stop. -m milliseconds This specifies how many milliseconds should be recorded before the new file should stop. -u microseconds This specifies how many microseconds should be recorded before the new file should stop. -e events This specifies how many events should be recorded before the new file should stop. -p pages This specifies the number of pages that should be recorded before the new file should stop. Note: only one of *-p*, *-e*, *-u*, *-m*, *-s* may be specified at a time. If *-p* is specified, then *-c* is automatically set. -r This option causes the break up to repeat until end-time is reached (or end of the input if end-time is not specified). trace-cmd split -r -e 10000 This will break up trace.dat into several smaller files, each with at most 10,000 events in it. -c This option causes the above break up to be per CPU. trace-cmd split -c -p 10 This will create a file that has 10 pages per each CPU from the input. SEE ALSO
trace-cmd(1), trace-cmd-record(1), trace-cmd-report(1), trace-cmd-start(1), trace-cmd-stop(1), trace-cmd-extract(1), trace-cmd-reset(1), trace-cmd-list(1), trace-cmd-listen(1) AUTHOR
Written by Steven Rostedt, <rostedt@goodmis.org[1]> RESOURCES
git://git.kernel.org/pub/scm/linux/kernel/git/rostedt/trace-cmd.git COPYING
Copyright (C) 2010 Red Hat, Inc. Free use of this software is granted under the terms of the GNU Public License (GPL). NOTES
1. rostedt@goodmis.org mailto:rostedt@goodmis.org 06/11/2014 TRACE-CMD-SPLIT(1)
All times are GMT -4. The time now is 11:11 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy