Split large file into smaller files without disturbing the entry chunks


 
Thread Tools Search this Thread
Top Forums UNIX for Beginners Questions & Answers Split large file into smaller files without disturbing the entry chunks
# 1  
Old 04-27-2018
Linux Split large file into smaller files without disturbing the entry chunks

Dears,

Need you help with the below file manipulation. I want to split the file into 8 smaller files but without cutting/disturbing the entries (meaning every small file should start with a entry and end with an empty line). It will be helpful if you can provide a one liner command for this and also help me understand the command u give. I'm not very good at this.

Input_file:- (With more than 10 Million such chunks separated by empty line)
Code:
dn: MSISDN=asdasdasdasd,dc=msisdn,ou=identities,dc=ppp
objectClass: alias
objectClass: MSISDN
MSISDN: asdasdasdasd
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: IMSI=uuuuuuuuuu,dc=imsi,ou=identities,dc=ppp
objectClass: alias
objectClass: IMSI
IMSI: uuuuuuuuuuu
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
mscId: asdasdasdasdaaaaaaa4
DSUnitGroup: 1
objectClass: CUDBMultiServiceConsumer

dn: serv=AAA,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: AAA

dn: serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: Auth

dn: IMSI=426019001711473,serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
AMFVALUE: 15
EKI::
objectClass: 
BNS: 15
GAPSIGN: 15
MIGRATIONEXPDATE:: 
SQNPS: 
KIND: 
MIGRATIONSTEP: 
FSETIND: 
AKAALGIND: 
GAP:: 
CDC: 0
SQN: 15
IMSI: 
A3A8IND: 0
SQNCS: 15
AKATYPE: 0
VNUMBER: 0
SQNIMS: 15
A4IND: 0


small_file1:
-
Code:
dn: MSISDN=asdasdasdasd,dc=msisdn,ou=identities,dc=ppp
objectClass: alias
objectClass: MSISDN
MSISDN: asdasdasdasd
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: IMSI=uuuuuuuuuu,dc=imsi,ou=identities,dc=ppp
objectClass: alias
objectClass: IMSI
IMSI: uuuuuuuuuuu
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
mscId: asdasdasdasdaaaaaaa4
DSUnitGroup: 1
objectClass: CUDBMultiServiceConsumer

dn: serv=AAA,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: AAA

small_file2:-

Code:
dn: serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: Auth

dn: IMSI=426019001711473,serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
AMFVALUE: 15
EKI::
objectClass: 
BNS: 15
GAPSIGN: 15
MIGRATIONEXPDATE:: 
SQNPS: 
KIND: 
MIGRATIONSTEP: 
FSETIND: 
AKAALGIND: 
GAP:: 
CDC: 0
SQN: 15
IMSI: 
A3A8IND: 0
SQNCS: 15
AKATYPE: 0
VNUMBER: 0
SQNIMS: 15
A4IND: 0

Thank you in advance.
Kamesh G

Last edited by RudiC; 04-28-2018 at 02:45 AM.. Reason: Added CODE tags for file data
# 2  
Old 04-27-2018
Code:
awk -v c=8 '{if(f) close(f);f=(FILENAME "_" FNR%c);print $0 ORS >>f}' RS= myFile


Last edited by vgersh99; 04-27-2018 at 11:28 AM..
# 3  
Old 04-28-2018
@vgersh99, it worked, thank you for the help.

the problem is the file split part is okay but the contents are not as expected, can you help with the command to create the output like sample_file1 and sample_file2 examples.

right now my files are getting split but the expected result is to be each file should begin with the start of smaller chunk eg: the line start with dn: should be the start of each smaller files.

Last edited by Kamesh G; 04-28-2018 at 02:07 AM.. Reason: I posted wrongly.
# 4  
Old 04-28-2018
You now see how important it is to exercise due care when
- phrasing your request
- formatting your post

vgersh99's proposal fulfills your request in post#1 (given that the split into file1 and file2 was hidden invisibly somewhere in your text and overlooked by the mod who inserted the necessary code tags for you) , and even does so for your post#3 as every file produced starts with dn:. How to define the optimal position of a file break?

Would this paraphrase of the specification come close to what you need:
Split the file into n contiguous chunks on record boundaries while keeping the original order and togetherness of the records?
# 5  
Old 04-28-2018
@Rudic, sorry that my post was misleading. Are you right, the input file should be split into 8 smaller files also maintaining the integrity of each chunk of lines in the input file.

-> Each of the 8 smaller output files should start with dn: and end with an empty line.
-> optimal position of a file break, It should be the "empty line" that is separating the input file chunks. the expected it to have the input file split into 8 smaller files without disturbing the order/integrity of each chunks.
# 6  
Old 04-28-2018
Quote:
Originally Posted by Kamesh G
@Rudic, sorry that my post was misleading. Are you right, the input file should be split into 8 smaller files also maintaining the integrity of each chunk of lines in the input file.

-> Each of the 8 smaller output files should start with dn: and end with an empty line.
-> optimal position of a file break, It should be the "empty line" that is separating the input file chunks. the expected it to have the input file split into 8 smaller files without disturbing the order/integrity of each chunks.
Hmmm... I guess we'd need a descriptive definition of how the file should be split.
Each record/block starts with dn:
What logic did you use to split your sample file into 2 files you provided?
What decision did you make to start outputting chunks into small_file2?
# 7  
Old 04-28-2018
The sample I printed above was just an example I did manually in a text editor. I'm on a very armature level I was trying to split the file based on number of lines and split command but then it doesn't help in the expected output of each small file starting with a dn: and end with an empty line. I was hoping the forum can help.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Modification of perl script to split a large file into chunks of 5000 chracters

I have a perl script which splits a large file into chunks.The script is given below use strict; use warnings; open (FH, "<monolingual.txt") or die "Could not open source file. $!"; my $i = 0; while (1) { my $chunk; print "process part $i\n"; open(OUT, ">part$i.log") or die "Could... (4 Replies)
Discussion started by: gimley
4 Replies

2. UNIX for Dummies Questions & Answers

Split files into smaller ones with 1000 hierarchies in a single file.

input file: AD,00,--,---,---,---,---,---,---,--,--,--- AM,000,---,---,---,---,---,--- AR, ,---,--,---,--- AA,---,---,---,--- AT,--- AU,---,---,--- AS,---,--- AP,---,---,--- AI,--- AD,00,---,---,---, ,---,---,---,---,---,--- AM,000,---,---,--- AR,... (6 Replies)
Discussion started by: kcdg859
6 Replies

3. UNIX for Dummies Questions & Answers

Split large file to smaller fastly

hi , I have a requirement input file: 1 1111111111111 108 1 1111111111111 109 1 1111111111111 109 1 1111111111111 110 1 1111111111111 111 1 1111111111111 111 1 1111111111111 111 1 1111111111111 112 1 1111111111111 112 1 1111111111111 112 The output should be, (19 Replies)
Discussion started by: mechvijays
19 Replies

4. Shell Programming and Scripting

Sed: Splitting A large File into smaller files based on recursive Regular Expression match

I will simplify the explaination a bit, I need to parse through a 87m file - I have a single text file in the form of : <NAME>house........ SOMETEXT SOMETEXT SOMETEXT . . . . </script> MORETEXT MORETEXT . . . (6 Replies)
Discussion started by: sumguy
6 Replies

5. Shell Programming and Scripting

Split a large array into small chunks

Hi, I need to split a large array "@sharedArray" into 10 small arrays. The arrays should be like @sharedArray1,@sharedArray2,@sharedArray3...so on.. Can anyone help me with the logic to do so :(:confused: (6 Replies)
Discussion started by: rkrish
6 Replies

6. Shell Programming and Scripting

Help needed - Split large file into smaller files based on pattern match

Help needed urgently please. I have a large file - a few hundred thousand lines. Sample CP START ACCOUNT 1234556 name 1 CP END ACCOUNT CP START ACCOUNT 2224444 name 1 CP END ACCOUNT CP START ACCOUNT 333344444 name 1 CP END ACCOUNT I need to split this file each time "CP START... (7 Replies)
Discussion started by: frustrated1
7 Replies

7. Shell Programming and Scripting

How to split a file into smaller files

Hi, I have a big text file with m columns and n rows. The format is like: STF123450001000200030004STF123450005000600070008STF123450009001000110012 STF234560345002208330154STF234590705620600070080STF234567804094562357688 STF356780001000200030004STF356780005000600070080STF356780800094562657687... (2 Replies)
Discussion started by: wintersnow2011
2 Replies

8. Shell Programming and Scripting

Split large file into smaller file

hi Guys i need some help here.. i have a file which has > 800,000 lines in it. I need to split this file into smaller files with 25000 lines each. please help thanks (1 Reply)
Discussion started by: sitaldip
1 Replies

9. UNIX for Dummies Questions & Answers

multiple smaller files from one large file

I have a file with a simple list of ids. 750,000 rows. I have to break it down into multiple 50,000 row files to submit in a batch process.. Is there an easy script I could write to accomplish this task? (2 Replies)
Discussion started by: rtroscianecki
2 Replies

10. UNIX for Dummies Questions & Answers

splitting the large file into smaller files

hi all im new to this forum..excuse me if anythng wrong. I have a file containing 600 MB data in that. when i do parse the data in perl program im getting out of memory error. so iam planning to split the file into smaller files and process one by one. can any one tell me what is the code... (1 Reply)
Discussion started by: vsnreddy
1 Replies
Login or Register to Ask a Question