Split large file into smaller files without disturbing the entry chunks


Login or Register to Reply

 
Thread Tools Search this Thread
# 1  
Old 04-27-2018
Linux Split large file into smaller files without disturbing the entry chunks

Dears,

Need you help with the below file manipulation. I want to split the file into 8 smaller files but without cutting/disturbing the entries (meaning every small file should start with a entry and end with an empty line). It will be helpful if you can provide a one liner command for this and also help me understand the command u give. I'm not very good at this.

Input_file:- (With more than 10 Million such chunks separated by empty line)
Code:
dn: MSISDN=asdasdasdasd,dc=msisdn,ou=identities,dc=ppp
objectClass: alias
objectClass: MSISDN
MSISDN: asdasdasdasd
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: IMSI=uuuuuuuuuu,dc=imsi,ou=identities,dc=ppp
objectClass: alias
objectClass: IMSI
IMSI: uuuuuuuuuuu
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
mscId: asdasdasdasdaaaaaaa4
DSUnitGroup: 1
objectClass: CUDBMultiServiceConsumer

dn: serv=AAA,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: AAA

dn: serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: Auth

dn: IMSI=426019001711473,serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
AMFVALUE: 15
EKI::
objectClass: 
BNS: 15
GAPSIGN: 15
MIGRATIONEXPDATE:: 
SQNPS: 
KIND: 
MIGRATIONSTEP: 
FSETIND: 
AKAALGIND: 
GAP:: 
CDC: 0
SQN: 15
IMSI: 
A3A8IND: 0
SQNCS: 15
AKATYPE: 0
VNUMBER: 0
SQNIMS: 15
A4IND: 0


small_file1:
-
Code:
dn: MSISDN=asdasdasdasd,dc=msisdn,ou=identities,dc=ppp
objectClass: alias
objectClass: MSISDN
MSISDN: asdasdasdasd
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: IMSI=uuuuuuuuuu,dc=imsi,ou=identities,dc=ppp
objectClass: alias
objectClass: IMSI
IMSI: uuuuuuuuuuu
aliasedObjectName: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp

dn: mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
mscId: asdasdasdasdaaaaaaa4
DSUnitGroup: 1
objectClass: CUDBMultiServiceConsumer

dn: serv=AAA,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: AAA

small_file2:-

Code:
dn: serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
objectClass: CUDBService
serv: Auth

dn: IMSI=426019001711473,serv=Auth,mscId=asdasdasdasdaaaaaaa4,ou=multiSCs,dc=ppp
AMFVALUE: 15
EKI::
objectClass: 
BNS: 15
GAPSIGN: 15
MIGRATIONEXPDATE:: 
SQNPS: 
KIND: 
MIGRATIONSTEP: 
FSETIND: 
AKAALGIND: 
GAP:: 
CDC: 0
SQN: 15
IMSI: 
A3A8IND: 0
SQNCS: 15
AKATYPE: 0
VNUMBER: 0
SQNIMS: 15
A4IND: 0

Thank you in advance.
Kamesh G

Last edited by RudiC; 04-28-2018 at 02:45 AM.. Reason: Added CODE tags for file data
# 2  
Old 04-27-2018
Code:
awk -v c=8 '{if(f) close(f);f=(FILENAME "_" FNR%c);print $0 ORS >>f}' RS= myFile


Last edited by vgersh99; 04-27-2018 at 11:28 AM..
# 3  
Old 04-28-2018
@vgersh99, it worked, thank you for the help.

the problem is the file split part is okay but the contents are not as expected, can you help with the command to create the output like sample_file1 and sample_file2 examples.

right now my files are getting split but the expected result is to be each file should begin with the start of smaller chunk eg: the line start with dn: should be the start of each smaller files.

Last edited by Kamesh G; 04-28-2018 at 02:07 AM.. Reason: I posted wrongly.
# 4  
Old 04-28-2018
You now see how important it is to exercise due care when
- phrasing your request
- formatting your post

vgersh99's proposal fulfills your request in post#1 (given that the split into file1 and file2 was hidden invisibly somewhere in your text and overlooked by the mod who inserted the necessary code tags for you) , and even does so for your post#3 as every file produced starts with dn:. How to define the optimal position of a file break?

Would this paraphrase of the specification come close to what you need:
Split the file into n contiguous chunks on record boundaries while keeping the original order and togetherness of the records?
# 5  
Old 04-28-2018
@Rudic, sorry that my post was misleading. Are you right, the input file should be split into 8 smaller files also maintaining the integrity of each chunk of lines in the input file.

-> Each of the 8 smaller output files should start with dn: and end with an empty line.
-> optimal position of a file break, It should be the "empty line" that is separating the input file chunks. the expected it to have the input file split into 8 smaller files without disturbing the order/integrity of each chunks.
# 6  
Old 04-28-2018
Quote:
Originally Posted by Kamesh G
@Rudic, sorry that my post was misleading. Are you right, the input file should be split into 8 smaller files also maintaining the integrity of each chunk of lines in the input file.

-> Each of the 8 smaller output files should start with dn: and end with an empty line.
-> optimal position of a file break, It should be the "empty line" that is separating the input file chunks. the expected it to have the input file split into 8 smaller files without disturbing the order/integrity of each chunks.
Hmmm... I guess we'd need a descriptive definition of how the file should be split.
Each record/block starts with dn:
What logic did you use to split your sample file into 2 files you provided?
What decision did you make to start outputting chunks into small_file2?
# 7  
Old 04-28-2018
The sample I printed above was just an example I did manually in a text editor. I'm on a very armature level I was trying to split the file based on number of lines and split command but then it doesn't help in the expected output of each small file starting with a dn: and end with an empty line. I was hoping the forum can help.
Login or Register to Reply

|
Thread Tools Search this Thread
Search this Thread:
Advanced Search

More UNIX and Linux Forum Topics You Might Find Helpful
Modification of perl script to split a large file into chunks of 5000 chracters gimley Shell Programming and Scripting 4 05-09-2018 03:45 AM
Split larger files into smaller ones with Column names Nivas UNIX for Dummies Questions & Answers 1 08-07-2015 03:37 AM
Split files into smaller ones with 1000 hierarchies in a single file. kcdg859 UNIX for Dummies Questions & Answers 6 10-15-2014 06:39 AM
Split large file to smaller fastly mechvijays UNIX for Dummies Questions & Answers 19 09-23-2014 04:29 AM
Sed: Splitting A large File into smaller files based on recursive Regular Expression match sumguy Shell Programming and Scripting 6 04-02-2013 09:39 PM
Split a large array into small chunks rkrish Shell Programming and Scripting 6 03-13-2013 07:36 AM
Help needed - Split large file into smaller files based on pattern match frustrated1 Shell Programming and Scripting 7 01-18-2013 06:02 PM
Split large zone file dump into multiple files Bluemerlin Shell Programming and Scripting 7 12-21-2011 09:15 AM
How to split a file into smaller files wintersnow2011 Shell Programming and Scripting 2 12-08-2011 03:58 PM
Split large file into smaller file sitaldip Shell Programming and Scripting 1 08-05-2011 05:59 AM
Divide large data files into smaller files ad23 Shell Programming and Scripting 7 07-21-2010 09:30 AM
Split file into chunks of low & high byte basta Shell Programming and Scripting 2 02-26-2010 03:35 AM
multiple smaller files from one large file rtroscianecki UNIX for Dummies Questions & Answers 2 07-15-2009 11:25 PM
splitting the large file into smaller files vsnreddy UNIX for Dummies Questions & Answers 1 11-16-2008 09:09 PM
Split large file and add header and footer to each small files ashish4422 Shell Programming and Scripting 7 07-07-2008 03:13 PM