Help with Splitting a Large XML file based on size AND tags


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Help with Splitting a Large XML file based on size AND tags
# 1  
Old 07-02-2014
Linux Help with Splitting a Large XML file based on size AND tags

Hi All,

This is my first post here. Hoping to share and gain knowledge from this great forum !!!!

I've scanned this forum before posting my problem here, but I'm afraid I couldn't find any thread that addresses this exact problem.

I'm trying to split a large XML file (with multiple tag sets) into smaller files of equal size so that the splitting doesn't happen between tags, i.e. I'm trying to have a complete tag set in a file. The size limit of the smaller files is specified in a parameter file. For example, if the size limit is 100 KB, and the Large file is 440 KB, I should have five smaller files of sizes 100 KB,100 KB,100 KB,100 KB and 40 KB.

My initial approach was to create the large file with all the complete tag sets in a single line each, and then to use the split function based on the size limit. However, the complete tag sets are not getting accommodated in single lines since the XMLs are itself Huge. So I was thinking of splitting the large file based on tags, as well as within the size limit.

Below is what I tried to do so far

Code:
#!/bin/bash
export ORACLE_HOME=.........
export ORACLE_SID=...........
export PATH=........
. ./params       # contains the parameter sizelimit
FILE="datafile.txt"
sqlplus -s userid/password@DB <<EOF
SET HEADING OFF
SET PAGESIZE 0
SET LINESIZE 32000
SET LONG 32000
SET NEWPAGE NONE
SET FEEDBACK OFF
SET TRIMSPOOL ON
SET DEFINE ON     
SET VERIFY OFF
SET SERVEROUTPUT OFF
SPOOL $FILE
[....query to create the master file...]
SPOOL OFF
EXIT
EOF
filesize= ls -l $FILE | awk '{print $5}'
#echo $filesize
#echo $sizelimit 
if ! echo "$filesize $sizelimit -p" | bc | grep  > /dev/null ^-;
then split -b $sizelimit $FILE part
else echo "less than the limit"
fi

This was the first attempt in using Split function. However, I don't think this can be used, given my criterion. Assuming the tag sets are like <URL>...</URL>, can anyone suggest any other way out?

Thanks a lot,

- Avik

Last edited by Scrutinizer; 07-02-2014 at 08:53 AM.. Reason: Changed ICODE to CODE tags
# 2  
Old 07-02-2014
Welcome Aviktheory11,

Please change your post to wrap your code/output in [CODE] & [/CODE] rather than [ICODE] & [/ICODE]. It makes it far easier to read.

Thanks for clearly putting in some effort before posting and for posting a good amount of information. Just a few questions:-
  • What OS and version are you running?
  • What are your preferred tools to work in? e.g. just ksh/bash, awk, etc.
  • Can you post a small sample of input and the expected output, or perhaps (with a width of 32000) just a representative sample with fewer 'columns'
  • How close have you got with your requirement?
  • Is the data you want to split just a single column for each record? If not, then perhaps a tweak to your SELECT statement may be enough.



Thanks again,
Robin
This User Gave Thanks to rbatte1 For This Post:
# 3  
Old 07-02-2014
how about this:

Code:
#!/bin/bash
export ORACLE_HOME=.........
export ORACLE_SID=...........
export PATH=........
. ./params        # contains the parameter sizelimit
...

if [ $(stat -c%s $FILE) -gt $sizelimit ]
then
    awk -v limit=$sizelimit '
        BEGIN { num=1 }
        {
          if ((bytes+=length)>limit) {
             close(FILENAME "." num)
             num++
          }
          printf "%s%s",$0,RS > FILENAME "." num
        } ' RS="</URL>" $FILE
else
   echo "$FILE: already less than the limit of $sizelimit"
fi

Just be careful awk and many other unix utilities have limits on the length of a single line you may be better off putting a newline character after each </URL>

---------- Post updated at 10:17 AM ---------- Previous update was at 10:06 AM ----------


Depending on your OS the stat command I used above may not be available. A much more portable (but possible less efficient) version would be:

Code:
if [ $(wc -c < $FILE) -gt $sizelimit ]


Last edited by Chubler_XL; 07-02-2014 at 09:19 PM.. Reason: close previous file to ensure awk openfile limit is not exceeded
# 4  
Old 07-03-2014
Hi Robin,

Thanks for the reply. Below are the information you asked for :

I'm running Linux kernel 2.6.39.

The script I'm trying to write is for bash. I'm comfortable working with Awk/sed etc.

The actual table that I'm querying is something like below
Code:
ID (PK, NUMBER)      URL (the queried column, XMLTYPE)

1		     <URL><A>v1</A><B>v2</B><C>v3</C></URL>
2		     <URL><A>x1</A><B>x2</B><C>x3</C></URL>
3		     <URL><A>y1</A><B>y2</B><C>y3</C></URL>
4		     <URL><A>z1</A><B>z2</B><C>z3</C></URL>

There are about 10K rows in this table, and only the URL column needs to be dumped into a file and sent to an FTP server. There should be a check if the file is more than 10MB in size, and if it is, the file needs to be splitted into smaller files, each of size 10MB or lower.
The structure of the output file will be just something simple like this:
Code:
<URL><A>v1</A><B>v2</B><C>v3</C></URL>
<URL><A>x1</A><B>x2</B><C>x3</C></URL>
<URL><A>y1</A><B>y2</B><C>y3</C></URL>
<URL><A>z1</A><B>z2</B><C>z3</C></URL>

In case the size exceeds 10MB, splitting this file will be easy with the SPLIT function, based on the size limit. But the problem is that each URL XML is extremely large, and the output file has got it's limit on the length of a single line, so the output file is being generated as something like below
Code:
<URL><A>v1</A><B>v2</B>
<C>v3</C></URL>
<URL><A>x1</A><B>x2</B>
<C>x3</C></URL>
<URL><A>y1</A><B>y2</B>
<C>y3</C></URL>
<URL><A>z1</A><B>z2</B>
<C>z3</C></URL>

I found it tricky to split this file, since the SPLIT function won't understand XML tags.

I have been successfully splitting file with sample data in my table where the XML length is much smaller. As in the above example, my script works perfectly. However, it's the Production data that's causing the problem.


Quote:
Originally Posted by Chubler_XL
how about this:

Code:
#!/bin/bash
export ORACLE_HOME=.........
export ORACLE_SID=...........
export PATH=........
. ./params        # contains the parameter sizelimit
...

if [ $(stat -c%s $FILE) -gt $sizelimit ]
then
    awk -v limit=$sizelimit '
        BEGIN { num=1 }
        {
          if ((bytes+=length)>limit) {
             close(FILENAME "." num)
             num++
          }
          printf "%s%s",$0,RS > FILENAME "." num
        } ' RS="</URL>" $FILE
else
   echo "$FILE: already less than the limit of $sizelimit"
fi

Just be careful awk and many other unix utilities have limits on the length of a single line you may be better off putting a newline character after each </URL>

---------- Post updated at 10:17 AM ---------- Previous update was at 10:06 AM ----------


Depending on your OS the stat command I used above may not be available. A much more portable (but possible less efficient) version would be:

Code:
if [ $(wc -c < $FILE) -gt $sizelimit ]


Thanks a lot Chubler_XL. I'll surely try out your idea. It looks good to me. I've found that in my version of Linux, the STAT command is available. I'll try this out and let you know.

Thanks again,

- Avik
# 5  
Old 07-03-2014
Sorry I should have tried my code on more than 1 large URL as I have forgotten to reset the bytes variable please accept this updated version:

Code:
#!/bin/bash
export ORACLE_HOME=.........
export ORACLE_SID=...........
export PATH=........
. ./params        # contains the parameter sizelimit
...

if [ $(stat -c%s $FILE) -gt $sizelimit ]
then
    awk -v limit=$sizelimit '
        BEGIN { num=1 }
        {
          if ((bytes+=length)>limit) {
             close(FILENAME "." num)
             bytes=length
             num++
          }
          printf "%s%s",$0,RS > FILENAME "." num
        } ' RS="</URL>" $FILE
else
   echo "$FILE: already less than the limit of $sizelimit"
fi

This User Gave Thanks to Chubler_XL For This Post:
# 6  
Old 07-03-2014
Quote:
Originally Posted by Chubler_XL
Sorry I should have tried my code on more than 1 large URL as I have forgotten to reset the bytes variable please accept this updated version:

Code:
#!/bin/bash
export ORACLE_HOME=.........
export ORACLE_SID=...........
export PATH=........
. ./params        # contains the parameter sizelimit
...

if [ $(stat -c%s $FILE) -gt $sizelimit ]
then
    awk -v limit=$sizelimit '
        BEGIN { num=1 }
        {
          if ((bytes+=length)>limit) {
             close(FILENAME "." num)
             bytes=length
             num++
          }
          printf "%s%s",$0,RS > FILENAME "." num
        } ' RS="</URL>" $FILE
else
   echo "$FILE: already less than the limit of $sizelimit"
fi

Hi Chubler_XL,

I tried out your code snippet. There're a couple of observations that I made.

Firstly, the splitted files are being generated, but the sizelimit is not being considered as that in the parameter file, but the size of the initial file itself. For e.g., suppose the initial file ("output") was created with size 324010 bytes, whereas the parameter file specified the size limit of 10240 bytes. However there are two files created by the script, one is "output" (the initial file) with size 324010 bytes, and "output.2" with size 324017, both with the same data.

I guess there might be something amiss with the bytes variable assignment, but I'm not sure.

Secondly, just for my knowledge, is your script supposedly appending </URL> to the end of every file that gets generated?



Thanks,

- Avik
# 7  
Old 07-03-2014
This is the exact symptom you would see if your file doesn't contain </URL> at all.

Note: awk is case-sensitive, could it be that the file actually contains </url>? If this is the issue you can put IGNORECASE=1 just before the RS="</URL>"

No, if your file ends in </URL> as it should from your definition then nothing should change.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Issue splitting file based on XML tags

more a-d.txt1 <a-dets> <a-serv> <aserv>mymac14,mymac15:MYAPP:mydom:/web/domain/mydom/config <NMGR>:MYAPP:/web/bea_apps/perf/NMGR/NMGR1034 <a-rep-string> 11.12.10.01=192.10.00.26 10.20.18.10=192.10.00.27 </a-rep-string> </a-serv> <w-serv>... (2 Replies)
Discussion started by: mohtashims
2 Replies

2. Shell Programming and Scripting

Splitting a single xml file into multiple xml files

Hi, I'm having a xml file with multiple xml header. so i want to split the file into multiple files. Sample.xml consists multiple headers so how can we split these multiple headers into multiple files in unix. eg : <?xml version="1.0" encoding="UTF-8"?> <ml:individual... (3 Replies)
Discussion started by: Narendra921631
3 Replies

3. Shell Programming and Scripting

Splitting xml file into several xml files using perl

Hi Everyone, I'm new here and I was checking this old post: /shell-programming-and-scripting/180669-splitting-file-into-several-smaller-files-using-perl.html (cannot paste link because of lack of points) I need to do something like this but understand very little of perl. I also check... (4 Replies)
Discussion started by: mcosta
4 Replies

4. Shell Programming and Scripting

Split XML file based on tags

Hello All , Please help me with below requirement I want to split a xml file based on tag.here is the file format <data-set> some-information </data-set> <data-set1> some-information </data-set1> <data-set2> some-information </data-set2> I want to split the above file into 3... (5 Replies)
Discussion started by: Pratik4891
5 Replies

5. Shell Programming and Scripting

Sed: Splitting A large File into smaller files based on recursive Regular Expression match

I will simplify the explaination a bit, I need to parse through a 87m file - I have a single text file in the form of : <NAME>house........ SOMETEXT SOMETEXT SOMETEXT . . . . </script> MORETEXT MORETEXT . . . (6 Replies)
Discussion started by: sumguy
6 Replies

6. Shell Programming and Scripting

Help required in Splitting a xml file into multiple and appending it in another .xml file

HI All, I have to split a xml file into multiple xml files and append it in another .xml file. for example below is a sample xml and using shell script i have to split it into three xml files and append all the three xmls in a .xml file. Can some one help plz. eg: <?xml version="1.0"?>... (4 Replies)
Discussion started by: ganesan kulasek
4 Replies

7. Shell Programming and Scripting

Splitting large file and renaming based on field

I am trying to update an older program on a small cluster. It uses individual files to send jobs to each node. However the newer database comes as one large file, containing over 10,000 records. I therefore need to split this file. It looks like this: HMMER3/b NAME 1-cysPrx_C ACC ... (2 Replies)
Discussion started by: fozrun
2 Replies

8. Shell Programming and Scripting

Problem with splitting large file based on pattern

Hi Experts, I have to split huge file based on the pattern to create smaller files. The pattern which is expected in the file is: Master..... First... second.... second... third.. third... Master... First.. second... third... Master... First... second.. second.. second..... (2 Replies)
Discussion started by: saisanthi
2 Replies

9. Shell Programming and Scripting

Splitting large file into multiple files in unix based on pattern

I need to write a shell script for below scenario My input file has data in format: qwerty0101TWE 12345 01022005 01022005 datainala alanfernanded 26 qwerty0101mXZ 12349 01022005 06022008 datainalb johngalilo 28 qwerty0101TWE 12342 01022005 07022009 datainalc hitalbert 43 qwerty0101CFG 12345... (19 Replies)
Discussion started by: jimmy12
19 Replies

10. Shell Programming and Scripting

awk - splitting 1 large file into multiple based on same key records

Hello gurus, I am new to "awk" and trying to break a large file having 4 million records into several output files each having half million but at the same time I want to keep the similar key records in the same output file, not to exist accross the files. e.g. my data is like: Row_Num,... (6 Replies)
Discussion started by: kam66
6 Replies
Login or Register to Ask a Question