Sponsored Content
Top Forums Shell Programming and Scripting Help me pls : splitting single file in unix into different files based on data Post 302715227 by Ravindra Swan on Sunday 14th of October 2012 06:36:07 AM
Old 10-14-2012
clarification

Now i have total 101 files but I want this to be done for n number of graphs. So please dont restrict on number of files generated from a particular big file(Graph)
So Ques1 is cleared

Is this ok??? so number of files varies form graph to graph. So i need a generalised code which performs my analysis one by one graph

So Ques2 is cleared

This is the Input_File i have given u:
Code:

Code:
{2010503005|XXGfvertex|1259|0|3869|0|{|{30100001|XXparameter_set|@@@@{{30001002|XXparameter|Layout|$[[record kind 85 subkind 0 parts [vector _interp_("mfile:$\{INF_ENTRPRSDWUNFYRETLCRED_MFS\}/m_cdp2_uedw_t_arnge_to_arnge_uld.dat", "dollar_substitution")]]]|3|9||@{0|}}
{30001002|XXparameter|read_metadata|$\{INF_ENTRPRSDWUNFYRETLCRED_DML\}/cdp2_uedw_t_arnge_to_arnge.dml|3|2|f$|@{0|}}
{30001002|XXparameter|!prototype_path|C:\\Program Files\\Ab Initio\\Ab Initio GDE 1_14_5\\Components\\Datasets\\Input_File.mdc|3|2|Pw$|@{0|}}
{30001002|XXparameter|eme_dataset_location|$\{_Projects_SunTrust_inf_inf_EntrprsDwUnfyRetlCred\}/data/mfs/main/m_cdp2_uedw_t_arnge_to_arnge_uld.dat|3|9||@{0|}}
}}@1|@151424|2797588|173000|2818000|56000|50000|39502|m_cdp2_uedw_t_arnge_to_arnge_uld.dat|M K Muralidhar||1|100|-1||6||33537|-1|-1|}}
{2010203004|XXGoport|1260|0|3871|0|{@{}@228000|2839000|11000|11000|read|0.0|@@@2160|0|}}
{2010503005|XXGfvertex|1261|0|3874|0|{Represents one file, many files, or a multifile as an input to your graph.|{30100001|XXparameter_set|@@@@{{30001002|XXparameter|protection|0666|12|2|RF$||{0|}}
{30001002|XXparameter|mode|0x0001|1|2|FH$|modes of access|{0|}}
{30001002|XXparameter|Layout|@28|2|RF$||{0|}}
{30001002|XXparameter|read_metadata||7|1|RFl||{0|}}
{30001002|XXparameter|mpcmodtime|1138303912|1|1|Hl|The last modification time of this component's template|{0|}}
{30001002|XXparameter|eme_dataset_location|@3|9|F|Place in the EME to create a dataset corresponding to this file.|{0|}}
}}@0|@0|0|0|0|0|0|0|@@@1|10|-1|@6|@1|-1|-1|}}
{2010203004|XXGoport|1262|0|3876|0|{@{30100001|XXparameter_set|@@@@{{30001002|XXparameter|metadata||7|8|RF=||{0|}}
}}@0|0|0|0|read|0.0|@@@2160|0|}}
{2010501005|XXGpvertex|1263|0|3885|0|{|{30100001|XXparameter_set|@@@@{{30001002|XXparameter|transform0|$AI_XFR/cdp2_rdm_dt_core_cnsum_arnge_dim_xfm_rfmt_orig_fico_scor_val.xfr|3|2|f$|@{0|}}
{30001002|XXparameter|out0_metadata|$AI_DML/cdp2_rdm_dt_core_cnsum_arnge_dim_xfm_rfmt_orig_fico_scor_val.dml|3|2|f$|@{0|}}
{30001002|XXparameter|error0_metadata|string('\\n')|3|1|l|@{0|}}
{30001002|XXparameter|log_metadata|record string("\|") node, timestamp, component, subcomponent, event_type; string("\|\\n") event_text; end|3|1|l|@{0|}}



from the above did u find this line??
Code:

Code:
{2010203004|XXGoport|1260|0|3871|0|{@{}@228000|2839000|11000|11000|read|0.0|@@@2160|0|}}



Now from the above line did u get: 2839000
from every Input_File We can find the same line but with different numbers. In the sense

Code:

Code:
{2010203004|XXGoport|1260|0|3871|0|



This one we can have as a key word. Now frm every Input_File we will get one num then add 5000 to this number
2839000+5000 = 2844000

So Ques3 is cleared

As i already mentioned I have one file with me with the name XXGFlow and it contains the flow of components from one component to another(one file to another) and see my previous post for the file XXGFlow.

So Ques4 is cleared

My main moto at this stage is to get the flow of componet by component(file by file) starting from input_File to Output_File and need respective flow in respective Input_File'number'_f(i mean respective file name_f).
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

splitting files based on text in the file

I need to split a file based on certain context inside the file. Is there a unix command that can do this? I have looked into split and csplit but it does not seem like those would work because I need to split this file based on certain text. The file has multiple records and I need to split this... (1 Reply)
Discussion started by: matrix1067
1 Replies

2. Shell Programming and Scripting

Splitting large file into multiple files in unix based on pattern

I need to write a shell script for below scenario My input file has data in format: qwerty0101TWE 12345 01022005 01022005 datainala alanfernanded 26 qwerty0101mXZ 12349 01022005 06022008 datainalb johngalilo 28 qwerty0101TWE 12342 01022005 07022009 datainalc hitalbert 43 qwerty0101CFG 12345... (19 Replies)
Discussion started by: jimmy12
19 Replies

3. Shell Programming and Scripting

Data Splitting into two files from one file

I have a file as: I/P File: Ground Car 2009 Lib 2008 Lib 2003 Ground Car 2009 Ground Car 2003 Car 2005 Car 2003 Car 2005 Sita 2900 2006 Car 2007 I have to split the file into two: - one for names and second for years. O/p1 (Names): Ground Car (3 Replies)
Discussion started by: karumudi7
3 Replies

4. Shell Programming and Scripting

Splitting single file into n files

Hi all, I am new to scripting and I have a requirement we have source file as HEADER 01.10.2010 14:32:37 NAYA TA0022 TA0000 20000001;20060612;99991231;K4;02;3 20000008;20080624;99991231;K4;02;3 20000026;19840724;99991231;KK;01;3 20000027;19840724;99991231;KK;01;3... (6 Replies)
Discussion started by: srk409
6 Replies

5. Shell Programming and Scripting

Urgent ...pls Sorting files based on timestamp and picking the latest file

Hi Friends, Newbie to shell scripting. Currently i have used the below to sort data based on filenames and datestamp $ printf '%s\n' *.dat* | sort -t. -k3,4 filename_1.dat.20120430.Z filename_2.dat.20120430.Z filename_3.dat.20120430.Z filename_1.dat.20120501.Z filename_2.dat.20120501.Z... (1 Reply)
Discussion started by: robertbrown624
1 Replies

6. Shell Programming and Scripting

Sed: Splitting A large File into smaller files based on recursive Regular Expression match

I will simplify the explaination a bit, I need to parse through a 87m file - I have a single text file in the form of : <NAME>house........ SOMETEXT SOMETEXT SOMETEXT . . . . </script> MORETEXT MORETEXT . . . (6 Replies)
Discussion started by: sumguy
6 Replies

7. UNIX for Dummies Questions & Answers

Extracting data from one file, based on another file (splitting)

Dear All, I have two files but want to extract data from one based on another... can you please help me file 1 David Tom Ellen and file 2 David|0010|testnamez|resultsz David|0004|testnamex|resultsx Tom|0010|testnamez|resultsz Tom|0004|testnamex|resultsx Ellen|0010|testnamez|resultsz... (12 Replies)
Discussion started by: A-V
12 Replies

8. Shell Programming and Scripting

Splitting a single file to multiple files

Hi Friends , Please guide me with the code to extract multiple files from one file . The File Looks like ( Suppose a file has 2 tables list ,column length may vary ) H..- > File Header.... H....- >Table 1 Header.... D....- > Table 1 Data.... T....- >Table 1 Trailer.... H..-> Table 2... (1 Reply)
Discussion started by: AspiringD
1 Replies

9. Shell Programming and Scripting

Split a single file into multiple files based on a value.

Hi All, I have the sales_data.csv file in the directory as below. SDDCCR; SOM ; MD6546474777 ;05-JAN-16 ABC ; KIRAN ; CB789 ;04-JAN-16 ABC ; RAMANA; KS566767477747 ;06-JAN-16 ABC ; KAMESH; A33535335 ;04-JAN-16 SDDCCR; DINESH; GD6674474747 ;08-JAN-16... (4 Replies)
Discussion started by: ROCK_PLSQL
4 Replies

10. Shell Programming and Scripting

In PErl script: need to read the data one file and generate multiple files based on the data

We have the data looks like below in a log file. I want to generat files based on the string between two hash(#) symbol like below Source: #ext1#test1.tale2 drop #ext1#test11.tale21 drop #ext1#test123.tale21 drop #ext2#test1.tale21 drop #ext2#test12.tale21 drop #ext3#test11.tale21 drop... (5 Replies)
Discussion started by: Sanjeev G
5 Replies
All times are GMT -4. The time now is 03:18 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy