Sponsored Content
Top Forums Shell Programming and Scripting awk - printing new lines based of 2 dates Post 302969291 by Ads89 on Monday 21st of March 2016 07:29:07 AM
Old 03-21-2016
Hi Don,

Thanks a lot for the above - I have tested that and it works really nicely for what I need - there is just one scenario which I don't think I explained too well before;

There could be a scenario where we don't have the very first record i.e. the reporting day record, so we would have to create the records before based on a Reporting date parameter.

For example

Input record - Note that Reporting date is 2015-12-01;
Code:
COL1,COL2,COL3,COL4,COL5,COL6,COL7,COL8,COL9,COL10,COL11,COL12,COL13,COL14,COL15,COL16,COL17,COL18
C,1234,TEST,1,AA,AAAA,2019-12-01,2020-11-30,190425.71,AAA,P,4305590.22,NULL,NULL,NULL,NULL,NULL,2020-11-30

As we don't have the reporting date record, we would need to create that as below, right up until Maturity date, or in this case, the final record;

What we are trying to say in the below example is that up until the final record, no units were used in the previous years. Therefore we need to create records that reflect this - units used (COL9) will be 0 as nothing has used, and the outstanding balance should be calculated by adding units (COL9) to outstanding balance (COL12) of the record we know.

Note:
COL9 - Units used that year
COL12 - Outstanding Balance i.e. Units remaining

Desired output;
Code:
COL1,COL2,COL3,COL4,COL5,COL6,COL7,COL8,COL9,COL10,COL11,COL12,COL13,COL14,COL15,COL16,COL17,COL18
C,1234,TEST,1,AA,AAAA,2015-12-01,2016-11-30,0,AAA,P,4496015.93,NULL,NULL,NULL,NULL,NULL,2020-11-30
C,1234,TEST,1,AA,AAAA,2016-12-01,2017-11-30,0,AAA,P,4496015.93,NULL,NULL,NULL,NULL,NULL,2020-11-30 
C,1234,TEST,1,AA,AAAA,2017-12-01,2018-11-30,0,AAA,P,4496015.93,NULL,NULL,NULL,NULL,NULL,2020-11-30 
C,1234,TEST,1,AA,AAAA,2018-12-01,2019-11-30,0,AAA,P,4496015.93,NULL,NULL,NULL,NULL,NULL,2020-11-30
C,1234,TEST,1,AA,AAAA,2019-12-01,2020-11-30,190425.71,AAA,P,4305590.22,NULL,NULL,NULL,NULL,NULL,2020-11-30  -- Original input record




On the same hand, we could also have a scenario where we have used x amount of units half way through the term, and then again, nothing right up until maturity date;

For example

Input record - Note again reporting date is 2015-12-01;
Code:
COL1,COL2,COL3,COL4,COL5,COL6,COL7,COL8,COL9,COL10,COL11,COL12,COL13,COL14,COL15,COL16,COL17,COL18
C,1234,TEST,1,AA,AAAA,2017-12-01,2018-11-30,190425.71,AAA,P,4305590.22,NULL,NULL,NULL,NULL,NULL,2020-11-30

In this scenario we would have to add in records before (starting from reporting date) again using the same logic as above (COL9 set to 0 and COL12 being worked out as COL9+COL12) - We would also have to add records in after the record that we know of i.e. up until Maturity date. In this case, as you have kindly done in your previous bit of code COL9 would be set to 0 and COL12 would take the value of the previous record.

Desired output;
Code:
COL1,COL2,COL3,COL4,COL5,COL6,COL7,COL8,COL9,COL10,COL11,COL12,COL13,COL14,COL15,COL16,COL17,COL18
C,1234,TEST,1,AA,AAAA,2015-12-01,2016-11-30,0,AAA,P,4496015.93,NULL,NULL,NULL,NULL,NULL,2020-11-30
C,1234,TEST,1,AA,AAAA,2016-12-01,2017-11-30,0,AAA,P,4496015.93,NULL,NULL,NULL,NULL,NULL,2020-11-30 
C,1234,TEST,1,AA,AAAA,2017-12-01,2018-11-30,190425.71,AAA,P,4305590.22,NULL,NULL,NULL,NULL,NULL,2020-11-30 -- Original input record
C,1234,TEST,1,AA,AAAA,2018-12-01,2019-11-30,0,AAA,P,4305590.22,NULL,NULL,NULL,NULL,NULL,2020-11-30
C,1234,TEST,1,AA,AAAA,2019-12-01,2020-11-30,0,AAA,P,4305590.22,NULL,NULL,NULL,NULL,NULL,2020-11-30

In a nut shell, what we're trying to work out here is for each reporting year up until maturity date, how many units have been used, and how many do we have remaining.

Really appreciate your help on this one Don, it's certainly a lot more complicated that we first envisaged
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Printing lines with specific awk NF

I have this files: ./frm/lf_mt1_cd.Ic_cell_template.attr ./die/addgen_tb_pumd.Ic_cell_template.attr ./min_m1_n.Ic_cell_template.attr When I use: awk -F\/ '{print NF}' Would result to: 3 3 2 I would like to list the files with 3 fields on it. Any Suggestions? (1 Reply)
Discussion started by: jehrome_rando
1 Replies

2. Shell Programming and Scripting

printing two lines in awk as two columns in excel

hi guys, i would like to print two lines from a file as two adjacent columns using excel using awk.. i have this so far: awk '{for(i=1; i<=NF; i++) {printf("%s\n",$i)}}' "$count".ttt > "$count".csv #this to print the first line from the .ttt file as rows of the first column in the .csv... (9 Replies)
Discussion started by: npatwardhan
9 Replies

3. Shell Programming and Scripting

Gawk / Awk Merge Lines based on Key

Hi Guys, After windows died on my netbook I installed Lubuntu and discovered Gawk about a month ago. After using Excel for 10+ years I'm amazed how quick and easily Gawk can process data but I'm stuck with a little problem merging data from multiple lines. I'm an SEO Consultant and provide... (9 Replies)
Discussion started by: Jamesfirst
9 Replies

4. Shell Programming and Scripting

sed/awk : how to delete lines based on IP pattern ?

Hi, I would like to delete lines in /etc/hosts on few workstations, basically I want to delete all the lines for a list of machines like this : for HOST in $(cat stations.lst |uniq) do # echo -n "$HOST" if ping -c 1 $HOST > /dev/null 2>&1 then HOSTNAME_val=`rsh $HOST "sed... (3 Replies)
Discussion started by: albator1932
3 Replies

5. Shell Programming and Scripting

Help With AWK Matching and Re-printing Lines

Hi All, I'm looking to use AWK to pattern match lines in XML file - Example patten for below sample would be /^<apple>/ The sample I wrote out is very basic compared to what I am actually working with but it will get me started I would like to keep the matched line(s) unchanged but have them... (4 Replies)
Discussion started by: rhoderidge
4 Replies

6. Shell Programming and Scripting

awk - printing nth field based on parameter

I have a need to print nth field based on the parameter passed. Suppose I have 3 fields in a file, passing 1 to the function should print 1st field and so on. I have attempted below function but this throws an error due to incorrect awk syntax. function calcmaxlen { FIELDMAXLEN=0 ... (5 Replies)
Discussion started by: krishmaths
5 Replies

7. UNIX for Dummies Questions & Answers

awk solution to duplicate lines based on column

Hi experts, I have a tab-delimited file with one column containing values separated by a comma. I wish to duplicate the entire line for every value in that comma-delimited field. For example: $cat file 4444 4444 4444 4444 9990 2222,7777 6666 2222 ... (3 Replies)
Discussion started by: torchij
3 Replies

8. Shell Programming and Scripting

UNIX awk pattern matching and printing lines

I have the below plain text file where i have some result, in order to mail that result in html table format I have written the below script and its working well. cat result.txt Page 2015-01-01 2000 Colors 2015-02-01 3000 Landing 2015-03-02 4000 #!/bin/sh LOG=/tmp/maillog.txt... (1 Reply)
Discussion started by: close2jay
1 Replies

9. Shell Programming and Scripting

awk join lines based on keyword

Hello , I will need your help once again. I have the following file: cat file02.txt PATTERN XXX.YYY.ZZZ. 500 ROW01 aaa. 300 XS 14 ROW 45 29 AS XD.FD. PATTERN 500 ZZYN002 ROW gdf gsste ALT 267 fhhfe.ddgdg. PATTERN ERE.MAY. 280 PATTERRNTH 5000 rt.rt. ROW SO a 678 PATTERN... (2 Replies)
Discussion started by: alex2005
2 Replies

10. Shell Programming and Scripting

awk to reformat lines based on condition

The awk below uses the tab-delimeted fileand reformats each line based on one of three conditions (rules). The 3 rules are for deletion (lines in blue), snv (line in red), and insertion (lines in green). I have included all possible combinations of lines from my actual data, which is very large.... (0 Replies)
Discussion started by: cmccabe
0 Replies
All times are GMT -4. The time now is 02:25 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy