awk remove/grab lines from file with pattern from other file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting awk remove/grab lines from file with pattern from other file
# 15  
Old 02-11-2016
Quote:
Originally Posted by RudiC
This might solve your new problem:
Code:
awk -F ';' '
NR==FNR         {id[$0]
                 next
                }
                {for (SP in id) if ($0 ~ SP)    {print > "Positive"    
                                                 next
                                                }
                }
                {print > "Negative"    
                }
' file1 file2
cf *ive
Negative:
37760   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Orange Sirup 500ml   3.1682242990654 EUR     7.00    SodaStream      7290002793335   1020103490      >10     4.99    0.699   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37761   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Zitrone-Limette Sirup 500ml  2.5046728971963 EUR     7.00    SodaStream      7290002793328   1020110490      0       4.99    0.600   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37762   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Apfel-Mix Sirup 500ml        3.5046728971963 EUR     7.00    SodaStream      7290002793229   1020108491      3       4.99    0.600   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37765   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Isotonic Sirup 375ml 3.7289719626168 EUR     7.00    SodaStream      7290010498574   5140013 >10     4.99    0.400   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37773   Haushalt & Küche > Elektro Kleingeräte > Eierkocher     Gastroback 42801 Design Eierkocher Silber       33      EUR     19.00   Gastroback      4016432428011   0580178 1       0       1.000   0       Haushalt & Kueche > Elektro Kleingeraete > Eierkocher
54164           Logitech R400 Wireless Presenter        29.327731092437 EUR     19.00   Logitech        5099206018129   910-001357      2       4.99    0.210   0
Positive:
68132   Computer & Zubehör > Eingabegeräte > Mäuse      Logitech MK710 Wireless Desktop 64.621848739496 EUR     19.00   Logitech        5099206020948   920-002420      10      4.99    1.390   0       Computer & Zubehoer > Eingabegeraete > Maeuse

I am not sure what i am doing wrong with your code but it does not seem to work like your example.

Code:
awk -F '\t' '
NR==FNR         {id[$0]
                 next
                }
                {for (SP in id) if ($0 ~ SP)
                  {print > "'"$PAD/removed_woord.csv"'" #positive
                    next
                  }
                }
                {print > "'"$PAD/raw1.csv"'" #negative
                }
' filters/woord_COM.csv $PAD/raw.csv

i get the positive file completly filled with all lines even though it should only add several in there while the negative file does not even get a single line. I added negative and positive in above code to should which is which.

I just noticed i had 2 small errors in my copied code. I removed them but the same problem is still present. the positive file with all the matches contains all lines and the negative one is not even created.

---------- Post updated at 04:32 PM ---------- Previous update was at 12:59 PM ----------

I did some more fiddling and i think i know where the error is and how to solve it.

I took the original code:
Code:
awk -F ';' ' NR==FNR         {id[$0]                  next                 }                 {for (SP in id) if ($0 ~ SP)    {print > "Positive"                                                      next                                                 }                 }                 {print > "Negative"                     } ' file1 file2

the only thing i changed on this was the names of file1 and file2. To be sure i used the example data i provided earlier. When i executed it the code worked fine and created both files.

Then i changed file1 to be the original filter file. Now it did not work anymore. I only recieved the Positive file with all lines. To be sure i tested it the other way around also but this worked fine. So i found out the problem was at the filter file. After some thinking i remembered that i copied over the data straight from a excel file and when i used notepad as in between it solved the problem.

So with a new filter file it works fine. Thank you for helping.

Last edited by SDohmen; 02-11-2016 at 10:41 AM..
# 16  
Old 02-11-2016
The desired two-filter filtering can be done in one go. Does your data file have space field sparators or e.g. <TAB> separators?
# 17  
Old 02-11-2016
Quote:
Originally Posted by RudiC
The desired two-filter filtering can be done in one go. Does your data file have space field sparators or e.g. <TAB> separators?
That is the annoying part. Some actually have TAB and some have semicolon etc. What is strange though is that for another file where i was testing i tried setting the delimiter to ; but it just ignored the whole code and threw all in the positive file. After i changed it to TAB en also made the delimiter TAB it worked just fine.

It is a bit annoying but i can work around that for at least the time untill i actually start to understand the code some more. I would even go as far as saying that the current code could be done in about half the lines but at least i understand what it does atm Smilie

I am always open for updated code if you would be happen to know it but it is no biggy to do it in 2 runs Smilie.
# 18  
Old 02-11-2016
If you provide samples that match the requirements (word filter, EAN filter, and data) and post possible results, we could give it a go.
# 19  
Old 02-11-2016
Quote:
Originally Posted by RudiC
If you provide samples that match the requirements (word filter, EAN filter, and data) and post possible results, we could give it a go.
Well to make it actually understandable i will post the whole file although it has been shortened a bit.

Code:
## Download the new list
curl -o $PAD/pricing.csv http://website/pricing.csv

## Backup original file
cp $PAD/pricing.csv $ARCHIEF/origineel.$TIJDDATUM.csv

## Remove all lines with 7% in colomn 6
awk -F"\t" '$6 != "7.00"' $PAD/pricing.csv > $PAD/raw.csv

## Put all lines that are in the word_filter_file in a seperated file and the rest in the raw2 for further processing
awk -F"\t" '
NR==FNR         {id[$0]
                 next
                }
                {for (SP in id) if ($0 ~ SP)    {print > "'"$PAD/removed_woord.csv"'"
                                                 next
                                                }
                }
                {print > "'"$PAD/raw1.csv"'"
                }
' $VOEG/filters/woord_COM.csv $PAD/raw.csv

## Remove all colomns except for the ones we need.
cut -f1,4,7-10 $PAD/raw1.csv > $PAD/raw2.csv

## Put all columns in the correct order and use TAB as delimiter
awk 'BEGIN { FS="\t"; OFS="\t"; } {print $4,$1,$5,$6,$2,$3}' $PAD/raw2.csv > $PAD/raw3.csv

## Remove all lines with a blank first column
awk '!/^\t/' $PAD/raw3.csv > $PAD/raw4.csv

## Remove all > in the complete file
sed 's/>//g' $PAD/raw4.csv > $PAD/raw5.csv

## Remove the first line
awk '{if (NR!=1) {print}}' $PAD/raw5.csv > $PAD/raw6.csv

## Limit the chars behind the . (only for column 5 needed)
awk 'BEGIN { FS="\t"; OFS="\t"; } {printf("%s\t%s\t%s\t%s\t%.2f\t%s\n",$1,$2,$3,$4,$5,$6)}' $PAD/raw6.csv > $PAD/raw7.csv

## Make the delimiter ;
awk 'BEGIN { FS="\t"; OFS=";"; } {print $1,$2,$3,$4,$5,$6}' $PAD/raw7.csv > $PAD/raw8.csv

## Put all lines that are in the ean_filter_file in a seperated file and the rest in the raw9 for further processing
awk 'FNR==NR{A[$1]=$1;next} ($1 in A){print >> "'"$PAD/removed_EAN.csv"'"} !($1 in A){print >> "'"$PAD/raw9.csv"'"}' $VOEG/filters/niet_gebruiken_ean.csv FS=";" $PAD/raw8.csv

## Remove line if column 5 contains -
awk -F';' '!($5 ~ "-")' $PAD/raw8.csv > $PAD/raw9.csv

## Add code as last column
awk '$0=$0" ;CMT"' $PAD/raw9.csv > $PAD/clean.csv

## Create backup of the cleaned file
cp $PAD/clean.csv $ARCHIEF/clean.$TIJDDATUM.csv

As mentioned before the code works fine but it is probarly not the best code.

Main sample data (the original file is far bigger)
Code:
sku     category        title   price   currency        tax     manufacturer    EAN     suppliernumber  instock shipp_moneyorder        weight  Sped    catSort EEK
37760   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Orange Sirup 500ml   3.1682242990654 EUR     7.00    SodaStream      7290002793335   1020103490      0       4.99    0.699   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37761   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Zitrone-Limette Sirup 500ml  2.5046728971963 EUR     7.00    SodaStream      7290002793328   1020110490      >10     4.99    0.600   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37762   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Apfel-Mix Sirup 500ml        3.5046728971963 EUR     7.00    SodaStream      7290002793229   1020108491      3       4.99    0.600   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37765   Haushalt & Küche > SodaStream & Wassermaxx > Sirup      SodaStream Isotonic Sirup 375ml 3.7289719626168 EUR     7.00    SodaStream      7290010498574   5140013 >10     4.99    0.400   0       Haushalt & Kueche > SodaStream & Wassermaxx > Sirup
37773   Haushalt & Küche > Elektro Kleingeräte > Eierkocher     Gastroback 42801 Design Eierkocher Silber       29.327731092437 EUR     19.00   Gastroback      4016432428011   0580178 1       0       1.000   0       Haushalt & Kueche > Elektro Kleingeraete > Eierkocher
37816   Haushalt & Küche > SodaStream & Wassermaxx > Ersatzflaschen     SodaStream PET Flasche 1 ltr. Boden Edelstahl   7.2521008403361 EUR     19.00   SodaStream      7290006780829   1041191490      0       3.99    1.000   0       Haushalt & Kueche > SodaStream & Wassermaxx > Ersatzflaschen
37898   Haushalt & Küche > Kaffee & Tee > Kaffeebohnen  Melitta Bella Crema LaCrema Kaffeebohnen 1kg    11.598130841121 EUR     7.00    Melitta 4002720008102   008102  8       4.99    1.000   0       Haushalt & Kueche > Kaffee & Tee > Kaffeebohnen
38616   Receiver & SAT-Anlagen > SAT Zubehör > SAT LNB  Kathrein UAS 584 Quatro LNB     75.756302521008 EUR     19.00   Kathrein        4021121468858   20110019        4       4.99    0.300   0       Receiver & SAT-Anlagen > SAT Zubehoer > SAT LNB
38909   Gesundheit & Wellness > Körperpflege > Rasierer Zubehör Panasonic WES 035 K503 Reinigungskartusche      7.9831932773109 EUR     19.00   Panasonic       5025232434084   WES035K503      5       3.99    0.100   0       Gesundheit & Wellness > Koerperpflege > Rasierer Zubehoer
39101   Receiver & SAT-Anlagen > SAT Zubehör > SAT Stecker & Adapter    Kathrein ESD 84 Antennendose    5.9579831932773 EUR     19.00   Kathrein        4021121338540   274425  5       4.99    0.200   0       Receiver & SAT-Anlagen > SAT Zubehoer > SAT Stecker & Adapter
39417   Receiver & SAT-Anlagen > SAT Zubehör > SAT Stecker & Adapter    Kathrein EBC 10 Zweifachverteiler SAT-Verteiler 6.6386554621849 EUR     19.00   Kathrein        4021121435638   272859  6       4.99    0.100   0       Receiver & SAT-Anlagen > SAT Zubehoer > SAT Stecker & Adapter
39837   Haushalt & Küche > Elektro Kleingeräte > Folienschweißgerät     Rommelsbacher VRS 2060 Vakuumier Rollen 20x600cm 2er Set        8.3193277310924 EUR     19.00   Rommelsbacher   8018294008659   8018294008659   1       3.99    0.500   0       Haushalt & Kueche > Elektro Kleingeraete > Folienschweissgeraet
39840   Computer & Zubehör > Monitore > Monitor-Zubehör Dell AX510 Lautsprecher für UltraSharp  20.084033613445 EUR     19.00   Dell    0000000039840   520-10703       >10     4.99    1.000   0       Computer & Zubehoer > Monitore > Monitor-Zubehoer
39843   Haushalt & Küche > Elektro Kleingeräte > Zitruspresse   Gastroback 41138 Home Culture Zitruspresse      39      EUR     19.00   Gastroback      4016432411389   41138   3       0       1.000   0       Haushalt & Kueche > Elektro Kleingeraete > Zitruspresse
39934   Haushalt & Küche > Elektro Kleingeräte > Folienschweißgerät     Rommelsbacher VRS 3060 Vakuumier Rollen 30x600cm 2er Set        11.680672268908 EUR     19.00   Rommelsbacher   4001797824004   VRS3060 7       4.99    0.500   0       Haushalt & Kueche > Elektro Kleingeraete > Folienschweissgeraet

Word filter file
Code:
Schweissgeraete
Sharp Fernseher
Sirup
Software
Solo-Mikrowelle
Sony Fernseher
Standherde
Stromerzeuger

EAN filter file
Code:
0085126300272
4960759025241
0018208021444
0182080214444
4960759023858
0024066553065
0240665530652
4001797824004

All files are snippets of the full files as they are too big to put them here. With the code from above it should be understandable what happens.

Each filter file gives a seperate file as output and a raw one which continues the processing. At the end i have a tar command which packs those files together (not the raw ones) and sends them per email and also uploads them to the correct server.

The only real disadvantage that i have is that every supplier has its own layout but this above is one of the longest scripts for this. Most others are smaller but not less important. I know most of the code what i copied here so i should be able to make it work for those if needed.
# 20  
Old 02-11-2016
Wellllll, not sure if I met ALL of your requirements that I tried to infer from your code, but this might be at least a starting point:
Code:
awk -F"\t" '
FNR == 1        {FC++
                }
FC == 1         {FILTWORD[$0]
                 next
                }
FC == 2         {FILTEAN[$0]
                 next
                }

FNR == 1 || 
$6 == 7         {next
                }

                {for (SP in FILTWORD) if ($0 ~ SP)      {print > "removed_woord"
                                                         next
                                                        }
                 gsub (/>/, "")
                }

!$4  ||
$9 ~ "-"        {next
                }

$8 in FILTEAN   {print $8, $1, $9, $10, $4+0, $7  > "removed_EAN" 
                 next
                }

                {print $8, $1, $9, $10, $4+0, $7, "CMT"  > "clean"
                }
' OFS=";" OFMT="%.2f" file2 file3 file1
cf removed_EAN clean 
removed_EAN:
4001797824004;39934;VRS3060;7;11.68;Rommelsbacher
clean:
4016432428011;37773;0580178;1;29.33;Gastroback;CMT
7290006780829;37816;1041191490;0;7.25;SodaStream;CMT
4021121468858;38616;20110019;4;75.76;Kathrein;CMT
5025232434084;38909;WES035K503;5;7.98;Panasonic;CMT
4021121338540;39101;274425;5;5.96;Kathrein;CMT
4021121435638;39417;272859;6;6.64;Kathrein;CMT
8018294008659;39837;8018294008659;1;8.32;Rommelsbacher;CMT
4016432411389;39843;41138;3;39;Gastroback;CMT

Not sure if the fields selected and their order is correctly inferred; an output sample to compare to is missing.
removed_woord is empty as all eligible lines were already discarded by the 7.00% VAT criterion.

Last edited by RudiC; 02-11-2016 at 06:29 PM..
# 21  
Old 02-12-2016
Holy cr*p Smilie.

I added the main code so you could see the output because of the double filters but this just butchered my code (in a good way that is).

Even though i am extremly gratefull for the complete rewrite of the code, i am not sure if i will/can use it since i cannot read it completly yet. The main purpose was mainly to add both filters together :P and to create 3 files (2 filters and 1 raw).

I will definatly check it out some more and see if i can actually understand some more of it to change the code to suit the other files also.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

awk with sed to combine lines and remove specific odd # pattern from line

In the awk piped to sed below I am trying to format file by removing the odd xxxx_digits and whitespace after, then move the even xxxx_digit to the line above it and add a space between them. There may be multiple lines in file but they are in the same format. The Filename_ID line is the last line... (4 Replies)
Discussion started by: cmccabe
4 Replies

2. UNIX for Beginners Questions & Answers

awk to remove pattern and lines above pattern

In the awk below I am trying to remove all lines above and including the pattern Test or Test2. Each block is seperated by a newline and Test2 also appears in the lines to keep but it will always have additional text after it. The Test to remove will not. The awk executed until the || was added... (2 Replies)
Discussion started by: cmccabe
2 Replies

3. UNIX for Beginners Questions & Answers

awk function to remove lines that contain contents of another file

Hi, I'd be grateful for your help with the following. I have a file (file.txt) with 10 columns and about half a million lines, which in simplified form looks like this: ID Col1 Col2 Col3.... a 4 2 8 b 5 6 1 c 8 4 1 d... (4 Replies)
Discussion started by: aberg
4 Replies

4. Shell Programming and Scripting

Using awk to remove lines from file that match text

I am trying to remove each line in which $2 is FP or RFP. I believe the below will remove one instance but not both. Thank you :). file 12 123 FP 11 10 RFP awk awk -F'\t' ' $2 != "FP"' file desired output 12 11 (6 Replies)
Discussion started by: cmccabe
6 Replies

5. Shell Programming and Scripting

awk to remove lines in file if specific field matches

I am trying to remove lines in the target.txt file if $5 before the - in that file matches sorted_list. I have tried grep and awk. Thank you :). grep grep -v -F -f targets.bed sort_list grep -vFf sort_list targets awk awk -F, ' > FILENAME == ARGV {to_remove=1; next} > ! ($5 in... (2 Replies)
Discussion started by: cmccabe
2 Replies

6. UNIX for Dummies Questions & Answers

awk -remove pattern from file

I have a file like this - I want to remove the 2015 (or any four digit #) from column $4 so I can get: Nov 05 1997 /ifs/inventory2/ for example. Im not sure how. Should I use an if statement with awk? Jan 16 2015 23:45 /ifs/sql_file Jan 16 2015 23:45 /ifs/sql_file Nov 05 2015 1997... (4 Replies)
Discussion started by: newbie2010
4 Replies

7. Shell Programming and Scripting

How to grab a block of data in a file with repeating pattern?

I need to send email to receipient in each block of data in a file which has the sender address under TO and just send that block of data where it ends as COMPANY. I tried to work this out by getting line numbers of the string HELLO but unable to grab the next block of data to send the next... (5 Replies)
Discussion started by: loggedout
5 Replies

8. Shell Programming and Scripting

Getting lines before and until next pattern in file /awk, sed

Hi, I need to get specific parts in a large file. I need to: Get a line containing an IP address, and read from there to another line saying ***SNMP-END*** So, I have the start and the end well defined, but the problem is that apparently the awk command using the -F option doesn't work... (17 Replies)
Discussion started by: ocramas
17 Replies

9. Shell Programming and Scripting

shell script to remove all lines from a file before a line starting with pattern

hi,, i hav a file with many lines.i need to remove all lines before a line begginning with a specific pattern from the file because these lines are not required. Can u help me out with either a perl script or shell script example:- if file initially contains lines: a b c d .1.2 d e f... (2 Replies)
Discussion started by: raksha.s
2 Replies

10. Shell Programming and Scripting

Search file for pattern and grab some lines before pattern

I want to search a file for a string and then if the string is found I need the line that the string is on - but also the previous two lines from the file (that the pattern will not be found in) This is on solaris Can you help? (2 Replies)
Discussion started by: frustrated1
2 Replies
Login or Register to Ask a Question