Read column from file and delete rows with some condition..


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Read column from file and delete rows with some condition..
# 1  
Old 12-06-2012
Read column from file and delete rows with some condition..

Hi....

I have a need of script to do delete row whenever condition is true....

Code:
2.16    (3)    [00]    1    3    9999    0    (1)    (0)    [00]
34.42    (4)    [00]    1    3    9999    37    (2)    (3)    [00]
34.38    (4)    [00]    1    3    9999    64    (2)    (3)    [00]
34.4    (4)    [00]    1    3    1    110    (3)    (3)    [00]
34.38    (4)    [00]    1    3    12    165    (3)    (3)    [00]
34.42    (4)    [00]    1    3    13    220    (3)    (3)    [00]
34.4    (4)    [00]    1    3    9999    274    (3)    (3)    [00]
34.38    (4)    [00]    1    3    9999    348    (3)    (3)    [00]

Here 6th column is having some values, whenever there is 9999, that row should be deleted....

and some columns containing values inside brackets both '()' and []',those brackets should be removed,,,and values inside bracket should be as it is..


Here I have attached my input sample file....please go through it..

Thanks in advance..
# 2  
Old 12-06-2012
Is this what you want..

Code:
$ cat file
2.16    (3)     [00]    1       3       9999    0       (1)     (0)     [00]
34.42   (4)     [00]    1       3       9999    37      (2)     (3)     [00]
34.38   (4)     [00]    1       3       9999    64      (2)     (3)     [00]
34.4    (4)     [00]    1       3       1       110     (3)     (3)     [00]
34.38   (4)     [00]    1       3       12      165     (3)     (3)     [00]
34.42   (4)     [00]    1       3       13      220     (3)     (3)     [00]
34.4    (4)     [00]    1       3       9999    274     (3)     (3)     [00]
34.38   (4)     [00]    1       3       9999    348     (3)     (3)     [00]

$ awk '$6 != 9999{gsub("\\(","");gsub("\\)","");gsub("\\[","");gsub("\\]","");print}' file
34.4    4       00      1       3       1       110     3       3       00
34.38   4       00      1       3       12      165     3       3       00
34.42   4       00      1       3       13      220     3       3       00

This User Gave Thanks to pamu For This Post:
# 3  
Old 12-06-2012
Alternate awk..
Code:
$ uname -rs
SunOS 5.10
$ nawk '$6!="9999"{gsub(/[)(\]\[]/,"",$0);print}' inputfile

This User Gave Thanks to michaelrozar17 For This Post:
# 4  
Old 12-06-2012
Code:
 
$ perl -lane 's/[()\]\[]//g;print $_ if $F[5]!=9999' input.txt
34.4    4     00    1       3       1       110     3     3     00
34.38   4     00    1       3       12      165     3     3     00
34.42   4     00    1       3       13      220     3     3     00

This User Gave Thanks to itkamaraj For This Post:
# 5  
Old 12-06-2012
I have one more problem of same kind....here is that file format

Code:
9.983    68.033    1    28.25    36.42
9.983    68.033    5    28.26    36.42
9.983    68.033    10    28.23    36.43
9.983    68.033    15    28.22    36.43
9.983    68.033    20    28.2    36.42
9.983    68.033    25    28.19    36.43
9.983    68.033    30    28.18    36.43
9.983    68.033    35    28.18    36.43
9.983    68.033    40    28.18    36.44
9.983    68.033    45    28.19    36.45
9.983    68.033    50    28.19    36.44
9.983    68.033    55    28.2    36.45
9.983    68.033    60    28.2    36.469
9.983    68.033    5    28.26    36.42
9.983    68.033    10    28.23    36.43
9.983    68.033    15    28.22    36.43
9.983    68.033    20    28.2    36.42
9.983    68.033    25    28.19    36.43
9.983    68.033    30    28.18    36.43
9.983    68.033    35    28.18    36.43
9.983    68.033    40    28.18    36.44
9.983    68.033    45    28.19    36.45
9.983    68.033    50    28.19    36.44
9.983    68.033    55    28.2    36.45
9.983    68.033    60    28.2    36.46
9.983    68.033    65    28.21    36.48
9.983    68.033    70    28.22    36.47
9.983    68.033    65    28.21    36.48
9.983    68.033    70    28.22    36.47

I need data till 50 in third column.....after 50 all data to be ignored...and want to print only 2nd,3rd and 5th column as output...
# 6  
Old 12-06-2012
I assuming you want data which has value less than or equal to 50 in 3rd column.

Code:
 $ cat file
9.983    68.033    1    28.25    36.42
9.983    68.033    5    28.26    36.42
9.983    68.033    10    28.23    36.43
9.983    68.033    15    28.22    36.43
9.983    68.033    20    28.2    36.42
9.983    68.033    25    28.19    36.43
9.983    68.033    30    28.18    36.43
9.983    68.033    35    28.18    36.43
9.983    68.033    40    28.18    36.44
9.983    68.033    45    28.19    36.45
9.983    68.033    50    28.19    36.44
9.983    68.033    55    28.2    36.45
9.983    68.033    60    28.2    36.469
9.983    68.033    5    28.26    36.42
9.983    68.033    10    28.23    36.43
9.983    68.033    15    28.22    36.43
9.983    68.033    20    28.2    36.42
9.983    68.033    25    28.19    36.43
9.983    68.033    30    28.18    36.43
9.983    68.033    35    28.18    36.43
9.983    68.033    40    28.18    36.44
9.983    68.033    45    28.19    36.45
9.983    68.033    50    28.19    36.44
9.983    68.033    55    28.2    36.45
9.983    68.033    60    28.2    36.46
9.983    68.033    65    28.21    36.48
9.983    68.033    70    28.22    36.47
9.983    68.033    65    28.21    36.48
9.983    68.033    70    28.22    36.47

$ awk '$3 <= 50{print $2,$3,$5}' file
68.033 1 36.42
68.033 5 36.42
68.033 10 36.43
68.033 15 36.43
68.033 20 36.42
68.033 25 36.43
68.033 30 36.43
68.033 35 36.43
68.033 40 36.44
68.033 45 36.45
68.033 50 36.44
68.033 5 36.42
68.033 10 36.43
68.033 15 36.43
68.033 20 36.42
68.033 25 36.43
68.033 30 36.43
68.033 35 36.43
68.033 40 36.44
68.033 45 36.45
68.033 50 36.44

# 7  
Old 12-06-2012
if values are of this type...how to filter data

Pamu...if data file is like this...how can I take data till 1st columns maximum value...
say

Code:
1  25   35 
2  25    32 
3   25    32 
4   24    35
5   23    38 
6   17   15 
4   58    35 
3   15    36 
2   25    33 
1   25    35 
0   25   38

If I have to filter out till 6
then ..

output looks like this

Code:
1  25   35 
2  25    32 
3   25    32 
4   24    35
5   23    38 
6   17   15

Here...in first column maximum is 6, in some other file it may be 60, 70, or 200....I need data till 1st column's 1st profile's maximum file....downward data I don't need...that is 4,3, 2..need to be ignored...

Last edited by Scott; 12-06-2012 at 06:23 AM.. Reason: Code tags
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Read 4th column and print those many rows

Hi, My input file chr1 3217769 3217789 2952725-5 255 + chr1 3260455 3260475 2434087-6 255 - My desired output chr1 3217769 3217789 2952725-1 255 + chr1 3217769 3217789 2952725-2 255 + chr1 3217769 3217789 2952725-3 255 + chr1 3217769 3217789 2952725-4 255 +... (7 Replies)
Discussion started by: jacobs.smith
7 Replies

2. Shell Programming and Scripting

Delete if condition met in a column

i have a table like this: id, senderNumber, blacklist ----------------------------- 1 0835636326 Y 2 0373562343 Y 3 0273646833 Y and I want to delete automatically if a new inserted row on another table consist anything on senderNumber column above using a BASH Script I... (9 Replies)
Discussion started by: jazzyzha
9 Replies

3. UNIX for Dummies Questions & Answers

merging rows into new file based on rows and first column

I have 2 files, file01= 7 columns, row unknown (but few) file02= 7 columns, row unknown (but many) now I want to create an output with the first field that is shared in both of them and then subtract the results from the rest of the fields and print there e.g. file 01 James|0|50|25|10|50|30... (1 Reply)
Discussion started by: A-V
1 Replies

4. Shell Programming and Scripting

Capture rows for a column in file from delete sql -Oracle

Hi, This may not be the right forum but i am hoping someone knows an answer to this. I have to capture rows for a column that was deleted. How can i do that without having to write a select query? delete from myschema.mytable where currentdatetimestamp > columnDate this should delete 5... (4 Replies)
Discussion started by: jakSun8
4 Replies

5. UNIX for Dummies Questions & Answers

Delete rows with unique value for specific column

Hi all I have a file which looks like this 1234|1|Jon|some text|some text 1234|2|Jon|some text|some text 3453|5|Jon|some text|some text 6533|2|Kate|some text|some text 4567|3|Chris|some text|some text 4567|4|Maggie|some text|some text 8764|6|Maggie|some text|some text My third column is my... (9 Replies)
Discussion started by: A-V
9 Replies

6. Shell Programming and Scripting

Selecting rows from a pipe delimited file based on condition

HI all, I have a simple challenge for you.. I have the following pipe delimited file 2345|98|1809||x|969|0 2345|98|0809||y|0|537 2345|97|9809||x|544|0 2345|97|0909||y|0|651 9685|98|7809||x|321|0 9685|98|7909||y|0|357 9685|98|7809||x|687|0 9685|98|0809||y|0|234 2315|98|0809||x|564|0 ... (2 Replies)
Discussion started by: nithins007
2 Replies

7. Shell Programming and Scripting

Parsing a CSV file and deleting all rows on condition

Hello list, I am working on a csv file which contains two fields per record which contain IP addresses. What I am trying to do is find records which have identical fields(IP addresses) which occur 4(four) times, and if they do, delete all records with that specific identical field(ip address). ... (4 Replies)
Discussion started by: landossa
4 Replies

8. Shell Programming and Scripting

delete rows in a file based on the rows of another file

I need to delete rows based on the number of lines in a different file, I have a piece of code with me working but when I merge with my C application, it doesnt work. sed '1,'\"`wc -l < /tmp/fileyyyy`\"'d' /tmp/fileA > /tmp/filexxxx Can anyone give me an alternate solution for the above (2 Replies)
Discussion started by: Muthuraj K
2 Replies

9. Shell Programming and Scripting

awk to select rows based on condition on column

I have got a file like this 003ABC00281020091005000100042.810001 ... (8 Replies)
Discussion started by: Maruti
8 Replies

10. Shell Programming and Scripting

how to delete duplicate rows based on last column

hii i have a huge amt of data stored in a file.Here in this file i need to remove duplicates rows in such a way that the last column has different data & i must check for greatest among last colmn data & print the largest data along with other entries but just one of other duplicate entries is... (16 Replies)
Discussion started by: reva
16 Replies
Login or Register to Ask a Question