Read column from file and delete rows with some condition..


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Read column from file and delete rows with some condition..
# 8  
Old 12-06-2012
Please use code tags..

Assuming you have sorted input.
Try
Code:
$ cat file1
1 25 35
2 25 32
3 25 32
4 24 35
5 23 38
6 17 15
4 58 35
3 15 36
2 25 33
1 25 35
0 25 38

$ awk '$1 > s || NR==1{print}{s=$1}' file1
1 25 35
2 25 32
3 25 32
4 24 35
5 23 38
6 17 15

$awk '$1 > s || NR==1{s=$1;print}' file1
1 25 35
2 25 32
3 25 32
4 24 35
5 23 38
6 17 15

This User Gave Thanks to pamu For This Post:
# 9  
Old 12-06-2012
Thank you Pamu, I am having problem in saving file..
I tried this command

Code:
for file in *.txt; do
awk '{print $8 "\t"$3 "\t"$7}' $file >"out_"$file.csv
done

in csv
output looks like this
1


28.1027 33.7323 2


28.1055 33.731 3
# 10  
Old 12-06-2012
Quote:
Originally Posted by nex_asp
Thank you Pamu, I am having problem in saving file..
I tried this command

Code:
for file in *.txt; do
awk '{print $8 "\t"$3 "\t"$7}' $file >"out_"$file.csv
done

CSV...?

do you want a comma separated file..? OR tab separated..?
# 11  
Old 12-06-2012
both I tried not working...pasted command with tab sorry....when you save...then 1st column not coming properly...

---------- Post updated at 06:20 AM ---------- Previous update was at 06:13 AM ----------

Hi Pamu,

why its like that.....if you print on command prompt its showing properly tab separated...but when you save its not coming ...

Did you find the reason ?
# 12  
Old 12-06-2012
Quote:
Originally Posted by nex_asp
both I tried not working...pasted command with tab sorry....when you save...then 1st column not coming properly...
And please use code tags for code and data sample..

I am not getting what could be the problem..

you may try like this..

Code:
$ cat file
      1.006   5.324079    27.6452    2.40651     2.8315     0.4706    33.1640      1.000
      2.012   5.323260    27.4376    2.88395     2.9420     0.5726    33.3054      2.000
      3.018   5.319734    27.3193    3.17664     3.0671     0.7445    33.3646      3.000
      4.024   5.320370    27.3121    3.55961     2.8734     0.7843    33.3740      4.000
      5.029   5.321427    27.2701    3.27116     2.7069     0.7734    33.4111      5.000
      6.035   5.317643    27.2201    2.99257     2.5828     0.8503    33.4199      6.000
      7.041   5.307164    27.1758    4.18136     3.0051     0.9228    33.3773      7.000
      8.047   5.305160    27.1626    4.47475     3.4154     0.8651    33.3724      8.000

Code:
$ awk '{print $8,$3,$7}' OFS="\t" file > out_file.csv

$ cat out_file.csv
1.000   27.6452 33.1640
2.000   27.4376 33.3054
3.000   27.3193 33.3646
4.000   27.3121 33.3740
5.000   27.2701 33.4111
6.000   27.2201 33.4199
7.000   27.1758 33.3773
8.000   27.1626 33.3724

Code:
$ awk '{print $NF,$(NF-5),$(NF-1)}' OFS="\t" file > out_file.csv

$ cat out_file.csv
1.000   27.6452 33.1640
2.000   27.4376 33.3054
3.000   27.3193 33.3646
4.000   27.3121 33.3740
5.000   27.2701 33.4111
6.000   27.2201 33.4199
7.000   27.1758 33.3773
8.000   27.1626 33.3724

Hope this helps you..Smilie

EDIT: about tab separated files. Yes it may look like having different spacing but still they are tab separated. Don't just look at spacings.Smilie

pamu
# 13  
Old 12-06-2012
No, its not coming...see I have attached here...by changing extension...
# 14  
Old 12-06-2012
Quote:
Originally Posted by nex_asp
No, its not coming...see I have attached here...by changing extension...
You may like to define spacing by your own...Smilie

Code:
$ cat file
      1.006   5.324079    27.6452    2.40651     2.8315     0.4706    33.1640      1.000
      2.012   5.323260    27.4376    2.88395     2.9420     0.5726    33.3054      2.000
      3.018   5.319734    27.3193    3.17664     3.0671     0.7445    33.3646      3.000
      4.024   5.320370    27.3121    3.55961     2.8734     0.7843    33.3740      4.000
      5.029   5.321427    27.2701    3.27116     2.7069     0.7734    33.4111      5.000
      6.035   5.317643    27.2201    2.99257     2.5828     0.8503    33.4199      6.000
      7.041   5.307164    27.1758    4.18136     3.0051     0.9228    33.3773      7.000
      8.047   5.305160    27.1626    4.47475     3.4154     0.8651    33.3724      8.000

$ awk '{printf "%-10s%-12s%-10s\n", $8,$3,$7}' file
1.000     27.6452     33.1640
2.000     27.4376     33.3054
3.000     27.3193     33.3646
4.000     27.3121     33.3740
5.000     27.2701     33.4111
6.000     27.2201     33.4199
7.000     27.1758     33.3773
8.000     27.1626     33.3724

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Read 4th column and print those many rows

Hi, My input file chr1 3217769 3217789 2952725-5 255 + chr1 3260455 3260475 2434087-6 255 - My desired output chr1 3217769 3217789 2952725-1 255 + chr1 3217769 3217789 2952725-2 255 + chr1 3217769 3217789 2952725-3 255 + chr1 3217769 3217789 2952725-4 255 +... (7 Replies)
Discussion started by: jacobs.smith
7 Replies

2. Shell Programming and Scripting

Delete if condition met in a column

i have a table like this: id, senderNumber, blacklist ----------------------------- 1 0835636326 Y 2 0373562343 Y 3 0273646833 Y and I want to delete automatically if a new inserted row on another table consist anything on senderNumber column above using a BASH Script I... (9 Replies)
Discussion started by: jazzyzha
9 Replies

3. UNIX for Dummies Questions & Answers

merging rows into new file based on rows and first column

I have 2 files, file01= 7 columns, row unknown (but few) file02= 7 columns, row unknown (but many) now I want to create an output with the first field that is shared in both of them and then subtract the results from the rest of the fields and print there e.g. file 01 James|0|50|25|10|50|30... (1 Reply)
Discussion started by: A-V
1 Replies

4. Shell Programming and Scripting

Capture rows for a column in file from delete sql -Oracle

Hi, This may not be the right forum but i am hoping someone knows an answer to this. I have to capture rows for a column that was deleted. How can i do that without having to write a select query? delete from myschema.mytable where currentdatetimestamp > columnDate this should delete 5... (4 Replies)
Discussion started by: jakSun8
4 Replies

5. UNIX for Dummies Questions & Answers

Delete rows with unique value for specific column

Hi all I have a file which looks like this 1234|1|Jon|some text|some text 1234|2|Jon|some text|some text 3453|5|Jon|some text|some text 6533|2|Kate|some text|some text 4567|3|Chris|some text|some text 4567|4|Maggie|some text|some text 8764|6|Maggie|some text|some text My third column is my... (9 Replies)
Discussion started by: A-V
9 Replies

6. Shell Programming and Scripting

Selecting rows from a pipe delimited file based on condition

HI all, I have a simple challenge for you.. I have the following pipe delimited file 2345|98|1809||x|969|0 2345|98|0809||y|0|537 2345|97|9809||x|544|0 2345|97|0909||y|0|651 9685|98|7809||x|321|0 9685|98|7909||y|0|357 9685|98|7809||x|687|0 9685|98|0809||y|0|234 2315|98|0809||x|564|0 ... (2 Replies)
Discussion started by: nithins007
2 Replies

7. Shell Programming and Scripting

Parsing a CSV file and deleting all rows on condition

Hello list, I am working on a csv file which contains two fields per record which contain IP addresses. What I am trying to do is find records which have identical fields(IP addresses) which occur 4(four) times, and if they do, delete all records with that specific identical field(ip address). ... (4 Replies)
Discussion started by: landossa
4 Replies

8. Shell Programming and Scripting

delete rows in a file based on the rows of another file

I need to delete rows based on the number of lines in a different file, I have a piece of code with me working but when I merge with my C application, it doesnt work. sed '1,'\"`wc -l < /tmp/fileyyyy`\"'d' /tmp/fileA > /tmp/filexxxx Can anyone give me an alternate solution for the above (2 Replies)
Discussion started by: Muthuraj K
2 Replies

9. Shell Programming and Scripting

awk to select rows based on condition on column

I have got a file like this 003ABC00281020091005000100042.810001 ... (8 Replies)
Discussion started by: Maruti
8 Replies

10. Shell Programming and Scripting

how to delete duplicate rows based on last column

hii i have a huge amt of data stored in a file.Here in this file i need to remove duplicates rows in such a way that the last column has different data & i must check for greatest among last colmn data & print the largest data along with other entries but just one of other duplicate entries is... (16 Replies)
Discussion started by: reva
16 Replies
Login or Register to Ask a Question