Delete repeated rows from a file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Delete repeated rows from a file
# 1  
Old 09-24-2007
Error Delete repeated rows from a file

Hi everybody:
Could anybody tell me how I can delete repeated rows from a file?, this is, for exemple I have a file like this:

0.490 958.73 281.85 6.67985 0.002481
0.490 954.833 283.991 8.73019 0.002471
0.590 950.504 286.241 6.61451 0.002461
0.690 939.323 286.112 6.16451 0.00246
0.790 928.17 285.71 5.87057 0.002451
0.890 917.196 285.503 5.6777 0.002441
0.990 906.277 284.498 5.46275 0.00244
1.090 895.529 283.818 5.43785 0.002431
1.190 884.757 283.098 5.36579 0.002421
1.290 874.22 282.2 5.33933 0.00242
1.390 863.667 281.35 5.01376 0.002411
1.490 853.3 280.55 4.61738 0.00241
1.590 842.962 279.95 4.27487 0.002401
1.690 832.775 279.362 3.77744 0.002391
1.790 822.634 278.532 3.78002 0.00239
1.890 812.608 277.625 3.98339 0.002381
1.990 802.735 276.995 4.17061 0.00238
2.090 792.845 276.65 4.77151 0.002389
..
..

in this case I only would like this:

0.490 958.73 281.85 6.67985 0.002481
0.590 950.504 286.241 6.61451 0.002461
0.690 939.323 286.112 6.16451 0.00246
0.790 928.17 285.71 5.87057 0.002451
0.890 917.196 285.503 5.6777 0.002441
0.990 906.277 284.498 5.46275 0.00244
1.090 895.529 283.818 5.43785 0.002431
1.190 884.757 283.098 5.36579 0.002421
1.290 874.22 282.2 5.33933 0.00242
1.390 863.667 281.35 5.01376 0.002411
1.490 853.3 280.55 4.61738 0.00241
1.590 842.962 279.95 4.27487 0.002401
1.690 832.775 279.362 3.77744 0.002391
1.790 822.634 278.532 3.78002 0.00239
1.890 812.608 277.625 3.98339 0.002381
1.990 802.735 276.995 4.17061 0.00238
2.090 792.845 276.65 4.77151 0.002389
..
..

Note that the pattern that it repeat is $1 and I would like the first value that appear.

Thanks a lot and cheers . Smilie
tonet
# 2  
Old 09-24-2007
[jsaikia] ~/prac/ $ cat file | awk '{print $1}' | sort | uniq | while read firstf; do awk '$1=="'"$firstf"'"' file | sed '1!d' ; done

0.490 958.73 281.85 6.67985 0.002481
0.590 950.504 286.241 6.61451 0.002461
0.690 939.323 286.112 6.16451 0.00246
0.790 928.17 285.71 5.87057 0.002451
0.890 917.196 285.503 5.6777 0.002441
0.990 906.277 284.498 5.46275 0.00244
1.090 895.529 283.818 5.43785 0.002431
1.190 884.757 283.098 5.36579 0.002421
1.290 874.22 282.2 5.33933 0.00242
1.390 863.667 281.35 5.01376 0.002411
1.490 853.3 280.55 4.61738 0.00241
1.590 842.962 279.95 4.27487 0.002401
1.690 832.775 279.362 3.77744 0.002391
1.790 822.634 278.532 3.78002 0.00239
1.890 812.608 277.625 3.98339 0.002381
1.990 802.735 276.995 4.17061 0.00238
2.090 792.845 276.65 4.77151 0.002389
# 3  
Old 09-24-2007
cat Input_file|sort -ruk 1n
# 4  
Old 09-24-2007
Quote:
Originally Posted by pbsrinivas
cat Input_file|sort -ruk 1n
Ya this is really a good one, I never knew about this, thanks :-)
# 5  
Old 09-24-2007
GNU awk:
Code:
awk '{  a[$1]=$0 } 
        END{
                n = asort(a)
                for (i=1;i<=n;i++) print a[i] 
        }
' "file"

Quote:
Originally Posted by pbsrinivas
cat Input_file|sort -ruk 1n
no need for cat

Quote:
Originally Posted by jaduks
Code:
cat file | awk '{print $1}' | sort | uniq | while read firstf; do awk '$1=="'"$firstf"'"' file | sed '1!d' ; done

there's no need to go to such extent.
# 6  
Old 04-08-2008
Hi all
now my script is ok.
But I have a problem.
When I run the sed command:
sed -f /tmp/delete EVDO_A12.users

Everthing that changed only display on the screen. The input file EVDO_A12.users didn't change any thing. Which problem ?
# 7  
Old 04-08-2008
You redirect to another file, then move that file on top of the old file. Read a basic Unix book.

Code:
command oldfile>newfile; mv newfile oldfile

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Transposing Repeated Rows to Columns.

I have 1000s of these rows that I would like to transpose to columns. However I would like the transpose every 3 consecutive rows to columns like below, sorted by column 3 and provide a total for each occurrences. Finally I would like a grand total of column 3. 21|FE|41|0B 50\65\78 15... (2 Replies)
Discussion started by: ravzter
2 Replies

2. Shell Programming and Scripting

Converted repeated rows into splitted columns

Dear Friends, I have an input file contains lot of datas, which is like repaeated rows report. The output file need to have column wise report, rather than row-wise. Input File random line 1 random line 2 random line 3 ------------------------------------- Start line 1.1 (9.9) ... (1 Reply)
Discussion started by: vasanth.vadalur
1 Replies

3. Shell Programming and Scripting

Delete rows in text file

Hi I do have a text file with 1000's of lines with 1 row and column with a specific pattern. 1102 1 1 1 1 1234 1 1 1 1 1009 1 1 1 1 1056 1 (3 Replies)
Discussion started by: Lucky Ali
3 Replies

4. UNIX for Advanced & Expert Users

Delete rows from a file...!!

Say i have a file with X rows and Y columns....i see that in some of the rows,some columns are blank (no value set)...i wish to delete such rows....how can it be done? e.g 181766 100 2009-06-04 184443 2009-06-04 10962 151 2009-06-04 161 2009-06-04... (7 Replies)
Discussion started by: ak835
7 Replies

5. Shell Programming and Scripting

delete rows in a file based on the rows of another file

I need to delete rows based on the number of lines in a different file, I have a piece of code with me working but when I merge with my C application, it doesnt work. sed '1,'\"`wc -l < /tmp/fileyyyy`\"'d' /tmp/fileA > /tmp/filexxxx Can anyone give me an alternate solution for the above (2 Replies)
Discussion started by: Muthuraj K
2 Replies

6. Shell Programming and Scripting

[HELP] - Delete rows on a CSV file

Hello to all members, I am very new in unix stuff (shell scripting), but a want to learn a lot. I am a ex windows user but now i am absolutely Linux super user... :D So i am tryng to made a function to do this: I have two csv files only with numbers, the first one a have: 1 2 3 4 5... (6 Replies)
Discussion started by: Sadarrab
6 Replies

7. Shell Programming and Scripting

Delete repeated word in text file

Hi expert, I am using C shell. And i trying to delete repeated word. Example file.txt: BLUE YELLOW RED VIOLET RED RED BLUE WHITE YELLOW BLACK and i wan store the output into a new file: BLUE (6 Replies)
Discussion started by: vincyoxy
6 Replies

8. UNIX for Dummies Questions & Answers

Delete repeated nos in a file

Hi, I need to delete repeated nos in a file and finally list the count. Can some one assist me? file: 12345 12345 56345 12345 23896 Output needed: 12345 56345 23896 Total count:3 Thanks (2 Replies)
Discussion started by: gini
2 Replies

9. Shell Programming and Scripting

how to delete duplicate rows in a file

I have a file content like below. "0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","","" "0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","","" "0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","",""... (5 Replies)
Discussion started by: vamshikrishnab
5 Replies

10. Shell Programming and Scripting

How to delete particular rows from a file

Hi I have a file having 1000 rows. Now I would like to remove 10 rows from it. Plz give me the script. Eg: input file like 4 1 4500.0 1 5 1 1.0 30 6 1 1.0 4500 7 1 4.0 730 7 2 500000.0 730 8 1 785460.0 45 8 7 94255.0 30 9 1 31800.0 30 9 4 36000.0 30 10 1 15000.0 30... (5 Replies)
Discussion started by: suresh3566
5 Replies
Login or Register to Ask a Question