Find line number of bad data in large file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Find line number of bad data in large file
# 1  
Old 01-14-2011
Find line number of bad data in large file

Hi Forum.

I was trying to search the following scenario on the forum but was not able to.

Let's say that I have a very large file that has some bad data in it (for ex: 0.0015 in the 12th column) and I would like to find the line number and remove that particular line.

What's the easiest way to do so?

For a smaller file, I could use the "vi editor", edit the file, search for the bad data in the specific column and then just delete the row.

But for a larger file where using the "vi editor" is out of the question.

Cannot really use the grep -v "0.0015" option since "0.0015" value could be valid for other rows and which is not in the 12th column.

I do not know what is the line number where the bad data resides.

Thanks.
# 2  
Old 01-14-2011
Tools See a previous posting

Take a look at this similar request:
https://www.unix.com/shell-programmin...ect-lines.html
# 3  
Old 01-14-2011
Assuming your columns are space/tab separated
Code:
nawk '$12 != 0.0015' myFile

# 4  
Old 01-14-2011
Code:
awk '$12 != 0.0015' inputFile

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Find Data in test file and write each out to a line

I have a .csv file that has been create from a google form and I need to extract the data from it that has been entered by users. The CSV will have anywhere between 100 and 1000 lines which comprise entr data for a sports carnival A few typical line is shown here to show the problem I have ... (19 Replies)
Discussion started by: kcpoole
19 Replies

2. Shell Programming and Scripting

Reoccuring peak values in large data file and print the line..

Hi i have some large data files that contain several fields and rows the data in a field have a numeric value that is in a sine wave pattern what i would like todo is locate each peak and pick the highest value and print that complete line. the data looks something like this it is field nr4 which... (4 Replies)
Discussion started by: ninjaunx
4 Replies

3. Shell Programming and Scripting

awk to find number in a field then print the line and the number

Hi I want to use awk to match where field 3 contains a number within string - then print the line and just the number as a new field. The source file is pipe delimited and looks something like 1|net|ABC Letr1|1530||| 1|net|EXP_1040 ABC|1121||| 1|net|EXP_TG1224|1122||| 1|net|R_North|1123|||... (5 Replies)
Discussion started by: Mudshark
5 Replies

4. Shell Programming and Scripting

How to remove a subset of data from a large dataset based on values on one line

Hello. I was wondering if anyone could help. I have a file containing a large table in the format: marker1 marker2 marker3 marker4 position1 position2 position3 position4 genotype1 genotype2 genotype3 genotype4 with marker being a name, position a numeric... (2 Replies)
Discussion started by: davegen
2 Replies

5. Shell Programming and Scripting

find string nth occurrence in file and print line number

Hi I have requirement to find nth occurrence in a file and capture data from with in lines (between lines) Data in File. <QUOTE> <SESSION> <ATTRIBUTE NAME='Parameter Filename' VALUE='file1.parm'/> <ATTRIBUTE NAME='Service Name' VALUE='None'/> </SESSION> <SESSION> <ATTRIBUTE... (6 Replies)
Discussion started by: tmalik79
6 Replies

6. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

7. Programming

Suitable data structure large number of heterogeneous records

Hi All, I don't need any code for this just some advice. I have a large collection of heterogeneous data (about 1.3 million) which simply means data of different types like float, long double, string, ints. I have built a linked list for it and stored all the different data types in a structure,... (5 Replies)
Discussion started by: shoaibjameel123
5 Replies

8. Solaris

Bad File Number

System Solaris 8 When I open a CONSOLE window the following starts scrolling: "ServiceCommand: :write: Bad FIle Number" This will continue to scroll without stopping. However, you can type while it is scrolling and login into root and even conduct business within the CONSOLE window. The... (1 Reply)
Discussion started by: Kevin1166
1 Replies

9. Shell Programming and Scripting

how to get the data from line number 1 to line number 100 of a file

Hi Everybody, I am trying to write a script that will get some perticuler data from a file and redirect to a file. My Question is, I have a Very huge file,In that file I have my required data is started from 25th line and it will ends in 100th line. I know the line numbers, I need to get all... (9 Replies)
Discussion started by: Anji
9 Replies

10. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies
Login or Register to Ask a Question