Sponsored Content
Top Forums Shell Programming and Scripting Print ALL lines except if field is 999 Post 302923986 by RudiC on Wednesday 5th of November 2014 02:31:51 PM
Old 11-05-2014
@Aia: That would suppress lines with 1999 or 99921234 as well... well, the first, unedited version ...
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Print lines where there's no indent on the first field

Hi All, I need a code to print those lines where there's NO indents on the 1st field Example shown below. I tried to use the below codes but i am not able to see the expected result. Can any expert give any advise ? My Code cat filename| awk '$1 ~ /^+$/ {print $0}' Input 1199 ... (7 Replies)
Discussion started by: Raynon
7 Replies

2. Shell Programming and Scripting

print running field average for a set of lines

Hi everyone, I have a program that generates logs that contains sections like this: IMAGE INPUT 81 0 0.995 2449470 0 1726 368 1 0.0635 0.3291 82 0 1.001 2448013 0 1666 365 1 0.0649 0.3235 83 0 1.009 2444822 0 1697 371 1 ... (3 Replies)
Discussion started by: euval
3 Replies

3. Shell Programming and Scripting

Compare Tab Separated Field with AWK to all and print lines of unique fields.

Hi. I have a tab separated file that has a couple nearly identical lines. When doing: sort file | uniq > file.new It passes through the nearly identical lines because, well, they still are unique. a) I want to look only at field x for uniqueness and if the content in field x is the... (1 Reply)
Discussion started by: rocket_dog
1 Replies

4. Shell Programming and Scripting

Using awk, print all the lines where field 8 is equal to x

Using awk, print all the lines where field 8 is equal to x I really did try, but this awk thing is really hard to figure out. file1.txt"Georgia","Atlanta","2011-11-02","x","","","","" "California","Los Angeles","2011-11-03","x","","","",""... (2 Replies)
Discussion started by: charles33
2 Replies

5. Shell Programming and Scripting

Awk: print lines with one of multiple pattern in the same field (column)

Hi all, I am new to using awk and am quickly discovering what a powerful pattern-recognition tool it is. However, I have what seems like a fairly basic task that I just can't figure out how to perform in one line. I want awk to find and print all the lines in which one of multiple patterns (e.g.... (8 Replies)
Discussion started by: elgo4
8 Replies

6. Shell Programming and Scripting

Command/script to match a field and print the next field of each line in a file.

Hello, I have a text file in the below format: Source Destination State Lag Status CQA02W2K12pl:D:\CAQA ... (10 Replies)
Discussion started by: pocodot
10 Replies

7. Shell Programming and Scripting

Print field after pattern in all lines

data: hello--hello1--hello2--#growncars#vello--hello3--hello4--jello#growncars#dello--gello--gelloA--gelloB#growncars# I want to be able to print all the values that are found between the patterns "#growncars#" and the next "#growncars#" on the same line. so the output should be: ... (8 Replies)
Discussion started by: SkySmart
8 Replies

8. Shell Programming and Scripting

awk to print lines based on text in field and value in two additional fields

In the awk below I am trying to print the entire line, along with the header row, if $2 is SNV or MNV or INDEL. If that condition is met or is true, and $3 is less than or equal to 0.05, then in $7 the sub pattern :GMAF= is found and the value after the = sign is checked. If that value is less than... (0 Replies)
Discussion started by: cmccabe
0 Replies

9. UNIX for Beginners Questions & Answers

Print lines based upon unique values in Nth field

For some reason I am having difficulty performing what should be a fairly easy task. I would like to print lines of a file that have a unique value in the first field. For example, I have a large data-set with the following excerpt: PS003,001 MZMWR/ L-DWD// * PS003,001... (4 Replies)
Discussion started by: jvoot
4 Replies

10. UNIX for Beginners Questions & Answers

awk - If field value of consecutive records are the identical print portion of lines

I have some data that looks like this: PXD= ZW< 1,6 QR> QRJ== 1,2(5) QR> QRJ== 4,1(2) QR> QRJ== 4,2 QRB= QRB 4,2 QWM QWM 6,2 R<C ZW< 11,2 R<H= R<J= 6,1 R>H XZJ= 1,2(2) R>H XZJ= 2,6(2) R>H XZJ= 4,1(2) R>H XZJ= 6,2 RDP RDP 1,2 What I would like to do is if fields $1 and $2 are... (5 Replies)
Discussion started by: jvoot
5 Replies
COMM(1) 							   User Commands							   COMM(1)

NAME
comm - compare two sorted files line by line SYNOPSIS
comm [OPTION]... FILE1 FILE2 DESCRIPTION
Compare sorted files FILE1 and FILE2 line by line. With no options, produce three-column output. Column one contains lines unique to FILE1, column two contains lines unique to FILE2, and column three contains lines common to both files. -1 suppress lines unique to FILE1 -2 suppress lines unique to FILE2 -3 suppress lines that appear in both files --check-order check that the input is correctly sorted, even if all input lines are pairable --nocheck-order do not check that the input is correctly sorted --output-delimiter=STR separate columns with STR --help display this help and exit --version output version information and exit AUTHOR
Written by Richard M. Stallman and David MacKenzie. REPORTING BUGS
Report comm bugs to bug-coreutils@gnu.org GNU coreutils home page: <http://www.gnu.org/software/coreutils/> General help using GNU software: <http://www.gnu.org/gethelp/> COPYRIGHT
Copyright (C) 2009 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>. This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. SEE ALSO
The full documentation for comm is maintained as a Texinfo manual. If the info and comm programs are properly installed at your site, the command info coreutils 'comm invocation' should give you access to the complete manual. GNU coreutils 7.1 July 2010 COMM(1)
All times are GMT -4. The time now is 04:10 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy