Sponsored Content
Top Forums Shell Programming and Scripting Held req: Awk - remove non-alpha line Post 302456844 by Grueben on Sunday 26th of September 2010 10:44:56 AM
Old 09-26-2010
Cheers Scottn. The file looks like:
Code:
18:02:00 JOB02084722
18:06:00
18:09:00 2010120942
18:12:04 JOB02084723
18:34:16 20100709

Basically all I want from it are the lines where the 2nd field starts with 'JOB'. All other lines can be deleted. Appreciate your help

Last edited by Franklin52; 09-27-2010 at 03:28 AM.. Reason: Please use code tags!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

(Req)Shell script req

HI All Im new to shell scripting...kindly plz help me... I need a shell script for: We have to take export of all db's on daily basis from all svr's and keep these export backups on diffrent server. Plz help. Regards Gaurav (5 Replies)
Discussion started by: ergauravtaneja
5 Replies

2. Shell Programming and Scripting

awk script to remove duplicate rows in line

i have the long file more than one ns and www and mx in the line like . i need the first ns record and first www and first mx from line . the records are seperated with tthe ; i am try ing in awk scripting not getiing the solution. ... (4 Replies)
Discussion started by: kiranmosarla
4 Replies

3. UNIX for Dummies Questions & Answers

Req 1-liner for Awk, et al to find str position

Hi, I'm trying to find the position of a series of numbers within a large text file. The numbers are separated by spaces. This works fine: type Huge_File.txt | gawk "{print index($0,"255")}" But this does not: type Huge_File.txt | gawk "{print index($0,"188 028 239 160 016 190 137... (4 Replies)
Discussion started by: Lemming42
4 Replies

4. Shell Programming and Scripting

AWK: Remove spaces before processing each line?

Hi, all I have a file containing the following data: name: PRODUCT_1 date: 2010-01-07 really_long_name: PRODUCT_ABCDEFG I want to get the date (it is "2010-01-07" here), I could use the following code to do that: awk... (6 Replies)
Discussion started by: kevintse
6 Replies

5. Shell Programming and Scripting

awk : Remove column1 and last column in a line

Hi All, How to remove col1 and last column in a line. Please suggest some awk stuffs. Input col1 col2 col3 col4 col1 col2 col3 col4 col5 col1 col2 col3 col4 col1 col2 col3 Output Processing col2 col3 ... Processing col2 col3 col4 ... Processing col2 col3 ... Processing... (5 Replies)
Discussion started by: k_manimuthu
5 Replies

6. Shell Programming and Scripting

awk remove line feed

Hi, I've this file: 1, 2, 3, 4, 5, 6, I need to remove the line feed LF every 3 row. 1,2,3, 4,5,6, Thanks in advance, Alfredo (5 Replies)
Discussion started by: alfreale
5 Replies

7. Shell Programming and Scripting

awk to search for pattern and remove line

I am an awk beginner and need help figuring out how to search for a number in the first column and if it (or anything greater) exists, remove those lines. AM11400012012 2.26 2.12 1.98 2.52 3.53 3.01 3.62 5.00 3.65 7.95 0.79 3.88 0.00 AM11400012013 3.39 2.29 ... (1 Reply)
Discussion started by: ncwxpanther
1 Replies

8. Shell Programming and Scripting

awk remove line

I would like to remove lines with certain pattern but only Estimate: and Realised: in USD and Date: shall be output. The order of the currency are mixed. Output I failed on awk with sub, gensub and was not able to remove the multiple entry on the * Date: (2 Replies)
Discussion started by: sdf
2 Replies

9. Shell Programming and Scripting

Remove line based on condition in awk

In the following tab-delimited input, I am checking $7 for the keyword intronic. If that keyword is found then $2 is split by the . in each line and if the string after the digits or the +/- is >10, then that line is deleted. This will always be the case for intronic. If $7 is exonic then nothing... (10 Replies)
Discussion started by: cmccabe
10 Replies

10. Shell Programming and Scripting

awk to remove line if field has symbols in it

Trying to use awk to remove a line only if $1 contains either ; or :. Thje awk below runs but no lines are removed. Thank you :). awk awk '$1 !~ /;/ || $1 !~ /:/ { print }' file file AARS2;TMEM151B 1 AASS 2 ABAT 3 ABCA1 3 ABCA10 1 ABCA12 2 ABCA13 1 ABCA13:AX746840 2 ABCA2 5 (5 Replies)
Discussion started by: cmccabe
5 Replies
PAPS(1) 						      General Commands Manual							   PAPS(1)

NAME
paps - UTF-8 to PostScript converter using Pango SYNOPSIS
paps [options] files... DESCRIPTION
paps reads a UTF-8 encoded file and generates a PostScript language rendering of the file. The rendering is done by creating outline curves through the pango ft2 backend. OPTIONS
These programs follow the usual GNU command line syntax, with long options starting with two dashes (`-'). A summary of options is included below. --landscape Landscape output. Default is portrait. --columns=cl Number of columns output. Default is 1. --font=desc Set the font description. Default is Monospace 12. --rtl Do rtl layout. --paper ps Choose paper size. Known paper sizes are legal, letter, a4. Default is A4. --bottom-margin=bm Set bottom margin in postscript points (1/72 inch). Default is 36. --top-margin=tm Set top margin. Default is 36. --left-margin=lm Set left margin. Default is 36. --right-margin=rm Set right margin. Default is 36. --help Show summary of options. --header Draw page header for each page. --markup Interpret the text as pango markup. --encoding=ENCODING Assume the documentation encoding is ENCODING. --lpi Set the lines per inch. This determines the line spacing. --cpi Set the characters per inch. This is an alternative method of specifying the font size. --stretch-chars Indicates that characters should be stretched in the y-direction to fill up their vertical space. This is similar to the texttops behaviour. AUTHOR
paps was written by Dov Grobgeld <dov.grobgeld@gmail.com>. This manual page was written by Lior Kaplan <kaplan@debian.org>, for the Debian project (but may be used by others). April 17, 2006 PAPS(1)
All times are GMT -4. The time now is 06:58 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy