Sponsored Content
Top Forums Shell Programming and Scripting Extract lines that appear twice Post 302891956 by LeftoverStew on Monday 10th of March 2014 04:49:44 AM
Old 03-10-2014
The 000* series.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Extract known lines

Hi all, I have a text file of 143 lines. The I don't want all lines but want to retain line format. How can I extract lines 34, 65, 68, 70 (plus 7 others) easliy? I have found some example head/tail n lines and some sed -n examples that have been shown for single line or mass consecutive... (2 Replies)
Discussion started by: nhatch
2 Replies

2. Shell Programming and Scripting

extract the lines

Hi, I have a text file with 15 columns and i want to extract those lines of which 7th column is ABCD. I think we can do this using awk but could not frame the command. Please help. TIA Prvn (2 Replies)
Discussion started by: prvnrk
2 Replies

3. UNIX for Dummies Questions & Answers

extract lines from data

I have a file that looks like this: >cel-miR-35 MIMAT6 C eles le UCACCGGGUGGAAACUAGCAGU >hsa-let-7a MI062 H sa UGAGGUAGUAGGUUGUAUAGUU >cel-miR-36 M007 Ca ele UCACCGGGUGAAAAUUCGCAUG >hsa-let-7b MI63 Hmo sns le UGAGGUAGUAGGUUGUGUGGUUI would like to extract all the lines that start with... (7 Replies)
Discussion started by: jdhahbi
7 Replies

4. Shell Programming and Scripting

Script to extract certain lines

Hi I have a text file with the following information: # List 1 (first header) test 1 test 2 test 3 ... # Trials (second header) round 1 run 5 ... and so on I want to create a script, which based on some criterias with return only the list of lines between the header. I... (9 Replies)
Discussion started by: nimo
9 Replies

5. Shell Programming and Scripting

Extract some lines from one file and add those lines to current file

hi, i have two files. file1.sh echo "unix" echo "linux" file2.sh echo "unix linux forums" now the output i need is $./file2.sh unix linux forums (3 Replies)
Discussion started by: snreddy_gopu
3 Replies

6. UNIX for Dummies Questions & Answers

Extract lines with specific words with addition 2 lines before and after

Dear all, Greetings. I would like to ask for your help to extract lines with specific words in addition 2 lines before and after these lines by using awk or sed. For example, the input file is: 1 ak1 abc1.0 1 ak2 abc1.0 1 ak3 abc1.0 1 ak4 abc1.0 1 ak5 abc1.1 1 ak6 abc1.1 1 ak7... (7 Replies)
Discussion started by: Amanda Low
7 Replies

7. Shell Programming and Scripting

Extract the lines between two dates

Dear experts I am new bee to scripting... Pl. help me getting the lines between two strings. I am getting the oracle sql output as below. and would like to get the time from system date compare with the date in sql output file and extract the lines. if system time is 2012-07-01 19:15:00 get... (1 Reply)
Discussion started by: nmadhuhb
1 Replies

8. Shell Programming and Scripting

Search for a pattern,extract value(s) from next line, extract lines having those extracted value(s)

I have hundreds of files to process. In each file I need to look for a pattern then extract value(s) from next line and then search for value(s) selected from point (2) in the same file at a specific position. HEADER ELECTRON TRANSPORT 18-MAR-98 1A7V TITLE CYTOCHROME... (7 Replies)
Discussion started by: AshwaniSharma09
7 Replies

9. Shell Programming and Scripting

Extract particular lines from a file

Hi all, I have a file with many records with information as given below ID A16L2_HUMAN Reviewed; 619 AA. AC Q8NAA4; A5PL30; B2RPK5; Q658V4; Q6PID3; Q8NBG0; DT 20-MAY-2008, integrated into UniProtKB/Swiss-Prot. DT 20-MAY-2008, sequence version 2. DT ... (1 Reply)
Discussion started by: kaav06
1 Replies

10. Shell Programming and Scripting

ksh sed - Extract specific lines with mulitple occurance of interesting lines

Data file example I look for primary and * to isolate the interesting slot number. slot=`sed '/^primary$/,/\*/!d' filename | tail -1 | sed s'/*//' | awk '{print $1" "$2}'` Now I want to get the Touch line for only the associate slot number, in this case, because the asterisk... (2 Replies)
Discussion started by: popeye
2 Replies
funtbl(1)							SAORD Documentation							 funtbl(1)

NAME
funtbl - extract a table from Funtools ASCII output SYNOPSIS
funtable [-c cols] [-h] [-n table] [-p prog] [-s sep] <iname> DESCRIPTION
[NB: This program has been deprecated in favor of the ASCII text processing support in funtools. You can now perform fundisp on funtools ASCII output files (specifying the table using bracket notation) to extract tables and columns.] The funtbl script extracts a specified table (without the header and comments) from a funtools ASCII output file and writes the result to the standard output. The first non-switch argument is the ASCII input file name (i.e. the saved output from funcnts, fundisp, funhist, etc.). If no filename is specified, stdin is read. The -n switch specifies which table (starting from 1) to extract. The default is to extract the first table. The -c switch is a space-delimited list of column numbers to output, e.g. -c "1 3 5" will extract the first three odd-numbered columns. The default is to extract all columns. The -s switch specifies the separator string to put between columns. The default is a single space. The -h switch specifies that column names should be added in a header line before the data is output. With- out the switch, no header is prepended. The -p program switch allows you to specify an awk-like program to run instead of the default (which is host-specific and is determined at build time). The -T switch will output the data in rdb format (i.e., with a 2-row header of column names and dashes, and with data columns separated by tabs). The -help switch will print out a message describing program usage. For example, consider the output from the following funcnts command: [sh] funcnts -sr snr.ev "ann 512 512 0 9 n=3" # source # data file: /proj/rd/data/snr.ev # arcsec/pixel: 8 # background # constant value: 0.000000 # column units # area: arcsec**2 # surf_bri: cnts/arcsec**2 # surf_err: cnts/arcsec**2 # summed background-subtracted results upto net_counts error background berror area surf_bri surf_err ---- ------------ --------- ------------ --------- --------- --------- --------- 1 147.000 12.124 0.000 0.000 1600.00 0.092 0.008 2 625.000 25.000 0.000 0.000 6976.00 0.090 0.004 3 1442.000 37.974 0.000 0.000 15936.00 0.090 0.002 # background-subtracted results reg net_counts error background berror area surf_bri surf_err ---- ------------ --------- ------------ --------- --------- --------- --------- 1 147.000 12.124 0.000 0.000 1600.00 0.092 0.008 2 478.000 21.863 0.000 0.000 5376.00 0.089 0.004 3 817.000 28.583 0.000 0.000 8960.00 0.091 0.003 # the following source and background components were used: source_region(s) ---------------- ann 512 512 0 9 n=3 reg counts pixels sumcnts sumpix ---- ------------ --------- ------------ --------- 1 147.000 25 147.000 25 2 478.000 84 625.000 109 3 817.000 140 1442.000 249 There are four tables in this output. To extract the last one, you can execute: [sh] funcnts -s snr.ev "ann 512 512 0 9 n=3" | funtbl -n 4 1 147.000 25 147.000 25 2 478.000 84 625.000 109 3 817.000 140 1442.000 249 Note that the output has been re-formatted so that only a single space separates each column, with no extraneous header or comment informa- tion. To extract only columns 1,2, and 4 from the last example (but with a header prepended and tabs between columns), you can execute: [sh] funcnts -s snr.ev "ann 512 512 0 9 n=3" | funtbl -c "1 2 4" -h -n 4 -s " " #reg counts sumcnts 1 147.000 147.000 2 478.000 625.000 3 817.000 1442.000 Of course, if the output has previously been saved in a file named foo.out, the same result can be obtained by executing: [sh] funtbl -c "1 2 4" -h -n 4 -s " " foo.out #reg counts sumcnts 1 147.000 147.000 2 478.000 625.000 3 817.000 1442.000 SEE ALSO
See funtools(7) for a list of Funtools help pages version 1.4.2 January 2, 2008 funtbl(1)
All times are GMT -4. The time now is 02:06 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy