Sponsored Content
Top Forums Shell Programming and Scripting using awk to get specific section of lines in logs Post 302672945 by CarloM on Tuesday 17th of July 2012 09:46:58 AM
Old 07-17-2012
What do you want to do with it? Output just the datim, or check for lines that contain a specified date?
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Printing lines with specific awk NF

I have this files: ./frm/lf_mt1_cd.Ic_cell_template.attr ./die/addgen_tb_pumd.Ic_cell_template.attr ./min_m1_n.Ic_cell_template.attr When I use: awk -F\/ '{print NF}' Would result to: 3 3 2 I would like to list the files with 3 fields on it. Any Suggestions? (1 Reply)
Discussion started by: jehrome_rando
1 Replies

2. Shell Programming and Scripting

How to print specific lines with awk

Hi! How can I print out a specific range of rows, like "cat file | awk NR==5,NR==9", but in the END-statement? I have a small awk-script that finds specific rows in a file and saves the line number in an array, like this: awk ' BEGIN { count=0} /ZZZZ/ { list=NR ... (10 Replies)
Discussion started by: Bugenhagen
10 Replies

3. Shell Programming and Scripting

How to read a specific section and modify within

Hi, I am n00b to shell scripting and I am learning Ksh, sed and awk. I have a requirement and need your help. 1) How to read a specific section of a file. I have a file and I want to read the contents between say "Page Number:1" to "End of Page 1" 2) Within the section of the file that was... (2 Replies)
Discussion started by: kn.naresh
2 Replies

4. Shell Programming and Scripting

Sed or Awk to remove specific lines

I have searched the forum for this - forgive me if I missed a previous post. I have the following file: blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah alter table "informix".esc_acct add constraint (foreign key (fi_id) references "informix".fi ... (5 Replies)
Discussion started by: Shoeless_Mike
5 Replies

5. Shell Programming and Scripting

awk : deleting specific incorrect lines

Hello friends, I searched in forums for similar threads but what I want is to have a single awk code to perform followings; I have a big log file going like this; ... 7450494 1724465 -47 003A98B710C0 7450492 1724461 -69 003A98B710C0 7450488 1724459 001DA1915B70 trafo_14:3 7450482... (5 Replies)
Discussion started by: enes71
5 Replies

6. Shell Programming and Scripting

Summing over specific lines and replacing the lines with the sum using sed, awk

Hi friends, This is sed & awk type question. I have a text file which has numbers spread all over the file. I want to sum the series of numbers whenever i find it and produce an output file with the sum. For example ###start of input text file #### abc def ghi 1 2 3 4 kjld random... (3 Replies)
Discussion started by: kaaliakahn
3 Replies

7. Shell Programming and Scripting

Adding a lines to specific section of the file.

Hello, I have to a add 2 lines to /etc/sudoers file under this section below, can someone please suggest script to add these two lines when execute this remotely on to a multiple servers. before ## Allow root to run any commands anywhere root ALL=(ALL) ALL After ## Allow root... (2 Replies)
Discussion started by: bobby320
2 Replies

8. Shell Programming and Scripting

How to grep logs for errors and receive specific additional lines?

Hi there, I have a script that I've used to find errors in my Minecraft Server logs. But I'd like to refine that script to be more useful. Here is the script: grep -n "SEVERE" /minecraft/server.log | awk -F":" '{print $1-2 "," $1+10 "p"}' | xargs -t -i sed -n {} /minecraft/server.log >>... (15 Replies)
Discussion started by: nbsparks
15 Replies

9. Shell Programming and Scripting

awk to add value and text to specific lines

In the awk I have a very large tab-delimeted file that I am trying to extract the DP= value put it in $16 and add specific text to $16 with . (dot) in $11-$15 and $18. Only the lines (there are several) that have the formating below in file will have an empty $16. Other lines will be in a... (6 Replies)
Discussion started by: cmccabe
6 Replies
SDIFF(1)							   User Commands							  SDIFF(1)

NAME
sdiff - side-by-side merge of file differences SYNOPSIS
sdiff [OPTION]... FILE1 FILE2 DESCRIPTION
Side-by-side merge of file differences. -o FILE --output=FILE Operate interactively, sending output to FILE. -i --ignore-case Consider upper- and lower-case to be the same. -E --ignore-tab-expansion Ignore changes due to tab expansion. -b --ignore-space-change Ignore changes in the amount of white space. -W --ignore-all-space Ignore all white space. -B --ignore-blank-lines Ignore changes whose lines are all blank. -I RE --ignore-matching-lines=RE Ignore changes whose lines all match RE. --strip-trailing-cr Strip trailing carriage return on input. -a --text Treat all files as text. -w NUM --width=NUM Output at most NUM (default 130) columns per line. -l --left-column Output only the left column of common lines. -s --suppress-common-lines Do not output common lines. -t --expand-tabs Expand tabs to spaces in output. -d --minimal Try hard to find a smaller set of changes. -H --speed-large-files Assume large files and many scattered small changes. --diff-program=PROGRAM Use PROGRAM to compare files. -v --version Output version info. --help Output this help. If a FILE is `-', read standard input. AUTHOR
Written by Thomas Lord. REPORTING BUGS
Report bugs to <bug-gnu-utils@gnu.org>. COPYRIGHT
Copyright (C) 2002 Free Software Foundation, Inc. This program comes with NO WARRANTY, to the extent permitted by law. You may redistribute copies of this program under the terms of the GNU General Public License. For more information about these matters, see the file named COPYING. SEE ALSO
The full documentation for sdiff is maintained as a Texinfo manual. If the info and sdiff programs are properly installed at your site, the command info diff should give you access to the complete manual. diffutils 2.8.1 April 2002 SDIFF(1)
All times are GMT -4. The time now is 10:27 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy