Sponsored Content
Full Discussion: Problem with an awk Script
Top Forums Shell Programming and Scripting Problem with an awk Script Post 302418735 by panyam on Wednesday 5th of May 2010 08:17:58 AM
Old 05-05-2010
I am not sure of the requirement , but based on the input and ouput you are providing i suggest :

Code:
 
sed 's/RECORD/GRID/g' input_file > output_file

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

awk script Problem

I wrote a awk but doesnt work as expected. The Input File attached input file My awk Script /^.......*EXEC CICS /,/END-EXEC/ { if ( $0 ~ / LINK / ) { tsflag=1 } if ( $0 ~ /EXEC CICS/ && tsflag == 1 ) ... (6 Replies)
Discussion started by: pbsrinivas
6 Replies

2. Shell Programming and Scripting

Problem with one awk script

Hi , I am having a file having the contents like this file1 ##################### kite kshitij jolly admire in the wing and tell me the secret behind opus 123 and the right of the track ######################### I have to write one awk script to substitue some values with other... (6 Replies)
Discussion started by: kshitij
6 Replies

3. Shell Programming and Scripting

Problem with awk script

Hi Can anyone help me in this Problem File1 ######################### HOLI 123 AND ONE TWO THREE AMITABH SAMSUNG POLI AND TWO SENSE CRYING WING PPIN TBFLAG I B AND OROLE TB_HOT=" DCT" TB_CAT=" CAT" TC_NOT=" AND" +PIN TB=" HOT" TB_GATE=" KOT" TB_LATE=" MAT" TC=LOT MAT DAT SAT... (5 Replies)
Discussion started by: kshitij
5 Replies

4. Shell Programming and Scripting

Problem with a AWK Script

Hi I am having some contents in my file like this file1 ########################## pin (PIN1) { direction : input ; capacitance : 121 ; max_transition : 231 ; } pin (PIN2) { direction : input ; capacitance : 124 ; max_transition : 421 ;... (8 Replies)
Discussion started by: kshitij
8 Replies

5. Shell Programming and Scripting

awk script problem

Hi All, I have the following input data: That I'd like to look like this ($2 is the column I'd like it to appear in) where the entries are grouped by date: The code I have at present is: awk 'BEGIN {} { dt = $1 if (dt == dt_prev) { pp = $3 ... (7 Replies)
Discussion started by: pondlife
7 Replies

6. Shell Programming and Scripting

Problem with awk script

Hi, I have one csv file with 3 fileds like tmp1.csv 2079|2010Aug|cardilogy 2349|2010Aug|numerology 2213|2010Aug|immunlogy another csv file with code for those specialities spec.csv cardiology|CRD numerology|NMY immunology|IMY i want to replace the contents of file 1 with codes... (2 Replies)
Discussion started by: Man83Nagesh
2 Replies

7. Shell Programming and Scripting

Awk script Problem

Hi , I am having two files FILE1 and FILE2 as shown below I need to search each and every element of Coulumn1 in the FILE1 in FILE2 and Globally replace with the Corresponding element of the Column2 in the FILE2 , For example and1 which is the first element of COl 1 of the FILE1 should be... (4 Replies)
Discussion started by: jaita
4 Replies

8. Shell Programming and Scripting

problem with awk script

Hi, I have two files Hi, I have two files file1 :> Code: val="10" port="localhost:8080"httpadd="http:\\192.168.0.239" file2 :> Code: val=${val} val="pdssx" port=${port}port="1324"httpadd=${httpadd}httpadd="raamraav"fileloc=${fileloc} file3(or file2) should have following... (1 Reply)
Discussion started by: nitin.pathak
1 Replies

9. Shell Programming and Scripting

Awk Script Problem

Can someone please explain to me what is wrong with this awk script? echo 74 85 | awk '{ if ( $1 > $2 ) PRESULTS = ( $1 - $2 ); print $0,"=>","P"PRESULTS ; else if ( $1 > $2 ) NRESULTS = ( $2 - $1... (3 Replies)
Discussion started by: SkySmart
3 Replies

10. Shell Programming and Scripting

awk script problem

Hello guys i have following problem. I'm trying to copy content of one file and paste this content in all .txt files in directory, but at line 15. My script copy the content at first line, not 15. I'm confused how to do this. Thank you in advance for your help! This is my script: ARGS=2 ... (9 Replies)
Discussion started by: r00ty
9 Replies
UNIQ(1) 						    BSD General Commands Manual 						   UNIQ(1)

NAME
uniq -- report or filter out repeated lines in a file SYNOPSIS
uniq [-c | -d | -u] [-i] [-f num] [-s chars] [input_file [output_file]] DESCRIPTION
The uniq utility reads the specified input_file comparing adjacent lines, and writes a copy of each unique input line to the output_file. If input_file is a single dash ('-') or absent, the standard input is read. If output_file is absent, standard output is used for output. The second and succeeding copies of identical adjacent input lines are not written. Repeated lines in the input will not be detected if they are not adjacent, so it may be necessary to sort the files first. The following options are available: -c Precede each output line with the count of the number of times the line occurred in the input, followed by a single space. -d Only output lines that are repeated in the input. -f num Ignore the first num fields in each input line when doing comparisons. A field is a string of non-blank characters separated from adjacent fields by blanks. Field numbers are one based, i.e. the first field is field one. -s chars Ignore the first chars characters in each input line when doing comparisons. If specified in conjunction with the -f option, the first chars characters after the first num fields will be ignored. Character numbers are one based, i.e. the first character is character one. -u Only output lines that are not repeated in the input. -i Case insensitive comparison of lines. DIAGNOSTICS
The uniq utility exits 0 on success, and >0 if an error occurs. COMPATIBILITY
The historic +number and -number options have been deprecated but are still supported in this implementation. SEE ALSO
sort(1) STANDARDS
The uniq utility is expected to be IEEE Std 1003.2 (``POSIX.2'') compatible. HISTORY
A uniq command appeared in Version 3 AT&T UNIX. BSD
June 6, 1993 BSD
All times are GMT -4. The time now is 11:05 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy