and i have a grep command like this
Where $y refers to values in the params variable and $pf contains certain values which are assigned to the values fetched from params variable.
The output through that command is
Due to which it returns me prmODBCDataSource as well as prmODBCDataSource_ATM whereas i want it to fetch it the same sequence as only
I used grep command with ^ sign something like this
But this skips some other values so both commands are not helping me.
Can anyone please help me with a proper grep command
Hi all,
I have a 10.txt file. In this file 2 words are present by name Active and Inactive and these words are repeated 7000 times. I want to take the unique 2 words from this 7000 lines.
Thanks
Mahalakshmi.A (3 Replies)
Hi,
I have a file like this:
Some_String_Here 123 123 123 321 321 321 3432 3221 557 886 321 321
I would like to find only the unique values in the files and get the following output:
Some_String_Here 123 321 3432 3221 557 886
I am trying to get this done using awk. Can someone please... (5 Replies)
Hi All,
I have a file which is having 3 columns as (string string integer)
a b 1
x y 2
p k 5
y y 4
.....
.....
Question:
I want get the unique value of column 2 in a sorted way(on column 2) and the sum of the 3rd column of the corresponding rows. e.g the above file should return the... (6 Replies)
Hi All,
i need ti delete two duplicate processss which are running on the same device type (column 1) and port ID (column 2). here is the sample data
p1sc1m1 15517 11325 0 01:00:24 ? 0:00 scagntclsx25octtcp 2967 in3v mvmp01 0 8000 N S 969 750@751@752@
p1sc1m1 15519 11325 0 01:00:24 ? ... (5 Replies)
hi,
#cat /u01/file|grep -v "^#"|cut -f2 -d: -s
i want to avoid repeated lines from the output of the above command.
Do we have anyting like unique in shell scripting.
thx (4 Replies)
Hi All,
bash-3.00$ gzgrep -i '\ ExecuteThread:' /******/******/******/******/stdout.log.txt.gz
<Jan 7, 2012 5:54:55 PM UTC> <Error> <WebLogicServer> <BEA-000337> < ExecuteThread: '414' for queue: 'weblogic.kernel.Default (self-tuning)' has been busy for "696" seconds working on the request... (4 Replies)
HOW CAN I SELECT AN UNIQUE STRING FROM A FIELD? ACTUALLY I WANT TO PRINT RECORDS THAT 2ND FIELD OF THAT HAVE ONE CHARACTER AND IT MUST BE "P"
AWK '$2~"" {PRINT $0}' IN > OUTBUT THIS CODE PRINT ALL RECORDS WHICH 2ND FIELDS OF THEM START WITH "P" AND MAY CONTAINS ANOTHER CHARACTER! (1 Reply)
Calculate the total sze of the files recursively from the current directory. Hard linked files are to be considered only once.
Please use awk also. (3 Replies)
Hi All,
I have multiple files and i need to segregate unique and duplicates into files.
Eg: /source/ -- path
abc_12092016.csv
abc_11092016.csv
abc_12092016.csv
ID,NAME,NUMBER
1,XYZ,1234
2,SDF,3456
1,XYZ,1234
abc_11092016.csv
4,RTY,7890
6,WER,5678
8,YUI,0987
6,WER,5678
in the... (1 Reply)
Hello All,
I am trying to write a script which returns me clientID,programId,userID indicated in bold from the below log files.Log file is having many such data , iam just presenting sample .
Sample Log file.
hostname 1525867288264 UA:MP:EP491418 http-nio-8080-exec-11 ERROR Get Price... (13 Replies)
Discussion started by: nextStep
13 Replies
LEARN ABOUT ULTRIX
zgrep
ZGREP(1) General Commands Manual ZGREP(1)NAME
zgrep - search possibly compressed files for a regular expression
SYNOPSIS
zgrep [ grep_options ] [ -e ] pattern filename...
DESCRIPTION
Zgrep invokes grep on compressed or gzipped files. These grep options will cause zgrep to terminate with an error code:
(-[drRzZ]|--di*|--exc*|--inc*|--rec*|--nu*). All other options specified are passed directly to grep. If no file is specified, then the
standard input is decompressed if necessary and fed to grep. Otherwise the given files are uncompressed if necessary and fed to grep.
If the GREP environment variable is set, zgrep uses it as the grep program to be invoked.
EXIT CODE
2 - An option that is not supported was specified.
AUTHOR
Charles Levert (charles@comm.polymtl.ca)
SEE ALSO grep(1), gzexe(1), gzip(1), zdiff(1), zforce(1), zmore(1), znew(1)ZGREP(1)