After googling I found script for removing repeated line in a file.
Basically regular expression are used to remove repeated line now Point is am I am not getting clear idea about these how they are working....
The first three lines disregard (and do **not** print) blank lines and trim non-blank lines.
The fourth line is just another way of writing Klashxx's one-liner.
Try both of them with sample data and it should make things clearer.
Hi,
could someone help me on this i want to remove line from /etc/vfstab in the system how to do that
it is rite now like this
/dev/vx/dsk/appdg1/mytestvol /dev/vx/rdsk/appdg1/mytestvol /mytest vxfs 3 no largefiles
/dev/vx/dsk/appdg1/mytestvol1 ... (2 Replies)
HI,
Can anyone help me with a script.
i/p
calc 1 2 3 4 5 6 7 8
calc 4 5 6 calc 7 8 9
o/p
calc 1 2 3 4 5 6 7 8
calc 4 5 6
i.e remove anything after where two times the string calc is found.
thanks (3 Replies)
Hi all,
I want to remove the remove bracket sign ( ) and put in the separate column I also want to remove the repeated entry like in first row in below input (PA156) is repeated
ESR1 (PA156) leflunomide (PA450192) (PA156) leflunomide (PA450192)
CHST3 (PA26503) docetaxel... (2 Replies)
Hello, ksh on Sun5.8 here. I have a pipe-delimited, variable length record file with sub-segments identified with a tilda that we receive from a source outside of our control. The records are huge, and Perl seems to be the only shell that can handle the huge lines. I am new to Perl, and am... (8 Replies)
Hi, i need to read a line from a file and count the number of times it appear in, then continuous to the second line with the same. So when i count a line i have to remove all duplicates in the file to not count it another time.
while read line
do
n=$(grep -c $line File)
print "$line... (5 Replies)
Hi
I have stored a command output in an array like below
@a = `xyz`;
actually xyz comnad will give the output like this
tracker
date
xxxxxxx
xxxxxxx
---------------------
1 a
2 b
----------------------
i have stored the "xyz" output to an... (3 Replies)
Hi below is the input file, i need to find repeated words and sum up the values of it which is second field from the repeated work.Im trying but getting no where close to it.Kindly give me a hint on how to go about it
Input
fruits,apple,20,fruits,mango,20,veg,carrot,12,veg,raddish,30... (11 Replies)
Hi,
I have this text file with these words and I need help with removing words with repeated letter from these lines.
1 ama
5 bib
29 bob
2 bub
5 civic
2 dad
10 deed
1 denned
335 did
1 eeee
1 eeeee
2 eke
8... (4 Replies)
Remove duplicate lines which has been repeated 4 times attached test.txt
below command tried and not getting expect output.
for i in `cat test.txt | uniq`
do
num=`cat test.txt | grep $i | wc -l`
echo $i $num
done
test.txt
... (17 Replies)
Discussion started by: Kalia
17 Replies
LEARN ABOUT BSD
uniq
UNIQ(1) General Commands Manual UNIQ(1)NAME
uniq - report repeated lines in a file
SYNOPSIS
uniq [ -udc [ +n ] [ -n ] ] [ input [ output ] ]
DESCRIPTION
Uniq reads the input file comparing adjacent lines. In the normal case, the second and succeeding copies of repeated lines are removed;
the remainder is written on the output file. Note that repeated lines must be adjacent in order to be found; see sort(1). If the -u flag
is used, just the lines that are not repeated in the original file are output. The -d option specifies that one copy of just the repeated
lines is to be written. The normal mode output is the union of the -u and -d mode outputs.
The -c option supersedes -u and -d and generates an output report in default style but with each line preceded by a count of the number of
times it occurred.
The n arguments specify skipping an initial portion of each line in the comparison:
-n The first n fields together with any blanks before each are ignored. A field is defined as a string of non-space, non-tab charac-
ters separated by tabs and spaces from its neighbors.
+n The first n characters are ignored. Fields are skipped before characters.
SEE ALSO sort(1), comm(1)7th Edition April 29, 1985 UNIQ(1)