Sponsored Content
Top Forums Shell Programming and Scripting Remove the duplicate content in a file Post 302843251 by ashokvpp on Monday 12th of August 2013 12:51:05 PM
Old 08-12-2013
The occurrences are just random number of times.. all I need is just print the 1st occurrence of the start pattern match..

---------- Post updated at 11:51 AM ---------- Previous update was at 11:42 AM ----------

Code:
awk '!A[$0]++' RS="\n\n\n"  test.txt   # works great .

Have a good day Smilie

Last edited by Scott; 08-12-2013 at 01:52 PM.. Reason: Code tags
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove Duplicate Lines in File

I am doing KSH script to remove duplicate lines in a file. Let say the file has format below. FileA 1253-6856 3101-4011 1827-1356 1822-1157 1822-1157 1000-1410 1000-1410 1822-1231 1822-1231 3101-4011 1822-1157 1822-1231 and I want to simply it with no duplicate line as file... (5 Replies)
Discussion started by: Teh Tiack Ein
5 Replies

2. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

3. Shell Programming and Scripting

Help with remove duplicate content and only keep the first content detail

Input data_10 SSA data_2 TYUE data_3 PEOCV data_6 SSAT data_21 SSA data_19 TYUEC data_14 TYUE data_15 SSA data_32 PEOCV . . Desired Output data_10 SSA data_2 TYUE data_3 PEOCV data_6 SSAT data_19 TYUEC (9 Replies)
Discussion started by: patrick87
9 Replies

4. Shell Programming and Scripting

Formatting a file - Remove Duplicate

Hi I have a file in the following format. Basically the file contains tablename and their aliases: TABLE1 TABLE1 A TABLE2 TABLE2 B TABLE3 TABLE4 TABLE4 C TABLE4 Upon formatting an sql statement I am getting such output. Problem: Whenever a tablename appears with alias, it has... (5 Replies)
Discussion started by: freakygs
5 Replies

5. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

6. Shell Programming and Scripting

Help with remove duplicate content

Input file data_1 10 US data_1 2 US data_1 5 UK data_2 20 ENGLAND data_2 12 KOREA data_3 4 CHINA . . data_60 123 US data_60 23 UK data_60 45 US Desired output file data_1 10 US data_1 5 UK data_2 20 ENGLAND data_2 12 KOREA (2 Replies)
Discussion started by: perl_beginner
2 Replies

7. Shell Programming and Scripting

How to Remove duplicate value from file?

if different branch code is available for same BIC code and one of the branch code is XXX.only one row will be stored and with branch code as XXX .rest of the rows for the BIC code will not be stored. for example if $7 is BIC code and $8 is branch code INPUT file are following... (9 Replies)
Discussion started by: mohan sharma
9 Replies

8. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies

9. Shell Programming and Scripting

How to remove exisiting file content from a file and have to append new file content?

hi all, i had the below script x=`cat input.txt |wc -1` awk 'NR>1 && NR<'$x' ' input.txt > output.txt by using above script i am able to remove the head and tail part from the input file and able to append the output to the output.txt but if i run it for second time the output is... (2 Replies)
Discussion started by: hemanthsaikumar
2 Replies
DU(1)									FSF								     DU(1)

NAME
du - estimate file space usage SYNOPSIS
du [OPTION]... [FILE]... DESCRIPTION
Summarize disk usage of each FILE, recursively for directories. Mandatory arguments to long options are mandatory for short options too. -a, --all write counts for all files, not just directories -B, --block-size=SIZE use SIZE-byte blocks -b, --bytes print size in bytes -c, --total produce a grand total -D, --dereference-args dereference FILEs that are symbolic links -h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G) -H, --si likewise, but use powers of 1000 not 1024 -k like --block-size=1K -l, --count-links count sizes many times if hard linked -L, --dereference dereference all symbolic links -S, --separate-dirs do not include size of subdirectories -s, --summarize display only a total for each argument -x, --one-file-system skip directories on different filesystems -X FILE, --exclude-from=FILE Exclude files that match any pattern in FILE. --exclude=PATTERN Exclude files that match PATTERN. --max-depth=N print the total for a directory (or file, with --all) only if it is N or fewer levels below the command line argument; --max-depth=0 is the same as --summarize --help display this help and exit --version output version information and exit SIZE may be (or may be an integer optionally followed by) one of following: kB 1000, K 1024, MB 1,000,000, M 1,048,576, and so on for G, T, P, E, Z, Y. PATTERNS
PATTERN is a shell pattern (not a regular expression). The pattern ? matches any one character, whereas * matches any string (composed of zero, one or multiple characters). For example, *.o will match any files whose names end in .o. Therefore, the command du --exclude='*.o' will skip all files and subdirectories ending in .o (including the file .o itself). AUTHOR
Written by Torbjorn Granlund, David MacKenzie, Larry McVoy, and Paul Eggert. REPORTING BUGS
Report bugs to <bug-coreutils@gnu.org>. COPYRIGHT
Copyright (C) 2002 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICU- LAR PURPOSE. SEE ALSO
The full documentation for du is maintained as a Texinfo manual. If the info and du programs are properly installed at your site, the com- mand info du should give you access to the complete manual. du (coreutils) 4.5.3 February 2003 DU(1)
All times are GMT -4. The time now is 11:47 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy