Sponsored Content
Top Forums UNIX for Beginners Questions & Answers Advise to print lines before and after patterh match and checking and removing duplicate files Post 303046173 by newbie_01 on Sunday 26th of April 2020 10:58:51 PM
Old 04-26-2020
Hi Jim,


The grep works in Linux but not in Solaris. Sorry, forgot to mention, OS is SunOS <hostname> 5.11 11.3 sun4v sparc sun4v


Yeah, the code below works and files.tmp did has the list of files with their checksum, I only need to retain one of the files. Trying to work out how to sort the output AND retain just the lowest numbered file.



Code:
cd /path/to/logs

grep -l "CORRUPTION DETECTED" *.log  |
while read fname
do
   cksum $fname
done | sort -n -k1 > files.tmp
# files.tmp has a sorted list of files - by checksum




Code:
$: cat files.tmp
1237008222      10664   log.10
1237008222      10664   log.12
1237008222      10664   log.14
1237008222      10664   log.16
1237008222      10664   log.18
1237008222      10664   log.2
1237008222      10664   log.4
1237008222      10664   log.6
1237008222      10664   log.8
2296620157      10696   log.1
2296620157      10696   log.11
2296620157      10696   log.13
2296620157      10696   log.15
2296620157      10696   log.17
2296620157      10696   log.3
2296620157      10696   log.5
2296620157      10696   log.7
2296620157      10696   log.9


So from the list above, I will only want to retain log.1 and log.2, so kinda like group the output list above by checksum and retain the lowest number named file. Googling at the moment if there is an easier of deleting from the files.tmp list besides how am doing it below:


Code:
#!/bin/ksh
#

awk '{ print $1 }' files.tmp | sort | uniq > tmp.00

while read checksum
do
   grep "^$checksum" files.tmp | sort | sort -n -t. -k2 | awk 'NR>1 { print $3 }' | xargs rm
done < tmp.00


BTW, what is the code here below. I think there is something missing here, is oldfile supposedly the script that does the checksum and then I run the code below?



Code:
oldsum=0
oldfile
ls logfile* | 
while read sum size name
do
   if [  "$sum" -eq $oldsum ] ; then
      echo "$oldname and $name are duplicates"
      # put a rm command here after you see this work correctly for you
      # assuming you delete the second file name
      continue
   fi
   oldname=$name
   oldsum=$sum
done

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Removing duplicate lines ignore case

hi, I have the following input in file: abc ab a AB b c a C B When I use uniq -u file,the out put file is: abc ab AB c v B C (17 Replies)
Discussion started by: hellsd
17 Replies

2. UNIX for Dummies Questions & Answers

removing duplicate lines from a file

Hi, I am trying to remove duplicate lines from a file. For example the contents of example.txt is: this is a test 2342 this is a test 34343 this is a test 43434 and i want to remove the "this is a test" lines only and end up with the numbers in the file, that is, end up with: 2342... (4 Replies)
Discussion started by: ocelot
4 Replies

3. Shell Programming and Scripting

removing duplicate blank lines

Hi, how to remove the blank lines from the file only If we have more than one blank line. thanks rameez (8 Replies)
Discussion started by: rameezrajas
8 Replies

4. Shell Programming and Scripting

removing the duplicate lines in a file

Hi, I need to concatenate three files in to one destination file.In this if some duplicate data occurs it should be deleted. eg: file1: ----- data1 value1 data2 value2 data3 value3 file2: ----- data1 value1 data4 value4 data5 value5 file3: ----- data1 value1 data4 value4 (3 Replies)
Discussion started by: Sharmila_P
3 Replies

5. Shell Programming and Scripting

Removing duplicates from string (not duplicate lines)

please help me in getting following: Input Desired output x="foo" foo x="foo foo" foo x="foo foo" foo x="foo abc foo" foo abc x="foo foo1 foo2" foo foo1 foo2 I need to remove duplicated from string.. (8 Replies)
Discussion started by: vickylife
8 Replies

6. Shell Programming and Scripting

Removing Duplicate Lines per Section

Hello, I am in need of removing duplicate lines from within a file per section. File: ABC1 012345 header ABC2 7890-000 ABC3 012345 Header Table ABC4 ABC5 593.0000 587.4800 ABC5 593.5000 587.6580 <= dup need to remove ABC5 593.5000 ... (5 Replies)
Discussion started by: petersf
5 Replies

7. Shell Programming and Scripting

removing duplicate lines while maintaing coherence with second file

So I have two files. The first file, file1.txt, has lines of numbers separated by commas. file1.txt 10,2,30,50 22,6,3,15,16,100 73,55 78,40,33,30,11 73,55 99,82,85 22,6,3,15,16,100 The second file, file2.txt, has sentences. file2.txt "the cat is fat" "I like eggs" "fish live in... (6 Replies)
Discussion started by: adrunknarwhal
6 Replies

8. Shell Programming and Scripting

Removing a block of duplicate lines from a file

Hi all, I have a file with the data 1 abc 2 123 3 ; 4 rao 5 bell 6 ; 7 call 8 abc 9 123 10 ; 11 rao 12 bell 13 ; (10 Replies)
Discussion started by: raosr020
10 Replies

9. UNIX for Dummies Questions & Answers

Removing a set of Duplicate lines from a file

Hi, How do i remove a set of duplicate lines from a file. My file contains the lines: abc def ghi abc def ghi jkl mno pqr jkl mno (1 Reply)
Discussion started by: raosr020
1 Replies

10. UNIX for Beginners Questions & Answers

Advise on how to print range of lines above and below a number?

Hi, I have attached an output file which is some kind of database file mapping. It is basically like an allocation mapping of a tablespace and its datafile/s. The output is generated by the SQL script that I found from 401 Authorization Required Excerpts of the file are as below: ... (2 Replies)
Discussion started by: newbie_01
2 Replies
DOVEADM-LOG(1)							      Dovecot							    DOVEADM-LOG(1)

NAME
doveadm-log - Locate, test or reopen Dovecot's log files SYNOPSIS
doveadm [-Dv] log errors doveadm [-Dv] log find [directory] doveadm [-Dv] log reopen doveadm [-Dv] log test DESCRIPTION
The doveadm log commands are used to locate and reopen the log files of dovecot(1). It's also possible to test the configured targets of the *log_path settings. OPTIONS
Global doveadm(1) options: -D Enables verbosity and debug messages. -v Enables verbosity, including progress counter. COMMANDS
log errors doveadm log errors The log errors command is used to show the last - up to 1,000 - errors and warnings. If no output is generated, no errors have occurred since the last start. log find doveadm log find [directory] The log find command is used to show the location of the log files, to which dovecot(1) sends its log messages. If dovecot(1) logs its messages through syslogd(8) and doveadm(1) could not find any log files, you can specify the directory where your syslogd writes its log files. log reopen doveadm log reopen This command causes doveadm to reopen all log files, configured in the log_path, info_log_path and debug_log_path settings. These settings are configured in /etc/dovecot/conf.d/10-logging.conf. This is for example useful after manually rotating the log files. log test doveadm log test This command causes doveadm to write the message "This is Dovecot's priority log (timestamp)" to the configured log files. The used prior- ities are: debug, info, warning, error and fatal. EXAMPLE
This example shows how to locate the log files used by dovecot(1). doveadm log find Looking for log files from /var/log Debug: /var/log/dovecot.debug Info: /var/log/mail.log Warning: /var/log/mail.log Error: /var/log/mail.log Fatal: /var/log/mail.log REPORTING BUGS
Report bugs, including doveconf -n output, to the Dovecot Mailing List <dovecot@dovecot.org>. Information about reporting bugs is avail- able at: http://dovecot.org/bugreport.html SEE ALSO
doveadm(1) Dovecot v2.1 2012-02-22 DOVEADM-LOG(1)
All times are GMT -4. The time now is 07:15 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy