Sponsored Content
Top Forums Shell Programming and Scripting Remove all but newest two files (Not a duplicate post) Post 303022069 by drew77 on Thursday 23rd of August 2018 11:30:46 AM
Old 08-23-2018
Quote:
Originally Posted by vgersh99
try this:
Code:
REGEX='[0-9]{4}-[0-9]{2}-[0-9]{2}-[0-9]{2}-[0-9]{2}[.]zip'

This isn't deleting any files.

Code:
#!/bin/bash
TARGET_DIR='/media/andy/MAXTOR_SDB1/Ubuntu_Mate_18.04/Script_Backups/'
REGEX='[0-9]{4}-[0-9]{2}-[0-9]{2}-[0-9]{2}-[0-9]{2}[.]zip'
LATEST_FILE="$(ls "$TARGET_DIR" | egrep "^${REGEX}$" | tail -1)"

find "$TARGET_DIR" ! -name "$LATEST_FILE" -type f -regextype egrep -regex ".*/${REGEX}$" -exec rm -f {} +


Last edited by vgersh99; 08-23-2018 at 12:50 PM.. Reason: code tags, please!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

remove duplicate files in a directory

Hi ppl. I have to check for duplicate files in a directory . the directory has following files /the/folder /containing/the/file a1.yyyymmddhhmmss a1.yyyyMMddhhmmss b1.yyyymmddhhmmss b2.yyyymmddhhmmss c.yyyymmddhhmmss d.yyyymmddhhmmss d.yyyymmddhhmmss where the date time stamp can be... (1 Reply)
Discussion started by: asinha63
1 Replies

2. Shell Programming and Scripting

remove all duplicate lines from all files in one folder

Hi, is it possible to remove all duplicate lines from all txt files in a specific folder? This is too hard for me maybe someone could help. lets say we have an amount of textfiles 1 or 2 or 3 or... maximum 50 each textfile has lines with text. I want all lines of all textfiles... (8 Replies)
Discussion started by: lowmaster
8 Replies

3. Shell Programming and Scripting

Remove duplicate files based on text string?

Hi I have been struggling with a script for removing duplicate messages from a shared mailbox. I would like to search for duplicate messages based on the “Message-ID” string within the messages files. I have managed to find the duplicate “Message-ID” strings and (if I would like) delete... (1 Reply)
Discussion started by: spangberg
1 Replies

4. Shell Programming and Scripting

Remove duplicate files in same directory

Hi all. Am doing continuous backup of mailboxes using rsync. So whenever a new mail arrives it is automatically copied on backup server. When a new mail arrives it is named as xyz:2, when it is read by the email client an S is appended xyz:2,S Eventually , 2 copies of the same file exist on... (7 Replies)
Discussion started by: coolatt
7 Replies

5. Shell Programming and Scripting

Remove Duplicate Files On Remote Servers

Hello, I wrote a basic script that works however I am was wondering if it could be sped up. I am comparing files over ssh to remove the file from the source server directory if a match occurs. Please Advise me on my mistakes. #!/bin/bash for file in `ls /export/home/podcast2/"$1" ` ; do ... (5 Replies)
Discussion started by: jaysunn
5 Replies

6. Shell Programming and Scripting

perl/shell need help to remove duplicate lines from files

Dear All, I have multiple files having number of records, consist of more than 10 columns some column values are duplicate and i want to remove these duplicate values from these files. Duplicate values may come in different files.... all files laying in single directory.. Need help to... (3 Replies)
Discussion started by: arvindng
3 Replies

7. Shell Programming and Scripting

[uniq + awk?] How to remove duplicate blocks of lines in files?

Hello again, I am wanting to remove all duplicate blocks of XML code in a file. This is an example: input: <string-array name="threeItems"> <item>item1</item> <item>item2</item> <item>item3</item> </string-array> <string-array name="twoItems"> <item>item1</item> <item>item2</item>... (19 Replies)
Discussion started by: raidzero
19 Replies

8. Shell Programming and Scripting

Remove duplicate files

Hi, In a directory, e.g. ~/corpus is a lot of files and subdirectories. Some of the files are named: 12345___PP___0902___AA.txt 12346___PP___0902___AA. txt 12347___PP___0902___AA. txt The amount of files varies. I need to keep the highest (12347___PP___0902___AA. txt) and remove... (5 Replies)
Discussion started by: corfuitl
5 Replies

9. Windows & DOS: Issues & Discussions

Remove duplicate lines from text files.

So, I have text files, one "fail.txt" And one "color.txt" I now want to use a command line (DOS) to remove ANY line that is PRESENT IN BOTH from each text file. Afterwards there shall be no duplicate lines. (1 Reply)
Discussion started by: pasc
1 Replies

10. Shell Programming and Scripting

Delete all but 3 newest files

This is related to my post on backup up files. I really appreciate all the help too. :-) I would like to delete all but the 3 newest files in my backup directory. /media/andy/MAXTOR_SDB1/Ubuntu_Mate_18.04/ For example Ubuntu_Documents.zip_09Aug2018_12_00... (2 Replies)
Discussion started by: drew77
2 Replies
GRUB-RENDER-LABEL(1)                                               User Commands                                              GRUB-RENDER-LABEL(1)

NAME
grub-render-label - generate a .disk_label for Apple Macs. SYNOPSIS
grub-render-label [OPTION...] [OPTIONS] DESCRIPTION
Render Apple .disk_label. -b, --bgcolor=COLOR use COLOR for background -c, --color=COLOR use COLOR for text -f, --font=FILE use FILE as font (PF2). -i, --input=FILE read text from FILE. -o, --output=FILE set output filename. Default is STDOUT -t, --text=STRING set the label to render -v, --verbose print verbose messages. -?, --help give this help list --usage give a short usage message -V, --version print program version Mandatory or optional arguments to long options are also mandatory or optional for any corresponding short options. REPORTING BUGS
Report bugs to <bug-grub@gnu.org>. SEE ALSO
The full documentation for grub-render-label is maintained as a Texinfo manual. If the info and grub-render-label programs are properly installed at your site, the command info grub-render-label should give you access to the complete manual. grub-render-label (GRUB) 2.02-2ubuntu8.3 July 2018 GRUB-RENDER-LABEL(1)
All times are GMT -4. The time now is 09:15 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy