Sponsored Content
Top Forums Shell Programming and Scripting How to bypass rm command when there are no files to delete? Post 303011369 by Scrutinizer on Wednesday 17th of January 2018 11:05:12 AM
Old 01-17-2018
Quote:
Originally Posted by mohtashims
Hi,

Below command works fine when we have other files apart from hello.txt
Code:
ls | ggrep -v hello* | xargs rm -rf

But, if there is only one file i.e hello.txt the rm command does not find anything to delete and the script hangs.

While there could be trivial ways to check using if conditions if there are other files than hello* only then trigger the rm command else do not.

I was looking for a better solution if anyone can suggest.
In
Code:
ls | ggrep -v hello* | xargs rm -rf

The ggrep command has an incorrect syntax. I suggest you try:
Code:
ggrep -v '^hello'

Also, better try using rm instead of rm -rf
 

10 More Discussions You Might Find Interesting

1. Solaris

How to delete Directory and inside files using Find command

I am using the following Command to delete Directory with contents. But this command is deleting inside files only not directories. is there any change need in my command? find -type f -mtime +3 -exec rm -r {} \; Thanks (3 Replies)
Discussion started by: bmkreddy
3 Replies

2. Solaris

Could not able to delete the files through find command need expert advice

Hi Gurus I am facing a problem, there is a folder called /a where there are lots of files which are occupying space anything between 30 GB to 100 GB as I am not able to check the space occupied by that folder through "du -sh /a" command as I don't see any output after more than 1 hour of... (4 Replies)
Discussion started by: amity
4 Replies

3. Shell Programming and Scripting

Need a command to delete all files apart from last file generated

Hi, I have a directory named (/output). this directory has files in the below format abc.* , xyz*, djj*, iwe*, weewe*, rier*, 3948903ddfgf* these files are generated at random. what i need to do is. delete all the files of all kinds but keep only the last generated file of... (5 Replies)
Discussion started by: dazdseg
5 Replies

4. Solaris

Need to know command to delete more than 3 million files from /var/spool/clientmqueue

Hi I need to delete more than 3 million files from /var/spool/clientmqueue. When I give the following command to delete the files, I get the error # pwd /var/spool/clientmqueue # rm -f * /usr/bin/rm: arg list too long Please tell me how can I delete the files (5 Replies)
Discussion started by: sb200
5 Replies

5. UNIX for Dummies Questions & Answers

Find command to delete old files

Hi, I want to delete all the log files that was created on year 2008. My command is not working. Any idea? find . -name '*.log' -mtime 1460 -exec ls -lt {} \; Thank you. (2 Replies)
Discussion started by: samnyc
2 Replies

6. Shell Programming and Scripting

Find command to delete the files

Hi All, I've created 2 files touch -t 201309101234 aa10 touch -t 201309111234 aa11 Exact 60 days before from today date is SEPT 12th . As per the following command as i gave +60 means the files which were created before sept12th should be deleted find /etc/logs/*aa* -type f -atime +60... (5 Replies)
Discussion started by: smile689
5 Replies

7. AIX

Bypass Read Line in profile through AIX command

Hi All, I have a complicated requirement where in I have a "root" user and a user named "xeadmin" I want to take sudo of "xeadmin" by command sudo su - xeadmin. Later i need to hit 2 enter keys as there are 2 read line commands inserted in profile of "xeadmin" and I reach command prompt, i need... (1 Reply)
Discussion started by: hiteshsathawane
1 Replies

8. Shell Programming and Scripting

Command to delete half of files in directory.

Hello Friends, I have directory called /tmp. which stores the log files. Whenever it becomes full, i want to delete half of files from all log files. even after deleting the files, if space is more than 90% then it should delete rest of half files. While deleting files, older files... (7 Replies)
Discussion started by: Nakul_sh
7 Replies

9. Shell Programming and Scripting

Command to delete a word in all files

can anyone tell me what is the commands to delete the below particular word in the all files located in one particular file path files/ll>grep "/ftp/" test.kell ftp -m uskmc -d /ftp/ -i filename.zip output should be : ftp -m uskmc -d -i filename.zip (4 Replies)
Discussion started by: ramkumar15
4 Replies

10. Shell Programming and Scripting

Please - Looking for a online command to delete files

I have list of files like below, I want to delete files older than 2 days except S0000000.LOG, I have command find /export/home/X_GZPQJK/out/file* -mtime +1 -exec rm {} \; but it is deleting S0000000.LOG, Can you please help me how to modify command to delete except S0000000.LOG. $ ls -ltr... (2 Replies)
Discussion started by: prince1987
2 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 09:18 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy