Sponsored Content
Top Forums Shell Programming and Scripting remove some files on a condition.. Post 27043 by ST2000 on Tuesday 27th of August 2002 10:49:37 AM
Old 08-27-2002
remove some files on a condition..

Hi.. when I do a ls -lt, I get a listing of about 200 files.. These are trace files and some of it I might not need..

To be clear, say in a given week , I might not need files that have been traced between 11 and 11:30 am on a particular day. How can I delete based on this condition ?

Thanks, ST2000
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to search then remove based on condition

Folks; I'm trying to write a script to scan through a directory tree then for each file it finds, it run a command line tool, then if the results include the word "DONE", it removes the file. In more details; i have a Linux directory tree such as "/opt/grid/1022/store" I'm trying to write a... (6 Replies)
Discussion started by: Katkota
6 Replies

2. Shell Programming and Scripting

perl or awk remove empty lines when condition

Hi Everyone, # cat 1 a b b cc 1 2 3 3 3 4 55 5 a b (2 Replies)
Discussion started by: jimmy_y
2 Replies

3. Shell Programming and Scripting

Remove lines from XML based on condition

Hi, I need to remove some lines from an XML file is the value within a tag is empty. Imagine this scenario, <acd><acdID>2</acdID><logon></logon></acd> <acd><acdID></acdID><logon></logon></acd> <acd><acdID></acdID><logon></logon></acd> <acd><acdID></acdID><logon></logon></acd> I... (3 Replies)
Discussion started by: giles.cardew
3 Replies

4. Shell Programming and Scripting

Remove duplicate line on condition

Hi Ive been scratching over this for some time with no solution. I have a file like this 1 bla bla 1 2 bla bla 2 4 bla bla 3 5 bla bla 1 6 bla bla 1 I want to remove consecutive occurrences of lines like bla bla 1, but the first column may be different. Any ideasss?? (23 Replies)
Discussion started by: jamie_123
23 Replies

5. Shell Programming and Scripting

remove duplicate lines with condition

hi to all Does anyone know if there's a way to remove duplicate lines which we consider the same only if they have the first and the second column the same? For example I have : us2333 bbb 5 us2333 bbb 3 us2333 bbb 2 and I want to get us2333 bbb 10 The thing is I cannot... (2 Replies)
Discussion started by: vlm
2 Replies

6. Shell Programming and Scripting

Remove duplicate values with condition

Hi Gents, Please can you help me to get the desired output . In the first column I have some duplicate records, The condition is that all need to reject the duplicate record keeping the last occurrence. But the condition is. If the last occurrence is equal to value 14 or 98 in column 3 and... (2 Replies)
Discussion started by: jiam912
2 Replies

7. UNIX for Dummies Questions & Answers

Remove when substring meets condition

Hi Masters, I need to remove lines when date format is below certain date My file input 20140906|ALASKA|USASEL|TARPUNG|2014-03-01|82176614040|20|1 20140906|ALASKA|USATENG|CRUIEX|2014-08-01|81267079997|5|0 20140906|ALASKA|USASEL|CRUIEMBANG|2013-10-01|82280779814|9|0... (4 Replies)
Discussion started by: radius
4 Replies

8. Shell Programming and Scripting

Remove line based on condition in awk

In the following tab-delimited input, I am checking $7 for the keyword intronic. If that keyword is found then $2 is split by the . in each line and if the string after the digits or the +/- is >10, then that line is deleted. This will always be the case for intronic. If $7 is exonic then nothing... (10 Replies)
Discussion started by: cmccabe
10 Replies

9. Shell Programming and Scripting

Remove part of the file using a condition

Gents, Is there the chance to remove part of the file, Taking in consideration this condition. For each record the first row start with the string % VE should be 56 rows for each records.. first row = % VE last row = % sw total 56 rows for each record. Then in the case that the... (4 Replies)
Discussion started by: jiam912
4 Replies

10. UNIX for Beginners Questions & Answers

Remove footer record in specific condition

Hi Experts, we have a requirement , need your help to remove the footer record in the file. Input file : 1011070375,,21,,NG,NG,asdfsfadf,1011,,30/09/2017,ACI,USD,,0.28,,,,,,,,,,,, 1011070381,,21,,NG,NG,sgfseasdf,1011,,30/09/2017,ACI,GBP,,0.22,,,,,,,,,,,,... (6 Replies)
Discussion started by: KK230689
6 Replies
clear clearinghouse(1m) 												   clear clearinghouse(1m)

NAME
clear clearinghouse - Removes knowledge of the specified clearinghouse from the server's memory SYNOPSIS
cdscp clear clearinghouse clearinghouse-name ARGUMENTS
clearinghouse-name The full name of the clearinghouse. DESCRIPTION
The clear clearinghouse command removes knowledge of the specified clearinghouse from the server's memory. The clearinghouse files are not deleted. This ensures that the clearinghouse is not automatically enabled on server restarts. If you issue a list clearinghouse command, the clearinghouse will be listed. Before you can delete a cleared clearinghouse, you must use the create clearinghouse command to recreate it. After recreating the clear- inghouse, you can use the delete clearinghouse command to remove it. This command is part of the process of relocating a clearinghouse. See the OSF DCE Administration Guide for more information. Privilege Required You must have write permission to the server on which the clearinghouse resides. NOTE
This command is replaced at Revision 1.1 by the dcecp command and may not be provided in future releases of DCE. EXAMPLES
The following command clears the clearinghouse /.:/Paris2_CH before moving it to another server: cdscp> clear clearinghouse /.:/Paris2_CH RELATED INFORMATION
Books: OSF DCE Administration Guide Commands: create clearinghouse(1m), delete clearinghouse(1m), list clearinghouse(1m), set cdscp preferred clearinghouse(1m), show cdscp preferred clearinghouse(1m), show clearinghouse(1m) clear clearinghouse(1m)
All times are GMT -4. The time now is 11:21 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy