Sponsored Content
Full Discussion: moving files prior to 2 days
Top Forums UNIX for Dummies Questions & Answers moving files prior to 2 days Post 302073049 by roderashe on Tuesday 9th of May 2006 10:38:52 PM
Old 05-09-2006
When I use this for my script I get a bunch of errors. I thought I could use Xargs but I get this:

usage: mv [-f | -i | -n] [-v] source target
mv [-f | -i | -n] [-v] source ... directory

Here's my code:

find /home/pavi/logs/ -mtime +21 -exec ls {} \; | xargs mv /home/pavi

Any thoughts? Thanks.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

ls latest 4 days or specify days of files in the directory

Hi, I would like to list latest 2 days, 3 days or 4 days,etc of files in the directory... how? is it using ls? (3 Replies)
Discussion started by: happyv
3 Replies

2. Shell Programming and Scripting

List files created before Noon 2 days prior

Our nightly updates run in the evening and finish around 8am. My boss wants the current log files kept on the server for 2 days, but wants anything created before noon, 2 days prior archived. I was thinking of using touch to set a temporary file with a date of today-2 and a time of noon, then... (3 Replies)
Discussion started by: prismtx
3 Replies

3. UNIX for Advanced & Expert Users

File disk utilization for 10 days prior

Hi I have a requirement to list the files & the total disk utilization they have which are 10 prior to current date. I tried couple of options in combinations of find mtime, ctime with du -m, but no luck. Could you please help me in this ? (2 Replies)
Discussion started by: videsh77
2 Replies

4. Shell Programming and Scripting

Shell Script for moving 3 days old file to Archive Folder

Hi Experts, I have a "Source" folder which may contain some files. I need a shell script which should move all files which are older than 3 days to "Archive" folder. Thanks in Advance... (4 Replies)
Discussion started by: phani333
4 Replies

5. UNIX for Dummies Questions & Answers

Moving Multiple files to destination files

I am running a code like this foreach list ($tmp) mv *_${list}.txt ${chart}_${list}.txt #mv: when moving multiple files, last argument must be a directory mv *_${list}.doc ${chart}_${list}.doc #mv: when moving multiple files, last argument must be a... (3 Replies)
Discussion started by: animesharma
3 Replies

6. UNIX for Dummies Questions & Answers

Need Help in reading N days files from a Directory & combining the files

Hi All, Request your expertise in tackling one requirement in my project,(i dont have much expertise in Shell Scripting). The requirement is as below, 1) We store the last run date of a process in a file. When the batch run the next time, it should read this file, get the last run date from... (1 Reply)
Discussion started by: dsfreddie
1 Replies

7. Shell Programming and Scripting

Finding files with wc -l results = 1 then moving the files to another folder

Hi guys can you please help me with a script to find files with one row/1 line of content then move the file to another directory my script below runs but nothing happens to the files....Alternatively Ca I get a script to find the *.csv files with "wc -1" results = 1 then create a list of those... (5 Replies)
Discussion started by: Dj Moi
5 Replies

8. AIX

Moving Hidden files to normal files

I have a bunch of hidden files in a directory in AIX. I would like to move these hidden files as regular files to another directory. Say i have the following files in directory /x .test~1234~567 .report~5678~123 .find~9876~576 i would like to move them to directory /y as test~1234~567... (10 Replies)
Discussion started by: umesh.narain
10 Replies

9. Shell Programming and Scripting

How to create zip/gz/tar files for if the files are older than particular days in UNIX or Linux?

I need a script file for backup (zip or tar or gz) of old log files in our unix server (causing the space problem). Could you please help me to create the zip or gz files for each log files in current directory and sub-directories also? I found one command which is to create gz file for the... (4 Replies)
Discussion started by: Mallikgm
4 Replies
DB5.1_HOTBACKUP(1)					      General Commands Manual						DB5.1_HOTBACKUP(1)

NAME
db5.1_hotbackup - Create "hot backup" or "hot failover" snapshots SYNOPSIS
db5.1_hotbackup [-cDuVv] [-d data_dir ...] [-h home] [-l log_dir] [-P password] -b backup_dir DESCRIPTION
The db5.1_hotbackup utility creates "hot backup" or "hot failover" snapshots of Berkeley DB database environments. The db5.1_hotbackup utility performs the following steps: 1. If the -c option is specified, checkpoint the source home database environment, and remove any unnecessary log files. 2. If the target directory for the backup does not exist, it is created with mode read-write-execute for the owner. If the target directory for the backup does exist and the -u option was specified, all log files in the target directory are removed; if the -u option was not specified, all files in the target directory are removed. 3. If the -u option was not specified, copy application-specific files found in the database environment home directory, or any directory specified using the -d option, into the target directory for the backup. 4. Copy all log files found in the directory specified by the -l option (or in the database environment home directory, if no -l option was specified), into the target directory for the backup. 5. Perform catastrophic recovery on the hot backup. 6. Remove any unnecessary log files from the hot backup. The db5.1_hotbackup utility does not resolve pending transactions that are in the prepared state. Applications that use DB_TXN->prepare should specify DB_RECOVER_FATAL when opening the environment, and run DB_ENV->txn_recover to resolve any pending transactions, when failing over to the hot backup. OPTIONS
-b Specify the target directory for the backup. -c Before performing the snapshot, checkpoint the source database environment and remove any log files that are no longer required in that environment. To avoid making catastrophic failure impossible, log file removal must be integrated with log file archival. -d Specify one or more source directories that contain databases; if none is specified, the database environment home directory will be searched for database files. As database files are copied into a single backup directory, files named the same, stored in different source directories, could overwrite each other when copied into the backup directory. -h Specify the source directory for the backup, that is, the database environment home directory. -l Specify a source directory that contains log files; if none is specified, the database environment home directory will be searched for log files. -P Specify an environment password. Although Berkeley DB utilities overwrite password strings as soon as possible, be aware there may be a window of vulnerability on systems where unprivileged users can see command-line arguments or where utilities are not able to overwrite the memory containing the command-line arguments. -u Update a pre-existing hot backup snapshot by copying in new log files. If the -u option is specified, no databases will be copied into the target directory. -V Write the library version number to the standard output, and exit. -v Run in verbose mode, listing operations as they are done. -D Use the data directories listed in the DB_CONFIG configuration file in the source directory. This option has three effects: First, if they do not already exist, the specified data directories will be created relative to the target directory (with mode read-write- execute owner). Second, all files in the source data directories will be copied to the target data directories. If the DB_CONFIG file specifies one or more absolute pathnames, files in those source directories will be copied to the top-level target directory. Third, the DB_CONFIG configuration file will be copied from the +source directory to the target directory, and subsequently used for configuration if recovery is run in the target directory. Care should be taken with the -D option and data directories which are named relative to the source directory but are not subdirectories (that is, the name includes the element "..") Specifically, the constructed target directory names must be meaningful and distinct from the source directory names, otherwise running recovery in the target directory might corrupt the source data files. It is an error to use absolute pathnames for data directories or the log directory in this mode, as the DB_CONFIG configuration file copied into the target directory would then point at the source directories and running recovery would corrupt the source data files. The db5.1_hotbackup utility uses a Berkeley DB environment (as described for the -h option, the environment variable DB_HOME, or because the utility was run in a directory containing a Berkeley DB environment). In order to avoid environment corruption when using a Berkeley DB environment, db5.1_hotbackup should always be given the chance to detach from the environment and exit gracefully. To cause db5.1_hot- backup to release all environment resources and exit cleanly, send it an interrupt signal (SIGINT). The db5.1_hotbackup utility exits 0 on success, and >0 if an error occurs. ENVIRONMENT
DB_HOME If the -h option is not specified and the environment variable DB_HOME is set, it is used as the path of the database home, as described in DB_ENV->open. AUTHORS
Oracle Corporation. This manual page was created based on the HTML documentation for db_hotbackup from Sleepycat, by Thijs Kinkhorst <thijs@kinkhorst.com>, for the Debian system (but may be used by others). 28 January 2005 DB5.1_HOTBACKUP(1)
All times are GMT -4. The time now is 10:57 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy