Sponsored Content
Full Discussion: moving files prior to 2 days
Top Forums UNIX for Dummies Questions & Answers moving files prior to 2 days Post 302073149 by gthokala on Wednesday 10th of May 2006 10:48:36 AM
Old 05-10-2006
Please post exactly what you are typing to move the files.

Try to Copy or make changes to the paths and execute the command below and it should move the files older than 2 days old.

mv `find /home/pavi/logs/*.* -mtime +2 -exec ls {} \;` /home/pavi
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

ls latest 4 days or specify days of files in the directory

Hi, I would like to list latest 2 days, 3 days or 4 days,etc of files in the directory... how? is it using ls? (3 Replies)
Discussion started by: happyv
3 Replies

2. Shell Programming and Scripting

List files created before Noon 2 days prior

Our nightly updates run in the evening and finish around 8am. My boss wants the current log files kept on the server for 2 days, but wants anything created before noon, 2 days prior archived. I was thinking of using touch to set a temporary file with a date of today-2 and a time of noon, then... (3 Replies)
Discussion started by: prismtx
3 Replies

3. UNIX for Advanced & Expert Users

File disk utilization for 10 days prior

Hi I have a requirement to list the files & the total disk utilization they have which are 10 prior to current date. I tried couple of options in combinations of find mtime, ctime with du -m, but no luck. Could you please help me in this ? (2 Replies)
Discussion started by: videsh77
2 Replies

4. Shell Programming and Scripting

Shell Script for moving 3 days old file to Archive Folder

Hi Experts, I have a "Source" folder which may contain some files. I need a shell script which should move all files which are older than 3 days to "Archive" folder. Thanks in Advance... (4 Replies)
Discussion started by: phani333
4 Replies

5. UNIX for Dummies Questions & Answers

Moving Multiple files to destination files

I am running a code like this foreach list ($tmp) mv *_${list}.txt ${chart}_${list}.txt #mv: when moving multiple files, last argument must be a directory mv *_${list}.doc ${chart}_${list}.doc #mv: when moving multiple files, last argument must be a... (3 Replies)
Discussion started by: animesharma
3 Replies

6. UNIX for Dummies Questions & Answers

Need Help in reading N days files from a Directory & combining the files

Hi All, Request your expertise in tackling one requirement in my project,(i dont have much expertise in Shell Scripting). The requirement is as below, 1) We store the last run date of a process in a file. When the batch run the next time, it should read this file, get the last run date from... (1 Reply)
Discussion started by: dsfreddie
1 Replies

7. Shell Programming and Scripting

Finding files with wc -l results = 1 then moving the files to another folder

Hi guys can you please help me with a script to find files with one row/1 line of content then move the file to another directory my script below runs but nothing happens to the files....Alternatively Ca I get a script to find the *.csv files with "wc -1" results = 1 then create a list of those... (5 Replies)
Discussion started by: Dj Moi
5 Replies

8. AIX

Moving Hidden files to normal files

I have a bunch of hidden files in a directory in AIX. I would like to move these hidden files as regular files to another directory. Say i have the following files in directory /x .test~1234~567 .report~5678~123 .find~9876~576 i would like to move them to directory /y as test~1234~567... (10 Replies)
Discussion started by: umesh.narain
10 Replies

9. Shell Programming and Scripting

How to create zip/gz/tar files for if the files are older than particular days in UNIX or Linux?

I need a script file for backup (zip or tar or gz) of old log files in our unix server (causing the space problem). Could you please help me to create the zip or gz files for each log files in current directory and sub-directories also? I found one command which is to create gz file for the... (4 Replies)
Discussion started by: Mallikgm
4 Replies
SUMMAIN(1)						      General Commands Manual							SUMMAIN(1)

NAME
summain - gather file checksums and metadata SYNOPSIS
summain [-c=CHECKSUM] [--checksum=CHECKSUM] [--config=FILE] [--dump-config] [--dump-memory-profile=METHOD] [--dump-setting-names] [--exclude=FIELD] [--generate-manpage=TEMPLATE] [-h] [--help] [--list-config-files] [--log=FILE] [--log-keep=N] [--log-level=LEVEL] [--log-max=SIZE] [--log-mode=MODE] [-m] [--mangle-paths] [--no-default-configs] [--output=FILE] [-f=OUTPUT-FORMAT] [--output-format=OUTPUT-FORMAT] [-r] [--relative-paths] [--secret=SECRET] [--version] [FILE]... DESCRIPTION
summain gathers metadata about files, and computes their checksums. It is intended to create a manifest of the files. The manifest can be used to see if something has changed: a new manifest can be created and compared with the old one with diff(1). The manifest looks like this: Name: foo/bar/foobar SHA1: 1234123413241324 Mtime: 2010-01-01 02:08:00.127651 +0000 Mode: 1755 The filename is URL-encoded to ensure it is purely ASCII. Mode is in octal. Only some inode fields are included. It does not make sense to compare, for example, the access time, so that is not included. Time stamps are given using microsecond precision, for the benefit of those filesystems that can support precise timestamps. (Should be nanosecond, but Python return timestamps as floating point, and nanosecond precision is too much for the floating point type.) The inode and device number fields will not be reported accurately. Instead, they are normalized so that manifests are useful after the files have been restored from backups. Accurate numbers would mean everything seems to have changed. Normalized means that there will be no differences. The numbers are reported so that hard links can be checked. Directories named on the command line will be recursed automatically. OPTIONS
-c, --checksum=CHECKSUM which checksums to compute: MD5, SHA1, SHA224, SHA256, SHA384, SHA512; use once per checksum type (default is SHA1) --config=FILE add FILE to config files --dump-config write out the entire current configuration --dump-memory-profile=METHOD make memory profiling dumps using METHOD, which is one of: none, simple, meliae, or heapy (default: simple) --dump-setting-names write out all names of settings and quit --exclude=FIELD do not output or compute FIELD --generate-manpage=TEMPLATE fill in manual page TEMPLATE -h, --help show this help message and exit --list-config-files list all possible config files --log=FILE write log entries to FILE (default is to not write log files at all); use "syslog" to log to system log, or "none" to disable log- ging --log-keep=N keep last N logs (10) --log-level=LEVEL log at LEVEL, one of debug, info, warning, error, critical, fatal (default: debug) --log-max=SIZE rotate logs larger than SIZE, zero for never (default: 0) --log-mode=MODE set permissions of new log files to MODE (octal; default 0600) -m, --mangle-paths mangle (obfuscate) paths --no-default-configs clear list of configuration files to read --output=FILE write output to FILE, instead of standard output -f, --output-format=OUTPUT-FORMAT choose output format (rfc822, csv, json) -r, --relative-paths print paths relative to arguments --secret=SECRET use SECRET to make mangled paths unguessable --version show program's version number and exit SUMMAIN(1)
All times are GMT -4. The time now is 04:19 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy