03-25-2009
find new files and compress them
Hi!
First off I'd like to stress that I'm a true dummy
I have a website with SSH access and since it has user generated content I want to backup my website every day end send it through FTP to a different server. I got it working for my mysql database, so the only thing remaining are the files. I was thinking of doing the following:
Since the total size of all files is too much to backup every day, I want to search for all new files (i know, it doesn't account for files which have been removed) added in the past 24 hour (so since the last backup). I believe the following line does just that:
find domain.tld -mtime -1
Then I want to compress all these files (and of course keep the directory structure) into one file. I just know how to compress an entire folder:
tar -cvzf domain.tld.tar.gz domain.tld
So the question is, how do I combine these to functions??
Cheers & Thanks a bunch!
- Mark
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Hi,
I have my log files in /home/user1/temp2/logs i want to archive
*.log and *.txt files and to store in my /home/user1/temp2/archved/
with *.log with Time stamp ,Please let me know how to do this? (1 Reply)
Discussion started by: redlotus72
1 Replies
2. Shell Programming and Scripting
im doing a script to compress files in ${CompressPath} withe files older than ${FileAge}. The line below actually works, but I only need to compress files that are in ${CompressPath}. This line compresses all files that it can find under the ${CompressPath} and all its sub dirs. is there a way to... (6 Replies)
Discussion started by: tads98
6 Replies
3. Shell Programming and Scripting
Hi gurus,
I have a batch job running daily night automatically. The job produces file with extension '.0' like a.0, b.0 etc.
Now due to file space constraints, most of the time, the job fails with insufficient disk space and then we have to manually start the job again and keep running the... (1 Reply)
Discussion started by: super_duper_guy
1 Replies
4. Shell Programming and Scripting
Hi Friends,
Can anyone help me out with compressing multiple files.
I have multiple files in directory , I have to compress these into a single file,
I tried using
gzip -r outfile.gz file1 file2 file3.
It is not working
Thanks in advance for your help
S :) (5 Replies)
Discussion started by: sbasetty
5 Replies
5. UNIX for Dummies Questions & Answers
Hi All,
I would like to archive some of the scripts below(USFINUM042006_01.CSV
USFINUM042006_02.CSV and USFINUM042006_03.CSV )and also use a wildcard e.g. <command> USFINUM*.CSV. Also there are a lot of similar files but I want only the three latest files to be compressed. Which is the best... (3 Replies)
Discussion started by: indira
3 Replies
6. Filesystems, Disks and Memory
Hello,
I am having difficulty compressing the files using compress or GZIP utility on NAS share NFS mounted on my linux server. Any one have idea on how to do this ? I get the followign error but the trying to compress the files
STRP2> compress STR_OUTBOUND_CDM_LOG_LOB.PRT2008_26.txt... (0 Replies)
Discussion started by: kamathg
0 Replies
7. Shell Programming and Scripting
Could someone give me an idea how to compress all files from a given directory that are not of type .z (compressed). Please help. (2 Replies)
Discussion started by: lesstjm
2 Replies
8. UNIX for Advanced & Expert Users
Hi,
I have an application which creates the logs in a date wise.
like,
tomcat_access_log.2009-09-12.log
tomcat_access_log.2009-09-11.log
tomcat_access_log.2009-09-10.log
tomcat_access_log.2009-09-09.log
tomcat_access_log.2009-09-08.log
tomcat_access_log.2009-09-07.logNow my requirement is... (5 Replies)
Discussion started by: skmdu
5 Replies
9. UNIX for Dummies Questions & Answers
Hi All !
We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows.
I have used the utility ZIP like the below.
zip -e <newfilename> df2_test_extract.dat
but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Discussion started by: sakthifire
4 Replies
10. Shell Programming and Scripting
I am trying to accomplish follow
1- search $dir_find for \*.txt\*
2- exclude 2 directories $dir_find/abc/* and $dir_find/fit*
3- compress find result
4- move all compressed file to different directory
Here is what I have tried. How/where can I add mv?? Can I add to same line/command?
... (1 Reply)
Discussion started by: Jang
1 Replies
LEARN ABOUT DEBIAN
libgphoto2_port
LIBGPHOTO2_PORT(3) The gPhoto2 Reference (the man LIBGPHOTO2_PORT(3)
NAME
libgphoto2_port - cross-platform port access library
SYNOPSIS
#include <gphoto2/gphoto2_port.h>
DESCRIPTION
The libgphoto2_port library was written to provide libgphoto2(3) with a generic way of accessing ports. In this function, libgphoto2_port
is the successor of the libgpio library.
Currently, libgphoto2_port supports serial (RS-232) and USB connections, the latter requiring libusb to be installed.
The autogenerated API docs will be added here in the future.
ENVIRONMENT VARIABLES
IOLIBS If set, defines the directory where the libgphoto2_port library looks for its I/O drivers (iolibs). You only need to set this on
OS/2 systems and broken/test installations.
LD_DEBUG
Set this to all to receive lots of debug information regarding library loading on ld based systems.
USB_DEBUG
If set, defines the numeric debug level with which the libusb library will print messages. In order to get some debug output, set it
to 1.
SEE ALSO
libgphoto2(3), The gPhoto2 Manual, [1]gphoto website, automatically generated API docs, [2]libusb website
AUTHOR
The gPhoto2 Team.
Hans Ulrich Niedermann <gp@n-dimensional.de>. (man page)
REFERENCES
1. gphoto website
http://www.gphoto.org/
2. libusb website
http://libusb.sourceforge.net/
08/16/2006 LIBGPHOTO2_PORT(3)