Sponsored Content
Full Discussion: Wget
Operating Systems Linux Red Hat Wget Post 302823801 by popeye on Wednesday 19th of June 2013 11:46:59 PM
Old 06-20-2013
Wget

If I run the following command


wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/

A local directory named 10.11.12.13/backups with the content of web site data is created.

What I want to do is have the data placed in a local directory called $HOME/backups.


Thanks for the help !
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. Shell Programming and Scripting

wget help

i am trying to ftp files/dirs with wget. i am having an issue where the path always takes me to my home dir even when i specify something else. For example: wget -m ftp://USER:PASS@IP_ADDRESS/Path/on/remote/box ...but if that path on the remote box isn't in my home dir it doesn't change to... (0 Replies)
Discussion started by: djembeplayer
0 Replies

3. Shell Programming and Scripting

Help with wget

Hi, i need temperature hourly from a web page Im using wget to get the web page. I would like to save the page downloaded in a file called page. I check the file everytime i run the wget function but its not saving but instead creates a wx.php file....Each time i run it...a new wx.php file is... (2 Replies)
Discussion started by: vadharah
2 Replies

4. Shell Programming and Scripting

wget

Hi I want to download some files using wget , and want to save in a specified directory. Is there any way to save it.Please suggest me. (1 Reply)
Discussion started by: mnmonu
1 Replies

5. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

6. UNIX for Dummies Questions & Answers

Wget

...... (1 Reply)
Discussion started by: hoo
1 Replies

7. Shell Programming and Scripting

WGET help!

Hi Friends, I have an url like this https://www.unix.com/help/ In this help directory, I have more than 300 directories which contains file or files. So, the 300 directories are like this http://unix.com/help/ dir1 file1 dir2 file2 dir3 file3_1 file3_2... (4 Replies)
Discussion started by: jacobs.smith
4 Replies

8. UNIX for Dummies Questions & Answers

Wget help

How can I download only *.zip and *.rar files from a website <index> who has multiple directories in root parent directory? I need wget to crawl every directory and download only zip and rar files. Is there anyway I could do it? (7 Replies)
Discussion started by: galford
7 Replies

9. Shell Programming and Scripting

Wget and gz

Can wget be used to goto a site and piped into a .gz extrated command? wget ftp://ftp.ncbi.nlm.nih.gov/pub/clinvar/vcf_GRCh37 | gunzip -d clinvar_20150603.vcf.gz (1 Reply)
Discussion started by: cmccabe
1 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
EXPIRE_BACKUPS(1)						       S3QL							 EXPIRE_BACKUPS(1)

NAME
expire_backups - Intelligently expire old backups SYNOPSIS
expire_backups [options] <age> [<age> ...] DESCRIPTION
The expire_backups command intelligently remove old backups that are no longer needed. To define what backups you want to keep for how long, you define a number of age ranges. expire_backups ensures that you will have at least one backup in each age range at all times. It will keep exactly as many backups as are required for that and delete any backups that become redundant. Age ranges are specified by giving a list of range boundaries in terms of backup cycles. Every time you create a new backup, the existing backups age by one cycle. Example: when expire_backups is called with the age range definition 1 3 7 14 31, it will guarantee that you always have the following backups available: 1. A backup that is 0 to 1 cycles old (i.e, the most recent backup) 2. A backup that is 1 to 3 cycles old 3. A backup that is 3 to 7 cycles old 4. A backup that is 7 to 14 cycles old 5. A backup that is 14 to 31 cycles old Note If you do backups in fixed intervals, then one cycle will be equivalent to the backup interval. The advantage of specifying the age ranges in terms of backup cycles rather than days or weeks is that it allows you to gracefully handle irregular backup intervals. Imagine that for some reason you do not turn on your computer for one month. Now all your backups are at least a month old, and if you had specified the above backup strategy in terms of absolute ages, they would all be deleted! Specifying age ranges in terms of backup cycles avoids these sort of problems. expire_backups usage is simple. It requires backups to have names of the forms year-month-day_hour:minute:seconds (YYYY-MM-DD_HH:mm:ss) and works on all backups in the current directory. So for the above backup strategy, the correct invocation would be: expire_backups.py 1 3 7 14 31 When storing your backups on an S3QL file system, you probably want to specify the --use-s3qlrm option as well. This tells expire_backups to use the s3qlrm command to delete directories. expire_backups uses a "state file" to keep track which backups are how many cycles old (since this cannot be inferred from the dates con- tained in the directory names). The standard name for this state file is .expire_backups.dat. If this file gets damaged or deleted, expire_backups no longer knows the ages of the backups and refuses to work. In this case you can use the --reconstruct-state option to try to reconstruct the state from the backup dates. However, the accuracy of this reconstruction depends strongly on how rigorous you have been with making backups (it is only completely correct if the time between subsequent backups has always been exactly the same), so it's gener- ally a good idea not to tamper with the state file. OPTIONS
The expire_backups command accepts the following options: --quiet be really quiet --debug activate debugging output --version just print program version and exit --state <file> File to save state information in (default: ".expire_backups.dat") -n Dry run. Just show which backups would be deleted. --reconstruct-state Try to reconstruct a missing state file from backup dates. --use-s3qlrm Use s3qlrm command to delete backups. EXIT STATUS
expire_backups returns exit code 0 if the operation succeeded and 1 if some error occured. SEE ALSO
expire_backups is shipped as part of S3QL, http://code.google.com/p/s3ql/. COPYRIGHT
2008-2011, Nikolaus Rath 1.11.1 August 27, 2014 EXPIRE_BACKUPS(1)
All times are GMT -4. The time now is 07:35 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy