Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Crontab Wget, downloading a file to a FTP Post 302468783 by zYx on Wednesday 3rd of November 2010 07:49:36 PM
Old 11-03-2010
Quote:
Originally Posted by DGPickett
Maybe you have to wget it to a temp dir and then wput it, or wget|wput with stdout/stdin options. (It might be faster with the pipe, too!) What is your O/S?
My OS is win7, but the cPanel is installed on my hosting account so I take it it's linux ?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

downloading folders in ftp

can you download folders when in ftp or is the only way to download more then one file by mget?? (6 Replies)
Discussion started by: eloquent99
6 Replies

2. UNIX for Dummies Questions & Answers

Downloading whole directory with FTP

Is it possible using FTP to download a whole directory and all subdirectories at once without having to go through every single file? Thanks, Tom (4 Replies)
Discussion started by: Pokeyzx
4 Replies

3. UNIX for Advanced & Expert Users

downloading through ftp

i have been busy getting accustomed to ssh and ftp and have a remote account that I am trying to comprehend.. my question is when I use ftp, ssh and remote accounts where do I download packages to.. which directory.. i have a cooledit package that is tarred and gziped which I ncftpd from... (6 Replies)
Discussion started by: moxxx68
6 Replies

4. Shell Programming and Scripting

Problem in Downloading one day old files from FTP site

HI, I'm downloading one day old files from ftp site. Below is my script ---------------------------- printf "open $HOST \n" > ftp.cmd printf "user $USER $PASSWD\n" >> ftp.cmd printf "bin\n" >> ftp.cmd #printf "cd /Models/\n" >> ftp.cmd printf "prompt\n" >> ftp.cmd printf "for oldfile... (4 Replies)
Discussion started by: shekhar_v4
4 Replies

5. Solaris

HTTP error while downloading solaris patches using wget

Hello, I am getting a HTTP error while downloading solaris patches using wget. 'Downloading unsigned patch 113096-03. --2010-06-18 03:51:15-- http://sunsolve.sun.com/pdownload.pl?target=113096-03&method=h Resolving sunsolve.sun.com (sunsolve.sun.com)... 192.18.108.40 Connecting to... (5 Replies)
Discussion started by: sunny_a_j
5 Replies

6. UNIX for Dummies Questions & Answers

wget pdf downloading problem

Hi. I am trying to make a mirror of this free online journal: http://www.informaworld.com/smpp/title~content=t716100758~db=all Under the individual issues, the link location for the "Full Text PDF" does not have ".pdf" as an extension -- so when I use wget it misses the file. However clicking... (5 Replies)
Discussion started by: obo1234
5 Replies

7. Shell Programming and Scripting

Downloading with Wget

Hello everyone. I'm new both to the forum and to unix scripting, and this website has been very useful in putting together a script I am working on. However, I have run into a bit of a snag, which is why I have come here seeking help. First I will say what I am trying to do, and then what I have... (2 Replies)
Discussion started by: keltonhalbert
2 Replies

8. Shell Programming and Scripting

Downloading FTP Files

Hi every one, I have the requirement to download the files from FTP and move those files to unix box. Once after coping the files, i need to remove the files in FTP. I'm a newbie in Unix script. Can you please suggest a script for this.. Thanks in advance.. (2 Replies)
Discussion started by: Murali4u
2 Replies

9. Shell Programming and Scripting

Wget error while downloading from https website

Hi, I would like to download a file from a https website. I don't have the file name as it changes every day. I am using the following command: wget --no-check-certificate -r -np --user=ABC --password=DEF -O temp.txt https://<website/directory> I am getting followin error in my... (9 Replies)
Discussion started by: pinnacle
9 Replies

10. Shell Programming and Scripting

Wget for downloading a public file (stream) as mp4

I need a hint for using wget for getting a free content from a TV station that is streaming its material for a while until it appears on any video platform, that means no use of illegal methods, because it is on air, recently published and available. But reading the manual for wget I tried the... (5 Replies)
Discussion started by: 1in10
5 Replies
JIGDO-LITE(1)															     JIGDO-LITE(1)

NAME
jigdo-lite - Download jigdo files using wget SYNOPSIS
jigdo-lite [ URL ] DESCRIPTION
See jigdo-file(1) for an introduction to Jigsaw Download. Given the URL of a `.jigdo' file, jigdo-lite downloads the large file (e.g. a CD image) that has been made available through that URL. wget(1) is used to download the necessary pieces of administrative data (contained in the `.jigdo' file and a corresponding `.template' file) as well as the many pieces that the large file is made from. The jigdo-file(1) utility is used to reconstruct the large file from the pieces. `.jigdo' files that contain references to Debian mirrors are treated specially: When such a file is recognized, you are asked to select one mirror out of a list of all Debian mirrors. If URL is not given on the command line, the script prompts for a location to download the `.jigdo' file from. The following command line options are recognized: -h --help Output short summary of command syntax. -v --version Output version number. --scan FILES Do not ask for "Files to scan", use this path. --noask Do not ask any questions, instead behave as if the user had pressed Return at all prompts. This can be useful when running jigdo- lite from cron jobs or in other non-interactive environments. SEE ALSO
jigdo-file(1), jigdo-mirror(1), wget(1) (or `info wget') CD images for Debian Linux can be downloaded with jigdo <URL:http://www.debian.org/CD/jigdo-cd/>. AUTHOR
Jigsaw Download <URL:http://atterer.net/jigdo/> was written by Richard Atterer <jigdo atterer.net>, to make downloading of CD ROM images for the Debian Linux distribution more convenient. 19 May 2006 JIGDO-LITE(1)
All times are GMT -4. The time now is 02:29 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy