Sponsored Content
Top Forums Shell Programming and Scripting How to download file without curl and wget Post 302627685 by haczyk on Saturday 21st of April 2012 08:59:11 AM
Old 04-21-2012
I guess that you are not able to start on destination server service like ssh (scp) for ftp?

If no I think you have to use external package like "axel" (source), "aria2" (source2) or aget (source)

Kind regards
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Script to download file using wget

Hi I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie (1 Reply)
Discussion started by: techie82
1 Replies

2. UNIX for Dummies Questions & Answers

Using wget to download a file

Hello Everyone, I'm trying to use wget recursively to download a file. Only html files are being downloaded, instead of the target file. I'm trying this for the first time, here's what I've tried: wget -r -O jdk.bin... (4 Replies)
Discussion started by: thoughts
4 Replies

3. Shell Programming and Scripting

download a particular file using wget

Hi All I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website : http://downloads.biowisdomsrs.com/srs83_dist/ But this website contains lots of zipped files I want to download the above file only discarding other zipped files. When I am trying the... (1 Reply)
Discussion started by: alphasahoo
1 Replies

4. UNIX and Linux Applications

download file using wget

I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website: http://downloads.biowisdomsrs.com/srs83_dist There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from... (1 Reply)
Discussion started by: alphasahoo
1 Replies

5. Shell Programming and Scripting

How to download to a file using wget in perl?

Hi, I want to download some online data using wget command and write the contents to a file. For example this is the URL i want to download and store it in a file called "results.txt". #This is the URL. $url="http://www.example.com"; #retrieve data and store in a file results.txt ... (3 Replies)
Discussion started by: vanitham
3 Replies

6. Ubuntu

wget don't download complete file

I am using ubuntu 10.04 LTS I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K. may be I am using wget in wrong way, any suggestions please? Below is the command I used and the response from system. wget --tries=10 -nd -nH --use=user... (10 Replies)
Discussion started by: LinuxLearner
10 Replies

7. UNIX for Dummies Questions & Answers

How to download files matching pattern from FTP using CURL or WGET?

Hi, For an order I requested, the provider has uploaded a tar file in public FTP site which internally has tons of files (compressed) and I need to download files that follows particular pattern which would be few hundreds. Note: The order can't be requested for files that follows the... (7 Replies)
Discussion started by: Amalan
7 Replies

8. Shell Programming and Scripting

Wget download file ( do not overwrite )

Hello all, I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device. The download center is hosted on remote web server . Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies

9. Shell Programming and Scripting

Wget download file content in unicode

Hi All, I am trying to download a XML from a URL through wget and successful in that but the problem is that I have to check for some special characters inside that XML. But when I download through wget it transfers the content of the XML in plain text and I'm not able to search for those... (2 Replies)
Discussion started by: dips_ag
2 Replies

10. Shell Programming and Scripting

Curl parallel download file list

Hello guys, first post sorry if I did some mess here =) Using Ubuntu 14.04lts 64bits server version. I have a list (url.list) with only URLs to download, one per line, that looks like this: http://domain.com/teste.php?a=2&b=3&name=1 http://domain.com/teste.php?a=2&b=3&name=2 ...... (6 Replies)
Discussion started by: tonispa
6 Replies
AXEL(1) 						      General Commands Manual							   AXEL(1)

NAME
Axel - A light download accelerator for Linux. SYNOPSIS
axel [OPTIONS] url1 [url2] [url...] DESCRIPTION
Axel is a program that downloads a file from a FTP or HTTP server through multiple connection, each connection downloads its own part of the file. Unlike most other programs, Axel downloads all the data directly to the destination file, using one single thread. It just saves some time at the end because the program doesn't have to concatenate all the downloaded parts. OPTIONS
One argument is required, the URL to the file you want to download. When downloading from FTP, the filename may contain wildcards and the program will try to resolve the full filename. Multiple URL's can be specified as well and the program will use all those URL's for the download. Please note that the program does not check whether the files are equal. Other options: --max-speed=x, -s x You can specify a speed (bytes per second) here and Axel will try to keep the average speed around this speed. Useful if you don't want the program to suck up all of your bandwidth. --num-connections=x, -n x You can specify an alternative number of connections here. --output=x, -o x Downloaded data will be put in a local file with the same name, unless you specify a different name using this option. You can spec- ify a directory as well, the program will append the filename. --search[=x], -S[x] Axel can do a search for mirrors using the filesearching.com search engine. This search will be done if you use this option. You can specify how many different mirrors should be used for the download as well. The search for mirrors can be time-consuming because the program tests every server's speed, and it checks whether the file's still available. --no-proxy, -N Don't use any proxy server to download the file. Not possible when a transparent proxy is active somewhere, of course. --verbose If you want to see more status messages, you can use this option. Use it more than once if you want to see more. --quiet, -q No output to stdout. --alternate, -a This will show an alternate progress indicator. A bar displays the progress and status of the different threads, along with current speed and an estimate for the remaining download time. --header=x, -H x Add an additional HTTP header. This option should be in the form "Header: Value". See RFC 2616 section 4.2 and 14 for details on the format and standardized headers. --user-agent=x, -U x Set the HTTP user agent to use. Some websites serve different content based upon this parameter. The default value will include "Axel", its version and the platform. --help, -h A brief summary of all the options. --version, -V Get version information. NOTE
Long (double dash) options are supported only if your platform knows about the getopt_long call. If it does not (like *BSD), only the short options can be used. RETURN VALUE
The program returns 0 when the download was succesful, 1 if something really went wrong and 2 if the download was interrupted. If something else comes back, it must be a bug.. EXAMPLES
axel ftp://ftp.{be,nl,uk,de}.kernel.org/pub/linux/kernel/v2.4/linux-2.4.17.tar.bz2 This will use the Belgian, Dutch, English and German kernel.org mirrors to download a Linux 2.4.17 kernel image. axel -S4 ftp://ftp.kernel.org/pub/linux/kernel/v2.4/linux-2.4.17.tar.bz2 This will do a search for the linux-2.4.17.tar.bz2 file on filesearching.com and it'll use the four (if possible) fastest mirrors for the download. (Possibly including ftp.kernel.org) (Of course, the commands are a single line, but they're too long to fit on one line in this page.) FILES
/etc/axelrc System-wide configuration file. Note that development versions place this file in /usr/local/etc. ~/.axelrc Personal configuration file These files are not documented in a man-page, but the example file which comes with the program contains enough information, I hope. The position of the system-wide configuration file might be different. COPYRIGHT
Axel is Copyright 2001-2002 Wilmer van der Gaast. BUGS
Please report bugs at https://alioth.debian.org/tracker/?group_id=100070&atid=413085. AUTHORS
Wilmer van der Gaast. <wilmer@gaast.net> AXEL(1)
All times are GMT -4. The time now is 06:55 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy