04-21-2012
How to download file without curl and wget
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!!
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi
I need a Shell script that will download a text file every second from a http server using wget.
Can anyone provide me any pointers or sample scripts that will help me go about this task ???
regards
techie (1 Reply)
Discussion started by: techie82
1 Replies
2. UNIX for Dummies Questions & Answers
Hello Everyone,
I'm trying to use wget recursively to download a file.
Only html files are being downloaded, instead of the target file.
I'm trying this for the first time, here's what I've tried:
wget -r -O jdk.bin... (4 Replies)
Discussion started by: thoughts
4 Replies
3. Shell Programming and Scripting
Hi All
I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website :
http://downloads.biowisdomsrs.com/srs83_dist/
But this website contains lots of zipped files
I want to download the above file only discarding other zipped files.
When I am trying the... (1 Reply)
Discussion started by: alphasahoo
1 Replies
4. UNIX and Linux Applications
I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website:
http://downloads.biowisdomsrs.com/srs83_dist
There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from... (1 Reply)
Discussion started by: alphasahoo
1 Replies
5. Shell Programming and Scripting
Hi,
I want to download some online data using wget command and write the contents to a file.
For example this is the URL i want to download and store it in a file called "results.txt".
#This is the URL.
$url="http://www.example.com";
#retrieve data and store in a file results.txt
... (3 Replies)
Discussion started by: vanitham
3 Replies
6. Ubuntu
I am using ubuntu 10.04 LTS
I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K.
may be I am using wget in wrong way, any suggestions please?
Below is the command I used and the response from system.
wget --tries=10 -nd -nH --use=user... (10 Replies)
Discussion started by: LinuxLearner
10 Replies
7. UNIX for Dummies Questions & Answers
Hi,
For an order I requested, the provider has uploaded a tar file in public FTP site which internally has tons of files (compressed) and I need to download files that follows particular pattern which would be few hundreds.
Note: The order can't be requested for files that follows the... (7 Replies)
Discussion started by: Amalan
7 Replies
8. Shell Programming and Scripting
Hello all,
I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device.
The download center is hosted on remote web server .
Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies
9. Shell Programming and Scripting
Hi All,
I am trying to download a XML from a URL through wget and successful in that but the problem is that I have to check for some special characters inside that XML. But when I download through wget it transfers the content of the XML in plain text and I'm not able to search for those... (2 Replies)
Discussion started by: dips_ag
2 Replies
10. Shell Programming and Scripting
Hello guys, first post sorry if I did some mess here =)
Using Ubuntu 14.04lts 64bits server version.
I have a list (url.list) with only URLs to download, one per line, that looks like this:
http://domain.com/teste.php?a=2&b=3&name=1
http://domain.com/teste.php?a=2&b=3&name=2
...... (6 Replies)
Discussion started by: tonispa
6 Replies
LEARN ABOUT MOJAVE
curlopt_useragent
CURLOPT_USERAGENT(3) curl_easy_setopt options CURLOPT_USERAGENT(3)
NAME
CURLOPT_USERAGENT - set HTTP user-agent header
SYNOPSIS
#include <curl/curl.h>
CURLcode curl_easy_setopt(CURL *handle, CURLOPT_USERAGENT, char *ua);
DESCRIPTION
Pass a pointer to a zero terminated string as parameter. It will be used to set the User-Agent: header in the HTTP request sent to the
remote server. This can be used to fool servers or scripts. You can also set any custom header with CURLOPT_HTTPHEADER(3).
The application does not have to keep the string around after setting this option.
DEFAULT
NULL, no User-Agent: header is used by default.
PROTOCOLS
HTTP, HTTPS
EXAMPLE
CURL *curl = curl_easy_init();
if(curl) {
curl_easy_setopt(curl, CURLOPT_URL, "http://example.com");
curl_easy_setopt(curl, CURLOPT_USERAGENT, "Dark Secret Ninja/1.0");
curl_easy_perform(curl);
}
AVAILABILITY
As long as HTTP is supported
RETURN VALUE
Returns CURLE_OK if HTTP is supported, CURLE_UNKNOWN_OPTION if not, or CURLE_OUT_OF_MEMORY if there was insufficient heap space.
SEE ALSO
CURLOPT_REFERER(3), CURLOPT_HTTPHEADER(3),
libcurl 7.54.0 December 21, 2016 CURLOPT_USERAGENT(3)