I am trying to download a XML from a URL through wget and successful in that but the problem is that I have to check for some special characters inside that XML. But when I download through wget it transfers the content of the XML in plain text and I'm not able to search for those characters while if I open the file in any plain text editor I can see those characters.
See attachment, the character
there becomes
Can you please tell me how to download the file through wget preserving the character set, it's UTF-8 btw?
In one website I was option -r --remote-encoding=UTF-8 to be set but it's not working.
Error:
wget version -
-dips
Hi
I need a Shell script that will download a text file every second from a http server using wget.
Can anyone provide me any pointers or sample scripts that will help me go about this task ???
regards
techie (1 Reply)
Hello Everyone,
I'm trying to use wget recursively to download a file.
Only html files are being downloaded, instead of the target file.
I'm trying this for the first time, here's what I've tried:
wget -r -O jdk.bin... (4 Replies)
Hi All
I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website :
http://downloads.biowisdomsrs.com/srs83_dist/
But this website contains lots of zipped files
I want to download the above file only discarding other zipped files.
When I am trying the... (1 Reply)
I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website:
http://downloads.biowisdomsrs.com/srs83_dist
There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from... (1 Reply)
Hi,
I want to download some online data using wget command and write the contents to a file.
For example this is the URL i want to download and store it in a file called "results.txt".
#This is the URL.
$url="http://www.example.com";
#retrieve data and store in a file results.txt
... (3 Replies)
Ok, this is quite weird.
wget -r mysite.com/mylink/
should get all the files recursively from the 'mylink' folder.
The problem is that wget saves an index.html file!
When I open this index.html with my browser I realize that it shows all the files in the current folder (plus an option to move... (3 Replies)
I am using ubuntu 10.04 LTS
I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K.
may be I am using wget in wrong way, any suggestions please?
Below is the command I used and the response from system.
wget --tries=10 -nd -nH --use=user... (10 Replies)
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
Hi
I am trying to download the file using wget command. But The password was created as pwd$$ for the user xyz. When i give the command as below it is not downloading the file. Will the password has $$ causing this issue.
wget... (0 Replies)
Hello all,
I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device.
The download center is hosted on remote web server .
Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies
LEARN ABOUT CENTOS
spectool
SPECTOOL(1) User Commands SPECTOOL(1)NAME
spectool - manual page for spectool v1.0.10rpmdev1
SYNOPSIS
spectool [<options>] <specfile>
DESCRIPTION
Spectool is a tool to expand and download sources and patches from specfiles.
If you experience problems with specific specfiles, try to run
rpmbuild --nobuild --nodeps <specfile>
on the file which might give a clue why spectool fails on a file (ignore anything about missing sources or patches). The plan is to catch
errors like this in spectool itself and warn the user about it in the future.
OPTIONS
Operating mode:
-l, --lf, --list-files
lists the expanded sources/patches (default)
-g, --gf, --get-files
gets the sources/patches that are listed with a URL
-h, --help
display this help screen
Files on which to operate:
-A, --all
all files, sources and patches (default)
-S, --sources
all sources
-P, --patches
all patches
-s, --source x[,y[,...]]
specified sources
-p, --patch a[,b[,...]]
specified patches
Miscellaneous:
-d, --define 'macro value'
defines RPM macro 'macro' to be 'value'
-C, --directory dir
download into specified directory (default '.')
-R, --sourcedir
download into rpm's %{_sourcedir}
-n, --dryrun, --dry-run
don't download anything, just show what would be done
-f, --force
try to unlink and download if target files exist
-D, --debug
output debug info, don't clean up when done
FILES
/etc/rpmdevtools/curlrc
optional curl(1) configuration
spectool v1.0.10rpmdev1 June 2014 SPECTOOL(1)