Sponsored Content
Top Forums Shell Programming and Scripting Wget download file content in unicode Post 302935039 by DGPickett on Thursday 12th of February 2015 12:27:59 PM
Old 02-12-2015
If it is utf-8, what is a "special" character? A character may be many bytes. It's not going to look good if it goes past ASCII range in any app not UTF-8 handling.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Script to download file using wget

Hi I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie (1 Reply)
Discussion started by: techie82
1 Replies

2. UNIX for Dummies Questions & Answers

Using wget to download a file

Hello Everyone, I'm trying to use wget recursively to download a file. Only html files are being downloaded, instead of the target file. I'm trying this for the first time, here's what I've tried: wget -r -O jdk.bin... (4 Replies)
Discussion started by: thoughts
4 Replies

3. Shell Programming and Scripting

download a particular file using wget

Hi All I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website : http://downloads.biowisdomsrs.com/srs83_dist/ But this website contains lots of zipped files I want to download the above file only discarding other zipped files. When I am trying the... (1 Reply)
Discussion started by: alphasahoo
1 Replies

4. UNIX and Linux Applications

download file using wget

I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website: http://downloads.biowisdomsrs.com/srs83_dist There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from... (1 Reply)
Discussion started by: alphasahoo
1 Replies

5. Shell Programming and Scripting

How to download to a file using wget in perl?

Hi, I want to download some online data using wget command and write the contents to a file. For example this is the URL i want to download and store it in a file called "results.txt". #This is the URL. $url="http://www.example.com"; #retrieve data and store in a file results.txt ... (3 Replies)
Discussion started by: vanitham
3 Replies

6. Shell Programming and Scripting

Wget, download file from site's folder.

Ok, this is quite weird. wget -r mysite.com/mylink/ should get all the files recursively from the 'mylink' folder. The problem is that wget saves an index.html file! When I open this index.html with my browser I realize that it shows all the files in the current folder (plus an option to move... (3 Replies)
Discussion started by: hakermania
3 Replies

7. Ubuntu

wget don't download complete file

I am using ubuntu 10.04 LTS I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K. may be I am using wget in wrong way, any suggestions please? Below is the command I used and the response from system. wget --tries=10 -nd -nH --use=user... (10 Replies)
Discussion started by: LinuxLearner
10 Replies

8. Shell Programming and Scripting

How to download file without curl and wget

Hi I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget. Can anyone will help me go about this task ??? Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies

9. Shell Programming and Scripting

[Solved] Wget command to download file

Hi I am trying to download the file using wget command. But The password was created as pwd$$ for the user xyz. When i give the command as below it is not downloading the file. Will the password has $$ causing this issue. wget... (0 Replies)
Discussion started by: ksmbabu
0 Replies

10. Shell Programming and Scripting

Wget download file ( do not overwrite )

Hello all, I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device. The download center is hosted on remote web server . Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies
PARSER(1)						User Contributed Perl Documentation						 PARSER(1)

NAME
XML::DOM::Parser - An XML::Parser that builds XML::DOM document structures SYNOPSIS
use XML::DOM; my $parser = new XML::DOM::Parser; my $doc = $parser->parsefile ("file.xml"); DESCRIPTION
XML::DOM::Parser extends XML::Parser The XML::Parser module was written by Clark Cooper and is built on top of XML::Parser::Expat, which is a lower level interface to James Clark's expat library. XML::DOM::Parser parses XML strings or files and builds a data structure that conforms to the API of the Document Object Model as described at <http://www.w3.org/TR/REC-DOM-Level-1>. See the XML::Parser manpage for other additional properties of the XML::DOM::Parser class. Note that the 'Style' property should not be used (it is set internally.) The XML::Parser NoExpand option is more or less supported, in that it will generate EntityReference objects whenever an entity reference is encountered in character data. I'm not sure how useful this is. Any comments are welcome. As described in the synopsis, when you create an XML::DOM::Parser object, the parse and parsefile methods create an XML::DOM::Document object from the specified input. This Document object can then be examined, modified and written back out to a file or converted to a string. When using XML::DOM with XML::Parser version 2.19 and up, setting the XML::DOM::Parser option KeepCDATA to 1 will store CDATASections in CDATASection nodes, instead of converting them to Text nodes. Subsequent CDATASection nodes will be merged into one. Let me know if this is a problem. Using LWP to parse URLs The parsefile() method now also supports URLs, e.g. http://www.erols.com/enno/xsa.xml. It uses LWP to download the file and then calls parse() on the resulting string. By default it will use a LWP::UserAgent that is created as follows: use LWP::UserAgent; $LWP_USER_AGENT = LWP::UserAgent->new; $LWP_USER_AGENT->env_proxy; Note that env_proxy reads proxy settings from environment variables, which is what I need to do to get thru our firewall. If you want to use a different LWP::UserAgent, you can either set it globally with: XML::DOM::Parser::set_LWP_UserAgent ($my_agent); or, you can specify it for a specific XML::DOM::Parser by passing it to the constructor: my $parser = new XML::DOM::Parser (LWP_UserAgent => $my_agent); Currently, LWP is used when the filename (passed to parsefile) starts with one of the following URL schemes: http, https, ftp, wais, gopher, or file (followed by a colon.) If I missed one, please let me know. The LWP modules are part of libwww-perl which is available at CPAN. perl v5.8.0 2000-01-31 PARSER(1)
All times are GMT -4. The time now is 11:18 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy