02-14-2014
Check the
wget utility, i guess it does what you need ... without having to reinvent the wheel
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Hi,
I have a file with a URL text written in it within double quotes e.g.
"http://abcd.xyz.com/mno/somefile.dtd"
I want the above text to get replaced by a single space character.
I tried
cat File1.txt | sed -e 's/("http)*(dtd")/ /g' > File2.txt
But it didnt work out. Can someone... (5 Replies)
Discussion started by: dsrookie
5 Replies
2. UNIX for Advanced & Expert Users
HI
Can you please help me,how to post a xml file from Unix to URL.
Basically,i want to map contents of my file at an url
Regards
Pooja (1 Reply)
Discussion started by: PoojaM
1 Replies
3. UNIX for Advanced & Expert Users
Hi all,
I need to write a unix script in which need to call a url.
Then need to pass parameters to that url.
please help.
Regards,
gander_ss (1 Reply)
Discussion started by: gander_ss
1 Replies
4. Shell Programming and Scripting
Hi all,
I need to write a unix script in which need to call a url.
Then need to pass parameters to that url.
please help.
Regards,
gander_ss (1 Reply)
Discussion started by: gander_ss
1 Replies
5. UNIX for Dummies Questions & Answers
Hello,
I need to redirect an existing URL, how can i do that?
There's a current web address to a GUI that I have to redirect to another webaddress. Does anyone know how to do this?
This is on Unix boxes Linux.
example:
https://m45.testing.address.net/host.php
make it so the... (3 Replies)
Discussion started by: SkySmart
3 Replies
6. Shell Programming and Scripting
Hi :)
How to use dump in lynx.
$ lynx -dump http://www.google.com
So, this is an example of a lynx dump:
txt1 blabla Other txt
some text
1. http://url_of_txt1
2. http://url_of_blabla
3. http://url_of_Other_txt
4. http://url_of_some_text
...
How can i obtain this output?
... (12 Replies)
Discussion started by: aspire
12 Replies
7. Web Development
I am trying to find a way to test some code, but I need to rewrite a specific URL only from a specific HTTP_HOST
The call goes out to
http://SUB.DOMAIN.COM/showAssignment/7bde10b45efdd7a97629ef2fe01f7303/jsmodule/Nevow.Athena
The ID in the middle is always random due to the cookie.
I... (5 Replies)
Discussion started by: EXT3FSCK
5 Replies
8. Shell Programming and Scripting
HI,
I have a URL that points to a file:
LINK= "http://www.webpage.org/project/team2/file.tar"
However when I try to use wget on this variable I receive the following error.
wget $LINK
line 4: http://www.webpage.org/project/team2/file.tar: No such file or directory
wget:... (1 Reply)
Discussion started by: bashnewbee
1 Replies
9. UNIX for Dummies Questions & Answers
Here is what I have so far:
find . -name "*php*" -or -name "*htm*" | xargs grep -i iframe | awk -F'"' '/<iframe*/{gsub(/.\*iframe>/,"\"");print $2}'
Here is an example content of a PHP or HTM(HTML) file:
<iframe src="http://ADDRESS_1/?click=5BBB08\" width=1 height=1... (18 Replies)
Discussion started by: striker4o
18 Replies
10. Shell Programming and Scripting
Hello Everyone,
I am trying to write a shell script(or Perl Script) that would do the following:
I have a file that contains the following lines:
File:
https://ims-svnus.com/dev/DB/trunk/feeds/templates/shell_script.txt -r860... (5 Replies)
Discussion started by: filter
5 Replies
LEARN ABOUT DEBIAN
www::mechanize::gzip
WWW::Mechanize::GZip(3pm) User Contributed Perl Documentation WWW::Mechanize::GZip(3pm)
NAME
WWW::Mechanize::GZip - tries to fetch webpages with gzip-compression
VERSION
Version 0.10
SYNOPSIS
use WWW::Mechanize::GZip;
my $mech = WWW::Mechanize::GZip->new();
my $response = $mech->get( $url );
print "x-content-length (before unzip) = ", $response->header('x-content-length');
print "content-length (after unzip) = ", $response->header('content-length');
DESCRIPTION
The WWW::Mechanize::GZip module tries to fetch a URL by requesting gzip-compression from the webserver.
If the response contains a header with 'Content-Encoding: gzip', it decompresses the response in order to get the original (uncompressed)
content.
This module will help to reduce bandwith fetching webpages, if supported by the webeserver. If the webserver does not support gzip-
compression, no decompression will be made.
This modules is a direct subclass of WWW::Mechanize and will therefore support any methods provided by WWW::Mechanize.
The decompression is handled by Compress::Zlib::memGunzip.
There is a small webform, you can instantly test, whether a webserver supports gzip-compression on a particular URL:
<http://www.computerhandlung.de/www-mechanize-gzip.htm>
METHODS
prepare_request
Adds 'Accept-Encoding' => 'gzip' to outgoing HTTP-headers before sending.
send_request
Unzips response-body if 'content-encoding' is 'gzip' and corrects 'content-length' to unzipped content-length.
SEE ALSO
WWW::Mechanize
Compress::Zlib
AUTHOR
Peter Giessner "cardb@planet-elektronik.de"
LICENCE AND COPYRIGHT
Copyright (c) 2007, Peter Giessner "cardb@planet-elektronik.de". All rights reserved.
This module is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
perl v5.10.0 2009-06-24 WWW::Mechanize::GZip(3pm)