Sponsored Content
Top Forums UNIX for Dummies Questions & Answers cant make a http get request using wget Post 302289802 by Ikon on Friday 20th of February 2009 01:49:59 PM
Old 02-20-2009
the wget request IS working.

The "500 Internal Server Error" is the response from the webserver. Look in the logs on the webserver and find out what is wrong with the request.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget using wildcards using http

Hi there, probably a really simple question but i want to download all .rpm files from a web repository which happens to be http and not ftp Ive tried using wget, but as far as i can see it doesnt allow for wilcards (ie wget http://address/*.rpm) does anybody know i can get all these files in... (2 Replies)
Discussion started by: hcclnoodles
2 Replies

2. UNIX for Dummies Questions & Answers

unix script http request

Hi everybody, I have a *.vbs file which I want to run automatically. I want to know if there is anyway to implement the given example for e.g "http://255.255.255.55/script.vbs" what I mean is does anyone know how to make an http request from a unix script?? Thanks in advance!!!!!!!!!!! (1 Reply)
Discussion started by: arksal
1 Replies

3. Shell Programming and Scripting

Http request in Linux

Hi, i need a guide how to write a script which i can do a http request. Let say the request look like below; http://www.test.com?txid=1&type=service&server=linux I have a list of "txid" (in *.txt) and need to run all "txid" acordingly. So that mean, every transaction i have to refer "txid"... (7 Replies)
Discussion started by: malaysoul
7 Replies

4. UNIX for Dummies Questions & Answers

http request forward

Hi, Maybe it's a stupid question, anyway here goes.. I have an Apache web server on Solaris box, let's say A, with a public ip and a web application on a Linux box, uhmmm B, on a private lan with a private ip. I want the people from outside to connect to the app, but its inside the lan,... (4 Replies)
Discussion started by: piltrafa
4 Replies

5. UNIX for Dummies Questions & Answers

HTTP request

Can anybody tell about http request processing in shell script..? (3 Replies)
Discussion started by: noufal
3 Replies

6. Web Development

Copy and forward apache http request

Hello, I am using apache 2.2 and I need to have certain http requests (those including example.com for instance) to be executed normally and forwarded to another server. With mod_rewrite, I could easily forward but then the input request would not be executed on my server. Right? Am I... (1 Reply)
Discussion started by: JCR
1 Replies

7. Shell Programming and Scripting

Formatting wget request within script

When using a browser and calling this url .. the data returns the proper range of information ichart dot finance dot yahoo dot com/table.csv?s=YAHOO&a=3&b=14&c=2012&d=03&e=20&f=2012&g=d&ignore.csv (geeze wont let me post url's sorry ) However in my script the formatting is messing up on... (4 Replies)
Discussion started by: harte
4 Replies

8. Programming

C++ http GET request using sockets

Hello I am trying to communicate with a server that is ready to accept HTTP GET requests and send back data per the request. However, I have very little experience in socket programming and I don't really know how to debug this. Googling on the web hasn't yielded much, except people saying I... (2 Replies)
Discussion started by: flagman5
2 Replies

9. Shell Programming and Scripting

http request

I am running a website but I still have problems with the "service temporarily unavailable error". I want to make a simple check if the website is up and running. Does anybody has an idea how to do it? (the site is password protected, so you have to add a user and pwd before logging in). ... (2 Replies)
Discussion started by: jurgen
2 Replies

10. Shell Programming and Scripting

Parsing the http post request

Hi, I am trying to write a shell script to parse the post request data that it received to a xml file. Below is the post request data that script is receiving. -----------------------------7dd2339190c8e Content-Disposition: form-data; name="param1" 1... (2 Replies)
Discussion started by: jdp
2 Replies
URIFIND(1p)						User Contributed Perl Documentation					       URIFIND(1p)

NAME
urifind - find URIs in a document and dump them to STDOUT. SYNOPSIS
$ urifind file DESCRIPTION
urifind is a simple script that finds URIs in one or more files (using "URI::Find"), and outputs them to to STDOUT. That's it. To find all the URIs in file1, use: $ urifind file1 To find the URIs in multiple files, simply list them as arguments: $ urifind file1 file2 file3 urifind will read from "STDIN" if no files are given or if a filename of "-" is specified: $ wget http://www.boston.com/ -O - | urifind When multiple files are listed, urifind prefixes each found URI with the file from which it came: $ urifind file1 file2 file1: http://www.boston.com/index.html file2: http://use.perl.org/ This can be turned on for single files with the "-p" ("prefix") switch: $urifind -p file3 file1: http://fsck.com/rt/ It can also be turned off for multiple files with the "-n" ("no prefix") switch: $ urifind -n file1 file2 http://www.boston.com/index.html http://use.perl.org/ By default, URIs will be displayed in the order found; to sort them ascii-betically, use the "-s" ("sort") option. To reverse sort them, use the "-r" ("reverse") flag ("-r" implies "-s"). $ urifind -s file1 file2 http://use.perl.org/ http://www.boston.com/index.html mailto:webmaster@boston.com $ urifind -r file1 file2 mailto:webmaster@boston.com http://www.boston.com/index.html http://use.perl.org/ Finally, urifind supports limiting the returned URIs by scheme or by arbitrary pattern, using the "-S" option (for schemes) and the "-P" option. Both "-S" and "-P" can be specified multiple times: $ urifind -S mailto file1 mailto:webmaster@boston.com $ urifind -S mailto -S http file1 mailto:webmaster@boston.com http://www.boston.com/index.html "-P" takes an arbitrary Perl regex. It might need to be protected from the shell: $ urifind -P 's?html?' file1 http://www.boston.com/index.html $ urifind -P '.org' -S http file4 http://www.gnu.org/software/wget/wget.html Add a "-d" to have urifind dump the refexen generated from "-S" and "-P" to "STDERR". "-D" does the same but exits immediately: $ urifind -P '.org' -S http -D $scheme = '^(http):' @pats = ('^(http):', '.org') To remove duplicates from the results, use the "-u" ("unique") switch. OPTION SUMMARY
-s Sort results. -r Reverse sort results (implies -s). -u Return unique results only. -n Don't include filename in output. -p Include filename in output (0 by default, but 1 if multiple files are included on the command line). -P $re Print only lines matching regex '$re' (may be specified multiple times). -S $scheme Only this scheme (may be specified multiple times). -h Help summary. -v Display version and exit. -d Dump compiled regexes for "-S" and "-P" to "STDERR". -D Same as "-d", but exit after dumping. AUTHOR
darren chamberlain <darren@cpan.org> COPYRIGHT
(C) 2003 darren chamberlain This library is free software; you may distribute it and/or modify it under the same terms as Perl itself. SEE ALSO
URI::Find perl v5.14.2 2012-04-08 URIFIND(1p)
All times are GMT -4. The time now is 01:30 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy