Sponsored Content
Full Discussion: Wget -i URLs.txt problem
Top Forums UNIX for Dummies Questions & Answers Wget -i URLs.txt problem Post 302732633 by Corona688 on Sunday 18th of November 2012 09:28:19 PM
Old 11-18-2012
You can't just dump raw post data into a page that's not expecting it, it won't work. You need to look at the page and see what it's actually doing.

In this case, the login page is ( I think ) http://resell-rights-weekly.com/members/members.php, which expects post-data like you've been given.

To save the cookies, you need to tell wget to save the cookies(including session cookies), see man wget. So you could call wget once to login and save the cookies, and then call it again, loading the cookies to download the rest.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Sorting problem "sort -k 16,29 sample.txt > output.txt"

Hi all, Iam trying to sort the contents of the file based on the position of the file. Example: $cat sample.txt 0101020060731 ## Header record 1c1 Berger Awc ANP20070201301 4000.50 1c2 Bose W G ANP20070201609 6000.70 1c2 Andy CK ANP20070201230 28000.00... (3 Replies)
Discussion started by: ganapati
3 Replies

2. UNIX for Advanced & Expert Users

Wget FTP problem!

Hi, I've tried to download from ftp sites by wget but it failed and says "Service unavailable" but when I use sftp in binary mode and use "get" command it works perfectly. What's the problem? BTW: I tried both passive and active mode in wget. thnx for ur help (9 Replies)
Discussion started by: mjdousti
9 Replies

3. Shell Programming and Scripting

Problem with wget

Hi, I want to download some patches from SUN by using a script and I am using "wget" as the utillity for this. The website for downloading has a "https:" in its name as below https://sunsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h and on running wget as below wget... (1 Reply)
Discussion started by: max29583
1 Replies

4. Shell Programming and Scripting

Extract urls from index.html downloaded using wget

Hi, I need to basically get a list of all the tarballs located at uri I am currently doing a wget on urito get the index.html page Now this index page contains the list of uris that I want to use in my bash script. can someone please guide me ,. I am new to Linux and shell scripting. ... (5 Replies)
Discussion started by: mnanavati
5 Replies

5. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

6. UNIX for Dummies Questions & Answers

find lines in file1.txt not found in file2.txt memory problem

I have a diff command that does what I want but when comparing large text/log files, it uses up all the memory I have (sometimes over 8gig of memory) diff file1.txt file2.txt | grep '^<'| awk '{$1="";print $0}' | sed 's/^ *//' Is there a better more efficient way to find the lines in one file... (5 Replies)
Discussion started by: raptor25
5 Replies

7. Shell Programming and Scripting

Problem with wget and cookie

Dear people, I got a problem with an scrip using wget to download pdf-files from an website which uses session-cookies. Background: for university its quite nasty to look up weekly which new homeworks, papers etc. are available on the different sites of the universites chairs. So I wanted a... (1 Reply)
Discussion started by: jackomo
1 Replies

8. Shell Programming and Scripting

Download pdf's using wget convert to txt

wget -i genedx.txt The code above will download multiple pdf files from a site, but how can i download and convert these to .txt? I have attached the master list (genedx.txt - which contains the url and file names) as well as the two PDF's that are downloaded. I am trying to have those... (7 Replies)
Discussion started by: cmccabe
7 Replies

9. Proxy Server

Problem with wget

I cannot download anything using wget in centos 6.5 and 7. But I can update yum etc. # wget https://wordpress.org/latest.tar.gz --2014-10-23 13:50:23-- https://wordpress.org/latest.tar.gz Resolving wordpress.org... 66.155.40.249, 66.155.40.250 Connecting to wordpress.org|66.155.40.249|:443...... (3 Replies)
Discussion started by: nirosha
3 Replies
Gedcom::WebServices(3pm)				User Contributed Perl Documentation				  Gedcom::WebServices(3pm)

NAME
Gedcom::WebServices - Basic web service routines for Gedcom.pm Version 1.16 - 24th April 2009 SYNOPSIS
wget -qO - http://www.example.com/ws/plain/my_family/i9/name DESCRIPTION
This module provides web service access to a GEDCOM file in conjunction with mod_perl. Using it, A request for imformation can be made in the form of a URL specifying the GEDCOM file to be used, which information is required and the format in which the information is to be delivered. This information is then returned in the specified format. There are currently three supported formats: o plain - no markup o XML o JSON URLs The format of the URLs used to access the web services are: $BASEURL/$FORMAT/$GEDCOM/$XREF/requested/information $BASEURL/$FORMAT/$GEDCOM?search=search_criteria BASEURL The base URL to access the web services. FORMAT The format in which to return the results. GEDCOM The name of the GEDCOM file to use (the extension .ged is assumed). XREF The xref of the record about which information is required. XREFs can be obtained initially from a search, and subsequently from certain queries. requested/information The information requested. This is in the same format as that taken by the get_value method. search_criteria An individual to search for. This is in the same format as that taken by the get_individual method. EXAMPLES
$ wget -qO - 'http://pjcj.sytes.net:8585/ws/plain/royal92?search=elizabeth_ii' /ws/plain/royal92/I52 $ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52 0 @I52@ INDI 1 NAME Elizabeth_II Alexandra Mary/Windsor/ 1 TITL Queen of England 1 SEX F 1 BIRT 2 DATE 21 APR 1926 2 PLAC 17 Bruton St.,London,W1,England 1 FAMS @F14@ 1 FAMC @F12@ $ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52/name Elizabeth_II Alexandra Mary /Windsor/ $ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52/birth/date 21 APR 1926 $ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52/children /ws/plain/royal92/I58 /ws/plain/royal92/I59 /ws/plain/royal92/I60 /ws/plain/royal92/I61 $ wget -qO - http://pjcj.sytes.net:8585/ws/json/royal92/I52/name {"name":"Elizabeth_II Alexandra Mary /Windsor/"} $ wget -qO - http://pjcj.sytes.net:8585/ws/xml/royal92/I52/name <NAME>Elizabeth_II Alexandra Mary /Windsor/</NAME> $ wget -qO - http://pjcj.sytes.net:8585/ws/xml/royal92/I52 <INDI ID="I52"> <NAME>Elizabeth_II Alexandra Mary/Windsor/</NAME> <TITL>Queen of England</TITL> <SEX>F</SEX> <BIRT> <DATE>21 APR 1926</DATE> <PLAC>17 Bruton St.,London,W1,England</PLAC> </BIRT> <FAMS REF="F14"/> <FAMC REF="F12"/> </INDI> CONFIGURATION
Add a section similar to the following to your mod_perl config: PerlWarn On PerlTaintCheck On PerlPassEnv GEDCOM_TEST <IfDefine GEDCOM_TEST> <Perl> $Gedcom::TEST = 1; </Perl> </IfDefine> <Perl> use Apache::Status; $ENV{PATH} = "/bin:/usr/bin"; delete @ENV{"IFS", "CDPATH", "ENV", "BASH_ENV"}; $Gedcom::DATA = $Gedcom::ROOT; # location of data stored on server use lib "$Gedcom::ROOT/blib/lib"; use Gedcom::WebServices; my $handlers = [ qw ( plain xml json ) ]; eval Gedcom::WebServices::_set_handlers($handlers); # use Apache::PerlSections; print STDERR Apache::PerlSections->dump; </Perl> PerlTransHandler Gedcom::WebServices::_parse_uri BUGS
Very probably. See the BUGS file. And the TODO file. VERSION
Version 1.16 - 24th April 2009 LICENCE
Copyright 2005-2009, Paul Johnson (paul@pjcj.net) This software is free. It is licensed under the same terms as Perl itself. The latest version of this software should be available from my homepage: http://www.pjcj.net perl v5.14.2 2012-04-12 Gedcom::WebServices(3pm)
All times are GMT -4. The time now is 08:50 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy