Sponsored Content
Full Discussion: wget & rapidshare
Top Forums Shell Programming and Scripting wget & rapidshare Post 302165507 by trolis on Friday 8th of February 2008 03:46:20 AM
Old 02-08-2008
wget & rapidshare

Hello people,

I have a question concerning wget and rapidshare. How I can download from Rapidshare (have a premium account) using command-like tool wget. It seems pretty easy, but I always get only very small file (5KB) something like html. Please correct me if I am doing something in a wrong way:
1. I save a cookie by typing the following in Terminal
wget \
--save-cookies ~/.cookies/rapidshare \
--post-data "login=USERNAME&password=PASSWORD" \
-O - \
https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi \
> /dev/null
2. I try to download by typing this:
wget -c --load-cookies ~/.cookies/rapidshare hxxp://www.rapidshare.com/files/blablabla.xyz
11:45:21 (273.06 KB/s) - `blablabla.xyz' saved [5224/5224] While original size of the file is let's say 95 MB Smilie

Seems like the main link is redirected further and I get just a shortcut? So, the question is how to obtain original file? Any ideas are highly appreciated.
Thanks for support!
 

5 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

PHP read large string & split in multidimensional arrays & assign fieldnames & write into MYSQL

Hi, I hope the title does not scare people to look into this thread but it describes roughly what I'm trying to do. I need a solution in PHP. I'm a programming beginner, so it might be that the approach to solve this, might be easier to solve with an other approach of someone else, so if you... (0 Replies)
Discussion started by: lowmaster
0 Replies

2. Shell Programming and Scripting

Help needed in Curl & Wget

We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work. curl -k https://site curl -k -x IP:Port https://site curl -k -x IP:443 https://id:pwd@site ... (0 Replies)
Discussion started by: dineshbabu01
0 Replies

3. UNIX for Dummies Questions & Answers

wget and &

Hi, I'm trying to I'm trying to use wget to grab some data from the IRIS webservice. Basically its a web page that given the latitude and longitude for two points on the earth returns the distance and azimuth between them. Its usage is simple you just input an address as follows, ... (5 Replies)
Discussion started by: davcra
5 Replies

4. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

5. Shell Programming and Scripting

SFTP Shell Script Get & Delete && Upload & Delete

Hi All, Do you have any sample script, - auto get file from SFTP remote server and delete file in remove server after downloaded. - only download specify filename - auto upload file from local to SFTP remote server and delete local folder file after uploaded - only upload specify filename ... (3 Replies)
Discussion started by: weesiong
3 Replies
httpindex(1)						      General Commands Manual						      httpindex(1)

NAME
httpindex - HTTP front-end for SWISH++ indexer SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ] DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc- tory structure) can be kept, deleted, or replaced with their descriptions after indexing. OPTIONS
wget Options The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the EXAMPLE.) httpindex Options httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V. The following options are unique to httpindex: -d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See the extract_description() function in WWW(3) for details about how descriptions are extracted.) -D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with copies of remote files. EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally: wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 | httpindex -d -e'html:*.html,text:*.txt' Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex. EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise. CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.'' The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want to do: httpindex -e'html:*.html' -e'text:*.txt' do this instead: httpindex -e'html:*.html,text:*.txt' SEE ALSO
index++(1), wget(1), WWW(3) AUTHOR
Paul J. Lucas <pauljlucas@mac.com> SWISH++ August 2, 2005 httpindex(1)
All times are GMT -4. The time now is 04:57 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy