It looks like it's because they're embedded in javascript/flash. I hate when sites do that. Try these three commands:
Explanation:
After wget gets the page, the links in the raw HTML look like this:
I got the awk line from someone here and it's very useful. When it see "theFile=", it extracts everything until it runs into "&". There is some extra gibberish for some reason but piping it to "grep mp3" gets rid of it and every mp3 link is duplicated so piping to uniq takes care of that. The output goes to list and "wget -i list" gets the links in the list file.
That gets 59 links, I hope that's all of them. I didn't see any wav files so I just did it for mp3s.
It looks like it's because they're embedded in javascript/flash. I hate when sites do that. Try these three commands:
Explanation:
After wget gets the page, the links in the raw HTML look like this:
I got the awk line from someone here and it's very useful. When it see "theFile=", it extracts everything until it runs into "&". There is some extra gibberish for some reason but piping it to "grep mp3" gets rid of it and every mp3 link is duplicated so piping to uniq takes care of that. The output goes to list and "wget -i list" gets the links in the list file.
That gets 59 links, I hope that's all of them. I didn't see any wav files so I just did it for mp3s.
Thanks. I figured it was due to the embedded nature. Thanks for the awk line. That is very helpful.
Hello,
I have a server that I have to ftp files off and they all start SGRD and are followed by 6 numbers.
SGRD000001
SGRD000002
SGRD000003
The script I have will run every 10 mins to pick up files as new ones will be coming in all the time and what I want to do is delete the files I have... (7 Replies)
Need assistance in writing a for loop script or any looping method. Below is the code where i can get all the files from the URL . There are about 80 files in the URL .Every day the files get updated . Script that i wanted is the loop must keep on running till it gets 80 files. It matches the count... (5 Replies)
hello.
How can I detect within script, that the downloaded file had not a correct size.
linux:~ # wget --limit-rate=20k --ignore-length -O /Software_Downloaded/MULTIMEDIA_ADDON/skype-4.1.0.20-suse.i586.rpm ... (6 Replies)
Hi,
I need to implement below logic to download files daily from a URL.
* Need to check if it is yesterday's file (YYYY-DD-MM.dat)
* If present then download from URL (sample_url/2013-01-28.dat)
* Need to implement wait logic if not present
* if it still not able to find the file... (1 Reply)
Hello All,
I have gone through Google and came to know that we can download images from a site using wget.
Now I am been asked to check whether an image is populated in a site or not. If yes, please send that image to an address as an attachment..
Say for example, the site is Wiki -... (6 Replies)
Hi,
I need to basically get a list of all the tarballs located at uri
I am currently doing a wget on urito get the index.html page
Now this index page contains the list of uris that I want to use in my bash script.
can someone please guide me ,.
I am new to Linux and shell scripting.
... (5 Replies)
Hello,
I have setup Cherokee web server and php 5.2 in Opensolaris zone. Problem is that all .php files are downloaded from web server and not served when I use IP address instead of DNS name in web brovser.
Example: test.mydomain.com <-- php works
192.168.0.10/index.php <--... (3 Replies)
Hi everybody, I would greatly appreciate some expertise in this matter. I am trying find an efficient way to batch download files from a website and rename each file with the url it originated from (from the CLI). (ie. Instead of xyz.zip, the output file would be http://www.abc.com/xyz.zip) A... (10 Replies)
I am doing an ftp of around 1010 files and I am using mput for this. For some reason its only transferring 10 or 20 files and the rest are
not getting transferred. There is some socket error in the log. is there an issue if we have more than 50 or so files for mput.
here is the o/p in the log... (2 Replies)
I need to download some files from a remote server using ftp. I have ftp'd into the site. I then do an mget * to retrieve all of the data files. Everything seems to proceed normally and I am given feedback that the files were downloaded. Now if I go into the DOS Shell or Windows explorer, it list... (5 Replies)