It would help if you shared the code that you're using, along with a description of how it fails and the desired result (which I assume is to have the url watermarked on an image). Don't assume that we are familiar with the tool's you are using. However, even without specific knowledge of the tools involved, if there is a shortcoming in your shell script, we may be able assist.
If you want to watermark an image everytime that wget fetches it , you have to separate each wget call per url in a loop in a shell script. Then everytime wget successfully downloaded an image, the script will call another tool to add watermark to the image. The problem arises with how you will save the images to your local directories.
Here's what I have so far:
Now I just need to be able to string these together and add the URL -to-Watermark feature.
I also have a piece of code that isolates the filename from the URL:
Theoretically, one could compare the filename to the addresses in "picasalist", and then pass the corresponding URL off to the mogrify command and presto, mission accomplished! I just wish my technical ability was on par with my aspirations, lol. I have to say though, the kind people on these forums have always helped me in the right direction.
Hello,
I have a server that I have to ftp files off and they all start SGRD and are followed by 6 numbers.
SGRD000001
SGRD000002
SGRD000003
The script I have will run every 10 mins to pick up files as new ones will be coming in all the time and what I want to do is delete the files I have... (7 Replies)
Hi,
In sftp script to get files, I have to rename all the files which I am picking. Rename command does not work here. Is there any way to do this?
I am using #!/bin/ksh
For eg: sftp user@host <<EOF
cd /path
get *.txt
rename *.txt *.txt.done
... (7 Replies)
Need assistance in writing a for loop script or any looping method. Below is the code where i can get all the files from the URL . There are about 80 files in the URL .Every day the files get updated . Script that i wanted is the loop must keep on running till it gets 80 files. It matches the count... (5 Replies)
hello.
How can I detect within script, that the downloaded file had not a correct size.
linux:~ # wget --limit-rate=20k --ignore-length -O /Software_Downloaded/MULTIMEDIA_ADDON/skype-4.1.0.20-suse.i586.rpm ... (6 Replies)
Hello All,
I have gone through Google and came to know that we can download images from a site using wget.
Now I am been asked to check whether an image is populated in a site or not. If yes, please send that image to an address as an attachment..
Say for example, the site is Wiki -... (6 Replies)
Hi All
I have a folder that contains hundreds of file with a names
3.msa
4.msa
21.msa
6.msa
345.msa
456.msa
98.msa
...
...
...
I need rename each of this file by adding "core_" in the begiining of each file such as
core_3.msa
core_4.msa
core_21.msa (4 Replies)
Hi,
I need to basically get a list of all the tarballs located at uri
I am currently doing a wget on urito get the index.html page
Now this index page contains the list of uris that I want to use in my bash script.
can someone please guide me ,.
I am new to Linux and shell scripting.
... (5 Replies)
Hello,
I have setup Cherokee web server and php 5.2 in Opensolaris zone. Problem is that all .php files are downloaded from web server and not served when I use IP address instead of DNS name in web brovser.
Example: test.mydomain.com <-- php works
192.168.0.10/index.php <--... (3 Replies)
I am trying to download a page and retrieve only wav and mp3 files via wget.
the website is:
Alarm Sounds | Free Sound Effects | Alarm Sound Clips | Sound Bites
my command is :
wget -rl 2 -e robots=off -A wav,mp3 http://soundbible.com/tags-alarm.html
When not using the -A wav,mp3... (2 Replies)
I need to download some files from a remote server using ftp. I have ftp'd into the site. I then do an mget * to retrieve all of the data files. Everything seems to proceed normally and I am given feedback that the files were downloaded. Now if I go into the DOS Shell or Windows explorer, it list... (5 Replies)