Sponsored Content
Top Forums Shell Programming and Scripting Help with WGET and renaming downloaded files :( Post 302446828 by konsolebox on Thursday 19th of August 2010 10:25:08 PM
Old 08-19-2010
If you want to watermark an image everytime that wget fetches it , you have to separate each wget call per url in a loop in a shell script. Then everytime wget successfully downloaded an image, the script will call another tool to add watermark to the image. The problem arises with how you will save the images to your local directories.

With respect to watermarking, there are lots of tutorials on the web on how to do it like in these pages:
Resize and Watermark Images in Linux | SavvyAdmin.com
Batch Watermark Images in Linux | Tux Tweaks
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

question regarding ftp. Files downloaded are of size Zero.

I need to download some files from a remote server using ftp. I have ftp'd into the site. I then do an mget * to retrieve all of the data files. Everything seems to proceed normally and I am given feedback that the files were downloaded. Now if I go into the DOS Shell or Windows explorer, it list... (5 Replies)
Discussion started by: ralphisnow
5 Replies

2. Shell Programming and Scripting

how to limit files downloaded by wget

I am trying to download a page and retrieve only wav and mp3 files via wget. the website is: Alarm Sounds | Free Sound Effects | Alarm Sound Clips | Sound Bites my command is : wget -rl 2 -e robots=off -A wav,mp3 http://soundbible.com/tags-alarm.html When not using the -A wav,mp3... (2 Replies)
Discussion started by: Narnie
2 Replies

3. Web Development

php files are downloaded

Hello, I have setup Cherokee web server and php 5.2 in Opensolaris zone. Problem is that all .php files are downloaded from web server and not served when I use IP address instead of DNS name in web brovser. Example: test.mydomain.com <-- php works 192.168.0.10/index.php <--... (3 Replies)
Discussion started by: kreno
3 Replies

4. Shell Programming and Scripting

Extract urls from index.html downloaded using wget

Hi, I need to basically get a list of all the tarballs located at uri I am currently doing a wget on urito get the index.html page Now this index page contains the list of uris that I want to use in my bash script. can someone please guide me ,. I am new to Linux and shell scripting. ... (5 Replies)
Discussion started by: mnanavati
5 Replies

5. Shell Programming and Scripting

renaming files or adding a name in the beginning of all files in a folder

Hi All I have a folder that contains hundreds of file with a names 3.msa 4.msa 21.msa 6.msa 345.msa 456.msa 98.msa ... ... ... I need rename each of this file by adding "core_" in the begiining of each file such as core_3.msa core_4.msa core_21.msa (4 Replies)
Discussion started by: Lucky Ali
4 Replies

6. Shell Programming and Scripting

Specific image to be downloaded with wget

Hello All, I have gone through Google and came to know that we can download images from a site using wget. Now I am been asked to check whether an image is populated in a site or not. If yes, please send that image to an address as an attachment.. Say for example, the site is Wiki -... (6 Replies)
Discussion started by: sathyaonnuix
6 Replies

7. Shell Programming and Scripting

BASH scripting - Preventing wget messed downloaded files

hello. How can I detect within script, that the downloaded file had not a correct size. linux:~ # wget --limit-rate=20k --ignore-length -O /Software_Downloaded/MULTIMEDIA_ADDON/skype-4.1.0.20-suse.i586.rpm ... (6 Replies)
Discussion started by: jcdole
6 Replies

8. Shell Programming and Scripting

For loop till the files downloaded

Need assistance in writing a for loop script or any looping method. Below is the code where i can get all the files from the URL . There are about 80 files in the URL .Every day the files get updated . Script that i wanted is the loop must keep on running till it gets 80 files. It matches the count... (5 Replies)
Discussion started by: ajayram_arya
5 Replies

9. Shell Programming and Scripting

Renaming multiple files in sftp server in a get files script

Hi, In sftp script to get files, I have to rename all the files which I am picking. Rename command does not work here. Is there any way to do this? I am using #!/bin/ksh For eg: sftp user@host <<EOF cd /path get *.txt rename *.txt *.txt.done ... (7 Replies)
Discussion started by: jhilmil
7 Replies

10. Shell Programming and Scripting

Deleting multiple files off an ftp server once they have been downloaded

Hello, I have a server that I have to ftp files off and they all start SGRD and are followed by 6 numbers. SGRD000001 SGRD000002 SGRD000003 The script I have will run every 10 mins to pick up files as new ones will be coming in all the time and what I want to do is delete the files I have... (7 Replies)
Discussion started by: sph90457
7 Replies
WMGRABIMGAE(1)						      General Commands Manual						    WMGRABIMGAE(1)

NAME
WMGRABIMGAE - Dockable WWW Image monitor. SYNOPSIS
wmGrabImage [-h] [-display <Display>] -url <Image URL> [-http <URL>] [-c] [-delay <Time>] DESCRIPTION
wmGrabImage is a WindowMaker DockApp that maintains a small thumbnail copy of your favorite image from the WWW. The image to monitor is specified via the "-url <Image URL>" command-line option and it gets updated approximately every 5 minutes. The update interval can be overridden via the "-delay <Time>" command-line option (Time is in seconds). Each of the three mouse buttons can be double clicked with the following effects; Left Mouse: Brings up the full-sized image in xv. Middle Mouse: Sends a URL (specified via the -http <URL> command-line option) to an already running netscape process or in a new netscape process if there arent any running. Right Mouse: Updates the image immediately. OPTIONS
-h Display list of command-line options. -display [display] Use an alternate X Display. -url <Image URL> The URL of the WWW image to monitor. -http <URL> The URL to send to netscape via a Middle double click. -c Center the image vertically within the icon. -delay <Time> The time between updates. The default is about 5 minutes. FILES
The original sized image and the thumbnail XPM image are both stored in ~/.wmGrabImage/ which gets created if it doesnt already exist. SEE ALSO
wget and the ImageMagick convert utility. BUGS
Who knows? -- its still Beta though. (Let me know if you find any). Oldish versions of the ImageMagick convert utility have a memory leak -- if you have that problem, upgrade to the latest version. AUTHOR
Michael G. Henderson <mghenderson@lanl.gov> 16 December 1998 WMGRABIMGAE(1)
All times are GMT -4. The time now is 03:06 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy