I am sorry if I am in the wrong place!
I have been looking for a way to isolate and FTP out of the server hundreds of images which are no longer doing anything there, that is, that are not linked to any page.
The only thing I found (free) was the following script. If I am useless at html/css , then this script is Chinese to me!
Could you tell me if it would work, what I have to change, and where to place it, please?
Please use CODE tags, not ICODE.
Please convert the output(file) from Windows to linux format, the current output is invalid, as its missing linebreaks.
You could use Notepad++ or any other TEXT editor (= NOT ms Word!) that lets you save/convert files to linux with utf8 w/o bom, then copy-paste again.
If you have a bash environment in windows, i'd put the script in $HOME/bin or /bin.
Then you could just type: remover.sh in the shell and pass the values, otherwise, you'd need to change to the dir and type ./remover.sh, or just type the /full/path/to/remover.sh.
/path/to/clear, represents the full path to the local files, has to be quoted if it contains spaces
BASEURL, should be a quoted url, like "http://www.pintotours.com"
The eimages are in the server not in my computer. The important thing to start with is to separate the images, i.e. the one not being used could be put in another folder.
After that I could FTP them to my computer. So, I suppose we can forget about my Windpws machine, unless you are telling me that the commands have to come from here.
I'm sorry but I really don't understand this code
Does the script need altering in any way' i just added bits and pieces without knowing what I was doing!
Should save a new file called files_to_remove.txt with a list of all unused images.
I highly recomend to first get such a list before (re-)moving anything.
Expecting that the list either contains only the filename, and that are images are in a single folder, you could then try something like:
Code:
put files_to_remove.txt
mkdir /bkp_imgs
cd www/pintotours/images
while read img ; do mv $img /bkp_imgs ; done</files_to_remove.txt
The files (images) are in a server run on Apache. All I need really is to separate the images there from where they are (doing nothing as they are no linked to any htl or php file) and placing them in anew file IN THE SERVER called, say, files_to_remove.txt, as you said.
Then, the question fo dealing with them, is not a problem to me and I do not need any scripts.
So will this script do the job in the server and how would I start? I take it that I would have to upload it and then soemhow get it to do the job
Hi all,
on our application server we have the following script that monitor the status of the website, my problem here is that i have edite the retries from 3 to 5,
and the timewait to 120 second,
so the script should check 5 times every 2 minutes, and if the fifth check fails it must restart... (0 Replies)
I should start by saying that I am totally new to linux...
I am trying to create a script that downloads images from multiple cameras into a specific folder on my machine. Ideally renaming the images as they are downloaded.
I have installed gphoto2 which as a command line interface that... (0 Replies)
Ultimately, I'm looking to create a script that allows me to plug in a usb drive with lots of jpegs on it & copy them over to a folder on my hard drive. So in the process of copying I am looking to hash check them, record dupes to a file, copy only 1 of the identical files (if it doesn't exsist... (1 Reply)
Hi All- I have written a shell script to send birthday wish mail to all my colleagues.
I need help from you all in two things,
1. I need to add some 17 birthday images as my team size is 17 i want to send different images for everyone. (I dont know how to do it)
2. I need to send mail... (4 Replies)
Hello all,
Currently I am using a script with "curl" to get the an alert if 200 ok would not be grepped.and the link is down.
is it possible to get an alert mail if a particular link on a website is not completely down but SLOW?? (0 Replies)
Good Day,
I have multiple websites on a domain. I am looking for a loop structure that can run each site script. egdomain1/test.php domainx/test.php so on, currently I copy and paste a list of commands but that skips certain commands. Some help would be greatly appreciated.
Sergio (3 Replies)
Hi ALL,
Is there any way, to login into a website using Shell/Perl command/script?
I am struggling on this from quite sometime but with no luck. Can you guys help, please?
My sole purpose is to login a website (Which requires Username and Password) and then extract some information from... (3 Replies)
Hi,
Could someone please help.
How do I verify using a shell script whether a website URL is available? It's roughly the URL equivalent of ping <servername> or tnsping <Oracle database name>?
I hope this is enough information - please let me know if it's not.
Many thanks,
Neil (3 Replies)
Hi all,
i have to need one script:
1. it will capture the unused user accounts in /export/home directory.
2. it will capture the locked user accounts in /export/home directory.
Note: locked accounts will show in /etc/passwd like /bin/false --> (instead of ksh it will show false)
the... (1 Reply)