Sponsored Content
Top Forums Shell Programming and Scripting shellscript for download files Post 98291 by narsing on Tuesday 7th of February 2006 07:16:43 AM
Old 02-07-2006
pls confirm the given script is working for u or not ?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Download files using perl

What is the easiest way to download some file using perl for it. (2 Replies)
Discussion started by: mirusnet
2 Replies

2. Shell Programming and Scripting

download files

Hi, I experience some difficulties to perform a script to be able to download some files via HTTP. The Perl script is located on a Linux in a server ("/var/www/cgi-bin/down.pl") where I can run it via Internet Explorer. I'm able to display the directory content ("/home/toto") but I'm not... (1 Reply)
Discussion started by: imad77
1 Replies

3. Shell Programming and Scripting

Shellscript to sort duplicate files listed in a text file

I have many pdf's scattered across 4 machines. There is 1 location where I have other Pdf's maintained. But the issues it the 4 machines may have duplicate pdf's among themselves, but I want just 1 copy of each so that they can be transfered to that 1 location. What I have thought is: 1) I have... (11 Replies)
Discussion started by: deaddevil
11 Replies

4. Shell Programming and Scripting

LFTP - to download files that are between 0 and 3 days old

Im writing a script and i specifically need it to download files that are between 0-2 days old. This will run every 2 days. I understand lftp supports newer files only, but these files will be removed from the target so this is not what we want. Does anyone know how to do this? ----------... (0 Replies)
Discussion started by: mokachoka
0 Replies

5. Shell Programming and Scripting

shellscript on AIX to download file from windows to AIX

i require the shell script that is running on the AIX to download a file from Windows desktop to the location where the shell script resides onthe AIX system. I have used the below code: but it throwing the error as below.please help me at the earliest to resolve the issue. error message :... (1 Reply)
Discussion started by: kvkc
1 Replies

6. UNIX for Advanced & Expert Users

Help with using curl to download files from https

Hi I'm trying to download an xml file from a https server using curl on a Linux machine with Ubuntu 10.4.2 I am able to connect to the remote server with my username and password but the output is only "Virtual user <username> logged in". I am expecting to download the xml file. My output... (4 Replies)
Discussion started by: henryN
4 Replies

7. Shell Programming and Scripting

Files download using wget

Hi, I need to implement below logic to download files daily from a URL. * Need to check if it is yesterday's file (YYYY-DD-MM.dat) * If present then download from URL (sample_url/2013-01-28.dat) * Need to implement wait logic if not present * if it still not able to find the file... (1 Reply)
Discussion started by: rakesh5300
1 Replies

8. Shell Programming and Scripting

Download files every one second using ftp script

Our main Server "Srv1" is used to generate text files based on specified criteria and it is also connected to two clients (pc1 and pc2) which are responsible for getting the files from Srv1 as it follows: 1. pc1 ( which represents my UNIX machine ) uses shell script to copy the files from Srv1 2.... (3 Replies)
Discussion started by: arm
3 Replies

9. Shell Programming and Scripting

[Solved] Simple Shellscript for uploading files to a specific folder on a ftp-server?

hi! Iam using my D-link DNS-320 (NAS) with fun_plug installed (a unix client) I am currently using cron to run a shellscript running a java-application that creates a couple of txt files. These files needs to be uploaded to a specific folder on my webhosts ftp server-account. I need a... (16 Replies)
Discussion started by: Nigge
16 Replies

10. Shell Programming and Scripting

Shellscript command to remove files starting with a certain string, and older than 3 days

Hi All, Need help in identifying a shellscript command to remove all files on a server directory, starting with a certain prefix and also older than 3 days. That means files created with that prefix, today or yesterday, shouldn't be removed. Thanks, Dev (3 Replies)
Discussion started by: dev.devil.1983
3 Replies
ps2frag(1)						      General Commands Manual							ps2frag(1)

NAME
ps2frag - obsolete shell script for the PSfrag system. IMPORTANT NOTICE
The new PSfrag system no longer requires the ps2frag script; instead, it handles the processing entirely within TeX/LaTeX and DVIPS. I'm sure you will agree that never needing to run ps2frag again is a nice convenience! However, there are two significant differences in the way this new version of PSfrag works. Please make yourself aware of them: 1) XDvi is no longer able to determine where your PSfrag replacements should go, so instead it lines them up in a vertical list to the left of the figure. This allows you to confirm that they have been typeset properly, at least. However, to confirm that PSfrag positions your replacements properly, you will have to view the PostScript version of your file with a viewer like GhostView, or print it out. This seems to be the only disadvantage to the elimination of the pre-processing step. 2) If you embed ' ex' commands inside your figures, you now need to explicitly _tell_ PSfrag to process these commands. To do so, use usepackage[scanall]{psfrag} instead of usepackage{psfrag} at the beginning of your LaTeX file. If you only use ' ex' commands in a small number of figures, then a more efficient might be to turn on ' ex'-scanning only for those fig- ures. To do that, add the command 'psfragscanon' immediately before each relevant includegraphics or epsfbox command. NOTES
See the PSfrag documentation for further information. SEE ALSO
dvips(1), gs(1), ghostview(1), latex(1) AUTHORS
psfrag@rascals.stanford.edu The PSfrag maintainer's mailing list. TeXware Feb 95 ps2frag(1)
All times are GMT -4. The time now is 08:21 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy