Sponsored Content
Top Forums Shell Programming and Scripting For loop till the files downloaded Post 302852475 by Corona688 on Wednesday 11th of September 2013 04:01:37 PM
Old 09-11-2013
Why 80? 80 of what? You're downloading recursively, do sub-downloads count towards the 80?

wget has retry options of its own which may make things a great deal simpler.
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

question regarding ftp. Files downloaded are of size Zero.

I need to download some files from a remote server using ftp. I have ftp'd into the site. I then do an mget * to retrieve all of the data files. Everything seems to proceed normally and I am given feedback that the files were downloaded. Now if I go into the DOS Shell or Windows explorer, it list... (5 Replies)
Discussion started by: ralphisnow
5 Replies

2. UNIX for Dummies Questions & Answers

Loop till you find a string in a fine <-- Need Help New to Unix Scripting

Guys - I am new to Unix scripting and am in need for a script that does the following. I have bits and pieces created and tested but i am just having a little difficult time getting it all together. - Loop through till it finds a string in a specific file. Any help is greatly appreciated. ... (1 Reply)
Discussion started by: mrehman
1 Replies

3. Shell Programming and Scripting

How to print lines till till a pattern is matched in loop

Dear All I have a file like this 112534554 446538656 444695656 225696966 226569744 228787874 113536566 443533535 222564552 115464656 225445345 225533234 I want to cut the file into different parts where the first two columns are '11' . The first two columns will be either... (3 Replies)
Discussion started by: anoopvraj
3 Replies

4. Shell Programming and Scripting

how to limit files downloaded by wget

I am trying to download a page and retrieve only wav and mp3 files via wget. the website is: Alarm Sounds | Free Sound Effects | Alarm Sound Clips | Sound Bites my command is : wget -rl 2 -e robots=off -A wav,mp3 http://soundbible.com/tags-alarm.html When not using the -A wav,mp3... (2 Replies)
Discussion started by: Narnie
2 Replies

5. Shell Programming and Scripting

Help with WGET and renaming downloaded files :(

Hi everybody, I would greatly appreciate some expertise in this matter. I am trying find an efficient way to batch download files from a website and rename each file with the url it originated from (from the CLI). (ie. Instead of xyz.zip, the output file would be http://www.abc.com/xyz.zip) A... (10 Replies)
Discussion started by: o0110o
10 Replies

6. Web Development

php files are downloaded

Hello, I have setup Cherokee web server and php 5.2 in Opensolaris zone. Problem is that all .php files are downloaded from web server and not served when I use IP address instead of DNS name in web brovser. Example: test.mydomain.com <-- php works 192.168.0.10/index.php <--... (3 Replies)
Discussion started by: kreno
3 Replies

7. Shell Programming and Scripting

BASH scripting - Preventing wget messed downloaded files

hello. How can I detect within script, that the downloaded file had not a correct size. linux:~ # wget --limit-rate=20k --ignore-length -O /Software_Downloaded/MULTIMEDIA_ADDON/skype-4.1.0.20-suse.i586.rpm ... (6 Replies)
Discussion started by: jcdole
6 Replies

8. Shell Programming and Scripting

Grep with loop till search is done

I need help to put a script where it runs the svn command grep'ing for the ticket# in the comments to see if the ticket was used in the latest commit. so on command line: ./test.sh ticket-1 ticket-2 ticket-3 It should be able to check if ticket-1 is used first and if not then check if... (2 Replies)
Discussion started by: iaav
2 Replies

9. Shell Programming and Scripting

While loop till length of line is great enough

I have the following code: # Get the line of stations_info.txt starting with "${xstation1} " and copy it to file temp.txt grep "^${xstation1} " stations_info.txt > temp.txt # Get lat and long of station nl=0 ... (2 Replies)
Discussion started by: claire.a
2 Replies

10. Shell Programming and Scripting

Deleting multiple files off an ftp server once they have been downloaded

Hello, I have a server that I have to ftp files off and they all start SGRD and are followed by 6 numbers. SGRD000001 SGRD000002 SGRD000003 The script I have will run every 10 mins to pick up files as new ones will be coming in all the time and what I want to do is delete the files I have... (7 Replies)
Discussion started by: sph90457
7 Replies
GNUNET-DOWNLOAD(1)					      General Commands Manual						GNUNET-DOWNLOAD(1)

NAME
gnunet-download - a command line interface for downloading files from GNUnet SYNOPSIS
gnunet-download [OPTIONS] -- GNUNET_URI DESCRIPTION
Download files from GNUnet. -a LEVEL, --anonymity=LEVEL set desired level of receiver anonymity. Default is 1. -c FILENAME, --config=FILENAME use config file (defaults: ~/.gnunet/gnunet.conf) -D, --delete-incomplete causes gnunet-download to delete incomplete downloads when aborted with CTRL-C. Note that complete files that are part of an incom- plete recursive download will not be deleted even with this option. Without this option, terminating gnunet-download with a signal will cause incomplete downloads to stay on disk. If gnunet-download runs to (normal) completion finishing the download, this option has no effect. -h, --help print help page -L LOGLEVEL, --loglevel=LOGLEVEL Change the loglevel. Possible values for LOGLEVEL are ERROR, WARNING, INFO and DEBUG. -n, --no-network Only search locally, do not forward requests to other peers. -o FILENAME, --output=FILENAME write the file to FILENAME. Hint: when recursively downloading a directory, append a '/' to the end of the FILENAME to create a directory of that name. If no FILENAME is specified, gnunet-download constructs a temporary ID from the URI of the file. The final filename is constructed based on meta-data extracted using libextractor (if available). -p DOWNLOADS, --parallelism=DOWNLOADS set the maximum number of parallel downloads that is allowed. More parallel downloads can, to some extent, improve the overall time to download content. However, parallel downloads also take more memory (see also option -r which can be used to limit memory uti- lization) and more sockets. This option is used to limit the number of files that are downloaded in parallel (-r can be used to limit the number of blocks that are concurrently requested). As a result, the value only matters for recursive downloads. The default value is 32. -r REQUESTS, --request-parallelism=REQUESTS set the maximum number of parallel requests that is allowed. If multiple files are downloaded, gnunet-download will not run them in parallel if this would cause the number of pending requests to possibly exceed the given value. This is useful since, for example, downloading dozens of multi-gigabyte files in parallel could exhaust memory resources and would hardly improve performance. Note that the limit only applies to this specific process and that other download activities by other processes are not included in this limit. Consider raising this limit for large recursive downloads with many large files if memory and network bandwidth are not fully utilized and if the parallelism limit (-p option) is not reached. This option also only matters for recursive downloads. The default value is 4092. -R, --recursive download directories recursively (and in parallel); note that the URI must belong to a GNUnet directory and that the filename given must end with a '/' -- otherwise, only the file corresponding to the URI will be downloaded. Note that in addition to using '-R', you must also specify a filename ending in '.gnd' so that the code realizes that the top-level file is a directory (since we have no meta data). -v, --version print the version number -V, --verbose print progress information NOTES
The GNUNET_URI is typically obtained from gnunet-search. gnunet-fs-gtk can also be used instead of gnunet-download. If you ever have to abort a download, you can at any time continue it by re-issuing gnunet-download with the same filename. In that case GNUnet will not down- load blocks again that are already present. GNUnet's file-encoding will ensure file integrity, even if the existing file was not downloaded from GNUnet in the first place. Temporary information will be appended to the target file until the download is completed. SETTING ANONYMITY LEVEL
The -a option can be used to specify additional anonymity constraints. If set to 0, GNUnet will try to download the file as fast as possi- ble, including using non-anonymous methods. If you set it to 1 (default), you use the standard anonymous routing algorithm (which does not explicitly leak your identity). However, a powerful adversary may still be able to perform traffic analysis (statistics) to over time infer data about your identity. You can gain better privacy by specifying a higher level of anonymity, which increases the amount of cover traffic your own traffic will get, at the expense of performance. Note that your download performance is not only determined by your own anonymity level, but also by the anonymity level of the peers publishing the file. So even if you download with anonymity level 0, the peers publishing the data might be sharing with a higher anonymity level, which in this case will determine performance. Also, peers that cache content in the network always use anonymity level 1. This option can be used to limit requests further than that. In particular, you can require GNUnet to receive certain amounts of traffic from other peers before sending your queries. This way, you can gain very high levels of anonymity - at the expense of much more traffic and much higher latency. So set it only if you really believe you need it. The definition of ANONYMITY-RECEIVE is the following. 0 means no anonymity is required. Otherwise a value of 'v' means that 1 out of v bytes of "anonymous" traffic can be from the local user, leaving 'v-1' bytes of cover traffic per byte on the wire. Thus, if GNUnet routes n bytes of messages from foreign peers (using anonymous routing), it may originate n/(v-1) bytes of queries in the same time-period. The time-period is twice the average delay that GNUnet defers forwarded queries. The default is 1 and this should be fine for most users. Also notice that if you choose very large values, you may end up having no throughput at all, especially if many of your fellow GNUnet-peers all do the same. FILES
~/.gnunet/gnunet.conf GNUnet configuration file REPORTING BUGS
Report bugs to <https://gnunet.org/bugs/> or by sending electronic mail to <gnunet-developers@gnu.org> SEE ALSO
gnunet-fs-gtk(1), gnunet-publish(1), gnunet-search(1), gnunet.conf(5), gnunet-service-fs(1) GNUnet 25 Feb 2012 GNUNET-DOWNLOAD(1)
All times are GMT -4. The time now is 01:10 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy