Sponsored Content
Top Forums Shell Programming and Scripting Bash to download specific files and save in two folders Post 302945535 by cmccabe on Saturday 30th of May 2015 11:06:26 AM
Old 05-30-2015
Bash to download specific files and save in two folders

I am trying to download all files from a user authentication, password protected https site, with a particular extension (.bam). The files are ~20GB each and I am not sure if the below is the best way to do it. I am also not sure how to direct the downloaded files to a folder as well as external drive. Thank you Smilie.

Code:
while true
do
      printf "Files down loading please wait "
      wget --http-user cmccabe --http -passwd xxxx*** -r -A "*.bam" -q --show-progress --directory-prefix=C:\Users\cmccabe\Desktop\Example --directory-prefix=E:\WD\files https://test.xxxx.com/xxxxx.aspx
      printf "Files downloaded "

done


Last edited by cmccabe; 05-30-2015 at 12:10 PM.. Reason: added line of code
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

decompressed files to specific folders

Hi all, I have a script below which meant to decompress *.tr.gz to allocated folder. e.g. hello.tar.gz should be decompress to a subdirectory called hello and all the files should be in hello subdirectory. How do i do it? Am i missing anything? Condition: there are existing tar.gz in teh... (3 Replies)
Discussion started by: c00kie88
3 Replies

2. Shell Programming and Scripting

Copying specific files from remote m/c to specific folders

Hi All, I am trying to rsync some of the latest files from remote m/c to my local linux box. Folder structure in my remote m/c looks like this /pub/Nightly/Package/ROLL/WIN /pub/Nightly/Package/SOLL/sol /pub/Nightly/Package/SOLL/linux Each of the folder contains gzip files which on daily... (0 Replies)
Discussion started by: jhoomsharabi
0 Replies

3. UNIX for Dummies Questions & Answers

Monitoring specific files and folders

I want a mechanism to monitor a folder full of files that are sensitive. I want to log all accesses,modifications and changes to any file within the folder in a log file which should give me access/modify/change times,the user id of the process which tried and the pid. Even some idea of what to... (1 Reply)
Discussion started by: Vivek788
1 Replies

4. Shell Programming and Scripting

FAQ how to download multiple specific files via command line

Hi, This evening i would like to download multiple pcap captures files in the wireshark wiki sites. My aim is to download the capture files .pcap .cap and etc on the wireshark site SampleCaptures - The Wireshark Wiki. i already used wget, lynx, htget but still their problem downloading..it seems... (1 Reply)
Discussion started by: jao_madn
1 Replies

5. Shell Programming and Scripting

How to delete all the files and folders inside all the directories except some specific directory?

hi, i have a requirement to delete all the files from all the directories except some specific directories like archive and log. for example: there are following directories such as A B C D Archive E Log F which contains some sub directories and files. The requirement is to delete all the... (7 Replies)
Discussion started by: Little
7 Replies

6. Shell Programming and Scripting

Bash to tell download where specific files are stored

The bash below will download all the files in download to /home/Desktop/folder. That works great, but within /home/Desktop/folder there are several folders bam, other, and vcf, is there a way to specify by extention in the download file where to download it to? For example, all .pdf and .zip... (2 Replies)
Discussion started by: cmccabe
2 Replies

7. Shell Programming and Scripting

Bash to move specific files from folders in find file

I have a directory /home/cmccabe/nfs/exportedReports that contains multiple folders in it. The find writes the name of each folder to out.txt. A new directory is then created in a new location /home/cmccabe/Desktop/NGS/API, named with the date. What I am trying to do, unsuccessfully at the moment,... (7 Replies)
Discussion started by: cmccabe
7 Replies

8. Shell Programming and Scripting

Bash to list all folders in a specific directory

The below bash is trying to list the folders in a specific directory. It seems close but adds the path to the filename, which basename could strip off I think, but not sure why it writes the text file created? This list of folders in the directory will be used later, but needs to only be the... (5 Replies)
Discussion started by: cmccabe
5 Replies

9. Shell Programming and Scripting

Bash directory loop, but only choose those folders with specific word in it

Hello, how in bash i can get directory loop, but only choose those folders with specific word in it, so it will only echo those with specific word #!/bin/bash for filename in /home/test/* do if ; then echo $filename; fithx! (4 Replies)
Discussion started by: ZerO13
4 Replies

10. UNIX for Advanced & Expert Users

Find files in specific folders

Hi Team, I am new to the linux commands and I really need help . I would be really thankful if I can get some inputs. I have below folders in the path "/home/temp" 20170428 20170427 20170429 changes tempI need to get the files generated in the last 15 mins in all the above folders... (4 Replies)
Discussion started by: JackJinu
4 Replies
LWP-RGET(1)						User Contributed Perl Documentation					       LWP-RGET(1)

NAME
lwp-rget - Retrieve web documents recursively SYNOPSIS
lwp-rget [--verbose] [--auth=USER:PASS] [--depth=N] [--hier] [--iis] [--keepext=mime/type[,mime/type]] [--limit=N] [--nospace] [--prefix=URL] [--referer=URL] [--sleep=N] [--tolower] <URL> lwp-rget --version DESCRIPTION
This program will retrieve a document and store it in a local file. It will follow any links found in the document and store these documents as well, patching links so that they refer to these local copies. This process continues until there are no more unvisited links or the process is stopped by the one or more of the limits which can be controlled by the command line arguments. This program is useful if you want to make a local copy of a collection of documents or want to do web reading off-line. All documents are stored as plain files in the current directory. The file names chosen are derived from the last component of URL paths. The options are: --auth=USER:PASn Set the authentication credentials to user "USER" and password "PASS" if any restricted parts of the web site are hit. If there are restricted parts of the web site and authentication credentials are not available, those pages will not be downloaded. --depth=n Limit the recursive level. Embedded images are always loaded, even if they fall outside the --depth. This means that one can use --depth=0 in order to fetch a single document together with all inline graphics. The default depth is 5. --hier Download files into a hierarchy that mimics the web site structure. The default is to put all files in the current directory. --referer=URI Set the value of the Referer header for the initial request. The special value "NONE" can be used to suppress the Referer header in any of subsequent requests. The Referer header will always be suppressed in all normal "http" requests if the referring page was transmitted over "https" as recommended in RFC 2616. --iis Sends an "Accept: */*" on all URL requests as a workaround for a bug in IIS 2.0. If no Accept MIME header is present, IIS 2.0 returns with a "406 No acceptable objects were found" error. Also converts any back slashes (\) in URLs to forward slashes (/). --keepext=mime/type[,mime/type] Keeps the current extension for the list MIME types. Useful when downloading text/plain documents that shouldn't all be translated to *.txt files. --limit=n Limit the number of documents to get. The default limit is 50. --nospace Changes spaces in all URLs to underscore characters (_). Useful when downloading files from sites serving URLs with spaces in them. Does not remove spaces from fragments, e.g., "file.html#somewhere in here". --prefix=url_prefix Limit the links to follow. Only URLs that start the prefix string are followed. The default prefix is set as the "directory" of the initial URL to follow. For instance if we start lwp-rget with the URL "http://www.sn.no/foo/bar.html", then prefix will be set to "http://www.sn.no/foo/". Use "--prefix=''" if you don't want the fetching to be limited by any prefix. --sleep=n Sleep n seconds before retrieving each document. This options allows you to go slowly, not loading the server you visiting too much. --tolower Translates all links to lowercase. Useful when downloading files from IIS since it does not serve files in a case sensitive manner. --verbose Make more noise while running. --quiet Don't make any noise. --version Print program version number and quit. --help Print the usage message and quit. Before the program exits the name of the file, where the initial URL is stored, is printed on stdout. All used filenames are also printed on stderr as they are loaded. This printing can be suppressed with the --quiet option. SEE ALSO
lwp-request, LWP AUTHOR
Gisle Aas <aas@sn.no> perl v5.12.1 2009-06-15 LWP-RGET(1)
All times are GMT -4. The time now is 11:04 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy