How to download files matching pattern from FTP using CURL or WGET?

Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers How to download files matching pattern from FTP using CURL or WGET?
# 1  
Old 09-11-2013
How to download files matching pattern from FTP using CURL or WGET?


For an order I requested, the provider has uploaded a tar file in public FTP site which internally has tons of files (compressed) and I need to download files that follows particular pattern which would be few hundreds.

Note: The order can't be requested for files that follows the pattern.

Can this be performed using CURL or WGET commands? Please advise.

Amalan J
# 2  
Old 09-12-2013
Provided the pattern you need is relativly simple (ie file globbing rather than full regex), you can pass wildcards via the FTP protocol.

For a vanilla commandline ftp, you'd want to turn off prompting (tpye 'prompt' until you see it's set to off), while you are at it, turn on hash ('hash') and check that your binary mode is correct for you ('bin' or 'asc' depending on if you want it to convert carriage returns/line feeds). Then use mget to specify your pattern.

mget is nifty in that it will do multiple gets but it can't do filename renaming/relocating on the fly so make sure you are sitting in the right directory ('lcd' before your run it).

curl and wget aren't really right for the task but you could possibly use a tool like that to first get the directory, then grep the resultant index for your files and feed that into a loop... but... yuck :/

That said, I've heard talk in the curl dev lists of adding support for ftp wildcards, so maybe recent versions do this (man curl and see how you go)
# 3  
Old 09-13-2013
Thanks for the reply.
But my need is to get multiple files within a single tar file available in ftp. And I need to know is there a way to extract only those files I need from tar instead of downloading the entire tar file.
# 4  
Old 09-13-2013
Originally Posted by Amalan
But my need is to get multiple files within a single tar file available in ftp.
Download in ftp is file-based, so you can only download a file or not download it, but you cannot download selected parts of it. You will have to download the tar-file, unpack it locally and then throw away all files you do not need.

I hope this helps.

This User Gave Thanks to bakunin For This Post:
# 5  
Old 09-16-2013
Weeeellll, with a bit of bodging you can sort of do this.
Tar allows you to operate on an incoming stream (it was originally designed for this) and stop when you've got all you want. You still need to seek through the file until you've found all the files you want:
wget -q -O - "<url of tarball>" | gunzip -c | tar -xf - "<filename you want to extract>"

This will seek through the tarball, streaming off the ftp server and pop the file you want off as it passes. You can specify multiple files on the one commandline but this will only work if you know the files (although you could try passing an escaped wildcard and see how you go - you'll have to manually abort the extract once you feel you have all the files you want though).
You can also pass the "--files-from <file containing the filnames you want>" flag to tar if you prefer.

Note that this won't save you loads of time, especially so if all the files you want are at the end of the archive, but it will help if they are at the start of the file and it's an interesting exercise in streams.
This User Gave Thanks to Smiling Dragon For This Post:
# 6  
Old 09-16-2013
This should help me a lot as it might save 50% of my time and space. Thanks for your help.
# 7  
Old 09-16-2013
Originally Posted by Amalan
This should help me a lot as it might save 50% of my time and space. Thanks for your help.
Maybe only 50% of your space. I am not sure but "tar" might read the file to the end unless you manually intervent.

Still, this is a very clever idea of Smiling Dragon. I filed it away immediately in my "all-things-interesting-to-know"-notes.

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Curl command to download multiple files with a file prefix

I am using the below curl command to download a single file from client server and it is working as expected curl --ftp-ssl -k -u ${USER}:${PASSWD} ftp://${HOST}:${PORT}/path/to/${FILE} --output ${DEST}/${FILE} let say the client has 3 files hellofile.101, hellofile.102, hellofile.103 and I... (3 Replies)
Discussion started by: r@v!7*7@
3 Replies

2. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

3. Shell Programming and Scripting

Download multiple files uing wget

Need Assistance . Using wget how can i download multiple files from http site. Http doesnt has wild card (*) but FTP has it . Any ideas will be appreciative. wget --timeout=120 --append-output=output.txt --no-directories --cut-dirs=1 -np -m --accept=grib2 -r (4 Replies)
Discussion started by: ajayram_arya
4 Replies

4. Shell Programming and Scripting

Curl ftp ssl download files

Hello all, I have been struggling with this issue on and off for a couple of weeks now and I just got it all working, so I wanted to share my findings in case some other poor soul needs to know how. First some background on what I'm doing. I am uploading files to different directories based on... (0 Replies)
Discussion started by: msjkadams
0 Replies

5. Shell Programming and Scripting

Files download using wget

Hi, I need to implement below logic to download files daily from a URL. * Need to check if it is yesterday's file (YYYY-DD-MM.dat) * If present then download from URL (sample_url/2013-01-28.dat) * Need to implement wait logic if not present * if it still not able to find the file... (1 Reply)
Discussion started by: rakesh5300
1 Replies

6. Shell Programming and Scripting

How to download file without curl and wget

Hi I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget. Can anyone will help me go about this task ??? Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies

7. UNIX for Advanced & Expert Users

Help with using curl to download files from https

Hi I'm trying to download an xml file from a https server using curl on a Linux machine with Ubuntu 10.4.2 I am able to connect to the remote server with my username and password but the output is only "Virtual user <username> logged in". I am expecting to download the xml file. My output... (4 Replies)
Discussion started by: henryN
4 Replies

8. Shell Programming and Scripting

curl script to download files from Secured HTTPS server?

curl -# -v -d "sendusername=myname&password=mypassword&wheretogo=download.php" -L -o\&prodtype=hire/ * About to connect() to port 80 * Trying connected * Connected to (1 Reply)
Discussion started by: laknar
1 Replies

9. Shell Programming and Scripting

Shell Script for Upload/download files using cURL

hi please help me out here, i want to use curl command in shell script to test web pages, what i have is an opening page, when i click on a button on opening page, the next page comes up and then i have to upload a file n then click another button to submit and then comes the output page,... (2 Replies)
Discussion started by: Olivia
2 Replies

10. UNIX for Dummies Questions & Answers

cURL Active FTP Download

Hello, I know this is probably a very silly question for most but how to do I force curl to do active FTP downloads? Thank you Dallas (2 Replies)
Discussion started by: Dallasbr
2 Replies
Login or Register to Ask a Question