Hello guys, first post sorry if I did some mess here =)
Using Ubuntu 14.04lts 64bits server version.
I have a list (url.list) with only URLs to download, one per line, that looks like this:
As you can see, there are many lines in the file (in this case 30000). Because of that I'm using a trick to download many URLs simultaneosly with this:
The problem is that I'd like to rename the output file with the same value of the name field, like: 1.html, 2.html, ..., 30000.html ecc, and use curl to limit the size of the file to 50KB. So the curl command should be something like: How can I have it done?
I can parse the output of the pipe with echo $URL | sed -n -e 's/^.*name=//p' but I don't know how use this in the same line grabbing the output of a pipe in 2 variables ($URL and $filename).
I tried this with no success:
Thank you in advance,
tonispa
Did you try reading that file:
The Xes are dummy variables. Instead of the echo, put in your magic command. There have been threads on "parallel" execution with some tricks; use the search function in here.
Thank you so much for your help @RudiC , I'll try your tips this night and post here back. I walked a little yesterday with this codem and figured out how to use xargs to "parallelize" the jobs to curl:
But the problem is that I can't rename the file as I want. So what I did is cd to my destination directory path, and then I run the code above. But I notice that if there is some similar filenames in differents URLs the first file is overwrote by the last one. Because of that, if I want to keep the same destination directory, be able to rename the output is mandatory.
Thank you for your reply! I did not make this work. I tried to write a script only for this function and call it, tried put "inline" command inside a screen, and always is the same error:
xargs: fetch_urlxargs: fetch_urlxargs: fetch_url: No such file or directory: No such file or directory
export -f is a bash feature and I use it here to insure the internal function fetch_url is exported to sub shells. This is needed as xargs is an external command to the shell and runs the assembled commands in a new shells.
I assumed, as you were using GNU xargs (-P feature is a GNU extension), that you were also using the bash shell. I've updated my original post to specify the required shell, and this is all you may need to do to get you version working.
However, if you do not wish to use bash, you could put your function in an external script so that it can be called from xargs for example:
$HOME/bin/fetch_url:
And from another script (or the command line) you can call this with:
Last edited by Chubler_XL; 05-27-2017 at 02:54 PM..
Hello,
My question is about curl command. (ubuntu14.04)
In terminal, I am able to download my mainfile with:
curl -u user1:pass1 http://11.22.33.44/*******
When I convert it into bash script like this:
#!/bin/bash
cd /root/scripts
computer_ip=11.22.33.44
curl -u $1:$2... (8 Replies)
This question could be specific to the site subdivx.com In the past, I've been able to download a file following location using cURL but there is something about subdivx.com that's different and can't figure out how to get it to work.
I tried the following directly in the terminal with no... (5 Replies)
Good Morning.
I'm trying to download a file from a server. I was able to upload to the server successfully but when i download, i see the file name in my server but with some unknow data. The file name i'm trying to download is abcd.zip in binary mode.
curl -1 -v --ftp-pasv -o abcd.zip -u... (4 Replies)
I am using the below curl command to download a single file from client server and it is working as expected
curl --ftp-ssl -k -u ${USER}:${PASSWD} ftp://${HOST}:${PORT}/path/to/${FILE} --output ${DEST}/${FILE}
let say the client has 3 files hellofile.101, hellofile.102, hellofile.103 and I... (3 Replies)
Hello all,
I have been struggling with this issue on and off for a couple of weeks now and I just got it all working, so I wanted to share my findings in case some other poor soul needs to know how. First some background on what I'm doing. I am uploading files to different directories based on... (0 Replies)
Hi i have a php script that works 100% however i don't want this to run on php because of server limits etc. Ideally if i could convert this simple php script to a shell script i can set it up to run on a cron. My mac server has curl on it. So i am assuming i should be using this to download the... (3 Replies)
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
Hi
I'm trying to download an xml file from a https server using curl on a Linux machine with Ubuntu 10.4.2
I am able to connect to the remote server with my username and password but the output is only "Virtual user <username> logged in".
I am expecting to download the xml file.
My output... (4 Replies)
Basically I am needing to Download (using curl) in the background some data from(link here), with stderr redirected to /dev/null, to a file named taxcode
I was doing this,
curl & http:// name here/download/pls/Title_26.txt 2> /dev/null > taxcode
but the results were not what I was after.
... (1 Reply)