I am attempting to download a url in csv format. When I download this url in a browser excel opens up and automatically populates with comma separated values.
When I try to use curl or wget I get nothing or garbage.
I want to automatically download a CSV file daily which can be found here:
http://www.londonstockexchange.com/en-gb/pricesnews/prices/coveredwarrants/search.htm
and the link is named "Click to download covered warrants (100Kb)" onthe right hand side.
What commands can I use to invoke clicking... (1 Reply)
Hi Guys,
I need to write a script, that exports the "moz_places" table of the "places.sqlite"-file (firefox browser history) into a csv-file. That part works. After the export, my csv looks like this:
...
4429;http://www.sqlite.org/sqlite.html;"Command Line Shell For... (11 Replies)
Hi All,
I have two .csv's
input.csv having values as (7 columns)
ABC,A19907103,ABC DEV YUNG,2.17,1000,2157,07/07/2006
XYZ,H00213850,MM TRUP HILL,38.38,580,23308,31/08/2010
output.csv having (25 columns)
A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y... (4 Replies)
hi guys,
im fairly new to unix and bash scripts and therefore your help would really be appreciated.
i need to write a bash script that will take a csv file, and reorder the data and output to another csv file.
The source csv file will look something like this:
HEAD,671061,Add,SS... (3 Replies)
Hi All,
Hope all you are doing good! Need your help. I have an XML file which needs to be converted CSV file. I am not an expert of awk/sed so your help is highly appreciated!!
XML file looks like this:
<l:event dateTime="2013-03-13 07:15:54.713" layerName="OSB" processName="ABC"... (2 Replies)
Hello Everyone,
I am trying to find a way to take a .csv file with 7 columns and a ton of rows (over 600,000) and remove the entire row if the cell in forth column is blank.
Just to give you a little background on why I am doing this (just in case there is an easier way), I am pulling... (3 Replies)
I am trying to download all files from a user authentication, password protected https site, with a particular extension (.bam). The files are ~20GB each and I am not sure if the below is the best way to do it. I am also not sure how to direct the downloaded files to a folder as well as external... (7 Replies)
Hi guys,
I recently managed to write up my working script, but now I have a problem.
If the file isn't there in the remote server, my actual script jumps it and all ok, but I need something like this:
Search file -> if there, then download -> if not, download next file in the list.
Any... (7 Replies)
The bash below will download all the files in download to /home/Desktop/folder. That works great, but within /home/Desktop/folder there are several folders bam, other, and vcf, is there a way to specify by extention in the download file where to download it to?
For example, all .pdf and .zip... (2 Replies)
Hello,
My question is about curl command. (ubuntu14.04)
In terminal, I am able to download my mainfile with:
curl -u user1:pass1 http://11.22.33.44/*******
When I convert it into bash script like this:
#!/bin/bash
cd /root/scripts
computer_ip=11.22.33.44
curl -u $1:$2... (8 Replies)
Discussion started by: baris35
8 Replies
LEARN ABOUT CENTOS
lwp-download
LWP-DOWNLOAD(1) User Contributed Perl Documentation LWP-DOWNLOAD(1)NAME
lwp-download - Fetch large files from the web
SYNOPSIS
lwp-download [-a] [-s] <url> [<local path>]
DESCRIPTION
The lwp-download program will save the file at url to a local file.
If local path is not specified, then the current directory is assumed.
If local path is a directory, then the last segment of the path of the url is appended to form a local filename. If the url path ends with
slash the name "index" is used. With the -s option pick up the last segment of the filename from server provided sources like the Content-
Disposition header or any redirect URLs. A file extension to match the server reported Content-Type might also be appended. If a file
with the produced filename already exists, then lwp-download will prompt before it overwrites and will fail if its standard input is not a
terminal. This form of invocation will also fail is no acceptable filename can be derived from the sources mentioned above.
If local path is not a directory, then it is simply used as the path to save into. If the file already exists it's overwritten.
The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request
program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you
don't have much options to worry about.
Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems.
EXAMPLE
Fetch the newest and greatest perl version:
$ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz
Saving to 'latest.tar.gz'...
11.4 MB received in 8 seconds (1.43 MB/sec)
AUTHOR
Gisle Aas <gisle@aas.no>
perl v5.16.3 2012-01-14 LWP-DOWNLOAD(1)