I have the following simplified script, that consists of a download (wget), a grab of only the lines with a comma (grep) and a rename of the file to indicate its date of origin (mv):
wget http://www.example.com/datafile.csv
grep ',' datafile.csv
mv datafile.csv datafile.`date... (2 Replies)
Well, that's what I'd do in bash :) Here's what I have so far:
import urllib2
from BeautifulSoup import BeautifulStoneSoup
xml = urllib2.urlopen('http://weatherlink.com/xml.php?user=blah&pass=blah')
soup = BeautifulStoneSoup(xml)
print soup.prettify()
but all it does is grab the html... (0 Replies)
I'd like to convert a date string in the form of sun aug 19 09:03:10 EDT 2012, to unixtime timestamp using awk.
I tried
This is how each line of the file looks like, different date and time in this format
Sun Aug 19 08:33:45 EDT 2012, user1(108.6.217.236) all: test on the 17th
... (2 Replies)
Hello friends,
I've been working on a solaris server,
I need to test responses of a web service using WGET command, if the response is successful, how quick it is etc.
I have scirpt like this, I modified it, i try to redirect the output of time command to total.txt but i couldn't manage, i... (4 Replies)
Hi,
I wish to check the return value for wget $url.
However, some urls are designed to take 45 minutes or more to return.
All i need to check if the URL can be reached or not using wget.
How can i get wget to return the value in a few seconds ? (8 Replies)
Dear all,
I am kindly seeking assistance on the following issue.
I am working with data that is sampled every 0.05 hours (that is 3 minutes intervals) here is a sample data from the file
5.00000 15.5030
5.05000 15.6680
5.10000 16.0100
5.15000 16.3450
5.20000 16.7120
5.25000... (4 Replies)
In the bash below when the program is opened the download function runs and downloads the getCSV file and on the screen "Downloading getCSV.csv:%" displays and when it completes the menu function is called. However, as of now the bash opens and closes after a few seconds and I'm not sure... (4 Replies)
Hi,
I need to download a zip file from my the below US govt link.
https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP
I only have wget utility installed on the server.
When I use the below command, I am getting error 403... (2 Replies)
Hi, I need to join these statements for efficiency, and without having to make a new directory for each batch. I'm annotating commands below.
wget -q -r -l1 URL
^^ can't use -O - here and pipe | to grep because of -r
grep -hrio "\b\+@\+\.\{2,4\}\+\b" * > first.txt
^^ Need to grep the output... (14 Replies)
Discussion started by: p1ne
14 Replies
LEARN ABOUT DEBIAN
bti-shrink-urls
BTI-SHRINK-URLS(1) bti-shrink-urls BTI-SHRINK-URLS(1)NAME
bti-shrink-urls - convert URLs to a shorter form using a web service
SYNOPSIS
bti [--escaped] [--help] [URL]
DESCRIPTION
bti-shrink-urls converts URLs to a shorter form using a web service.
Currently http://2tu.us/ (default) and http://bit.ly / http://j.mp are supported.
OPTIONS --escaped
Don't escape special characters in the URL any more, they are already percent encoded.
--help
Print help text.
URL
Specify the URL to be converted. If no URL is given bti-shrink-urls waits for input on stdin.
CONFIGURATION
bti-shrink-urls is configured by setting some values in ~/.bti:
shrink_host
Possible values: 2tu.us (default), bit.ly, j.mp
shrink_bitly_login
API login for bit.ly, j.mp, required if shrink_host is set to bit.ly or j.mp. See
https://code.google.com/p/bitly-api/wiki/ApiDocumentation
shrink_bitly_key
API key for bit.ly, j.mp, required if shrink_host is set to bit.ly or j.mp. See
https://code.google.com/p/bitly-api/wiki/ApiDocumentation
AUTHOR
Written by Bart Trojanowski bart@jukie.net.
COPYRIGHT AND LICENSE
Copyright (C) 2009 Bart Trojanowski bart@jukie.net.
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by
the Free Software Foundation version 2 of the License.
bti-shrink-urls March 2009 BTI-SHRINK-URLS(1)