9 More Discussions You Might Find Interesting
1. IP Networking
Hi all
I need help with a set of iptables rules that would allow a portion of a bank of ipaddresses acces to only one or two named wbsites.
Please advise
Thank You
Ed (3 Replies)
Discussion started by: wa1ed
3 Replies
2. UNIX for Advanced & Expert Users
Can someone please give me the conf file line to allow access to myexample.com and only that site, and only through http and https?
So far I have only that site accessible via http, but all https sites are opened.
Squid 3.1 on Cent 6
---------- Post updated at 12:06 PM ---------- Previous... (0 Replies)
Discussion started by: glev2005
0 Replies
3. Solaris
Hi,
We are using a java application (Java 6 , Using JAX-WS 2.0 to Create a Simple Web Service) that accesses SharePoint API through web services. We were able to get the data in windows and do all operations the API allows.
When we deploy this application on UNIX environment (Solaris 10)... (0 Replies)
Discussion started by: johninweb
0 Replies
4. Programming
Hi All,
Can anyone help me for knowing the java best side forums. where i will get a quick responce like here , as i am having lot of question.
Thanks (1 Reply)
Discussion started by: aish11
1 Replies
5. Programming
I’m using the below snipped for setting the certificate and key for client authentication.
curl_easy_setopt(curl,CURLOPT_SSLCERT,"clientCert.pem");
curl_easy_setopt(curl,CURLOPT_SSLCERTPASSWD,"changeit");
curl_easy_setopt(curl,CURLOPT_SSLCERTTYPE,"PEM"); ... (2 Replies)
Discussion started by: old_as_a_fossil
2 Replies
6. Shell Programming and Scripting
Hi there
I am currently trying to access an http site using the wget utility from a solaris box. I am going through proxies to do this and we have two types of proxies.
For the first one, which is a netcache proxy, I am able to use the wget command to export the proxy information
export... (2 Replies)
Discussion started by: memonks
2 Replies
7. Shell Programming and Scripting
Hello every one,
I have a little issue that has been killing me now for the past couple of days, I have tried to find solutions online, but its been hard to, ok here it goes...
I have created a site that is based on amount of user that have access at a time, based on cookie. So if the browser... (1 Reply)
Discussion started by: heman007
1 Replies
8. IP Networking
Setup a site to site VPN between two cisco routers.
One of the site locations is unable to access ports such as https://example.com:9001
How do I let them go into port 9001?
They can ssh, ftp, telnet and everything else.
Is this a VPN issue or ACL access issue?
I put
permit ip host... (0 Replies)
Discussion started by: photon
0 Replies
9. News, Links, Events and Announcements
I just wanted to share this piece of news to all of you, comments are unnecessary :)
http://news.com.com/2100-1001-872266.html (1 Reply)
Discussion started by: J.P
1 Replies
TV_GRAB_IT(1p) User Contributed Perl Documentation TV_GRAB_IT(1p)
NAME
tv_grab_it - Grab TV listings for Italy.
SYNOPSIS
tv_grab_it --help
tv_grab_it [--config-file FILE] --configure
tv_grab_it [--config-file FILE] [--output FILE]
[--days N] [--offset N] [--quiet]
[--slow] [--verbose] [--errors-in-xml]
[--backend SITE1[,SITE2[,SITE3]]] [--cache-slow]
DESCRIPTION
Output TV listings for several channels available in Italy. The grabber relies on parsing HTML so it might stop working at any time. The
data comes from different backends. This is to minimize blackouts in case of site changes but also to extend the number of channels. If
the grabber can't find the data with the first backend it will try the second one, and so on. You can specify your order of preference
using the --backend option.
Currently configured backends are (in default order):
skylife - grabs data from www.skylife.it
mtvit - grabs data from www.mtv.it
boingtv - grabs data from www.boingtv.it
sitcom1 - grabs data from www.sitcom1.it
raisat - grabs data from www.risat.it
raiit - grabs data from www.rai.it
iris - grabs data from www.iris.it
mediasetpremium - grabs data from www.mediasetpremium.it
First run tv_grab_it --configure to choose which channels you want to download. Then running tv_grab_it with no arguments will output
listings in XML format to standard output.
--configure Prompt for which channels, and writes the configuration file.
--config-file FILE Set the name of the configuration file, the default is ~/.xmltv/tv_grab_it.conf. This is the file written by
--configure and read when grabbing.
--gui OPTION Use this option to enable a graphical interface to be used. OPTION may be 'Tk', or left blank for the best available choice.
Additional allowed values of OPTION are 'Term' for normal terminal output (default) and 'TermNoProgressBar' to disable the use of
XMLTV::ProgressBar.
--output FILE write to FILE rather than standard output.
--days N Grab N days. The default is 7.
--offset N Start N days in the future. The default is to start from today.
--quiet Suppress the progress messages normally written to standard error.
--slow Downloads more details (descriptions, actors...). This means downloading a new file for each programme, so it's off by default to
save time.
--cache-slow If you use the --cache option to speed up thing when you grab data several times a week, using this option you will cache only
the --slow data, so you shouldn't miss changes in schedules.
--verbose Prints out verbose information useful for debugging.
--errors-in-xml Outputs warnings as programmes in the xml file, so that you can see errors in your favorite frontend in addition to the
default STDERR.
--backend Set the backend (or backends) to use. See the examples.
--version Show the version of the grabber.
--help Print a help message and exit.
CAVEATS
If you use --quiet you should also use --errors-in-xml or you won't be warned about errors. Note also that, as opposed to previous
versions, this grabber doesn't die if it cannot find any data, but returns an empty (or optionally containing just warnings) xml file
instead.
The backends' data quality differs a lot. For example, mytv was very basic, yet complete and uses the least amount of bandwith. Skytv has a
lot of channels, but unless you use it with the --slow option the data is not very good (and in this case i would be VERY slow). wfactory
is a good overall site if you don't need the whole sky package.
EXAMPLES
tv_grab_it --backend mtvit --configure
configures tv_grab_it using only the backend mtvit
tv_grab_it --backend skylife,wfactory --days 1
grabs one day of data overiding the default order (could also be --backend skylife --backend wfactory)
tv_grab_it --cache --slow --days 3
grabs the full data for the next three days using the default backend order and using a disk cache.
RECOMMENDED USAGE
tv_grab_it --cache --slow --cache-slow --errors-in-xml
SEE ALSO
xmltv.
AUTHOR
Davide Chiarini, davide.chiarini@gmail.com
you can find some more help at http://www.htpcpoint.it/forum/
perl v5.14.2 2012-06-30 TV_GRAB_IT(1p)