Linux and UNIX Man Pages

Linux & Unix Commands - Search Man Pages

bti-shrink-urls(1) [debian man page]

BTI-SHRINK-URLS(1)						  bti-shrink-urls						BTI-SHRINK-URLS(1)

NAME
bti-shrink-urls - convert URLs to a shorter form using a web service SYNOPSIS
bti [--escaped] [--help] [URL] DESCRIPTION
bti-shrink-urls converts URLs to a shorter form using a web service. Currently http://2tu.us/ (default) and http://bit.ly / http://j.mp are supported. OPTIONS
--escaped Don't escape special characters in the URL any more, they are already percent encoded. --help Print help text. URL Specify the URL to be converted. If no URL is given bti-shrink-urls waits for input on stdin. CONFIGURATION
bti-shrink-urls is configured by setting some values in ~/.bti: shrink_host Possible values: 2tu.us (default), bit.ly, j.mp shrink_bitly_login API login for bit.ly, j.mp, required if shrink_host is set to bit.ly or j.mp. See https://code.google.com/p/bitly-api/wiki/ApiDocumentation shrink_bitly_key API key for bit.ly, j.mp, required if shrink_host is set to bit.ly or j.mp. See https://code.google.com/p/bitly-api/wiki/ApiDocumentation AUTHOR
Written by Bart Trojanowski bart@jukie.net. COPYRIGHT AND LICENSE
Copyright (C) 2009 Bart Trojanowski bart@jukie.net. This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation version 2 of the License. bti-shrink-urls March 2009 BTI-SHRINK-URLS(1)

Check Out this Related Man Page

PMDISCOVERSERVICES(3)					     Library Functions Manual					     PMDISCOVERSERVICES(3)

NAME
pmDiscoverServices - discover PCP services on the network C SYNOPSIS
#include <pcp/pmapi.h> int pmDiscoverServices(const char *service, const char *mechanism, char ***urls); cc ... -lpcp DESCRIPTION
Given a PCP service name, as identified by service, and using the type of discovery optionally specified in mechanism, pmDiscoverServices returns, via urls, a list of URLs representing the services discovered on the network. service specifies the PCP service to be discovered. Currently, only PM_SERVER_SERVICE_SPEC is supported, which searches for pmcd(1) servers. mechanism specifies the style of discovery to be used. Currently, only "avahi" is supported. This searches for services which are broad- casting using mDNS via avahi-daemon(8). mechanism may also be NULL, which means to use all available discovery mechanisms. Normally, pmDiscoverServices will return the number of services discovered, else a value less than zero for an error. The value zero indi- cates that no services were discovered. The resulting list of pointers, urls, and the values (the URLs) that the pointers reference will have been allocated by pmDiscoverServices with a single call to malloc(3C), and it is the responsibility of the pmDiscoverServices caller to free(urls) to release the space when it is no longer required. When an error occurs, or no services are discovered, urls is undefined (no space will have been allocated, and so calling free(3C) is a singularly bad idea). PCP ENVIRONMENT
Environment variables with the prefix PCP_ are used to parameterize the file and directory names used by PCP. On each installation, the file /etc/pcp.conf contains the local values for these variables. The $PCP_CONF variable may be used to specify an alternative configura- tion file, as described in pcp.conf(5). Values for these variables may be obtained programmatically using the pmGetConfig(3) function. SEE ALSO
PMAPI(3), pmcd(1), pmfind(1), pmGetConfig(3), pcp.conf(5), pcp.env(5) and avahi-daemon(8). DIAGNOSTICS
EOPNOTSUPP The specified mechanism is not supported. Performance Co-Pilot PCP PMDISCOVERSERVICES(3)
Man Page

14 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Selecting information from several web pages...

Hi All! Is this possible? I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these... http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies

2. Shell Programming and Scripting

script

Hello, I have been searching for a way to extact urls from google cache url search results, I have a file with a list of urls like this ""http://64.233.167.104/search?q=cache:ts2G04wctD0J:www.worldwidewords.org/qa/qa-shi3.htm+%22shit%22&hl=en&ct=clnk&cd=12&gl=ca&ie=UTF-8"" what i need to... (6 Replies)
Discussion started by: mike171562
6 Replies

3. Web Development

wiki -- heard about them, tell me more

I have heard about companies setting up wiki sites to allow for user grops to workshare information via the web. When I said something about this to someone, was told it was a lot of work to setup. Anyone care to comment on what is truly needed? The materials needed, effort required, whether it... (4 Replies)
Discussion started by: joeyg
4 Replies

4. Shell Programming and Scripting

[lynx dump] Order (by name/URL)

Hi :) How to use dump in lynx. $ lynx -dump http://www.google.com So, this is an example of a lynx dump: txt1 blabla Other txt some text 1. http://url_of_txt1 2. http://url_of_blabla 3. http://url_of_Other_txt 4. http://url_of_some_text ... How can i obtain this output? ... (12 Replies)
Discussion started by: aspire
12 Replies

5. UNIX for Advanced & Expert Users

Parsing a file which contains urls from different sites

Hi I have a file which have millions of urls from different sites. Count of lines are 4000000. http://www.chipchick.com/2009/09/usb_hand_grenade.html http://www.engadget.com/page/5 http://www.mp3raid.com/search/download-mp3/20173/michael_jackson_fall_again_instrumental.html... (2 Replies)
Discussion started by: solitare123
2 Replies

6. Shell Programming and Scripting

Extract URLs from HTML code using sed

Hello, i try to extract urls from google-search-results, but i have problem with sed filtering of html-code. what i wont is just list of urls thay apears between ........<p><a href=" and next following " in html code. here is my code, i use wget and pipelines to filtering. wget works, but... (13 Replies)
Discussion started by: L0rd
13 Replies

7. Web Development

Tricky mod_rewrite for clean urls problems when fetching external sources

Hi, I have problems with mod rewrite. I will try to describe... I want clean urls but fail to make it work propperly. Maybe I have problems, because the content displayed is fetched from my other site... There is a lot of stuff I already red about this, but somehow I can not find a solution... (2 Replies)
Discussion started by: lowmaster
2 Replies

8. Web Development

Apache Virtual URL

Hi All, i'am facing a problem with urls that don't have a filestructure under DocumentRoot. A URL like http://mydomain.com/applicationrew/Structure1/Structure2/some?parameter=key&parameter1=key1 Should be rewritet to something else. Now i defined a Location like <Location ~... (3 Replies)
Discussion started by: wuschelz
3 Replies

9. Shell Programming and Scripting

Execution problems with sed

Hi,I confused how to use sed to deal with big file. example: the big file have some different urls just with filename. how can i use sed to fetch url except file name and replace to other urls with filename? thanks!!! (11 Replies)
Discussion started by: hshzh359
11 Replies

10. Shell Programming and Scripting

How to remove urls from html files

Does anybody know how to remove all urls from html files? all urls are links with anchor texts in the form of <a href="http://www.anydomain.com">ANCHOR</a> they may start with www or not. Goal is to delete all urls and keep the ANCHOR text and if possible to change tags around anchor to... (2 Replies)
Discussion started by: georgi58
2 Replies

11. Shell Programming and Scripting

"Command not found" doing a while loop in bash/shell

i=0 numberofproducts=${#urls} #gets number of entries in array called "urls" numberofproductsminusone=`expr $numberofproducts - 1` #-subtract by one while do wget ${urls} i=$(( $i + 1 )) sleep 10 done I'm getting an error ./scrape: line 22: [0: command not found that... (3 Replies)
Discussion started by: phpchick
3 Replies

12. Shell Programming and Scripting

Need help with TCL code to find IP address from a URL

Need help with a a tcl code. Need to find out the ip address from a URL if it is present to do some activity. The URLs will be of the form <domain>?a=12345&d=somestring1(Note: c not present) <domain>?c=10.10.10.100&d=somestring1 <domain>?a=12345&b=somestring1&c=10.1.2.4&d=somestring2... (1 Reply)
Discussion started by: ampak
1 Replies

13. Shell Programming and Scripting

Replacing urls from file

Hi ALL, I have a file A which contains A=www.google.com B=www.abcd.com C=www.nick.com D=567 file B Contains A=www.google1234.com B=www.bacd.com C=www.mick.com D=789 I wanted a script which can replace file A contents with B Contents (5 Replies)
Discussion started by: nikhil jain
5 Replies

14. Linux

IBM Code Page 437...

Hi all... Reference this URL: https://www.unix.com/unix-for-beginners-questions-and-answers/282400-lower-ascii-characters.html#post303037272 Researching in google for an answer showed the numerous times this has been asked for in various guises. So it inspired me to create a pseudo-IBM... (2 Replies)
Discussion started by: wisecracker
2 Replies