After Caldera.com's Robots.txt is Removed, Some Evidence Surfaces

 
Thread Tools Search this Thread
Special Forums News, Links, Events and Announcements UNIX and Linux RSS News After Caldera.com's Robots.txt is Removed, Some Evidence Surfaces
# 1  
Old 05-06-2010
After Caldera.com's Robots.txt is Removed, Some Evidence Surfaces

Now that SCO has sold off the caldera.com domain name, their previous robots.txt file no longer blocks access to the legacy Caldera web pages on Internet Archive. And what has popped up?
Some evidence that I believe proves that the SCOsource licensing program in 2004 was a right to use SVRX code, which would make it code SCO has to pay royalties received to Novell and which it must ask Novell's permission to license under the terms of the APA. That is not what SCOfolk testified to at the bench trial in 2008, where they presented the SCOsource license as protection from litigation or in one case as shared libraries. So this is a new piece of evidence to add to the pile that indeed at least some of the SCOsource licenses were right to use licenses for SVRX code.

More...
Login or Register to Ask a Question

Previous Thread | Next Thread

3 More Discussions You Might Find Interesting

1. Solaris

Netbackup robots not working

Hi All, I am facing a issue with robtest not working on netbackup 7.1 on solaris 10. I can see the robots and drives are deteted by O.S but not sure why robtest is not working. Below are few ouputs of few commands. $PWD>cfgadm -al -o show_FCP_dev Ap_Id Type ... (0 Replies)
Discussion started by: sahil_shine
0 Replies

2. Web Development

robots.txt usage

Dear all, I want to use robots.txt to control the "spider". can i specify a IP address to ALLOW the website can be accessed by the "spider"?? thank you. Rick (4 Replies)
Discussion started by: rickhlwong
4 Replies

3. UNIX for Dummies Questions & Answers

echo "ABC" > file1.txt file2.txt file3.txt

Hi Guru's, I need to create 3 files with the contents "ABC" using single command. Iam using: echo "ABC" > file1.txt file2.txt file3.txt the above command is not working. pls help me... With Regards / Ganapati (4 Replies)
Discussion started by: ganapati
4 Replies
Login or Register to Ask a Question
LWP::RobotUA(3) 					User Contributed Perl Documentation					   LWP::RobotUA(3)

NAME
LWP::RobotUA - a class for well-behaved Web robots SYNOPSIS
use LWP::RobotUA; my $ua = LWP::RobotUA->new('my-robot/0.1', 'me@foo.com'); $ua->delay(10); # be very nice -- max one hit every ten minutes! ... # Then just use it just like a normal LWP::UserAgent: my $response = $ua->get('http://whatever.int/...'); ... DESCRIPTION
This class implements a user agent that is suitable for robot applications. Robots should be nice to the servers they visit. They should consult the /robots.txt file to ensure that they are welcomed and they should not make requests too frequently. But before you consider writing a robot, take a look at <URL:http://www.robotstxt.org/>. When you use a LWP::RobotUA object as your user agent, then you do not really have to think about these things yourself; "robots.txt" files are automatically consulted and obeyed, the server isn't queried too rapidly, and so on. Just send requests as you do when you are using a normal LWP::UserAgent object (using "$ua->get(...)", "$ua->head(...)", "$ua->request(...)", etc.), and this special agent will make sure you are nice. METHODS
The LWP::RobotUA is a sub-class of LWP::UserAgent and implements the same methods. In addition the following methods are provided: $ua = LWP::RobotUA->new( %options ) $ua = LWP::RobotUA->new( $agent, $from ) $ua = LWP::RobotUA->new( $agent, $from, $rules ) The LWP::UserAgent options "agent" and "from" are mandatory. The options "delay", "use_sleep" and "rules" initialize attributes private to the RobotUA. If "rules" are not provided, then "WWW::RobotRules" is instantiated providing an internal database of robots.txt. It is also possible to just pass the value of "agent", "from" and optionally "rules" as plain positional arguments. $ua->delay $ua->delay( $minutes ) Get/set the minimum delay between requests to the same server, in minutes. The default is 1 minute. Note that this number doesn't have to be an integer; for example, this sets the delay to 10 seconds: $ua->delay(10/60); $ua->use_sleep $ua->use_sleep( $boolean ) Get/set a value indicating whether the UA should sleep() if requests arrive too fast, defined as $ua->delay minutes not passed since last request to the given server. The default is TRUE. If this value is FALSE then an internal SERVICE_UNAVAILABLE response will be generated. It will have an Retry-After header that indicates when it is OK to send another request to this server. $ua->rules $ua->rules( $rules ) Set/get which WWW::RobotRules object to use. $ua->no_visits( $netloc ) Returns the number of documents fetched from this server host. Yeah I know, this method should probably have been named num_visits() or something like that. :-( $ua->host_wait( $netloc ) Returns the number of seconds (from now) you must wait before you can make a new request to this host. $ua->as_string Returns a string that describes the state of the UA. Mainly useful for debugging. SEE ALSO
LWP::UserAgent, WWW::RobotRules COPYRIGHT
Copyright 1996-2004 Gisle Aas. This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.18.2 2012-02-11 LWP::RobotUA(3)