CRAWL(6) BSD Games Manual CRAWL(6)NAME
crawl -- the roguelike game of Crawl
SYNOPSIS
crawl [-vscores n] [-tscores n] [-test] [-sprint-map map] [-sprint] [-zotdef] [-species species] [-seed num] [-script file] [-scores n]
[-scorefile path] [-rcdir path] [-rc path] [-plain] [-name name] [-morgue path] [-macro path] [-help] [-extra-opt-last optname=optval]
[-extra-opt-first optname=optval] [-dir path] [-builddb] [-background background] [-arena ["monsters v monsters [arena:map]"]]
DESCRIPTION
crawl is a fun game in the grand tradition of games like Rogue, Hack, and Moria. Your objective is to travel deep into a subterranean cave
complex and retrieve the Orb of Zot, which is guarded by many horrible and hideous creatures.
ENVIRONMENT
CRAWL_NAME The character name to use.
CRAWL_DIR The directory to store information in.
CRAWL_RC The configuration file to use.
SEE ALSO
The in-game help menu, which can be accessed by pressing ?.
AUTHORS
Copyright 1997, 1998, 1999 Linley Henzell
BSD January 20, 2010 BSD
Check Out this Related Man Page
DM(8) BSD System Manager's Manual DM(8)NAME
dm -- dungeon master
SYNOPSIS
ln -s dm game
DESCRIPTION
dm is a program used to regulate game playing. dm expects to be invoked with the name of a game that a user wishes to play. This is done by
creating symbolic links to dm, in the directory /usr/games for all of the regulated games. The actual binaries for these games should be
placed in a ``hidden'' directory, /usr/lib/games/dm, that may only be accessed by the dm program. dm determines if the requested game is
available and, if so, runs it. The file /etc/dm.conf controls the conditions under which games may be run.
The file /etc/nogames may be used to ``turn off'' game playing. If the file exists, no game playing is allowed; the contents of the file
will be displayed to any user requesting a game.
FILES
/etc/dm.conf configuration file
/etc/nogames turns off game playing
/usr/lib/games/dm directory of ``real'' binaries
/var/games/games.log game logging file
SEE ALSO dm.conf(5)HISTORY
The dm command appeared in 4.3BSD-Tahoe.
SECURITY CONSIDERATIONS
Two issues result from dm running the games setgid ``games''. First, all games that allow users to run UNIX commands should carefully set
both the real and effective group ids immediately before executing those commands. Probably more important is that dm never be setgid any-
thing but ``games'' so that compromising a game will result only in the user's ability to play games at will. Secondly, games which previ-
ously had no reason to run setgid and which accessed user files may have to be modified.
BSD May 31, 1993 BSD
hey guys. i have a final project due at the end of this month. i have to create a webcrawler in unix i did some research and i know what a webcrawler is but i can't find no coding or any example of how people build a webcrawler in unix. most of them i find is in JAva. i know some of the commands... (1 Reply)
Hi Everyone
How you doing all.Im planning to write a script that will crawl a MS-Document
and should take the values from it.Is it possible at all.Im not a scripting guru just want to know your thoughts..
Im planning to do some thing like this:
Microsoft Document has:
Servername:... (1 Reply)
Hi Everyone
How you doing all.Im planning to write a script that will crawl a MS-Document
and should take the values from it.Is it possible at all.Im not a scripting guru just want to know your thoughts..
Im planning to do some thing like this:
Microsoft Document has:
Servername:... (6 Replies)
I am using wget to crawl a website using the following command:
wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://www.stupidsite.com
What I have found is that after two days of crawling some links are still not downloaded. For example, if some page has 10 links in it as anchor texts... (1 Reply)
Hi,
I want to build a crawler that seeks for a keyword on certain websites.
This is what the website looks like:
website.com/xxxxAA11xxxx
I want that the crawler automatically changes the letters alphanumerically and if a certain keyword is found, the website got to be logged.
But... (12 Replies)
hello
i'm doing a assignment in bash building a crawler..
i downloaded a url into a file and i need to check if the url is an html format and also to save in a different file all the html's links..
please help me
thx (1 Reply)
OK HPUX masters I need help. I have a HPUX 11.23 server that I am using as a Ignite server that services two HPUX services for backups only. The other day I noticed that our Make Net Recoveries were failing. Upon attempting to login to the server SSH and could not I then tried to ping and could... (6 Replies)
All,
I'm trying to learn scrapers, webcrawlers, search engines and CURL. I've chosen to interrogate the
following sites:
Manta,
SuperPages,
Yellow Book,
Yellow Pages.
These show organizations/businesses by search type/category, so effective in
finding potential clients.
... (3 Replies)
I am writing a bash shell script for GarazLab's "WP EMAIL CRAWLER - AUTO SCRAPER & REAL TIME EXTRACTOR". it contains some commands. I want to stop the shell execution as soon as it encounters an error. how to do it? (8 Replies)