Linux and UNIX Man Pages

Linux & Unix Commands - Search Man Pages

crawl(6) [debian man page]

CRAWL(6)							 BSD Games Manual							  CRAWL(6)

NAME
crawl -- the roguelike game of Crawl SYNOPSIS
crawl [-vscores n] [-tscores n] [-test] [-sprint-map map] [-sprint] [-zotdef] [-species species] [-seed num] [-script file] [-scores n] [-scorefile path] [-rcdir path] [-rc path] [-plain] [-name name] [-morgue path] [-macro path] [-help] [-extra-opt-last optname=optval] [-extra-opt-first optname=optval] [-dir path] [-builddb] [-background background] [-arena ["monsters v monsters [arena:map]"]] DESCRIPTION
crawl is a fun game in the grand tradition of games like Rogue, Hack, and Moria. Your objective is to travel deep into a subterranean cave complex and retrieve the Orb of Zot, which is guarded by many horrible and hideous creatures. ENVIRONMENT
CRAWL_NAME The character name to use. CRAWL_DIR The directory to store information in. CRAWL_RC The configuration file to use. SEE ALSO
The in-game help menu, which can be accessed by pressing ?. AUTHORS
Copyright 1997, 1998, 1999 Linley Henzell BSD
January 20, 2010 BSD

Check Out this Related Man Page

renpy(6)							       Games								  renpy(6)

NAME
renpy - engine for creating visual novels SYNOPSIS
renpy [path to the script directory] DESCRIPTION
Ren'Py is a programming language and runtime, intended to ease the creation of visual-novel type games. It contains features that make it easy to display thoughts, dialogue, and menus; to display images to the user; to write game logic; and to support the saving and loading of games. Ren'Py tries to be like an executable script, allowing you to get a working game without much more effort than is required to type the game script into the computer. Ren'Py is implemented on top of python, and that python heritage shows through in many places. Many Ren'Py statements allow python expres- sions to be used, and there are also Ren'Py statements that allow for the execution of arbitrary python code. Many of the less-used fea- tures of Ren'Py are exposed to the user by way of python. By only requiring use of the simplest features of python, it's hoped that Ren'Py will be usable by all game authors. USAGE
If you run the program without any arguments, zou will get an interactive launcher from where you can select, run and work different projects. For running a script, you need to give the full path to the directory that contains the game you want to play. For example: renpy /usr/share/games/renpy/demo/ To learn how to use the game interface, you should install and play renpy-demo. FILES
The game data for each user is stored at ~/.renpy/ directory. The scripts can be installed in the system bz placing them under /usr/share/games/renpy/ , but you can run any script in an arbitrarz directory just by telling the path as the parameter to the game. SEE ALSO
You can find more information at http://www.renpy.org/ May 2007 renpy(6)
Man Page

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

creating a webcrawler in UNIX

hey guys. i have a final project due at the end of this month. i have to create a webcrawler in unix i did some research and i know what a webcrawler is but i can't find no coding or any example of how people build a webcrawler in unix. most of them i find is in JAva. i know some of the commands... (1 Reply)
Discussion started by: bmwm3guy
1 Replies

2. Web Development

sitemap.xml from crawler or CMS?

Hello, I have question. Is it better to use site Crawler or scripts from CMS/blog to do that? (2 Replies)
Discussion started by: gstoychev
2 Replies

3. UNIX for Advanced & Expert Users

Crawling a MS-Doc using scripting

Hi Everyone How you doing all.Im planning to write a script that will crawl a MS-Document and should take the values from it.Is it possible at all.Im not a scripting guru just want to know your thoughts.. Im planning to do some thing like this: Microsoft Document has: Servername:... (1 Reply)
Discussion started by: coolkid
1 Replies

4. Shell Programming and Scripting

Script for "Crawling a doc"

Hi Everyone How you doing all.Im planning to write a script that will crawl a MS-Document and should take the values from it.Is it possible at all.Im not a scripting guru just want to know your thoughts.. Im planning to do some thing like this: Microsoft Document has: Servername:... (6 Replies)
Discussion started by: coolkid
6 Replies

5. Shell Programming and Scripting

wget crawl website by extracting links

I am using wget to crawl a website using the following command: wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://www.stupidsite.com What I have found is that after two days of crawling some links are still not downloaded. For example, if some page has 10 links in it as anchor texts... (1 Reply)
Discussion started by: shoaibjameel123
1 Replies

6. Shell Programming and Scripting

Website crawler

Hi, I want to build a crawler that seeks for a keyword on certain websites. This is what the website looks like: website.com/xxxxAA11xxxx I want that the crawler automatically changes the letters alphanumerically and if a certain keyword is found, the website got to be logged. But... (12 Replies)
Discussion started by: yaylol
12 Replies

7. Homework & Coursework Questions

crawler in bash

hello i'm doing a assignment in bash building a crawler.. i downloaded a url into a file and i need to check if the url is an html format and also to save in a different file all the html's links.. please help me thx (1 Reply)
Discussion started by: yanivlug
1 Replies

8. HP-UX

HPUX 11.23 - server is crawling

OK HPUX masters I need help. I have a HPUX 11.23 server that I am using as a Ignite server that services two HPUX services for backups only. The other day I noticed that our Make Net Recoveries were failing. Upon attempting to login to the server SSH and could not I then tried to ping and could... (6 Replies)
Discussion started by: waytec
6 Replies

9. Linux

Learning scrapers, webcrawlers, search engines and CURL

All, I'm trying to learn scrapers, webcrawlers, search engines and CURL. I've chosen to interrogate the following sites: Manta, SuperPages, Yellow Book, Yellow Pages. These show organizations/businesses by search type/category, so effective in finding potential clients. ... (3 Replies)
Discussion started by: TBotNik
3 Replies

10. UNIX for Beginners Questions & Answers

How to stop a shell script if it encounters a error?

I am writing a bash shell script for GarazLab's "WP EMAIL CRAWLER - AUTO SCRAPER & REAL TIME EXTRACTOR". it contains some commands. I want to stop the shell execution as soon as it encounters an error. how to do it? (8 Replies)
Discussion started by: tahsin352
8 Replies