Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Any URL'S for free Unix Download Post 30480 by yelamarthi on Tuesday 22nd of October 2002 06:19:15 PM
Old 10-22-2002
Hi norsk hedensk,
Thanks for the links.
They are really informative for me.
Kumar.
 

9 More Discussions You Might Find Interesting

1. Programming

I have not c compile environment ,i can download it but it ends with *.gz,so i can't

I need for help . (1 Reply)
Discussion started by: dsun5
1 Replies

2. Shell Programming and Scripting

Clearify what it means under 'WHAT' when hit the 'w'-command

I wonder how I shall read the result below, especially 'what' shown below. The result was shown when I entered 'w'. E.g what is TOP? What is gosh ( what does selmgr mean?)? login@ idle JCPU PCPU what 6:15am 7:04 39 39 TOP 6:34am 6:45 45 45 TOP 6:41am ... (1 Reply)
Discussion started by: Aelgen
1 Replies

3. UNIX for Dummies Questions & Answers

Unix ISO's for FTP, I've searched the other Posts

Where Do I download Unix ISO's for free? I have searched this database for other related posts, but to no avail. All I need is this info, and I don't want Linux; just a Unix site. Please and thank you for your help. (3 Replies)
Discussion started by: killrazor
3 Replies

4. UNIX for Advanced & Expert Users

memory free up using 'find'

Hi, I am facing an interesting aspect of find command... to be clear, we are running a small web server with oracle 8i database and Oralce9iAS on Sun E250 with Solaris 2.6 Over a period of time, the free memory ( displayed in 'top' utility ) drops down.. we could relate this to dedicated... (6 Replies)
Discussion started by: shibz
6 Replies

5. UNIX for Dummies Questions & Answers

it's free ????????

is solirs free or lindows and were to download it? i have a x86 p500 (3 Replies)
Discussion started by: amicrawler2000
3 Replies

6. UNIX for Advanced & Expert Users

Coomand to download from HTTP(URL)

Hi, What is the UNIX command to download a file or data from HTTP location. CURL(Linux) did not work. Thank You (4 Replies)
Discussion started by: skm123
4 Replies

7. UNIX for Dummies Questions & Answers

Where to download this free Unix database

Hi Folks, I was looking around some web sites and found out that there is/was this free databse for Unix called RDB by Walter Hobbs, which uses ASCII text files for its databases and uses Unix commands to manipulate them. Unfortunately, I can no longer access the ftp site mentioned on the web... (0 Replies)
Discussion started by: rooseter
0 Replies

8. Shell Programming and Scripting

URL download checking

Hi all, I have a url and i am using wget to load files from the url. but i requirement is if need to validate whether that downloading process done properly not. if that has any expcetion i need to seed mails. otherwise i will processed please help me on this. Thanks, Baski (1 Reply)
Discussion started by: baskivs
1 Replies

9. Shell Programming and Scripting

Download and Untar any URL

Hi, I am trying to make a flexible bash script which does the following: Downloads a URL from a variable Unzips it Deletes the original archive The problem is, the format could be .tar, .tar.gz etc, it wont be constant. This is what I have currently: #!/bin/bash dl_dir="/opt" ... (1 Reply)
Discussion started by: Spadez
1 Replies
WEB2DISK(1)							      calibre							       WEB2DISK(1)

NAME
web2disk - part of calibre SYNOPSIS
web2disk URL DESCRIPTION
Where URL is for example http://google.com Whenever you pass arguments to web2disk that have spaces in them, enclose the arguments in quotation marks. OPTIONS
--version show program's version number and exit -h, --help show this help message and exit -d, --base-dir Base directory into which URL is saved. Default is . -t, --timeout Timeout in seconds to wait for a response from the server. Default: 10.0 s -r, --max-recursions Maximum number of levels to recurse i.e. depth of links to follow. Default 1 -n, --max-files The maximum number of files to download. This only applies to files from <a href> tags. Default is 2147483647 --delay Minimum interval in seconds between consecutive fetches. Default is 0 s --encoding The character encoding for the websites you are trying to download. The default is to try and guess the encoding. --match-regexp Only links that match this regular expression will be followed. This option can be specified multiple times, in which case as long as a link matches any one regexp, it will be followed. By default all links are followed. --filter-regexp Any link that matches this regular expression will be ignored. This option can be specified multiple times, in which case as long as any regexp matches a link, it will be ignored. By default, no links are ignored. If both filter regexp and match regexp are speci- fied, then filter regexp is applied first. --dont-download-stylesheets Do not download CSS stylesheets. --verbose Show detailed output information. Useful for debugging SEE ALSO
The User Manual is available at http://manual.calibre-ebook.com Created by Kovid Goyal <kovid@kovidgoyal.net> web2disk (calibre 0.8.51) January 2013 WEB2DISK(1)
All times are GMT -4. The time now is 03:45 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy