Sponsored Content
Full Discussion: Wget and single page
Top Forums Shell Programming and Scripting Wget and single page Post 302706835 by silver18 on Thursday 27th of September 2012 01:17:58 PM
Old 09-27-2012
Wget and single page

Good evening to all!!

I'm trying to become familiar with wget.
I would like to download a page from Wikipedia with all images and CSSs but without going down to all links present in the page. It should be named index.html.
I would like also to save it to /mnt/us inside a new folder.

This is what I'm using now, but it saves all pages linked to the one I want to download.

Code:
PAGES="/mnt/us/"
webpage="http://it.wikipedia.org/wiki/Robot"

wget -e robots=off --quiet --mirror --page-requisites --no-parent --convert-links --adjust-extension -P "$PAGES" -U Mozilla "$webpage"

Thanks a lot to everyone!!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to get the page size (of a url) using wget

Hi , I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone? Regards, Raj (1 Reply)
Discussion started by: rajbal
1 Replies

2. UNIX for Dummies Questions & Answers

wget with semicolon in page name

So, I'd like to wget a webpage, as its not going to stick around forever - but the problem is the webpage has a semicolon in it. wget http://example.com/stuff/asdf;asdf obviously doesn't get the right webpage. Any good way around this? (2 Replies)
Discussion started by: Julolidine
2 Replies

3. Shell Programming and Scripting

WGET cycling on an updating page

Hello, I am experiencing an issue while downloading a few pages using wget. All of them work without a problem except one which is a page that does a tail on the log and as a result is constantly getting updated. wget here seems to run endlessly and needs to be manually killed. I wanted to... (0 Replies)
Discussion started by: prafulnama
0 Replies

4. Shell Programming and Scripting

cgi script to print all .png files on a single page

Hi guys I'm relativley new to Perl, and have not touched html before, im trying to write a cgi script that prints all images on a single html page from a given directory. Im using perl to gather stats, rrdtool to update and create graphs now i just need to print these graphs all onto 1 index.cgi... (3 Replies)
Discussion started by: jeffersno1
3 Replies

5. UNIX for Dummies Questions & Answers

display command output page per page

Good afternoon, I wonder how i could use unix commands to ease the reading of long command result output ? like the "php -i" or any other command that returns a long answer. I could not find the right terms to Google it or search the forum. Therefore I bother you with this question. ... (3 Replies)
Discussion started by: Mat_k
3 Replies

6. Web Development

Page load time- local page

Hi Is there a way to calculate the page load time, I am trying to calculate the load time of a page locally. I found tools to do this over http or https but none that work locally. Any ideas? Thanks. (4 Replies)
Discussion started by: jamie_123
4 Replies

7. Shell Programming and Scripting

Print multiple copies page by page using lp command

Hi I have a pdf file that is being generated using the rwrun command in the shell script. I then have the lp command in the shell script to print the same pdf file. Suppose there are 4 pages in the pdf file , I need to print 2 copies of the first page, 2 copies of the second page , then 2... (7 Replies)
Discussion started by: megha2525
7 Replies

8. Shell Programming and Scripting

script for adding page number before page breaks

Hi, If there is an expert that can help: I have many txt files that are produced from pdftotext that include page breaks the page breaks seem to be unix style hex 0C. I want to add page numbers before each page break as in : Page XXXX Regards antman (9 Replies)
Discussion started by: antman
9 Replies

9. Shell Programming and Scripting

Random web page download wget script

Hi, I've been attempting to create a script that downloads web pages at random intervals to mimic typical user usage. However I'm struggling to link $url to the URL list and thus wget complains of a missing URL. Any ideas? Thanks #!/bin/sh #URL List url1="http://www.bbc.co.uk"... (14 Replies)
Discussion started by: shadyuk
14 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
SMBGET(1)																 SMBGET(1)

NAME
smbget - wget-like utility for download files over SMB SYNOPSIS
smbget [-a,--guest] [-r,--resume] [-R,--recursive] [-u,--username=STRING] [-p,--password=STRING] [-w,--workgroup=STRING] [-n,--nonprompt] [-d,--debuglevel=INT] [-D,--dots] [-P,--keep-permissions] [-o,--outputfile] [-f,--rcfile] [-q,--quiet] [-v,--verbose] [-b,--blocksize] [-?,--help] [--usage] {smb://host/share/path/to/file} [smb://url2/] [...] DESCRIPTION
This tool is part of the samba(7) suite. smbget is a simple utility with wget-like semantics, that can download files from SMB servers. You can specify the files you would like to download on the command-line. The files should be in the smb-URL standard, e.g. use smb://host/share/file for the UNC path \HOSTSHAREfile. OPTIONS
-a, --guest Work as user guest -r, --resume Automatically resume aborted files -R, --recursive Recursively download files -u, --username=STRING Username to use -p, --password=STRING Password to use -w, --workgroup=STRING Workgroup to use (optional) -n, --nonprompt Don't ask anything (non-interactive) -d, --debuglevel=INT Debuglevel to use -D, --dots Show dots as progress indication -P, --keep-permissions Set same permissions on local file as are set on remote file. -o, --outputfile Write the file that is being download to the specified file. Can not be used together with -R. -f, --rcfile Use specified rcfile. This will be loaded in the order it was specified - e.g. if you specify any options before this one, they might get overriden by the contents of the rcfile. -q, --quiet Be quiet -v, --verbose Be verbose -b, --blocksize Number of bytes to download in a block. Defaults to 64000. -?, --help Show help message --usage Display brief usage message SMB URLS
SMB URL's should be specified in the following format: smb://[[[domain;]user[:password@]]server[/share[/path[/file]]]] smb:// means all the workgroups smb://name/ means, if name is a workgroup, all the servers in this workgroup, or if name is a server, all the shares on this server. EXAMPLES
# Recursively download 'src' directory smbget -R smb://rhonwyn/jelmer/src # Download FreeBSD ISO and enable resuming smbget -r smb://rhonwyn/isos/FreeBSD5.1.iso # Recursively download all ISOs smbget -Rr smb://rhonwyn/isos # Backup my data on rhonwyn smbget -Rr smb://rhonwyn/ BUGS
Permission denied is returned in some cases where the cause of the error is unknown (such as an illegally formatted smb:// url or trying to get a directory without -R turned on). VERSION
This man page is correct for version 3.0 of the Samba suite. AUTHOR
The original Samba software and related utilities were created by Andrew Tridgell. Samba is now developed by the Samba Team as an Open Source project similar to the way the Linux kernel is developed. The smbget manpage was written by Jelmer Vernooij. SMBGET(1)
All times are GMT -4. The time now is 01:22 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy