Sponsored Content
Top Forums Shell Programming and Scripting Help with using lynx/wget/curl when a link has an ampersand Post 302492422 by Corona688 on Monday 31st of January 2011 10:41:46 AM
Old 01-31-2011
I suppose you're running a new instance of wget to download every single webpage? You don't have to run wget 900 times to download 900 pages, try:

Code:
cat <<EOF | wget -i -
http://url1.com/path/to/whatever
http://url1.com/path/to/whatever2
http://url1.com/path/to/whatever3
http://url1.com/path/to/whatever4
EOF

Much faster, especially when the URL's are from the same site which lets it reuse the connection.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help needed in Curl & Wget

We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work. curl -k https://site curl -k -x IP:Port https://site curl -k -x IP:443 https://id:pwd@site ... (0 Replies)
Discussion started by: dineshbabu01
0 Replies

2. Shell Programming and Scripting

Proxy with curl/wget support

I need a proxy that would enable me to use cli curl/wget with another ip address. How do I find a paid proxy server that supports curl/wget? (1 Reply)
Discussion started by: locoroco
1 Replies

3. Shell Programming and Scripting

Specifying IP address with curl/wget

Hello, I am wondering does anyone know of a method using curl/wget or other where by I could specify the IP address of the server I wish to query for a website. Something similar to editing /etc/hosts but that can be done directly from the command line. I have looked through the man pages... (4 Replies)
Discussion started by: colinireland
4 Replies

4. Shell Programming and Scripting

How to download file without curl and wget

Hi I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget. Can anyone will help me go about this task ??? Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies

5. Shell Programming and Scripting

Encapsulating output of CURL and/or WGET

i use curl and wget quite often. i set up alarms on their output. for instance, i would run a "wget" on a url and then search for certain strings within the output given by the "wget". the problem is, i cant get the entire output or response of my wget/curl command to show up correctly in... (3 Replies)
Discussion started by: SkySmart
3 Replies

6. Shell Programming and Scripting

Wget vs Curl - Proxy issue

Hi, My script needs to crawl the data from a third party site. Currently it is written in wget. The third party site is of shared interface with different IP addresses. My wget works with all the IP address but not with one. Whereas the curl is able to hit that IP address and comes out... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies

7. Shell Programming and Scripting

Wget/curl credentials validation

Experts, I login to a 3rd party and pull some valuable information with my credentials. I pass my credentials via --post-data in wget. Now my Account is locked. I want my wget to alert that the Account is locked. How can i achieve this. My idea is, get the Source page html from the... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies

8. UNIX for Dummies Questions & Answers

Read URL data from UNIX without wget,curl,lynx,w3m.

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts. My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled... (0 Replies)
Discussion started by: scott_cog
0 Replies

9. Shell Programming and Scripting

How to get content of a webpage Curl vs Wget?

Hello, What I am trying to do is to get html data of a website automatically. Firstly I decided to do it manually and via terminal I entered below code: $ wget http://www.***.*** -q -O code.html Unfortunately code.html file was empty. When I enter below code it gave Error 303-304 $... (1 Reply)
Discussion started by: baris35
1 Replies

10. Web Development

Wget/curl and javascript

What can I use instead of wget/curl when I need to log into websites that use javascript? Wget and curl don't handle javascript. (6 Replies)
Discussion started by: locoroco
6 Replies
SMBGET(1)							   User Commands							 SMBGET(1)

NAME
smbget - wget-like utility for download files over SMB SYNOPSIS
smbget [-a, --guest] [-r, --resume] [-R, --recursive] [-u, --username=STRING] [-p, --password=STRING] [-w, --workgroup=STRING] [-n, --nonprompt] [-d, --debuglevel=INT] [-D, --dots] [-P, --keep-permissions] [-o, --outputfile] [-f, --rcfile] [-q, --quiet] [-v, --verbose] [-b, --blocksize] [-O, --stdout] [-?, --help] [--usage] {smb://host/share/path/to/file} [smb://url2/] [...] DESCRIPTION
This tool is part of the samba(7) suite. smbget is a simple utility with wget-like semantics, that can download files from SMB servers. You can specify the files you would like to download on the command-line. The files should be in the smb-URL standard, e.g. use smb://host/share/file for the UNC path \\HOST\SHARE\file. OPTIONS
-a, --guest Work as user guest -r, --resume Automatically resume aborted files -R, --recursive Recursively download files -u, --username=STRING Username to use -p, --password=STRING Password to use -w, --workgroup=STRING Workgroup to use (optional) -n, --nonprompt Don't ask anything (non-interactive) -d, --debuglevel=INT Debuglevel to use -D, --dots Show dots as progress indication -P, --keep-permissions Set same permissions on local file as are set on remote file. -o, --outputfile Write the file that is being downloaded to the specified file. Can not be used together with -R. -O, --stdout Write the file that is being downloaded to standard output. -f, --rcfile Use specified rcfile. This will be loaded in the order it was specified - e.g. if you specify any options before this one, they might get overriden by the contents of the rcfile. -q, --quiet Be quiet -v, --verbose Be verbose -b, --blocksize Number of bytes to download in a block. Defaults to 64000. -?, --help Show help message --usage Display brief usage message SMB URLS
SMB URL's should be specified in the following format: smb://[[[domain;]user[:password@]]server[/share[/path[/file]]]] smb:// means all the workgroups smb://name/ means, if name is a workgroup, all the servers in this workgroup, or if name is a server, all the shares on this server. EXAMPLES
# Recursively download 'src' directory smbget -R smb://rhonwyn/jelmer/src # Download FreeBSD ISO and enable resuming smbget -r smb://rhonwyn/isos/FreeBSD5.1.iso # Recursively download all ISOs smbget -Rr smb://rhonwyn/isos # Backup my data on rhonwyn smbget -Rr smb://rhonwyn/ BUGS
Permission denied is returned in some cases where the cause of the error is unknown (such as an illegally formatted smb:// url or trying to get a directory without -R turned on). VERSION
This man page is correct for version 3 of the Samba suite. AUTHOR
The original Samba software and related utilities were created by Andrew Tridgell. Samba is now developed by the Samba Team as an Open Source project similar to the way the Linux kernel is developed. The smbget manpage was written by Jelmer Vernooij. Samba 3.5 06/18/2010 SMBGET(1)
All times are GMT -4. The time now is 10:15 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy