Sponsored Content
Full Discussion: Wget Cronjob, Godaddy.
Top Forums UNIX for Beginners Questions & Answers Wget Cronjob, Godaddy. Post 302979726 by embus on Thursday 18th of August 2016 10:36:10 AM
Old 08-18-2016
Thanks for that, I see a little more clearly how the q works now. This is the response I get, suggesting it can't read the URL. I can't post urls here yet, but i've put it in as lynora.co.uk, (with a http above).

Code:
wget: missing URL
Usage: wget [OPTION]... [URL]...

Try ‘wget --help’ for more options.


Last edited by embus; 08-18-2016 at 11:42 AM..
 

8 More Discussions You Might Find Interesting

1. HP-UX

CronJob

Dear Guru, I have submitted some cronjobs that has been running for quite sometimes. However, today I counter some cronjob did not run. Can you please explain what causes this to happen. Any system setting that limit number of cronjob per minutes to be started, etc? Thanks. Kelly (1 Reply)
Discussion started by: hcng08
1 Replies

2. Solaris

at vs cronjob

HI, What is the differnece between at / con job? thanks in advance. (1 Reply)
Discussion started by: mokkan
1 Replies

3. UNIX for Dummies Questions & Answers

how to cancel a cronjob if the cronjob still running

hi everyone I'm newbie in this forum hope I can get some help here :) I have a command in crontab that executed every 1 minute sometime this command need more than 1 minute to finish the problem is, the crontab execute this command although it's not finish processing yet and causing the system... (7 Replies)
Discussion started by: 2j4h
7 Replies

4. UNIX for Dummies Questions & Answers

Cronjob help

Hi I am very new to linux. I want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me. All I have is this... */15 * * * * ls -l | wc -l | | mail -s "This is just a test" I would... (2 Replies)
Discussion started by: LinuxNewb
2 Replies

5. UNIX for Dummies Questions & Answers

Postfix Dovecot Roundcube Godaddy

Long story short, I have everything working in my SOHO which would include Postfix,Dovecot(imap),Roundcube,bind and is ready to recieve email from the outside using this tutorial: https://workaround.org/ispmail/wheezy I also setup my internal DNS server using: https://wiki.debian.org/Bind9... (2 Replies)
Discussion started by: metallica1973
2 Replies

6. Shell Programming and Scripting

Running a KSH file from VPS on Godaddy

i have looked for a week and tried a few things, but nothing seems to work so joined here. I have a go daddy account and also a vps in germany. In my vps, i run a code script (we will call it codegen) when i run ./codegen i get my question of how many codes do i want to make. With my answer... (9 Replies)
Discussion started by: uksatman
9 Replies

7. Shell Programming and Scripting

How to download Images and Json file from server(godaddy) to Local machine (Ubuntu 14.04).?

Hi Guys, Just entering the Linux word, So I need help to write a script on my local machine(Ubuntu 14.04) that continuously check the particular folder(contains images) and a json file on the server and download whenever new images are added to that folder and whenever there is a change in the... (10 Replies)
Discussion started by: g4v1n
10 Replies

8. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
httpindex(1)						      General Commands Manual						      httpindex(1)

NAME
httpindex - HTTP front-end for SWISH++ indexer SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ] DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc- tory structure) can be kept, deleted, or replaced with their descriptions after indexing. OPTIONS
wget Options The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the EXAMPLE.) httpindex Options httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V. The following options are unique to httpindex: -d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See the extract_description() function in WWW(3) for details about how descriptions are extracted.) -D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with copies of remote files. EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally: wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 | httpindex -d -e'html:*.html,text:*.txt' Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex. EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise. CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.'' The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want to do: httpindex -e'html:*.html' -e'text:*.txt' do this instead: httpindex -e'html:*.html,text:*.txt' SEE ALSO
index++(1), wget(1), WWW(3) AUTHOR
Paul J. Lucas <pauljlucas@mac.com> SWISH++ August 2, 2005 httpindex(1)
All times are GMT -4. The time now is 07:58 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy