Sponsored Content
Full Discussion: wget output file names
Top Forums Shell Programming and Scripting wget output file names Post 302673007 by jacobs.smith on Tuesday 17th of July 2012 10:39:42 AM
Old 07-17-2012
Alister,

You missed a point.

I did check my output files after editing the input.txt to the following

Code:
index.html?acc=OSR765454&file=filename1.gz -O filename1.gz
index.html?acc=OBR765454&file=filename111.gz -O filename111.gz
.....

This input.txt has to be given to the
Quote:
wget -i
command.

I did that and all I see is a single output file instead of one. If all the 5 file sizes together is 10GB. I see a single file filename1.gz to be 10GB.

Hope you got my point.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

wget output question

Hello there, İ want to ask a very simple question. I want to read the output messages of wget both in terminal and also put them into a text file. i know that by using -o flag, i can log the messages into a text file but then i won't be able to see them on terminal. I'd appreciate any help... (1 Reply)
Discussion started by: sertansenturk
1 Replies

2. Shell Programming and Scripting

download a particular file using wget

Hi All I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website : http://downloads.biowisdomsrs.com/srs83_dist/ But this website contains lots of zipped files I want to download the above file only discarding other zipped files. When I am trying the... (1 Reply)
Discussion started by: alphasahoo
1 Replies

3. Shell Programming and Scripting

Searching for file names in a directory while ignoring certain file names

Sun Solaris Unix Question Haven't been able to find any solution for this situation. Let's just say the file names listed below exist in a directory. I want the find command to find all files in this directory but at the same time I want to eliminate certain file names or files with certain... (2 Replies)
Discussion started by: 2reperry
2 Replies

4. Shell Programming and Scripting

ery weird wget/curl output - what should I do?

Hi, I'm trying to write a script to download RedHat's errata digest. It comes in a txt.gz format, and i can get it easily with firefox. HOWEVER: output is VERY strange when donwloading it in a script. It seems I'm getting a file of the same size - but partially text and partly binary! It... (5 Replies)
Discussion started by: jstilby
5 Replies

5. Shell Programming and Scripting

Encapsulating output of CURL and/or WGET

i use curl and wget quite often. i set up alarms on their output. for instance, i would run a "wget" on a url and then search for certain strings within the output given by the "wget". the problem is, i cant get the entire output or response of my wget/curl command to show up correctly in... (3 Replies)
Discussion started by: SkySmart
3 Replies

6. Shell Programming and Scripting

Exclude certain file names while selectingData files coming in different names in a file name called

Data files coming in different names in a file name called process.txt. 1. shipments_yyyymmdd.gz 2 Order_yyyymmdd.gz 3. Invoice_yyyymmdd.gz 4. globalorder_yyyymmdd.gz The process needs to discard all the below files and only process two of the 4 file names available ... (1 Reply)
Discussion started by: dsravanam
1 Replies

7. Shell Programming and Scripting

Custom wget output

The below hides the messy commands of wget #!/bin/bash cd 'C:\Users\cmccabe\Desktop\wget' wget -O getCSV.txt http://172.24.188.113/data/getCSV.csv progressfilt () { local flag=false c count cr=$'\r' nl=$'\n' while IFS='' read -d '' -rn 1 c do if $flag ... (5 Replies)
Discussion started by: cmccabe
5 Replies

8. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies

9. Shell Programming and Scripting

Print the output with different file names

I have a python script that gives output called test.png. By using the following command I run the script every 2 seconds. What is the easiest way to save the output as follows ( test.png (1st output), tes1.png (second output), tes2.png ....) Command I i use while sleep 2; do python... (1 Reply)
Discussion started by: quincyjones
1 Replies

10. Shell Programming and Scripting

How to list files names and sizes in a directory and output result to the file?

Hi , I'm trying to list the files and output is written to a file. But when I execute the command , the output file is being listed. How to exclude it ? /tmp file1.txt file2.txt ls -ltr |grep -v '-' | awk print {$9, $5} > output.txt cat output.txt file1.txt file2.txt output.txt (8 Replies)
Discussion started by: etldeveloper
8 Replies
httpindex(1)						      General Commands Manual						      httpindex(1)

NAME
httpindex - HTTP front-end for SWISH++ indexer SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ] DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc- tory structure) can be kept, deleted, or replaced with their descriptions after indexing. OPTIONS
wget Options The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the EXAMPLE.) httpindex Options httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V. The following options are unique to httpindex: -d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See the extract_description() function in WWW(3) for details about how descriptions are extracted.) -D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with copies of remote files. EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally: wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 | httpindex -d -e'html:*.html,text:*.txt' Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex. EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise. CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.'' The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want to do: httpindex -e'html:*.html' -e'text:*.txt' do this instead: httpindex -e'html:*.html,text:*.txt' SEE ALSO
index++(1), wget(1), WWW(3) AUTHOR
Paul J. Lucas <pauljlucas@mac.com> SWISH++ August 2, 2005 httpindex(1)
All times are GMT -4. The time now is 10:45 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy