Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Selecting information from several web pages... Post 31359 by Vishnu on Wednesday 6th of November 2002 04:04:30 PM
Old 11-06-2002
Yes Perderabo you are right in that..

and in fact I used a small command which used filter out user, sys admin etc commands into consolidated files (which of course had much less utility than the man pages themselves!!!)

I used some thing like this to consolidate user command mans...

man `man -k ' ' | grep '(1)' | awk -F"(" { print $1 } | tr -d "\n" " "` | col -b > consoldatedman

My purpose in asking such question was part of my general wish to be able to crawl thru web pages following certain pattern in their naming...

Cheers!
Vishnu.
 

9 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Dynamic web pages for Unix Web Server

Hi, my company is considering a new development of our web site, which used to run on Apachi over Solaris. The company who is going to do this for us knows only about developing it in ASP. I guess this means we'll have to have another ISS server on NT for these dynamic pages :( What are... (5 Replies)
Discussion started by: me2unix
5 Replies

2. Shell Programming and Scripting

Count links in all of my web pages

Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" . i want the output to look something like this: Total number of web pages: (number) Total number of links: (number) Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies

3. UNIX for Dummies Questions & Answers

Browse Web pages through command line

Is there any way to browse web pages while on the command line? I know wget can download pages, but I was wondering if there was an option other than that. (2 Replies)
Discussion started by: vroomlicious
2 Replies

4. Shell Programming and Scripting

Investigating web pages in awk

hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies

5. UNIX for Dummies Questions & Answers

curl command with web pages

I can't quite seem to understand what the curl command does with a web address. I tried this: curl O'Reilly Media: Tech Books, Conferences, Courses, News but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies

6. UNIX for Dummies Questions & Answers

Forcing web pages to anti-aliase

Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings. It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies

7. Shell Programming and Scripting

Checking Web Pages?

Hey guys, Unfortunatley, I can not use wget on our systems.... I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application. Has anyone saw this before? Thanks, Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies

8. Shell Programming and Scripting

Get web pages and compare

Hello, I'm writing a shell script to wget content web pages from multiple server into a variable and compare if they match return 0 or return 2 #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv1 srv2 srv3 srv4" CLUSTER1_APPLIS="test/version.html test2.version.jsp" # Liste des... (4 Replies)
Discussion started by: gtam
4 Replies

9. Shell Programming and Scripting

Get web pages and compare

Hello I'm writing a script to get content of web pages on different machines and compare them using their md5 hash hear is my code #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054" CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies
PHATCH(1)						      General Commands Manual							 PHATCH(1)

NAME
Phatch - Photo Batch Processor DESCRIPTION
Phatch is a simple photo batch processor. It handles all popular image formats and can duplicate (sub)folder hierarchies. It can also batch resize, rotate, rename, ... and more in minutes instead of hours or days if you do it manually. SYNOPSIS
Phatch [actionlist] Phatch [options] [actionlist] [image folders/files/urls] Phatch --inspect [image files/urls] Phatch --droplet [actionlist/recent] [image files/urls] OPTIONS
--version Show program's version number and exit. -h, --help Show the command line options which are accepted by Phatch -c, --console Run Phatch as console program without a gui -d, --droplet Run Phatch as a gui droplet --desktop Save always on desktop -f, --force Ignore errors --fonts Initialize fonts (only for installation scripts) -i, --interactive Interactive -k, --keep Keep existing images (don't overwrite) -l LOCALE Specify locale language (for example en or en_GB) -n, --inspect Inspect metadata (requires exif & iptc plugin) --no-save No save action required at the end -r, --recursive Include all subfolders -t, --trust Do not check images first --unsafe Allow Geek action and unsafe expressions -v, --verbose Verbose EXAMPLES
phatch action_list.phatch phatch --verbose --recursive action_list.py image_file.png image_folder phatch --inspect image_file.jpg phatch --droplet recent phatch -l el AUTHOR
Stani (spe.stani.be (at) gmail.com) User Commands February 2009 PHATCH(1)
All times are GMT -4. The time now is 06:12 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy