01-12-2007
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I built my website based on Dreamweaver, on Windows platform. My server uses Unix, and the page doesn't look too good. Is there any way to solve this problem without too much of a headache? (1 Reply)
Discussion started by: PCL
1 Replies
2. UNIX for Dummies Questions & Answers
Hello,
I am new to unix, but wanted to know how can we fetch data from a web page (i.e. an HTML Page), my requirement is to read an html page and wanted to create a flat file (text file) based on the contents available in the mentioned HTML page.
Thanks
Imtiaz (3 Replies)
Discussion started by: Imtiaz
3 Replies
3. UNIX for Dummies Questions & Answers
hey uhh this is my first post and i was wondering
how do i make a web page for like a small business or something
anything will help
thanks (3 Replies)
Discussion started by: Neil Peart
3 Replies
4. UNIX for Dummies Questions & Answers
I'm 13 years of age and I am into computers. I am trying to learn how to make a webpage.
I could use the help and I would greatly appriciate it. (1 Reply)
Discussion started by: lydia98
1 Replies
5. Programming
Hello,
I'm a total newbie to HTTP commands, so I'm not sure how to do this. What I'd like is to write a C program to fetch the contents of a html page of a given address.
Could someone help with this?
Thanks in advance! (4 Replies)
Discussion started by: rayne
4 Replies
6. Shell Programming and Scripting
Hi all,
I am having an XML file.
And as per requirement I need to map fields of this file with various field of web page.
So how can I use wput command into it ?
Regards,
gander_ss (3 Replies)
Discussion started by: gander_ss
3 Replies
7. Shell Programming and Scripting
Hi
I have a file which looks like this
name: Sally group: Group4
name: Tim group: Group1
name: Dan group: Group2
name: Chris group: Group3
name: Peter group:
name: Fred group:
name: Mary group: Group2
Well I want to get rid of the... (4 Replies)
Discussion started by: bombcan
4 Replies
8. UNIX Desktop Questions & Answers
Is there a way we can get a web page through CLI on a unix machine? Please help! (3 Replies)
Discussion started by: Pouchie1
3 Replies
9. UNIX for Dummies Questions & Answers
Hi,
I have a project for school using wget and egrep to locate pattern locations on a web page.
One of the things we have to do is handle an "access denied" exception.
Here is the problem, I can not think of/find any web pages that give me an access denied error to play with, can anyone suggest... (1 Reply)
Discussion started by: njmiano
1 Replies
10. Web Development
I am just wondering why do programmers are using this when programming the web? When you making a joomla templates and the more focus in your mind is to target the search engines then java is very important.Not to use that. (2 Replies)
Discussion started by: Anna Hussie
2 Replies
LEARN ABOUT DEBIAN
hxcopy
HXCOPY(1) HTML-XML-utils HXCOPY(1)
NAME
hxcopy - copy an HTML file and update its relative links
SYNOPSIS
hxcopy [ -i old-URL ] [ -o new-URL ] [ file-or-URL [ file-or-URL ] ]
DESCRIPTION
The hxcopy command copies its first argument to its second argument, while updating relative links. The input is assumed to be HTML or
XHTML and may be slightly reformatted in the process.
If the second argument is omitted, hxcopy writes to standard output. In this case the option -o is required. If the first argument is also
omitted, hxcopy reads from standard input. In this case the option -i is required.
OPTIONS
The following options are supported:
-i old-URL
For the purposes of updating relative links, act as if old-URL is the location from which the input is copied. If this option is
omitted, the actual location of the first argument is used for calculating relative links.
-o new-URL
For the purposed of updating relative links, act as if new-URL is the location to which the input is copied. If this option is
omitted, the actual location of the second argument is used for calculating relative links.
ENVIRONMENT
To use a proxy to retrieve remote files, set the environment variables http_proxy and ftp_proxy. E.g., http_proxy="http://localhost:8080/"
BUGS
Unlike the last argument of cp(1), the last argument of hxcopy must be a file, not a directory.
The second argument must be a local file. Writing to a URL is not yet implemented. To work around this, replace hxcopy file.html
http://example.org/file.html by hxcopy -o http://example.org/file.html file.html tmp.html and then upload tmp.html to the given URL with
some other command, such as curl(1). The first argument, however, may be a URL. hxcopy will download the given file. (Currently only HTTP
is supported.)
EXAMPLE
Assume the HTML file foo.html contains a relative link to "../bar.html". Here are some examples of commands:
hxcopy foo.html bar/foo.html
The file foo.html is copied to ../bar/foo.html and the relative link to "../bar.html" becomes "../../bar.html".
hxcopy foo.html ../foo.html
The file foo.html is copied to ../foo.html and the relative link to "../bar.html" is rewritten as "bar.html".
hxcopy -i http://my.org/dir1/foo.html -o http://my.org/foo.html file1.html file2.html
The file file1.html is copied to file2.html and the relative link to "../bar.html" is rewritten as "bar.html". A command like this
may be useful to update files that are later uploaded to a server.
SEE ALSO
cp(1), curl(1), hxwls(1)
6.x 9 Dec 2008 HXCOPY(1)