Sponsored Content
Top Forums Shell Programming and Scripting help pulling ${VARS} out of a web page user curl Post 302683919 by agama on Wednesday 8th of August 2012 11:32:18 PM
Old 08-09-2012
I assume you wanted the value into shell variables, not awk variables since you used a dollar sign prefix in your example.

First example works only in Kshell (cleaner if you're willing to use kshell). Second example, with a hack needed to get the variables will work in bash. I don't regularly use bash, so maybe there is a better/easier way, but I'm not finding it in my brain tonight.

Code:
#!/usr/bin/env ksh

url="http://www.everymac.com/systems/apple/macbook_pro/specs/macbook-pro-core-2-duo-2.8-aluminum-17-mid-2009-unibody-specs.html"

curl --silent "$url" | awk  '{gsub( "<[^>]*>", "\n" ); gsub( "\r", "" );  print;}'   |awk '
    /Standard RAM/      {snarf = 1; what = "sram"; next; }
    /Maximum RAM:/      {snarf = 1; what = "mram"; next; }
    /Apple Model No:/   {snarf = 1; what = "modeln"; next; }
    /Model ID:/         {snarf = 1; what = "modeli"; next; }

    NF > 0 && snarf { v[what] = $0; snarf = 0; next; }

    END { printf( "%s,%s,%s,%s\n", v["sram"], v["mram"], v["modeln"], v["modeli"] ); } ' | IFS=, read var1 var2 var3 var4

echo "v1=$var1  v2=$var2  v3=$var3  v4=$var4"

Code:
#!/usr/bin/env bash
url="http://www.everymac.com/systems/apple/macbook_pro/specs/macbook-pro-core-2-duo-2.8-aluminum-17-mid-2009-unibody-specs.html"

# not as clean, but works...
curl --silent "$url" | awk  '{gsub( "<[^>]*>", "\n" ); gsub( "\r", "" );  print;}'  | awk '
    /Standard RAM/      {snarf = 1; what = "sram"; next; }
    /Maximum RAM:/      {snarf = 1; what = "mram"; next; }
    /Apple Model No:/   {snarf = 1; what = "modeln"; next; }
    /Model ID:/         {snarf = 1; what = "modeli"; next; }

    NF > 0 && snarf {  v[what] = $0;  snarf = 0; next; }
   
 END {  printf( "var1=\"%s\"\nvar2=\"%s\"\nvar3=\"%s\"\nvar4=\"%s\"\n", v["sram"], v["mram"], v["modeln"], v["modeli"] ); } ' >/tmp/hack.$$

. /tmp/hack.$$
rm /tmp/hack.$$

echo "v1=$var1  v2=$var2  v3=$var3  v4=$var4"

Hope one of these helps.
 

9 More Discussions You Might Find Interesting

1. Programming

fetching a web page in C

Hello, I'm a total newbie to HTTP commands, so I'm not sure how to do this. What I'd like is to write a C program to fetch the contents of a html page of a given address. Could someone help with this? Thanks in advance! (4 Replies)
Discussion started by: rayne
4 Replies

2. UNIX for Dummies Questions & Answers

curl command with web pages

I can't quite seem to understand what the curl command does with a web address. I tried this: curl O'Reilly Media: Tech Books, Conferences, Courses, News but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies

3. Shell Programming and Scripting

web service call: curl output to xsltproc input

I need to invoke a web service and extract what I need from the response using a combination of curl and xsltproc. However, any file-based parameters that must be supplied to both these programs must be from stdin and not actual files. At least with curl, it seems to think that I am supplying a... (3 Replies)
Discussion started by: webuser
3 Replies

4. Shell Programming and Scripting

Grabbin a html code from a page (Var = Curl)

Hi there. Im not very good on shell yet. This line, will print me YES or NO in console. Its the HTML code returned from the website, simply YES or NO curl -L "http://www.thewebsite.net/auth/log.jsp?user=$user&sessionId=$sid&serverId=$hash" How could i save this into a variable, so i... (1 Reply)
Discussion started by: Ziden
1 Replies

5. UNIX for Dummies Questions & Answers

How to switch the user before executing a shell script from web page??

hi, i want to execute a shell script as a different user. the flow is like this. there is a html web page from which i have to call a shell script. web server is apache. to call the shell script from html page, a perl script is required. so the html page calls the perl script and the perl... (2 Replies)
Discussion started by: Little
2 Replies

6. Shell Programming and Scripting

Use curl to send a static xml file using url encoding to a web page using pos

Hi I am try to use curl to send a static xml file using url encoding to a web page using post. This has to go through a particular port on our firewall as well. This is my first exposure to curl and am not having much success, so any help you can supply, or point me in the right direction would be... (1 Reply)
Discussion started by: Paul Walker
1 Replies

7. UNIX for Beginners Questions & Answers

How to use cURL to download web page with authentification (form)?

Hello, I'm new in the forum and really beginer, and also sorry form my bad english. I use linux and want to create little program to download automaticaly some pdf (invoices) and put in a folder of my computer. I learn how to do and programme with ubuntu but the program will be implemented... (1 Reply)
Discussion started by: MarcelOrMittal
1 Replies

8. Linux

Curl login to web page

Hello dears, I am trying to log in the website using curl but no luck so far. The web page Content-Type is : text/html;charset=ISO-8859-1 I try the following command using curl: curl \ --header "Content-type: text/html" \ --request POST \ --data '{"user": "someusername",... (0 Replies)
Discussion started by: Vit0_Corleone
0 Replies

9. Shell Programming and Scripting

How to get contents of php page using curl in C language?

Hi, I want to write code in C using curl library to get output of php page , the output is in xml. Thanks (2 Replies)
Discussion started by: nitks.abhinav
2 Replies
snarf(1)						      General Commands Manual							  snarf(1)

NAME
snarf - Simple Non-interactive All-purpose Resource Fetcher SYNOPSIS
snarf [-avqprzm] URL [outfile] ... DESCRIPTION
Retrieves data from a variety of protocols, namely http, ftp, and gopher. USAGE
snarf is invoked with any number of URLs and outfiles. If an outfile is not specified, snarf preserves the remote file name when saving. For example, snarf http://foo.bar.com/images/face.gif will retrieve the file ``face.gif'' to the local system. In the event that there is no filename (the url ends in a slash), the data is retrieved and stored in the file index.html for http URLs, ftpindex.txt for ftp URLs, or gopherindex.txt for gopher URLs. Using a dash, "-", as the outfile causes snarf to send its output to stdout rather than a file. To log in to an ftp server or website that requires a username and password, use the syntax http://username:password@site.com/. If you omit the password, you will be prompted for it. Snarf has a built-in option to download the latest version of itself; simply run snarf LATEST. OPTIONS -a Causes snarf to use "active" ftp. By default, snarf uses passive ftp, and, if the server does not support it, falls back to active ftp. Using the -a option will avoid the initial passive attempt. -r Resumes an interrupted ftp or http transfer by checking if there is a local file with the same name as the remote file, and starting the transfer at the end of the local file and continuing until finished. This option only works with HTTP servers that understand HTTP/1.1 and ftp servers that support the REST command. snarf uses this option automatically if the outfile already exists. -n Don't resume; ignore the outfile if it exists and re-transfer it in its entirety. -q Don't print progress bars. -p Forces printing of progress bars. Snarf has a compile-time option for whether progress bars print by default or not. The -p option overrides the -q option. In addition, if progress bars are enabled by default, snarf suppresses them when standard output is not a terminal. Using -p will override this behavior. -v Prints all messages that come from the server to stderr. -z Send a user-agent string similar to what Netscape Navigator 4.0 uses. -m Send a user-agent string similar to what Microsoft Internet Explorer uses. Each option only affects the URL that immediately follows it. To have an option affect all URLs that follow it, use an uppercase letter for the option, e.g. -Q instead of -q. ENVIRONMENT
Snarf checks several environment variables when deciding what to use for a proxy. It checks a service-specific variable first, then SNARF_PROXY, then PROXY. The service-specific variables are HTTP_PROXY, FTP_PROXY, and GOPHER_PROXY. Snarf also checks the SNARF_HTTP_USER_AGENT environment variable and will use it when reporting its user-agent string to an HTTP server. In the same spirit, it also uses the SNARF_HTTP_REFERER environment variable to spoof a Referer to the web server. BUGS
Bugs? What bugs? If you find 'em, report 'em. AUTHOR
Copyright (C) 2000 Zachary Beane (xach@xach.com) 17 Jun 2000 snarf(1)
All times are GMT -4. The time now is 03:03 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy