Sponsored Content
Top Forums UNIX for Beginners Questions & Answers How to use cURL to download web page with authentification (form)? Post 302978816 by bakunin on Thursday 4th of August 2016 11:15:10 AM
Old 08-04-2016
I havent tried that with cURL, but used wget (a similar tool) to achieve about the same.

here is a test script i once wrote to write a bot for wiki-pages. It is not exactly a solution for you but shows how things work and how you can manipulate and use persistent login data across several calls of wget:

Code:
#! /bin/ksh93

typeset WGET=$(which wget)
typeset chWikiInst="my_wiki"
typeset chWikiURL="http://my_system/wiki/api.php"
typeset fWorkDir="/home/bakunin/projects/wiki/work"
typeset fOut="$fWorkDir/outfile"
typeset fTok="$fWorkDir/token"
typeset chUser="BotUser"
typeset chPwd="UserBot"
typeset chToken=""
typeset chSessionID=""
typeset chEditToken=""

rm "${fOut}*"
rm "${fTok}*"

# ----------------------- login1 --------------------
$WGET --post-data "action=login&lgname=${chUser}&lgpassword=${chPwd}&format=xml" \
      --save-cookies="${fTok}.login1" \
      --output-document="${fOut}.login1" \
      --keep-session-cookies \
      -q \
      "$chWikiURL"

                                                       # extract info
chToken="$(sed 's/.*\ token="\([^"]*\)".*/\1/' "${fOut}.login1")"
chSessionID="$(sed 's/.*\ sessionid="\([^"]*\)".*/\1/' "${fOut}.login1")"

print - "sessionID: $chSessionID \t Token: $chToken"

# ----------------------- confirm token --------------------
$WGET --post-data "action=login&lgname=${chUser}&lgpassword=${chPwd}&lgtoken=${chToken}&format=xml" \
      --load-cookies="${fTok}.login1" \
      --save-cookies="${fTok}.login2" \
      --output-document="${fOut}.login2" \
      --keep-session-cookies \
      -q \
      "$chWikiURL"

# ----------------------- get edit token --------------------
$WGET --post-data "action=tokens&type=edit&format=xml" \
      --load-cookies="${fTok}.login2" \
      --save-cookies="${fTok}.edit" \
      --output-document="${fOut}.edit" \
      --keep-session-cookies \
      -q \
      "$chWikiURL"

                                                       # extract info
chEditToken="$(sed 's/.*\ edittoken="\([^"]*\)+\\".*/\1/' "${fOut}.edit")"
                                                       # pseudo-URL-encode trailing "+\"
chEditToken="${chEditToken}%2B%5C"
print - "sessionID: $chSessionID\nToken....: $chToken\nEditToken: $chEditToken"

# ----------------------- create new page --------------------
$WGET --post-data "action=edit&title=MyTestPage&contentformat=text/x-wiki&format=xml&text='Hello-World'&token=${chEditToken}" \
      --load-cookies="${fTok}.edit" \
      --save-cookies="${fTok}.create" \
      --output-document="${fOut}.create" \
      --keep-session-cookies \
      -q \
      "$chWikiURL"

# ----------------------- logout -------------------
$WGET --post-data "action=logout&format=xml" \
      --load-cookies="${fTok}.edit" \
      --save-cookies="${fTok}.sessionend" \
      --output-document="${fOut}.sessionend" \
      "$chWikiURL"
exit 0

I hope this helps.

bakunin
This User Gave Thanks to bakunin For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Posting data to a form using curl

Hello all. I have an incredible number of servers that I need to change a parameter on using a web interface. I'd like to be able to do this via curl, but I'm having some trouble. I filled out the form and hit update while snooping (tcpdump) my interface. That gave the the following as what is... (0 Replies)
Discussion started by: DeCoTwc
0 Replies

2. UNIX for Dummies Questions & Answers

Possible to download web page's text to a file?

Hi, Say there is a web page that contains just text only - that is, even the source code is just the text itself, nothing more. An example would be "http://mynasadata.larc.nasa.gov/docs/ocean_percent.txt" Is there a UNIX command that would allow me to download this text and store it in a... (1 Reply)
Discussion started by: Breanne
1 Replies

3. Shell Programming and Scripting

Using cURL to submit a post form

I am trying to write a shell script to use curl in order to automate downloading data from a website. The URL with the post form is here: http://try-db.org/de/InfoBySpecies.php . I have a list of about 1800 different species I want to check. For Example, choose the first species and use the... (2 Replies)
Discussion started by: hansvg
2 Replies

4. Shell Programming and Scripting

help pulling ${VARS} out of a web page user curl

Here is the code I have so far #!/bin/bash INFOF="/tmp/mac.info" curl --silent http://www.everymac.com/systems/apple/macbook_pro/specs/macbook-pro-core-2-duo-2.8-aluminum-17-mid-2009-unibody-specs.html "$INFOF" I want help putting these specs into a vars Standard Ram: value into $VAR1... (1 Reply)
Discussion started by: briandanielz
1 Replies

5. Solaris

Disable SNMP v1/2 authentification and use SNMPv3 authentification

Hello everyone, Can anyone help me to disable SNMP v1/2 authentification on Solaris 10 and use SNMP v3 authentification ? I have SNMPv3 installed on my Solaris but the tool we use to scan vulnerabilities still show that there is a problem whith authentification version. Does it mean that... (1 Reply)
Discussion started by: adilyos
1 Replies

6. Shell Programming and Scripting

Random web page download wget script

Hi, I've been attempting to create a script that downloads web pages at random intervals to mimic typical user usage. However I'm struggling to link $url to the URL list and thus wget complains of a missing URL. Any ideas? Thanks #!/bin/sh #URL List url1="http://www.bbc.co.uk"... (14 Replies)
Discussion started by: shadyuk
14 Replies

7. UNIX for Dummies Questions & Answers

List and download web page files

Hello, Does anyone know of a way to list all files related to a single web page and then to download say 4 files at a time simultaneously until the entire web page has been downloaded successfully? I'm essentially trying to mimic a browser. Thanks. (2 Replies)
Discussion started by: shadyuk
2 Replies

8. Shell Programming and Scripting

Use curl to send a static xml file using url encoding to a web page using pos

Hi I am try to use curl to send a static xml file using url encoding to a web page using post. This has to go through a particular port on our firewall as well. This is my first exposure to curl and am not having much success, so any help you can supply, or point me in the right direction would be... (1 Reply)
Discussion started by: Paul Walker
1 Replies

9. Linux

Curl login to web page

Hello dears, I am trying to log in the website using curl but no luck so far. The web page Content-Type is : text/html;charset=ISO-8859-1 I try the following command using curl: curl \ --header "Content-type: text/html" \ --request POST \ --data '{"user": "someusername",... (0 Replies)
Discussion started by: Vit0_Corleone
0 Replies

10. Web Development

Curl - post form issue

I'm having an issue with curl post form, I dont' understand what I'm mising. I would like to send a post command login/password to a form, quite simple in the paper. URL : http: // <myebsite> / login Here the form source code : <form action="/login_check" method="post"> <input... (3 Replies)
Discussion started by: Fred13
3 Replies
All times are GMT -4. The time now is 11:34 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy