05-03-2013
Scrape 10 million pages and save the raw html data in mysql database
I have a list of 10 million page urls. I want those pages scraped and saved in the mysql database as raw html.
I own a Linux VPS server with 1GB RAM and WHM/cPanel.
I would like to scrape at least 100,000 urls in 24 hours.
So can anyone give me some sample shell scripting code?
8 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi,
Need to get help from you guys about this issue.
I need to insert data into MySql database from a text file which is located in other server.
The text file is something look like below:
Date | SubscriberNo | Call Duration
20/7/07 | 123456788 | 20
20/7/07 | 123412344 | 30
The... (4 Replies)
Discussion started by: shirleyeow
4 Replies
2. Shell Programming and Scripting
Hi,
I have gps receiver, by using gpsd data i can read gps log data to my database(my sql).
Steps:
1. telenet localhost 2947 > gps.txt (press enter)
2. r (press enter) //then i will get the data like below in gps.txt file
Trying 127.0.0.1...
Connected to localhost.... (1 Reply)
Discussion started by: gudivada213
1 Replies
3. Web Development
Hi all,
I was wondering if anyone knew a good/safe way to update a single column in a table that could contain upto 8 million rows...
simple command like:
UPDATE set blah=foo where bar=XXX;
I will be running this on tables being written to and tables that have already been created.
... (3 Replies)
Discussion started by: muay_tb
3 Replies
4. Shell Programming and Scripting
CAN I download an html file via wget and pass it to mysql and update a database field? (8 Replies)
Discussion started by: mapasainfo
8 Replies
5. Shell Programming and Scripting
Have to delete this long post. Seems nobody would spent time on it. (0 Replies)
Discussion started by: yifangt
0 Replies
6. Shell Programming and Scripting
Hi All
I would like to get cars DB from this web site link removed , and I am trying to right script to go and parse the web page and save the data for each cars into a file, type of cars, mark, model, but when I look to the source page I found out that the first car type is preselected and the... (3 Replies)
Discussion started by: molwiko
3 Replies
7. Shell Programming and Scripting
Sorry to disturb you, I would like to seek help on inserting data whenever the switch is on or off to my phpMyAdmin mySQL database from my Shell Script. I'm using Raspberry PI as my hardware and I have follow this LINK: instructables.com/id/Web-Control-of-Raspberry-Pi-GPIO/?ALLSTEPS to create my... (4 Replies)
Discussion started by: aoiregion
4 Replies
8. Programming
Hi guys, I would like to seek help on inserting data whenever the switch is on or off to my sensor mySQL database in phpMyAdmin from my control.php. I'm using Raspberry PI as my hardware and follow a few tutorials to create my own Web Control Interface, it works perfectly without insert method.... (1 Reply)
Discussion started by: aoiregion
1 Replies
LEARN ABOUT DEBIAN
scrapy
SCRAPY(1) General Commands Manual SCRAPY(1)
NAME
scrapy - the Scrapy command-line tool
SYNOPSIS
scrapy [command] [OPTIONS] ...
DESCRIPTION
Scrapy is controlled through the scrapy command-line tool. The script provides several commands, for different purposes. Each command sup-
ports its own particular syntax. In other words, each command supports a different set of arguments and options.
OPTIONS
fetch [OPTION] URL
Fetch a URL using the Scrapy downloader
--headers
Print response HTTP headers instead of body
runspider [OPTION] spiderfile
Run a spider
--output=FILE
Store scraped items to FILE in XML format
settings [OPTION]
Query Scrapy settings
--get=SETTING
Print raw setting value
--getbool=SETTING
Print setting value, intepreted as a boolean
--getint=SETTING
Print setting value, intepreted as an integer
--getfloat=SETTING
Print setting value, intepreted as an float
--getlist=SETTING
Print setting value, intepreted as an float
--init Print initial setting value (before loading extensions and spiders)
shell URL | file
Launch the interactive scraping console
startproject projectname
Create new project with an initial project template
--help, -h
Print command help and options
--logfile=FILE
Log file. if omitted stderr will be used
--loglevel=LEVEL, -L LEVEL
Log level (default: None)
--nolog
Disable logging completely
--spider=SPIDER
Always use this spider when arguments are urls
--profile=FILE
Write python cProfile stats to FILE
--lsprof=FILE
Write lsprof profiling stats to FILE
--pidfile=FILE
Write process ID to FILE
--set=NAME=VALUE, -s NAME=VALUE
Set/override setting (may be repeated)
AUTHOR
Scrapy was written by the Scrapy Developers <scrapy-developers@googlegroups.com>.
This manual page was written by Ignace Mouzannar <mouzannar@gmail.com>, for the Debian project (but may be used by others).
October 17, 2009 SCRAPY(1)