Sponsored Content
The Lounge What is on Your Mind? Baiduspider and Forum Performance Issues Post 302993358 by Neo on Thursday 9th of March 2017 07:04:20 AM
Old 03-09-2017
Sadly, have blocked a number of Baidu spider networks due to repeated problems.
These 3 Users Gave Thanks to Neo For This Post:
 

9 More Discussions You Might Find Interesting

1. Post Here to Contact Site Administrators and Moderators

Dos issues forum.

Didn't we used to a a forum for unix/microsoft interoperability issues? Was it dropped intentionally? Or did it get nailed by the same bug that got the shell forum a little while ago? (0 Replies)
Discussion started by: Perderabo
0 Replies

2. Shell Programming and Scripting

shell script performance issues --Urgent

I need help in awk please help immediatly. This below function is taking lot of time Please help me to fine tune it so that it runs faster. The file count is around 3million records # Process Body processbody() { #set -x while read line do ... (18 Replies)
Discussion started by: icefish
18 Replies

3. Solaris

raidctl performance issues

using the internal 2 drives mirror was created using raidctl on 100's of our servers . sometime when one drive fails we dont face any issue & we replace the drive with out any problem . but sometimes when one drive fails , system becomes unresponsive and doesnot allow us to login , the only way to... (1 Reply)
Discussion started by: skamal4u
1 Replies

4. UNIX for Dummies Questions & Answers

Awk Performance Issues

Hi All, I'm facing an issue in my awk script. The script is processing a large text file having the details of a number of persons, each person's details being written from 100 to 250 tags as given below: 100 START| 101klklk| ... 245 opr| 246 55| 250 END| 100 START| ... 245 pp| 246... (4 Replies)
Discussion started by: pgp_acc1
4 Replies

5. Programming

performance issues of calling a function in if condition

Hi, I have written a program in C and have to test the return value of the functions. So the normal way of doin this wud b int rc rc=myfunction(input); if(rc=TRUE){ } else{ } But instead of doing this I have called the function in the if() condition. Does this have any... (2 Replies)
Discussion started by: sidmania
2 Replies

6. AIX

Performance issues for LPAR with GPFS 3.4

Hi, We have GPFS 3.4 Installed on two AIX 6.1 Nodes. We have 3 GPFS Mount points: /abc01 4TB (Comprises of 14 x 300GB disks from XIV SAN) /abc02 4TB (Comprises of 14 x 300GB disks from XIV SAN) /abc03 1TB ((Comprises of Multiple 300GB disks from XIV SAN) Now these 40... (1 Reply)
Discussion started by: aixromeo
1 Replies

7. Solaris

zfs send receive performance issues

I 'm trying to clone a zfs file system pool/u01 to a new file system called newpool/u01 using following commands zfs list zfs snapshot pool/u01@new zfs send pool/u01@new | zfs -F receive newpool/u01 Its a 100G file system snapshot and copied to same server on different pool and... (9 Replies)
Discussion started by: fugitive
9 Replies

8. Solaris

Getcwd performance issues

Hello everyone, recently we have been experiencing performance issues with chmod. We managed to narrow it down to getcwd. The following folder exists: /Folder1/subfol1/subfol2/subfol3 cd /Folder1/subfol1/subfol2/subfol3 truss -D pwd 2>&1 | grep getcwd 0.0001... (4 Replies)
Discussion started by: KotekBury
4 Replies

9. AIX

AIX 6.1 Memory Performance issues

Good Day Everyone, Just wonder anyone has encounter AIX 6.1 Memory Performance issues ? What I have in my current scenario is we have 3 datastage servers (Segregate server and EE jobs - for those who know Datastage achitect) and 2 db servers(running HA to load balance 4 nodes partitions for... (3 Replies)
Discussion started by: ckwan
3 Replies
SCRAPY(1)						      General Commands Manual							 SCRAPY(1)

NAME
scrapy - the Scrapy command-line tool SYNOPSIS
scrapy [command] [OPTIONS] ... DESCRIPTION
Scrapy is controlled through the scrapy command-line tool. The script provides several commands, for different purposes. Each command sup- ports its own particular syntax. In other words, each command supports a different set of arguments and options. OPTIONS
fetch [OPTION] URL Fetch a URL using the Scrapy downloader --headers Print response HTTP headers instead of body runspider [OPTION] spiderfile Run a spider --output=FILE Store scraped items to FILE in XML format settings [OPTION] Query Scrapy settings --get=SETTING Print raw setting value --getbool=SETTING Print setting value, intepreted as a boolean --getint=SETTING Print setting value, intepreted as an integer --getfloat=SETTING Print setting value, intepreted as an float --getlist=SETTING Print setting value, intepreted as an float --init Print initial setting value (before loading extensions and spiders) shell URL | file Launch the interactive scraping console startproject projectname Create new project with an initial project template --help, -h Print command help and options --logfile=FILE Log file. if omitted stderr will be used --loglevel=LEVEL, -L LEVEL Log level (default: None) --nolog Disable logging completely --spider=SPIDER Always use this spider when arguments are urls --profile=FILE Write python cProfile stats to FILE --lsprof=FILE Write lsprof profiling stats to FILE --pidfile=FILE Write process ID to FILE --set=NAME=VALUE, -s NAME=VALUE Set/override setting (may be repeated) AUTHOR
Scrapy was written by the Scrapy Developers <scrapy-developers@googlegroups.com>. This manual page was written by Ignace Mouzannar <mouzannar@gmail.com>, for the Debian project (but may be used by others). October 17, 2009 SCRAPY(1)
All times are GMT -4. The time now is 05:14 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy