Please till me how to stop or to limit some IP which download files for more than 10000


Login or Register to Reply

 
Thread Tools Search this Thread
# 1  
Question Please till me how to stop or to limit some IP which download files for more than 10000

Please till me how to stop or to limit some IP which download .rm and .mp3 files for more than 10000 times ...
I have two cases :
1. code 206 up to 20/second 5 GB or more than that ...
2. code 206 up to 20/second but less than 0.5 GB

I used Ddos and mod_evasive20.so
Code:
<IfModule mod_evasive20.c> 
DOSHashTableSize 3097 
DOSPageCount 2 
DOSSiteCount 4 
DOSPageInterval 10
DOSSiteInterval 10 
DOSBlockingPeriod 300 
</IfModule

Note : I have Linux 2.6.24-26-server on x86_64

Last edited by pludi; 03-22-2010 at 03:23 AM.. Reason: code tags, please...
Login or Register to Reply

|
Thread Tools Search this Thread
Search this Thread:
Advanced Search

More UNIX and Linux Forum Topics You Might Find Helpful
Searching a particular string pattern in 10000 files
raihan26
Problem Statement:- I need to search a particular `String Pattern` in around `10000 files` and find the records which contains that `particular pattern`. I can use `grep` here, but it is taking lots of time. Below is the command I am using to search a `particular string pattern` after...... Shell Programming and Scripting
3
Shell Programming and Scripting
Merge files into groups of 10000
peh
Hi Guys, First post! I've seen a few options but dont know the most efficient: I have a directory with a 150,000+ text files in it I want to merge them into files contain 10,000 files with a carriage return in between. Thanks P The following is an example but doesnt limit the...... UNIX for Dummies Questions & Answers
2
UNIX for Dummies Questions & Answers
reply_body_max_size to limit download size in squid
surfer24
Hi All I m using squid 2.7 Stable 9 and Dansguardian 2.10.1.1, i have compiled both squid and dansguardian, i have enabled follow_x_forwarded_for in squid to make clients IPs visible to squid, i have also set x_forwarded_for=on in dansguardian, this is working fine and clients ips are visible to...... Red Hat
1
Red Hat
How to print lines till till a pattern is matched in loop
anoopvraj
Dear All I have a file like this 112534554 446538656 444695656 225696966 226569744 228787874 113536566 443533535 222564552 115464656 225445345 225533234 I want to cut the file into different parts where the first two columns are '11' . The first two columns will be either...... Shell Programming and Scripting
3
Shell Programming and Scripting
Download Limit with Linux Squid
surfer24
hi dears 1) i am using squid proxy for internet sharing .. how can i allow users to download files of specific size say only 5 Mb how allow through squid.. 2) i want to trace all the user with system ip and system id (name) what are sites they are visited and how much MB they are user ...... Linux
2
Linux
LWP-DOWNLOAD(1) 					User Contributed Perl Documentation					   LWP-DOWNLOAD(1)

NAME
lwp-download - Fetch large files from the web
SYNOPSIS
lwp-download [-a] [-s] <url> [<local path>]
DESCRIPTION
The lwp-download program will save the file at url to a local file. If local path is not specified, then the current directory is assumed. If local path is a directory, then the last segment of the path of the url is appended to form a local filename. If the url path ends with slash the name "index" is used. With the -s option pick up the last segment of the filename from server provided sources like the Content- Disposition header or any redirect URLs. A file extension to match the server reported Content-Type might also be appended. If a file with the produced filename already exists, then lwp-download will prompt before it overwrites and will fail if its standard input is not a terminal. This form of invocation will also fail is no acceptable filename can be derived from the sources mentioned above. If local path is not a directory, then it is simply used as the path to save into. If the file already exists it's overwritten. The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you don't have much options to worry about. Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems.
EXAMPLE
Fetch the newest and greatest perl version: $ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz Saving to 'latest.tar.gz'... 11.4 MB received in 8 seconds (1.43 MB/sec)
AUTHOR
Gisle Aas <gisle@aas.no> perl v5.12.1 2010-07-05 LWP-DOWNLOAD(1)

Featured Tech Videos