Tim Bass
11-25-2008 01:02 PM
Just as I was starting to worry that complex event processing community has been captured by RDBMS pirates off the coast of Somalia, I rediscovered a new core
blackboard architecture component,
Hadoop.
Hadoop is a framework for building applications on large commodity clusters while transparently providing applications with both reliability and data motion.* Hadoop implements*
Map/Reduce, where an application is divided into many small components of work, each of which may be executed or re-executed on any node in the cluster.
There are a number of great articles on implementing Hadoop in the Amazon Elastic Computing Cloud (EC2), including this one,
Running Hadoop MapReduce on Amazon EC2 and Amazon S3.* Hadoop provided the core component that permits a distributed agent-based architecture to become a manageable, simple-to-use service.** This, in turn, provides a framework, as a service, for solving complex distributed computing problems.
Another good article to read is
Taking Massive Distributed Computing to the Common Man - Hadoop on Amazon EC2/S3. There is also a nice article on the Amazon EC2 on the
Hadoop Wiki.
It is interesting to note that if you Google around you will find that the same RDBMS folks who have pirated the term “complex event processing” are some of the most vocal
Hadoop critics. Further reading, however, you will see that most of the critical comments by the RDBMS crowd have been answered.* It is very interesting to see the same debate in the MapReduce community as in the CEP community, the difference of course is that the MapReduce community is much larger than the CEP community.
However, there should be no doubt in anyone’s mind that MapReduce and the Hadoop implementation provide a way to accomplish CEP.* It is very refreshing to see this emerging CEP architecture on the rise.
Stay tuned for much more information related to MapReduce and CEP.
Source...