On research and practice in event processing


 
Thread Tools Search this Thread
Special Forums News, Links, Events and Announcements Complex Event Processing RSS News On research and practice in event processing
# 1  
Old 08-29-2008
On research and practice in event processing

2008-08-29T10:40:00.005+03:00
Image
Image

Triggered by a question of Hans Glide to a previous posting, today's topic is the relationships between research and practice in event processing. I'll not go to ancient history of the event processing area ansectors such as: simulation, active databases etc.., but start from mide to late 1990-ies, when the idea to generate a generic languages, tools and engines for event processing has emerged. This area has emerged in the research community. David Luckham and his team in Stanford, has done the Rapide project, Mani Chandy and his team in Cal Tech has done the Infospheres project, John Bates has been a faculty member in Cambridge University and Apama was a continuation of his academic work, my own contribution has been in establishing the AMIT project in IBM Haifa Research Lab, which is also part of the research community (kind of..). In the "stream processing" front there have been various academic projects - The Stream project in Stanford, The Aurora project in Brown/MIT/Brandeis, this are just samples, and there were more - however, the interesting observation is that the research projects have been there before the commercial implementation, furthermore, many of the commercial implementation were descendents of academic projects, examples are: Isphers was descendent of Infospheres, Apama was descendent of John Bates' work, Streambase was a descendent of Aurora, Coral8 was a descendent of the Stnaford stream project, and probably there are more. However, when commercial products are introduced, the world is changing, and there is a danger of disconnect between the research community and the commercial world, since products have life of their own, and are being developed to various directions, while people in the research community continue in many cases with the inertia to work on topics that may not be consistent with the problems that the vendors and customers see. While wild research is essential to breakthroughs, the reality provides a lot of research topics that have not been anticipated in the lab, and there is a need to do synchronization in order to obtain relevant research.

The Dagstuhl seminar in May 2007, where people from academia and industry met for five days and discussed this issue has been one step, my friend Rainer von Ammon organizes periodic meetings on these issues, and a European project may spin off these meetings We shall discuss this topic in the EPTS symposium, we have more than 20 members that are part of the research community, many of them will also participate in the meeting.



Bottom line: the life cycle is --



1. Ideas start in the research community.

2, At some point the commercial world catches-up.

3. Parallel directions - research continues, commercial products evolve to their own way.

4. Synchronization, exchange of knowledge, ideas flow in both directions -- need guidance.



More - later.



Source...
Login or Register to Ask a Question

Previous Thread | Next Thread

1 More Discussions You Might Find Interesting

1. Virtualization and Cloud Computing

Cloud Event Processing vs Event Cloud Processing?

vincent 01-20-2009 05:51 AM Interesting to see the huge interest in Cloud Computing. David Luckham’s complexevents.com just referenced one of several recent Infoworld articles and blogs that attempt to define the term. Another compares the buzz to past excitements - remember Application Service... (0 Replies)
Discussion started by: Linux Bot
0 Replies
Login or Register to Ask a Question