Thu, 25 Oct 2007 12:17:34 +0000
Some presentations I attended on the 1st day of BRForum caused some interesting thoughts on the advantages of an EDA approach to executing rules (aka production rules representing business rules).
First: an interesting presentation by AT&T on their (now) rule-driven, mainframe based (outstanding bill paymeny) collections system, with the minor caveat that the rules were split across a rule engine and a workflow engine. The benefit to companys like AT&T is that the rules (representing score models in this case) used to run the decision process can be more easily updated when the rules are externalized - this much is Comp Sci 101 (basic common sense) [*1]. The EDA takeaway from this session was that with this transaction-based or batch-based architecture, previous decisions cannot be updated automatically when the rules change. So if a score model is updated, which affects say 1% of your customers, you need either to find that 1% and resubmit them for rescoring, or resubmit the whole customer base for rescoring - assuming it is important for the rule change to be applied. An EDA approach using a stateful rules engine could update the new scores automatically, and make the appropriate new decisions automatically.
Secondly, there was a presentation on BI 2.0. Although the Wiki definition of BI 2.0 seems to talk about analysis of real-time data, the definition here was about SOA “enabling” intelligence to be added to processes, which seems a bit vague [*2]. Nonetheless, the idea of operational BI being about making smart decisions from business events makes perfect sense, and is indeed very familiar to us at TIBCO. The differentiations given on BI 2.0 vs CEP was that BI 2.0 is for the business whereas CEP is for the IT developer, and BI 2.0 is metrics based - however there are certainly business user interfaces for aspects of CEP, and there are certainly metrics you can model in CEP. So maybe a better way of thinking about this is that some CEP tools can be used to build BI 2.0 applications. In any case, the CEP takeaway from this presentation was that processes can only be smart if they know when to invoke their decision rules - a problem that goes away in an EDA / CEP environment.
Third was the presentation on Predictive Models and Rules Management in Insurance, from Deloitte Consulting. This was an interesting intro into why and how the insurance business uses IT to differentiate its business, with conventional analytics creating score models for insurance underwriting. The CEP takeaway here was the mention of the future being self-correcting models / rules, which is certainly an application for event-driven rule engines.
- - -
[*1] Not sure if this is yet taught in Comp Sci classes, but it should be. The term “101″ implies something like “the 1st class in year 1″ in US school nomenclature.
[*2] SOA enables applications to be componentized to encourage re-use of said components. So this allows “intelligence” to be a service, perhaps. But its not a requirement. For example, a CEP system can embed business rules / “intelligence” without being part of an SOA system.