Having been VoltDB partners for several years, we at Full 360 have sometimes found it difficult to articulate the value of the product to customers and prospects. The performance benefit of an in-memory SQL database is pretty easy to understand because the result is fast query times. This definitely has value but it only scratches the surface.
In a discussion with one of the regional reps for VoltDB… I had the a-ha moment that I needed. It comes down to the value of a single point of data.
At the time a single point of data is generated, it has immense value because there is the potential for a real-time reaction to the event. Perhaps the event is a credit card transaction that needs to be reviewed for fraud potential. The faster the turnaround on a potential fraud event, the better the likelihood of a happy outcome for the cardholder. As time goes on… the value of that single point of data diminishes. Fraud analysis that comes 3 hours after the event is still valuable… but not as valuable as catching it in the moment as the card could have been abused many times by that point.
Of course that is not to say that the value of a single point of data is only at the time it is created. What makes for long term value of a single point of data isn’t the individual event… but the collection and aggregation of data points in your data warehouse. Large amounts of data allow you to perform after the fact analysis to glean wisdom and insight about your enterprise. This is the realm of big data, analytics, and business intelligence.
The above picture sums it up. Single points of data are most useful right after they are created, and also as they aggregate into your data warehouse.
You need the wisdom of your analytics to guide the real-time reactions to incoming events. Knowing what to look for is the most important factor in the fraud detection scenario outlined above, and this comes from your analytics. A tool like VoltDB allows you to codify what you have learned from your data, and run the analysis in real-time, at scale.
I believe that an analytics first approach is necessary for a VoltDB implementation that goes deeper than the “fast SQL” aspect. If you don’t know the wisdom in your data, you aren’t going to be able to apply it in real-time. Understanding this is the first step. Once you know what you are looking for, based upon your analytics, VoltDB then enables you to react to events with java procedures that are executed within the context of the in-memory data store, allowing your business logic to be applied at scale to incoming events.