If you’ve done any cooking, you’ve probably experienced what it's like to forget to add an ingredient or worse the wrong ingredient. It’s a disappointing experience because there isn’t much that can be done other than live with the bad tasting food or throw it away.
Well it turns out that Precalculated Business Intelligence is the same way. When we commit data to finalized calculation in a cube we’re literally “baking” that data into a final form. This carries both good and bad consequences. On the good side, precalculated BI generates results that are “ready to eat” meaning the results appear nearly instantaneously. Another advantage of pre-calculation is the ability to create interactivity that doesn’t require refreshes.
On the bad side, pre-calculation typically requires data to be scheduled to process in advance, so this means that if the correct ingredients are not included, then users are basically in the same boat as our “bad food” example. This can be annoying for business analysts which might want to dabble with other perspectives and queries, pushing them to start their own rogue data gathering efforts.
Perhaps more vexing, is the fact that pre-calculation typically requires you to move the data. This isn’t such a big issue when you’re dealing with small data sets. However, when you’re dealing with large data sets, moving data around is something you want to minimize.
There’s a good compromise that balances the need for analytical flexibility without tying you down to a precalculated cube. It’s called a Metadata Layer, or a Semantic Layer. This is essentially a SQL generator which translates the data elements users picked into the SQL required to retrieve those data elements from the database. This metadata layer doesn’t magically get generated, but rather it gets modeled against a database ahead of time. So the body of queries the metadata layer can generate will only be as fast as the database it's pointing to. This gets to the second important component to this architecture, a data warehouse. The Data Warehouse is designed to make queries simple and scalably deliver mass quantities of data. The advances in technologies and techniques in data warehousing have made it possible to deliver consistently fast queries without having to pre-calculate the analytics ahead of time. Additionally, the latest cloud architectures enable us to leave the data in cheap storage while delivering massive speed and delivery enhancements. This means the body of queries can be cooked on the fly like cooking with a microwave whenever users want to investigate data. This powerful combination leaves the door open for a very broad landscape of queries while still being highly performant.
To help companies transition to this new paradigm, Intricity has put together a low risk engagement which helps you draw out your current landscape, then bridge it to the future cloud landscape. Then it spells out the steps to making this move in a Strategic Roadmap. I’ve written a short engagement synopsis that you can take a look at if you click on the link here. I’ve also included the link in the description below. And of course, you can always visit our website and register to talk with a specialist.