This is the second of a two-part series on the Oracle in-memory database option in Database 12c. This installment looks at potential licensing and hardware costs with the in-memory option. The first part introduced the feature and considerations for implementing it.
In-memory columnar technology is a promising one, so promising in fact that companies like SAP, IBM, MemSQL and Microsoft have already implemented some form of the technology. How Oracle's implementation will compare is yet to be seen. Also yet to be seen is what the new feature will cost. The fact that it's referred to as an "option" is enough to suggest that some sort of fee will be attached. What that fee will be is the million-dollar question.
When Oracle introduced its online analytical processing (OLAP) option, for example, it came with a hefty price tag, on top of already steep licensing fees for the database system. Yet the Oracle folks are well aware that SAP's in-memory platform, HANA, has been out for a couple years and that Oracle's pricing could play a critical role in keeping their customers from jumping ship. Even so, all we have is conjecture at this point, with no way of knowing what Oracle will do until they tell us.
Planning the hardware for a BI platform
What we do know is that there will be costs involved, and not just from licensing. When the new option is enabled, the database engine copies all or some of the data to memory. Although memory has dropped substantially over the last decade, it's still not free, and unless you're running a system that already has the necessary memory, you'll have to come up with the additional funds. How much is yet to be seen, but currently estimates for a terabyte (TB) of memory run between $5,000 and $9,200. To complicate matters, a fire this past September at an SK Hynix plant in China impacted inventories and caused prices to rise globally. SK Hynix is the world's second-largest producer of memory chips.
But memory isn't the only consideration. Your servers must be able to handle the additional memory and be able to process both OLTP and analytical loads. Much, of course, will depend on how the In-Memory Option is actually implemented and what sort of BI platforms you try to leverage off your OLTP operations. Ellison suggests that the new option will incur little overhead in maintaining the columnstore data, but it's not clear whether he's referring to OLTP databases already supporting analytics. If you're adding analytics on top of an OLTP database, your system must be able to handle the additional processing overhead and throughput requirements, as well as the extra memory.
When Ellison unveiled the In-Memory Option, he also showcased the Big Memory Machine, Oracle's new M6-32 server, which boasts 12 cores, 96 threads for parallel processing and up to 32 TB of memory -- the perfect machine to take advantage of the new in-memory database option. The cost? A cool $3 million. But Ellison was quick to compare Oracle's server to the IBM P795, whose price tag runs more along the lines of $9.6 million. The point is, in-memory databases require big computing and memory resources, so those costs must be taken into account if and when the time comes to plan for the In-Memory Option.
A cheaper option might be to run Oracle Database distributed across multiple commodity servers, but whether that will be possible with the In-Memory Option is yet to be seen. So far, Oracle is pointing primarily to its own Big Memory Machine. Yet even if commodity distribution is possible, costs are associated with those systems as well, and you still need the power necessary to run your BI and OLTP operations if you want to avoid contention- and memory-related issues.
Waiting for the In-Memory Option
When the time comes that you're able to fully assess the costs associated with the In-Memory Option, you must take into account all resources you'll need to implement and support your changes, in addition to any licensing fees. Part of your assessment, however, should include your actual business requirements for real-time analytics. Often the data you're after encompasses a relatively wide time frame, and daily feeds are more than adequate to meet your BI needs. Undoubtedly, the real-time, ad-hoc capabilities offered by the In-Memory Option, along with the ability to eliminate a separate data warehouse, could prove a great fit in some circumstances. But be certain to take into account all the potential expenses, not only hardware costs, but the resources needed to plan, develop, test, implement and maintain a new system.
About the author:
Robert Sheldon is a technical consultant and the author of numerous books, articles and training material related to Microsoft Windows, various relational database management systems and business intelligence design and implementation.
This was first published in January 2014