How to step up performance with Oracle in-memory database options
A comprehensive collection of articles, videos and more, hand-picked by our editors
This is the first of a two-part series on the Oracle in-memory database option in Database 12c. This installment...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
introduces the option and considerations for implementing it. The second part looks at potential licensing and hardware costs associated with adopting the in-memory option.
The Oracle Database 12c In-Memory Option, announced last fall and slated for release sometime this year, brings in-memory columnar storage to Oracle databases. It promises to substantially increase the performance of analytical queries and, to a lesser degree, accelerate online transactional processing (OLTP).
Specifics about the new feature are in short supply, especially in terms of costs for licensing the new option as well as for the hardware and resources needed to implement it.
Traditionally, data in an Oracle database has been stored in a row-wise format on disks. However, the Oracle in-memory database option enables 12c to store data in a column-wise fashion in-memory. The database still maintains the row-wise data for OLTP operations, but copies the data to the in-memory store for read-only analytical queries. This dual format provides for real-time, ad-hoc analytics on transactional data. In addition, if the OLTP database already supports business intelligence (BI) operations, the In-Memory Option makes analytical indexes unnecessary, thus improving OLTP operations as well.
During his September presentation, Ellison announced that data-driven applications will not need to be modified and that administrators will be able to implement the new option with a simple flick of a switch. After data is copied into memory, analytical queries are automatically directed to the analytic engine, which uses the columnar data to achieve scan rates of billions of rows per second. Because data is stored in-memory, queries don't incur the typical I/O costs associated with data stored on disk. At the same time, OLTP queries -- those that modify data -- are directed to the OLTP engine and use the on-disk data. The database engine keeps the two data stores transactionally consistent and simultaneously active.
Implementing the In-Memory Option
Whenever you implement or modify a system, you'll likely incur costs associated with development, deployment and quality assurance, no matter what has been promised as out-of-the-box capabilities. No organizations should be putting into production a new technology without fully testing it.
For example, suppose you're planning to implement a BI system based on data supplied by an OLTP database. You might decide to take a more traditional approach: build a data warehouse, set up the extract, transform and load operations and implement a system to provide ongoing maintenance and support. A BI platform of this nature not only requires the resources necessary to plan, develop, test and implement the system, but also the physical infrastructure needed to host the software and data and facilitate network communications.
However, the Oracle in-memory database option lets you skip the warehouse approach altogether. Instead of building a separate infrastructure, you scale up the existing OLTP infrastructure. Certainly, this platform requires an initial outlay for licensing, implementation and the infrastructure, but you could still save on hardware, maintenance and other costs. Even if you require proprietary hardware to run the In-Memory Option, you might still see benefits in total cost of ownership. Of course, until we have more specifics about licensing costs and hardware requirements, we have only conjecture at this point.
On the other hand, if you've already implemented a separate data warehouse, the resource investment necessary for the In-Memory Option might not be worth the tradeoff, especially if your system already operates smoothly and you've made your initial investment. You'll need to evaluate whether the real-time benefit of the In-Memory Option offsets the costs associated with planning, developing and testing the system -- along with the hardware and licensing costs themselves -- before making the switch.
Whether or not your warehouse already exists, you have a couple other cost-related issues to consider. First, you'll have to decide what do to about historical data. One reason a data warehouse is treated as a separate operation is because it serves as a large storehouse for the historical data no longer relevant to day-to-day transactional processing. A warehouse provides the larger scope needed for analyses that require a broader perspective over time. If you leverage your OLTP operation for analytics, you must account for the accumulated data and ensure you have the systems and processes in place to handle it. Large sets of historical data could translate into extremely large in-memory data sets and lag performance. Not all BI platforms can easily leverage an OLTP environment.
In addition, when BI is treated as a separate operation, issues that arise in the OLTP system have minimal impact on the data warehouse platform, and vice versa. If one goes down, the other keeps running. This makes troubleshooting and maintenance easier if a system experiences a problem, and fewer users are impacted if there is a problem. A processing-intense analytical query, for example, won't affect all the users who are also currently running OLTP operations. If you plan to integrate your OLTP and BI operations into a single system, you must plan for this contingency and the potential costs involved.
Another scenario to consider is the one in which you're already using your OLTP database for analytics, in which case you might have implemented analytical indexes to improve performance. Unfortunately, they can severely impact your OLTP operations. In these circumstances, the In-Memory Option might be worth the investment, once you better understand the specifics of the licensing and system requirements. This approach might be particularly useful for exploratory (investigative) types of analytics and operational analytics, those that are part of operational applications that have analytics built in. If you do decide on this approach, once again keep in mind that, despite promises of having only to turn on a switch, any update of this magnitude requires the resources necessary to implement and support the changes, in addition to the costs for additional hardware.
About the author:
Robert Sheldon is a technical consultant and the author of numerous books, articles and training material related to Microsoft Windows, various relational database management systems and business intelligence design and implementation.