Every time I find myself talking or presenting about HANA, I can barely contain my excitement. I have been in the IT industry and the SAP space for a long time and have only on rare occasion come across technology that fundamentally shifts everything. Some of these are IP networks and the resulting Internet in general (I know I am dating myself for observing this as a memorable shift), SAN Fabrics and Systems, Data Deduplication, Data Compression and Continuous Data Protection (CDP), more recently Virtualization and Cloud Computing. And then there is SAP HANA!
I couldn’t agree more with Jeffrey Word, in his choice to pick the “No Speed Limit Sign” you can find on the German Autobahn, for his “SAP HANA Essentials” book. Only those, who actually have experienced the excitement, exhilaration, and incredible focus when driving with the right car (important little detail!) down the Autobahn past this sign with 160mph, understand what I am talking about. I really like this comparison to HANA, because when I finally understood how powerful and impactful HANA really is I felt precisely this feeling. WOW – How cool is this! Call me a geek for feeling this way about technology – I guess, then I am proud to be a geek.
HANA is so cool because it fascinates business and technology folks equally but for very different reasons.
Business People are excited because,many realize that HANA’s in-memory-computing finally allows companies to achieve tings that where impossible in the past. In Memory Computing, limited until HANA only to those three or four letter organizations in the US, who could afford large and expensive HPC systems, now available to any enterprise! Once a company understands how to yield this powerful tool, it can catapult a business into completely new areas.
Rukhshaan Omar (SAP Mentor) manages the use case section of the www.saphana.com website. You can find HANA use-cases organized by Industry just to get a sense of what HANA could do for your business. There are many ways to discover that BIG IDEA (or amazing number of use-cases) in your company that will make your career and earn your company a competitive advantage.
- Internal brain storming workshops
Such workshops require the suspension of existing processes and beliefs and a very creative, playful attitude. The mindset of an excited startup company lends itself as useful guidance. At a minimum you should include resources from 5 areas.
Business representation (folks who run the business), Data owners or custodians (folks who manage or know the data, and data flow in your company), IT Architects with software development background (folks who can imagine what is possible from an interface perspective or know middleware tools), IT Architects with focus on IT Infrastructure (folks who are familiar with scanning the horizon constantly for innovative technology – they may already be aware of HANA), an independent Referee who is empowered and experienced in orchestrating such a “thought shop”.
- SAP’s “Value Discovery Workshop”
The workshop focuses on identification of HANA use cases for high impact and high value. You will work on identification of business value and cost savings, prioritization of HANA use cases taking into account technical landscape, and begin development of a business value driven HANA roadmap and strategy. Some of the artifacts of the workshop are Value/Feasibility Matrix, Roadmap, Business Case, and ROM. For more information contact Helen Sunderland at email@example.com
- Hiring one of your trusted and preferred Integration Partners
like Accenture, Deloitte, Capgemini, or others, who are extremely aware of the incredible possibilities with HANA. All major System Integrators offer specialized services, which are designed to help your business finding these game-changing use-cases. These guys have been in the SAP space for a long time and have helped shape 1000s of companies’ success around SAP in the past and are very well equipped to help you in this case too. There are too many to describe, so feel free to contact me and I am happy to refer you to the right resources.
Each option offers different pros and cons and certainly varies in price and value. I would probably start with an internal workshop to collect initial thoughts and ideas and get a better understanding of at least the possible business areas, to identify the right resources involved, and any obstacles that may be in the way. Then I would hire a specialist to take it to the next level and WOW me, who does this all day long, has seen many scenarios, and has seen what works, what doesn’t and how to overcome obstacles you may have identified.
Technical Folks are excited because, HANA is just simply cool from a technology perspective and approaches things differently. This has become very clear to me after reading an Architecture Overview by Franz Färber (SAP HANA’s Chief Architect) and team. No other single document has made the power and brilliance of HANA DB so clear to me! The document describes extremely well in short precise terms:
- The history of the HANA development and how components like the SAP TREX text engine, the SAP BI Accelerator, P*Time, and SAP’s MaxDB have provided the base for the HANA DB.
- The HANA DB architecture (above), describing how various languages are realized and supported, full transactional behavior is included by design, special emphasis is given to parallelism from thread and core level up to highly distributed setups over multiple machines. (This made me realize two things). a) The HANA DB makes use of specific Intel E7 chipset features (for all scale out models), which is why the HANA Product Availability Matrix (PAM) is so precisely defined, down to the CPU core architecture. This also tells me, that we will see future HANA versions probably tied to future Intel chipset versions; at least for scale out deployments. b) The HANA DB takes ownership of Parallelism across hosts/blades and therefore also High Availability (HA). Aside from the existing ABAP and Java SCS (SAP Central Services), this is the first time in history that SAP takes ownership of the HA function (for scale out HANA deployments). No need any more for specialized OS dependent HA solutions like MC/Serviceguard, PowerHA (formerly HACMP), Veritas Cluster, and others; the heart of the design are the in-memory processing engines, column and row based data tables can be converted from one layout to the other to allow for optimal query expressions in both table layouts, data Structures are optimized for cache efficiency instead of organization for traditional disk based systems, data is compressed using a variety of compression schemes, de-allocation and reload of application relevant data out of and into main memory for most efficient memory use, open and flexible data access plan generation, which supports a variety of query languages today. I smell a brilliant design here, where future query languages can be relatively easy adopted by adding specific Calculation Engine logic. No need to change the Optimizer, Plan Generator and Execution Engine. My mantra has always been “Flexibility is the mother of success”. This is an excellent example, where the design of the engine has the foresight to be flexible for future developer preferences new languages, and other features.
- How Analytical Applications are supported by executing business and application logic inside the database.
- Advantages of HANA for Analytical Query Processing by storing most data in a columnar data store, utilizing on-Chip Vector Processing Units (another reason for the precise choice of certain Intel chipsets), innovative ways to handle table updates, and including parallelism inside query execution plans across cores and nodes.
- Advantages of HANA for Transactional Query Processing. Quite surprising and compelling to me how the document describes advantages for OLTP workloads based on columnar stored data. Without leaving out the reality of still dealing with optimization challenges.
- And additionally, the document includes a reference to a wealth of information Architecture Overview refers to. This way geeks like me, can further educate themselves into the whole thought process how HANA really works.
Lastly, I need to make the obvious point that the infrastructure you choose for your HANA deployment is extremely important. I would makes sure to choose a platform that is as innovative as HANA itself with rock solid features that,
•Enable SAP’s HA design (not redesign SAP’s logic),
•Provide best of breed backup/recovery and data protection capability,
•Are in the sweet spot of development focus and strategic alignment,
•Include proven data management, protection, performance, and uptime elements you already demand for your existing SAP applications,
•Utilized base building blocks already established and proven in your data center for SAP and other mission critical applications,
•And enable solid Disaster Recovery functionality.
I work for EMC because I truly believe that at the intersection of EMC, VMware, and VCE, we have the best infrastructure for SAP. And as such, teamed with Cisco we naturally offer a world class HANA system – the Cisco/EMC HANA apppliance.