MAINFRAME > Administrator > Performance

BI and the Mainframe

Mainframe keeps up with BI needs; IBMers debunk the top three myths surrounding BI implementation on System z

Mainframe keeps up with BI needs; IBMers debunk the top three myths surrounding BI implementation on System z
Illustration by Bill Cigliano

Note: This article is part one of a three-part series.

Claims that the mainframe is a near-death technology in the mission-critical world of today’s robust business intelligence (BI) applications are exaggerated. Conventional wisdom says the mainframe—the powerhouse of corporate computing—is simply too costly, too complex and incapable of supporting a comprehensive BI system. Not so.

A critical look at the true cost, complexity and capability of the mainframe, which is home to 70 percent of the world’s critical transactional data, reveals a competitive BI platform. Countless innovations make the IBM System z* server a reliable, cost-effective, large-scale platform capable of satisfying core applications and BI needs. This system is proving to be a valuable asset to any company wanting to implement BI and analytic applications, particularly when integrating these into current operational processes.

Yet, myths persist about the platform’s BI capability. The intent of this three-part series is to debunk the top 10 mainframe myths and illustrate why current BI solutions can be successfully deployed on the platform. Part one explores the first three myths. After all, when choosing where to place BI applications, companies should weigh all of their options and consider application requirements two and three years down the road. If they make decisions based on misconceptions concerning the relative cost, complexity and capability of this unique platform, they might overlook the solution that best fits their needs.

BI has come a long way from the early days of simple reports and back-office statisticians. Today’s BI is in the boardroom and spread throughout the organization as every part of a corporation demands decision support. This new BI has ramifications not only in BI architectures but also in the technologies used to support these environments. There’s tremendous pressure on the BI environment as terabytes of data are accessible, response times mimic operations and globally dispersed users—sophisticated and novice—produce both simple queries and complex models.

Beginning in the 1990s, distributed servers were used increasingly throughout the enterprise to handle departmental and other workloads. Distributed systems were particularly attractive for BI workloads, as the server architectures matured and the software environment developed, including improved operating systems, database management systems, and data access and delivery tools. In time, server technology evolved across all platforms. As distributed systems continue to emulate more of the mainframe’s unique functions, such as partitioning capabilities, virtualization technologies and workload-management controls, the platform also has matured, supporting more of the software vendors and offerings that drive the BI and analytics markets today.

Why does the reliable IBM System z server have such a bad reputation for BI capability? Let’s investigate.

Myth One: Mainframe Total Cost of Ownership is Too High

Deploying any new environment can be expensive, regardless of platform choice. Hardware and software costs are only part of a solution’s overall costs. When using price as a criteria for selecting a BI platform, a true comparison requires considering the total cost of ownership (TCO). For any application environment, total cost has many components, including labor, hardware, software, electricity, etc. The most expensive component of any solution is the staff required to support it.

In a distributed server environment, costs go up linearly with additional workload. Adding capacity means adding servers. Each additional server increases the human resources needed to manage and maintain the environment. In the mixed-workload System z environment, initial hardware costs are higher, but the per-unit cost of incremental capacity actually decreases as the total workload grows. With this server, incremental capacity frequently can be added without increased staff to manage and maintain the environment. In an existing environment, many of the initial costs for deploying a new solution already have been paid. This makes incremental costs associated with adding BI capabilities much lower than those for a new environment. Creating a data warehouse from data that already may be housed on the System z server and adding a user solution can give users immediate access to valuable BI capabilities.

Further, by offering dedicated specialty processors, the System z server has recognized the need to target capacity to address specific workloads. These provide a high-speed engine that reduces overall processing costs when data is centralized on the mainframe. The economy of this solution helps break down the walls between transactional data stores and BI, ERP and customer-relationship management applications. This also minimizes the need to maintain duplicate copies of data across a pool of discrete systems, while providing high levels of security for critical corporate data. With a reduced need for multiple databases and applications consolidated onto the central mainframe, the platform’s inherent strengths are leveraged to manage the concurrent sharing of data by batch, online transaction processing and online advanced analytics processing applications.

Sound IT architectures involve separate environments for development, quality assurance and production workloads. In a distributed environment, typically this is accomplished with separate servers, each requiring a redundant copy of the operating system, database management system and application software, data and utilities. Substantial cost is associated with each copy, and they must be synchronized. Often, production problems are encountered because of differences between the production environment and the development or quality-assurance environments.

The physical movement of data is another costly challenge. With approximately 70 percent of the world’s critical transactional data already residing on the mainframe, substantial costs are inherent in moving data from the centralized server, versus leveraging the single copy of data. Using a central server such as a mainframe eliminates transporting data to another platform, thus helping reduce audit and control complexities and—more importantly—security costs. Eliminating an extra server for housing the data warehouse can further reduce costs by avoiding duplication of the operating system, database management system and additional copies of other software. For companies with significant workloads, the mainframe can offer a lower cost of ownership than a distributed environment.

The IBM System z server is unique in the market, as it's designed to provide a balanced system optimized for a mixed workload.

Claudia M. Imhoff, Ph.D, is president of Intelligent Solutions Inc. Claudia can be reached at isiclaudia@aol.com.


comments powered by Disqus

Advertisement

Advertisement

2019 Solutions Edition

A Comprehensive Online Buyer's Guide to Solutions, Services and Education.

Accelerating Enterprise Application Modernization

Modernizing existing applications rather than replacing them is a time-tested approach to competitive advantage in the financial-services industry.

IBM Systems Magazine Subscribe Box Read Now Link Subscribe Now Link iPad App Google Play Store
Mainframe News Sign Up Today! Past News Letters