Welcome!

IBM Cloud Authors: Elizabeth White, Yeshim Deniz, Pat Romanski, Liz McMillan, Stefan Bernbo

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Article

Best Practices for Cloud Workload Management

Return of batch jobs

The biggest issue for the today's enterprises is the ways and means of measuring their computing / processing workloads that need to run their business and then work on the ways and means of optimizing the same.

Workload is the amount of work assigned to, or done by, a client, workgroup, server, or Internetwork in a given time period. For example if we take a manufacturing organization, a workload can be a combination of:

  • Interactive or Network Intensive Workloads: The amount of online entry of sales orders, program planning, warranty claims that are referred to a help desk and similar interactive applications.
  • Content or Storage Intensive Workloads: The amount of huge content management systems that stores TBs of data, especially engineering drawings, CAD, CAM related.
  • In Memory / CPU Intensive or Calculation Related Workloads: Most of the advanced algorithms in a typical product design, like how to calculate the mass, width, breadth of a product and its consumption power etc., are highly resource intensive and are typically proprietary scientific calculations specific to the industry.
  • Batch Workloads: These workloads may utilize a combination of processor and storage; however they are not as calculation intensive in nature. However, they perform repetitive tasks for a large volume of records. For example, to generate a compliance related document to be sent to federal government for all the products that have been manufactured in the last quarter or a billing related batch job come under this category.

As the amount of data storage grows in the PETA BYTES and so is the associated processing, the biggest challenge the enterprises will face in the near future is how to make an optimal computing environment to take care of these workloads so that they are finished in the business-requested latency while the computing power needed to do them is optimal while dynamic and scalable to take care increased demands.

Workload Optimization and Challenges
The biggest challenge most enterprises face today is first, how to measure their work load size. Unlike a few other sizing parameters like Function Points (which defines the size of an application), LOC ( the size of the raw computing code), there are not many good industry standard measures to give an indication of a workload.

In today's world the complexity of the IT organization is determined by the $$ value of IT budget spending (like ours is a $5 billion IT shop), but not really about we process XXXX of workloads in a month. For example MIPS (millions of instructions per second (MIPS) is one such a measure to calculate the workload characteristics of an enterprise.

The other issues in today's enterprise workload processing are:

  • Most workloads are written for specific hardware and or software environments and making it difficult for enterprises to dynamically allocate them to the available compute and storage capacity.
  • The newer developer community are not having the same level of business logic of the legacy era resulting in the critical workloads written in a serial or single threaded manner and scaling them even in a Cloud infrastructure is difficult
  • Batch Jobs are used, but their ability to divide and rule the processing needs are limited.

Due to these application characteristics, most organization are not able to optimize their workloads because the workloads tend to contend for the same resources resulting in a deadlock situation among them. Also the operational expenses and capital expenses remain the same even when moved to a Dynamic infrastructure environments like Cloud.

Best Practices from Batch Jobs for a Legacy Era

  • Most batch jobs are written with parallel processing and workload scalability in mind. We can hardly see any batch program that does not utilize a organizational parameters like Division, State, Country etc., that makes them to be run in parallel in multiple servers at the same time.
  • Within a single instance of a batch job, the concept of ‘Divide and Rule' is employed well enough to be able to scale to multiple virtual servers in a Cloud world. For example most batch jobs split the tasks into Job Steps and the resource intensive operations like SORT, MERGE, TRANSFORM are done in independent manner, such that the multiple parallel fine grained resources can be put to task.
  • Most batch jobs have a restart logic, such that they can pick right from where they left. Considering a Cloud infrastructure where the work load can be internally moved to the available virtual machines, such characteristic of Batch Jobs are highly desirable to optimize the work load. This ensures that no processing power is wasted even due to a failed batch job, as we can always continue from where we left.
  • Good Monitoring and Instrumentation options, monitoring the flow of the batch jobs have been given ultimate importance and easier to track their progress even the work load moved to different servers.

Consider the above characteristics of the older batch jobs, against the monolithic stored procedures or business components that perform most of the processing in a single thread, so that even if a dynamic computing facility is available they will not scale up much.

Summary
In today's down economy, optimizing the work load has been the highest priority for the enterprises. While the cloud computing platform definitely provides a foundation for it, it is also up to the application characteristics within the enterprise to best utilize it.

There needs to be a shift from the single-threaded CPU-intensive applications to batch applications which are designed in alignment with the way the underlying business naturally divided among organizational divisions. This will help in the application workload optimized for the cloud and virtualized platform.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

@ThingsExpo Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...