Welcome!

IBM Cloud Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Stefan Bernbo

Related Topics: @DevOpsSummit, Microservices Expo, IBM Cloud, Linux Containers, Containers Expo Blog, @CloudExpo

@DevOpsSummit: Blog Feed Post

Understanding DevOps | @DevOpsSummit @IBMDevOps #DevOps

DevOps is a set of principles & practices that enables an organization to make their delivery of applications ‘lean’ & efficient

October 28, 2014

A simple description of DevOps is such:

‘An approach to Application Delivery that applies Lean principles to accelerate feedback and improve time to market’

What does this mean? In a nutshell it implies that DevOps is a set of principles and practices that enables an organization to make their delivery of applications ‘lean’ and efficient, while leveraging feedback from customers and users to continuous improve.

What do you ‘continuously improve’? Three things:

  1. The application being delivered
  2. The Environment of the application being delivered
  3. The process by which the application (and its environment) is delivered

The ‘continuous improvement’ of the application and it environment comes from the feedback mechanism. As the application is continuously delivered, customers or customer surrogates (if the new feature delivered cannot be made available to the customer) can use the application delivered and provide feedback on the application’s functionality and behavior. This feedback can be used to improve both the application itself and also the environment it is delivered on, in the next iteration. The application’s features can enhance, added to or removed, based on the feedback. The Environment can be enhanced or re-configured if is not enabling the application to perform as expected or unable to deliver the performance Service Level Agreements (SLAs) agreed upon.

The third area of improvement – that of improving the process of delivering the application is where the crux of DevOps lies. How does one continuous make the process of delivering the application more lean and efficient – continuously improve it.

Looking at delivery processes to continuously improve them is not a new approach. Lean Manufacturing and the Japanese manufacturing approach called Kaizen have been applied to improving factory processes for decades. DevOps is now taking these Lean approaches and applying them to Application Delivery. Agile development practices applied some of these principles to development and testing. DevOps applies them to end-to-end application delivery – from ideation to production.

Continuous Improvement – where to begin?

To begin applying lean principles to application delivery processes one first needs to identify where the ‘fat’ is that can be reduced or completely eliminated. Lean thinking leverages a technique known as ‘Value Stream Mapping’ to identify these areas of ‘fat’ or inefficiencies. While one can carry out an extensive ‘Value Stream Mapping’ exercise to analyze one’s application delivery processes in detail over a multi-week engagement with experts in the space, a simple and quick approach is to take some time to map your delivery pipeline and look for ‘bottlenecks’ in how the delivery pipeline operates. New requirements, enhancement requests and bugs to be fixed go in from one end of the delivery pipeline. Code running in production comes out from the other end. How efficiently does this pipeline operate? What bottlenecks are there which can be eliminated or at least minimized? Where is the ‘waste’ that can be reduced?

This value stream mapping identifies bottlenecks in the delivery pipeline. These bottlenecks are typically just symptoms of underlying ‘fat’ in the system. They need to be analyzed to identify root causes of the inefficiencies. This list of root causes then need to be prioritized and the top three to five identified to develop a mitigation plan. DevOps capabilities can now be applied to address them. An adoption roadmap to adopt these capabilities and the associated practices can now be developed and put in motion.

It's Continuous:

The key word in this all is ‘continuous’.

Adopting DevOps is not a step, but a journey of continuously ‘deploying improvement’ and continuously improving ones practices and culture.

Learn more:
Describing how to map out your delivery pipeline:


Check out this video of me doing a ‘mock’ value stream mapping:


Related Posts:

Understanding DevOps:

Adopting DevOps:


More Stories By Sanjeev Sharma

Sanjeev is a 20-year veteran of the software industry. For the past 18 years he has been a solution architect with Rational Software, an IBM brand. His areas of expertise include DevOps, Mobile Development and UX, Lean and Agile Transformation, Application Lifecycle Management and Software Supply Chains. He is a DevOps Thought Leader at IBM and currently leads IBM’s Worldwide Technical Sales team for DevOps. He speaks regularly at conferences and has written several papers. He is also the author of the DevOps For Dummies book.

Sanjeev has an Electrical Engineering degree from The National Institute of Technology, Kurukshetra, India and a Masters in Computer Science from Villanova University, United States. He is passionate about his family, travel, reading, Science Fiction movies and Airline Miles. He blogs about DevOps at http://bit.ly/sdarchitect and tweets as @sd_architect

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...