Welcome!

IBM Cloud Authors: Zakia Bouachraoui, Elizabeth White, Yeshim Deniz, Pat Romanski, Liz McMillan

Related Topics: Server Monitoring, IBM Cloud, @CloudExpo

Server Monitoring: Blog Post

Reporting for the Cloud

Auditing is essential for your cloud environments

Normally when you read or hear someone talk about application environments running on cloud platforms a lot of focus is put on provisioning and elasticity. Mainly the claims are that you should be able to very quickly provision full application environments on the cloud platform and that those environments should grow, and shrink, based on the demands on the system.

I certainly have no argument that those capabilities are important functionality for a cloud platform, but based on some recent conversations with several different users I'm beginning to think we aren't talking enough about another important feature for these solutions. That feature is the degree to which the application environments running on cloud platforms can be audited.

Auditing of the application environments running in a cloud has many facets, but let's just consider it as it relates to one of the basic tenets of cloud computing: utility-based pricing. If you are thinking of either using or constructing a cloud application platform within your enterprise, chances are utility-based pricing is important to you. You either expect to be charged for only the resources you use, or you want to allocate costs back to teams or users based on usage of your platform.

 

Your usage charges may be for the number of hours that an environment runs in the cloud (the Amazon EC2 pricing model), or it may mean something even finer-grained such as paying for the utilization of compute resources like CPU, memory, and storage. Any way you account for usage, the implication is that the cloud platform, or associated tooling, provides or enables you in some way to extract and filter the relevant data about the environments running on said platform in order to produce meaningful reports.

The best-case scenario is that the cloud platform directly provides you with your reports. For instance, if you decided to charge for use of your cloud platform solely based on usage hours, then the cloud platform solution might provide you with a report that showed the breakdown of usage hours for each user or team over a specified period of time. In this case, you don't have to do anything to extract, filter, or otherwise massage the data. You tell the solution you want a usage report, and it gives you just what you need.

Obviously, this turnkey reporting capability is not likely to be there for everything you need because there is no way for a solution provider to anticipate and provide every meaningful report for every user right out of the box. What do you do when you encounter a solution that you otherwise like, but doesn't necessarily give you the reporting you need right out of the box? Simple. You ensure that the cloud platform solution provides a mechanism by which you can extract audit data to support your own custom reporting.

The solution may enable you to extract data by providing a robust CLI, REST interface, web services interface, or query building capability. Regardless of the data extraction interface, if custom reporting is important to you, you'll want to ensure that not only can you extract the data, but that you can also filter and shape the data to fit your needs. If you can't filter the data, you're left with little more than a pile of meaningless bytes and bits. Just as importantly, if it looks too hard to filter the data, you'll want to consider whether it's worth the investment to write the necessary data mining logic.

The bottom line is that when you look at a cloud platform solution, be sure to look past that solution's approaches to provisioning and elasticity. Take a hard look at how it helps you to audit the application environments it will support. It's better to understand and accept those capabilities upfront instead of being disappointed once you put the solution in place.

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...