Welcome!

IBM Cloud Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Stefan Bernbo

Related Topics: @CloudExpo, Linux Containers, @DXWorldExpo, @ThingsExpo, @DevOpsSummit

@CloudExpo: Article

In-Stream Processing | @CloudExpo @robinAKAroblimo #BigData #AI #BI #DX

We are still getting sales reports and other information we need to run our business long after the fact

Most of us have moved our web and e-commerce operations to the cloud, but we are still getting sales reports and other information we need to run our business long after the fact. We sell a hamburger on Tuesday, you might say, but don't know if we made money selling it until Friday. That's because we still rely on Batch processing, where we generate orders, reports, and other management-useful pieces of data when it's most convenient for the IT department to process them, rather than in real time. That was fine when horse-drawn wagons made our deliveries, but it is far too slow for today's world, where stock prices and other bits of information circle the world (literally) at the speed of light. It's time to move to In-Stream Processing. You can't - and shouldn't - keep putting it off.

[Figure 1, courtesy of the Grid Dynamics Blog]

This diagram may look complicated at first, but if you trace the lines it will soon become clear. The only thing that might throw some managers for a loop is the "Data Science" box at the bottom. This term may seem intimidating, but in real life it's just a method of deciding what data is most important to extract from the data stream as it flows by, and how it should best be displayed to the business people who are its end users. Some say, "Data science is just a sexed-up term for statistics." Perhaps. But title aside, displaying the dynamic information needed to run a business - and only that information - in real-time is what In-Stream Processing is all about.

Data Nuggets in the Stream
A retailer may do 100,000 POS credit card transactions per day. That's nice to know, and it's nice to have a dashboard-type display that shows them in real-time, and shows upward and downward trends minute by minute so that cashiers can be assigned with maximum efficiency. In-Stream Processing can do this, and with the right output devices, can even trigger a storewide PA announcement that automatically says, "All cashiers to the front, please." Opening more registers as soon as checkout volume starts to trend upwards makes customers happy. Even the announcement makes them happy, because it shows that management cares about them.

And buried in those 100,000 credit card transactions, there is one that is an attempt to use a stolen card. Obviously, that transaction will be declined. But then what? In the era of batch processing, an automated alert might get sent to management or loss control hours or days after the miscreant was gone. Perhaps the cashier called security, perhaps he didn't. After all, he had other customers in line, and when he told the person with the hot credit card that it was declined, the criminal probably bolted, and the cashier didn't give chase because it was probably against store policy - and he had other things to worry about, anyway.

Now let's fast-forward to tomorrow, when we have In-Stream Processing up and running. The second - literally the second - the hot credit card transaction takes place, both the store's loss control people and management at HQ learn about it. Store security can head for the appropriate POS before the bogus customer is informed that "his" credit card is no good. Indeed, notifying the cashier, and therefore the bogus customer, that the card has been declined can be automatically delayed until store security personnel acknowledge receipt of the alert, which gives them a far greater chance to detain the criminal than they had in the bad old days before they had In-Stream Processing as an anti-theft weapon.

Predicting production failures with In-Stream Processing
Now, let's turn our sights from the POS system to the conveyors in our packing and shipping facility. Whoops! It seems the drive motor for belt number five is suddenly running at 40 degrees C instead of its typical 30C - 32C. If we don't learn about this for a day or two, chances are we'll have burnt-out motor to replace, and we've had a packing line down for a number of hours.

With In-Stream Processing, we can see temperature sensor output changes instantly, and even have alerts set to go off if they move more than X degrees outside their normal operating range. We can do the same thing with strain gauges and many other measurement devices. We can even check the number of boxes getting packed on each line, with alerts set for changes beyond our expectations so that a manager can check that line to make sure everything is okay both with the workers and with their equipment.

The sooner a potential problem is detected, the better our chance of solving it before it becomes a major, costly problem. This applies to almost every aspect of our business, including a sudden spate of customer calls about problems with a particular product.

Keeping irritated customers from becoming angry customers
An exciting recent development in call center management, which couldn't have happened before big data and the cloud made immense data processing power and data storage available at low (and ever-dropping) prices is Emotion Analysis. It's no great trick for a human to tell whether a caller is happy, inquisitive, upset or downright angry. But this is a recent trick for computers, and is just now starting to become a practical business application.

The idea is that as soon as the system detects unhappiness beyond a preset level in a customer's voice, the call is automatically diverted to a supervisor or 2nd-level support person. In theory, a first-tier support person should be able to detect that unhappiness and call for help, but as you know from your own experience calling businesses for help, people in the first support tier may not be capable of recognizing unhappiness even if - it often seems - you bluntly say, "I'm unhappy. Please get your supervisor."

Think what a pleasant surprise it would be if a fresh voice came on the line and said, "This is Ron, in support management. It sounds like you're upset. I don't blame you. Let's see what we can do to solve your problem!" We're not using a chatbot in this scenario - yet. It may not be long until we have good enough AI (or at least pseudo-AI), and good enough voice recognition to replace human customer service workers, but right now nothing beats a knowledgeable employee with the authority to actually make things right for a customer who has gotten a defective product or poor service of some sort.

But In-Stream Processing, running Emotion Analysis, can certainly help us hook our unhappy customer up with the person who can make her problems go away - and do it right now, not a week from Sunday.

New uses for In-Stream Processing are cropping up all the time
Indeed, Emotion Analysis and dozens of other applications that are new or are still being developed all depend on In-Stream Analysis, because they all rely on real-time or near-real-time processing, not processing that happens someday. Program trading is a great example. If you're doing high-frequency stock buying and selling, making hundreds or thousands of trades per minute (or even per second in some cases), you must be able to process data and make decisions - or have a program that makes decisions - fast enough that the length of your connection to the stock exchange can make a noticeable difference in your profits -- which is pretty darn fast.

-------------------------

With a little thought, you can almost certainly think of at least a few ways In-Stream Processing can benefit your business. If not, throw the idea out to your fellow executives. The chances are 100%, more or less, that within a week they'll think of at least a few ways In-Stream Processing can increase your profits.

More Stories By Robin Miller

Robin “Roblimo” Miller is a long-time IT journalist known for his work on Slashdot, Linux.com, and other sites covering software so new that its edges haven’t even started to bleed. Nowadays, he writes for FOSSForce.com and works as an editorial consultant and blog editor for Grid Dynamics, “the engineering IT services company known for transformative, mission-critical cloud solutions for retail, finance and technology sectors.”

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...