Guest Column | April 18, 2019

Retail IT: Solutions To Handling The Data Tsunami

By James D’Arezzo, Condusiv Technologies

Data Folder

It is well understood now that retail is undergoing a sea change of magnitude as technology leads the way toward instant gratification for consumers – whether online or in-store. The problem is that the tsunami of data that retailers must process is threatening to swamp even the most well-funded and savvy retailers.

It’s estimated that by 2025, the amount of data most enterprises need to manage will have increased more than 50-fold. With that increase, of course, also will come a massive shift in how retailers use that data, as we are now seeing many enterprises transition from “Big Data” to “fast data.” In other words, the focus shifts from having tons of data available to being able to use that data rapidly, accurately and for a host of applications in daily operations.

A decade ago, omni-channel retail - providing a seamless customer shopping experience both in-store and online - was practically unheard of, if not impossible. Now, it’s a necessity for almost any retailer; omni-channel retail customers may only account for only 7 percent of customers in the US, but they make 27 percent of retail purchases. Keeping them satisfied requires a complex network of fast, accurate data and retailers need to know exactly where their products are at any given time while integrating data from various sources virtually instantly. And while the average accuracy of retail inventory is only 65 percent, omni-channel fulfillment requires closer to 95 percent accuracy for a seamless experience.

Returns alone are enough to bog down virtually any retailer; the return rate for online purchases nears 30 percent and returns require serious amounts of data and processing power as retailers use fast data to integrate e-commerce systems with behind-the-scenes logistics. This is in addition to the huge profit loss that returns traditionally impose on businesses, leaving them with what are often razor-thin profit margins.

It’s clear that the need for high-performing Big Data has never been bigger, but how do retailers cope with the computing infrastructure required? And where does that leave smaller businesses and those without the computing power to handle such operations?

As more and more businesses make the shift to fast data, they need to upgrade their IT infrastructure to accommodate both the data and processing power. But it’s also becoming clear that spending money on hardware upgrades isn’t always the answer – and often ends up being a waste of money, threatening the profits they’ve managed to make in spite of high return rates and increasing fraud and chargebacks.

Instead, it’s evident that software solutions are necessary. Increasing data speeds via software saves retailers time and energy, as well as precious dollars, that would otherwise be spent on hardware upgrades. Speeding up processing performance also means less waiting for customers, resulting in fewer complaints and happier customers, as well as less downtime for retailers - which means greater profits.

How To Improve Software Storage Solutions

So how do you deal with these performance issues and get systems to rapidly handle the massive amounts of data? You’re probably asking why you can’t just buy and install newer and faster storage and computer power? If you have the money to spend, this could be a good temporary solution. But you’ll still have network pipeline challenges, as there will simply be too much data for your system to handle. It’s no different from being stuck in traffic simply because the road you’re on isn’t built to handle the number of cars taking it.

Three main issues each cause I/O bottlenecks (that is, the amount of data that must be read from storage, computed, presented and then written back to storage with each action being an I/O or input/output). These issues are your data pipelines, as mentioned, but also the sheer amount of non-application I/O overhead, as well as file system overhead. Each of these can cause degradations in data performance of 30 to 50 percent alone. Combined, they spell trouble for virtually any hardware setup. The fact is, much of the I/O overhead is caused by the software running your systems and applications.

So, what do you do to deal with the oncoming flood of data as an IT department head or system admin? As software is causing much of the problem, look at software to fix it before rushing out to purchase new hardware. There are solutions that can deal with storage performance at the operating system, file system and application levels.

Certainly, new hardware can buy improved performance for a time. In fact, most hardware manufacturers tout the number of IOPS (I/Os per second) they handle. The problem is that the growing number of I/Os will soon be bogged down and hampered by denser, faster data and more demanding applications, as has been shown to happen time and time again. When that happens, having the right software solutions at your disposal will get you back up and running at much better speeds.

About The Author

James D’Arezzo is CEO of Condusiv Technologies, the world leader in software-only storage performance solutions for virtual and physical server environments. His distinguished career in technology includes senior executive positions at IBM, Compaq, Autodesk and Radiant Logic.