In “What the storage industry’s inevitable transition to the cloud means for your business,” I cited the explosive growth of data as one of the primary reasons for a shift in storage architectures from local hardware to the cloud.
Let’s take a closer look at this phenomenon and how it impacts your approach to storage, as well as what architecture is the right one for your current and future needs.
And what a phenomenon it is! IDC and EMC project that data will grow to 40 zettabytes by 2020, resulting in a 50-fold growth from the beginning of 2010. And forget about the importance of data simply for your business. In a recent white paper for storage firm Seagate, research firm IDC estimates that by 2025, nearly 20 percent of the data in existence will be critical to our everyday lives, with almost 10 percent of that being “hypercritical.”
The takeaway here is there is strong industry consensus on two things. The first is that there is continued exponential growth of data. A close second is the growing importance of this data to enable your business to compete effectively.
The sources of this data growth are well understood. Big data and the analytics used to mine it for insights are transforming businesses across all industries, be it manufacturing, logistics, healthcare, agriculture, transportation…the list goes on, and has been thoughtfully examined by too many others to go into this trend here.
It’s likely this is impacting your business now, but even for those not yet affected, it is important to plan for the inevitable moment when even the most reticent businesses will be dragged into the world of analytics. Best to be prepared as that time will no doubt come sooner than you think, if it hasn’t already.
Hand in hand with big data is the Internet of Things (IoT), Industrial or otherwise. The devices, sensors and other IP-enabled objects providing connective tissue for this trend is another reason to design your storage resources to cope with the IoT today.
And finally, perhaps most importantly, the explosive growth in so-called “unstructured data.” NetworkWorld’s sister publication Computerworld estimates that 70 to 80 percent of business data resides in an unstructured format. And it’s only growing faster.
Data growth’s impact on your architecture
We all know that many IT organizations tend to move somewhat slowly unless presented with an immediate “hair on fire” problem, so let’s outline the impact this data growth can have on your organization:
Poor performance or even gridlock
Way back in 2012, an average company with about 1,000 employees spent more than $5 million a year to find information on its servers. The coming data tsunami will easily swamp your existing architecture if steps are not taken in anticipation of this event.
Drawing the wrong conclusions
For those businesses leveraging analytics, bad access to all the available information will make even the best machine learning algorithms fail. Even companies currently uninterested in Big Data still need to get the correct information into the hands of its employees. According to IBM, 42 percent of managers find they use the wrong information at least once a week. Data overload is a direct cause of this issue.
Too much money spent
If your internal architecture is not cloud-ready, you are going to be spending a LOT of budget in coping with the amount of data growth in your environment. And by a lot, we’re most likely talking seven figures for a good-sized business.
All these potential issues add up to a make this data tsunami a serious competitive threat for your business if your storage architecture isn’t ready. Of course, no one is more aware of this than the storage industry, which has myriad solutions for you to choose from. Some of these are more effective than others, but each have a role to play in helping businesses solve this issue.
Some of these solutions include:
Object-based storage (OBS)
OBS is certainly a solution many of us have heard of. Its benefits are very real and should be evaluated carefully. According to IBM, almost 83 percent of companies are using or evaluating OBS as a solution.
NAS solutions continue to evolve in order to better meet the needs of their customers, which range from enterprises with solutions from the likes of NetApp to the lower end of the Small- and Medium-sized Business (SMB) scale. That said, these solutions often incorporate cloud capabilities, but over time will probably be a more expensive solution due to the amount of onsite hardware required.
Hybrid cloud architectures
A hybrid cloud storage model leverages both onsite resources of some kind (cache drives, hard drives, NAS, etc.) combined with cloud storage services. This is becoming an increasingly popular solution, as it mitigates the sprawl of hardware resources to buy and mange onsite with the flexibility and scalability of the cloud. Examples of these solutions include those from companies like Panzura, Nasuni, and my own company Morro Data.
So, which of these solutions is best for you? The truth is that each environment is unique with its own unique needs. However, for most businesses that rely on their data for their core competence, it’s safe to say that some type of hybrid cloud architecture will serve best.
If cloud performance improves dramatically in the months to come, then that may change, but even then, cloud performance will always lag behind the growth of data. So, for now, only onsite hardware assets provide the performance that many of our mission-critical applications require. In my mind, that’s the only thing keeping businesses from a wholesale shift to the cloud.
This article is published as part of the IDG Contributor Network. Want to Join?