Accelerating Storage Innovation in the Next Data Decade
A sales and marketing professional with proven expertise in formulating and developing business strategies, Amit also bears expertise in datacenter technologies such as virtualization, Cloud storage and networking platforms
Over the previous decade, technology has transformed nearly every business into an IT-driven business. From farming to pharmaceuticals, these information technology developments have led organizations to reimagine how they operate, compete, and serve customers. Data is at the heart of these changes and will continue its transformative trajectory as organizations navigate the waves of technological progress in the next Data Decade.
In order to have strategic value in the enterprise, storage innovation must cross the capabilities chasm from just storing and moving around bits to holistic data management. This year, there are three key areas we believe that will be difference makers for organizations that are pushing the limits of current storage and IT approaches.
Trend #1: Machine learning and CPU Performance unlock new storage and data management approaches
There is a desire by customers in industries such as manufacturing, cybersecurity, autonomous vehicles, public safety and healthcare to build applications that treat data as streams instead of breaking it up into separate files or objects.
Ingesting and processing stream data has unique challenges that limit traditional IT and storage systems. Since streaming workloads often change throughout the day – storage capacity and compute power must be elastic to accommodate. By treating everything as a data stream, event data can be replayed in the same way we watch a live sporting event on a DVR-enabled TV, where the program can be paused, rewound and replayed instantly.
In the realm of data management, 2020 will usher in new approaches for organizations wishing to
better manage the data that is distributed across many silos of on prem and cloud data stores. Data growth has been outstripping the growth of IT budgets for years, making it difficult for organizations not only to keep and store all their data, but manage, monetize, secure and make it useful for end users.
Moreover, Dataset Management an evolving discipline using various approaches and technologies to help organizations better use and manage data through its lifecycle will be in prominence in 2020. It offers organizations a bridge from the old world of directories and files to the new world of data and metadata. It will be useful for industries(i.e. media & entertainment, healthcare, insurance) that frequently have data stored across different storage systems and platforms(i.e. device/instrument generated raw data to derivative data at a project level, etc.).
With Deeper Integration Of Virtualization Technologies On The Storage Array, Apps Can Be Run Directly On The Same System And Managed With Standard Tools
Trend #2: Storage will be architected and consumed as Software defined
We can expect to see new storage designs in 2020 that will further blur the line between storage and compute.
With deeper integration of virtualization technologies on the storage array, apps can be run directly on the same system and managed with standard tools. This could be suitable for data centric applications that require very storage and data intensive operations. Software defined infrastructure(SDI)is also becoming a greater consideration in enterprise data centers to augment traditional SANs and HCI deployments. Long the realm of hyperscalers, traditional enterprises are ready to adopt SDI for the redeployment of certain workloads that have different requirements for capacity and compute than what traditional 3-layer SANs can provide.
The solution is for customer that needs to consolidate multiple high performance (e.g. database) or general workloads. As enterprises consider consolidation strategies, they will bump up against the limits of traditional SANs and the unpredictable performance/costs and lock-in of cloud services. This is where SDI becomes a very viable alternative to traditional SANs and HCI for certain workloads.
Trend #3: High-performance Object storage enters the mainstream
As Object moves from cheap and deep, cold storage or archive to a modern cloud native storage platform, performance is on many people’s minds. One of the reasons we see this solution rising in 2020 will be due to its demand by application developers. Analytics is also driving a lot of demand and we expect to see companies in different verticals moving in this direction.
In turn, the added performance of flash and NVMe are creating tremendous opportunity for Object-based platforms to support things that require speed and near limitless scale Flash-based Object storage with automated tiering to disk offers a cost-effective solution, particularly when a customer is talking about hundreds of petabytes or Exabyte scale. It allows you to move the data you need up to the flash tier to run your analytics and high performance applications and then move the data off to a cold or archive tier when you are done with it. As the pace of technology innovation accelerates, so too will the possibilities in storage and data management. We are standing with our customers at the dawn of the Data Decade.