Today’s organizations are under immense pressure to embrace and succeed in their digital transformation journey to enable future growth. Leap now or get left behind is a common phrase uttered by digital transformation advocates when warning enterprises. This is why many companies scramble to adapt to disruptive changes in the modern workplace, such as the cloud, mobility, and the Internet of Things (IoT). Yet despite undertaking so many changes and implementing an array of new digital strategies, 84% of companies still don’t get it right. And for some, one of the greatest barriers to moving forward on the road to digital transformation is the business’ inability to leverage their most important asset: data.
Delphix, a fastest-growing data virtualization company dedicated to transforming how enterprises manage, consume, and access data, believe that in order to achieve success in the digital economy, it is crucial for businesses to rethink how they leverage data. Through a dynamic-data platform, Delphix helps enterprises address their most pressing data challenges to accelerate innovation.
In this article, we spoke to Adam Bowen, Strategic Advisor to the office of the Chief Technology Officer (CTO) at Delphix to discuss what digital transformation entails, how issues inhibit companies from achieving true innovation, and how companies can win against them.
Q: Tell us a bit more about Delphix. When was the company established, what industries do you serve, and what problems do your solutions help businesses solve?
A: Delphix was founded in 2008 and has now grown to have almost 400 employees. We enable 30% of the Fortune 500 to bring together data operators and data users through our dynamic-data platform that allows companies to provision lightweight, compressed copies of production data in minutes and secure that data, to ensure regulatory compliance.
Q: Digital Transformation is a popular (and often overused) phrase when it comes to business evolution…but what exactly does it mean? And what do you believe it requires in order to be successful?
A: I agree that Digital Transformation is overused, but I also believe that its meaning has changed over time. Originally, the term meant the process of converting physical assets and systems into digital and electronic mediums. Today, the term has been expanded to include refactoring and continuous improvement of digital assets, as well.
DevOps has played a big role in shaping that definition. But, with 84% of companies failing at Digital Transformation, I believe we have now entered into significant Digital Transformation fatigue. With the gap between legacy companies and “digitally transformed” companies growing larger and more apparent every day, companies can not afford to succumb to fatigue. We see this evidenced in the decline of Sears, taxi companies, insurance, and banking; and in the rise of Amazon, Uber, Esurance, and Metro Bank.
As to why companies are failing at Digital Transformation, I agree with the recently shared sentiment of Ken Piddington, CIO of MRE Consulting, “Most businesses have not achieved Digital Transformation, but incremental improvement.”
I believe that successful Digital Transformation does not begin with technology, but begins with defining the value-based outcomes, and then creating cross-functional teams that are aligned to delivering that value, and then trusting in them to deliver. Doing this requires a high-trust culture. This is hard for most legacy corporations as they are typically command and control styled structures, where trust only exists in small pockets in direct relationships. These teams must then be empowered to quickly explore and test solutions to best provide the stated value to the business. These solutions may be eliminating or replacing redundant policies or implementing and refreshing new technologies. Otherwise, if you simply forge ahead with technology, your organization may see incremental improvements; but it will still be the same dinosaur it was before, never undergoing a transformation of any sort.
Q: Data is a key factor that can help fuel digital transformation, but it’s no secret that data is growing in size and complexity on the regular. What challenges do today’s enterprises face because of this rapid change in data? And what role does DevOps play in helping to solve these challenges?
A: The core challenges have remained the same for hundreds of years: Corporate Espionage, Information Theft, Competitive Differentiation, Market Disruption, Agility, etc. But, since the “Data Boom,” the challenges have grown in complex ways. The amount of data has grown, but so has the number of locations where the data exists (on-premise, offshore, multiple clouds) and the number of people that need access to that data.
Another complicating factor is that the value of data has also continued to grow, and anytime something becomes more valuable, it attracts more crime. So, enterprises are faced with the increase in demand for data to be made available to more people (“Data Consumers”) in more places, and also faced with the countering force of a finite number of people (“Data Operators”) and resources to execute the requirements to secure, deliver, and ensure data is always available. We define the opposing force between these two groups of people “Data Friction.”
Data Friction is what kills innovation in companies and prohibits enterprises from delivering application projects on time, reacting swiftly to market forces, and from running a true data-driven business.
I say this as a DevOps practitioner and evangelist: DevOps specifically addresses the problem of streamlined development, deployment, and operation of applications and services. It’s not really focused on access to data. What DevOps has done is eliminate the constraints that used to hide the elephant in the room: Data. After all, what did it matter if it took 14 days to refresh the data in a QA environment when it would take just as long, if not longer, to re-baseline the operating systems and applications?
Additionally, while DevOps has helped turn our attention to the process of managing data, the associated challenges extend far beyond the application development and testing lifecycle. This problem is also experienced in other mission-critical systems like business analysis, financials (GL reconciliation), logistics, and regulatory auditing.
Q: In your words, what is DataOps? What does it involve? How is it different from DevOps?
A: DataOps is the alignment of people, process, and technology to enable the rapid, automated, and secure management of data. Its goal is to improve outcomes by bringing together those that need data with those that provide it, eliminating friction throughout the data lifecycle.
DataOps entails identifying and eliminating constraints that create friction in five key areas within the data lifecycle: governance, operations, delivery, transformation, and versioning. This is done through aligning people into value-oriented teams, refactoring/eliminating process, and employing technologies wherever possible to eliminate requirements for manual human intervention.
A high-level compare & contrast of DevOps and DataOps:
- Both are a commitment to people, process, and technology
- Both transition from a reactive to a proactive, self-service model
- DevOps solves the “throw the code over the wall” problem
- DataOps solves the “submit a ticket request for data” problem
- Both require the operators evolve their roles to platform engineers
- DevOps introduces a new role, the Site Reliability Engineer (“SRE”), who manage deployment tools
- DataOps introduces a new classification of “Data Operators” who are aligned to implement and oversee data policies
- Both are primed for liftoff due to new enabling technologies
- DevOps growing thanks to cloud, containers, Kubernetes, etc
- DataOps now possible thanks to advancements in COWFS, the abundance of APIs, and the proliferation of cloud
Q: What types of businesses can benefit most from adopting a DataOps strategy?
A: Any business that depends on data to survive and succeed: Financial Services, Insurance, Retail, Travel, even Agriculture and Waste Management. DataOps is about reducing the coefficient of data friction so that data can flow freely to the people that need it, when they need it, where they need it. In today’s economy, I cannot think of a business that does not depend on data; certainly, businesses that expect to be around in the future.
Q: Tell us a bit more about the Delphix Dynamic Data Platform. What are its key features, and how does it aim to help modern enterprises achieve their digital initiatives?
A: The Delphix Dynamic Data Platform (“DDP”) enables companies to accelerate application and data projects, enable cloud migration and adoption, and decrease the surface area of data risk. The platform exists on-premise or in the cloud and non-disruptively connects to the sources specified by Data Operators to perform a one-time copy of the data on the source. The DDP then continuously captures incremental changes on the source to record a complete application data history. All of this source data is filtered, deduped, and compressed inside the DDP. From that compressed data footprint, the DDP virtualizes the data and allows operators to create lightweight, full read/write virtual data copies in minutes.
The DDP leverages block sharing technology to quickly create and deliver the data copies, while reducing storage consumption by around 90%. The DDP continuously protects sensitive information with integrated data masking by replacing those sensitive values with fictitious, yet realistic, equivalents. The DDP automatically identifies sensitive values then applies custom or predefined masking algorithms. By seamlessly integrating data masking and provisioning into a single platform, the DDP ensures that secure data delivery is effortless and repeatable.
All of these features combined into a single platform provide Data Operators with a single point of control to govern, audit, and monitor data, and provision it in minutes. The DDP exposes a full set of API’s that allows any interaction that can be performed via the UI or CLI to also as easily be performed via popular orchestration tools like Jenkins. The DDP also provides an easy to use self-service interface that provides dynamic data controls for Data Consumers to manipulate the data provided to them, at will. Without involving a database or storage admin, a Data Consumer can easily rewind their data to any previous state to undo changes or they can refresh their data to the latest available version available to them.
Q: How can organizations make a seamless transition from DevOps to DataOps?
A: While it has been my experience that adopters of DevOps are often also ready adopters of DataOps, DevOps is certainly not a requirement. DevOps and DataOps are two wings of the same bird, seeking to help their companies climb higher and soar at the top. No matter where an enterprise is on their journey, they can begin to incorporate DataOps into their ethos.
DataOps is really about treating your most strategic asset, data, strategically. For too long, enterprises have managed data tactically by throwing specialized teams and products at specialized datasets (i.e., Oracle DBA’s and tools for Oracle databases), which has created an enterprise of Frankenstein of tightly coupled antiquated systems that are fragile and cannot scale. Much like DevOps ultimately forged a Site Reliability Engineer (“SRE”) role to leverage software programming practices to create reliable and scalable systems; successful DataOps enterprises employ an “Enterprise Data Reliability Engineer” that applies aspects of data engineering, information security, and software programming to data operations to ensure that data flows freely, is secure, and always available throughout the enterprise.
If enterprises start embracing DataOps as a strategy from the beginning, with proper executive support and resourcing, then their likelihood of success greatly increases. If DataOps is merely a tooling/technology strategy, then it is doomed to fail.
Q: What some of the best use cases for DDDP?
A: One of the great things about the DDDP is that customers come to us with new use cases all the time. But some of our most significant use cases have been in Testing, Cloud Migration, and Data Protection.
Two quick proof points:
StubHub deployed Delphix to help with its digital transformation strategy, and is now able to deliver more frequent, higher quality releases faster by automating the process of provisioning dev and test environments. StubHub developers are now able to work in parallel on more projects with less wait time.
Molina Healthcare chose Delphix to help accelerate application development. With dramatically faster data delivery, releases that once took 6 months are now completed in 3 months. Molina also implemented data masking to protect patient information from breaches and ensure compliance with HIPAA.
Q: Please share the steps and measures that Delphix has established to ensure that critical customer data is protected.
A: In short, DDDP drastically reduces our customers’ surface area of data risk by automatically masking sensitive information. In most cases, the part of the enterprise that requires sensitive customer information is the production systems. Most enterprises understand the risks posed to production and secure that environment to the best of their ability. Yet, the overwhelming majority of their data resides in non-production data copies.
The DDDP provides enterprises the capability to easily define masking policies according to their needs, and then enforce those policies via role-based access control. They can easily define all developers and testers requests are fulfilled with masked data, while all business analysts requests can be filled with real data. All masked data is realistic, deterministic, and non-invertible. With the DDDP, enterprises have all the benefits of realistic data, without the security concerns.
Additionally, the DDDP supports commonly used encryption methods for data sources (i.e. Oracle TDE); and also resides within a customer’s network boundaries so that data is under the customer’s network and system protections.
Q: What trends do you think will shape the future of DataOps? What else can we expect from Delphix in the future?
A: I think Machine Learning will be a major force that will both drive the need for DataOps and shape the future of DataOps. As ML becomes more advanced, the amount of data and the power of data will grow by orders of magnitude. ML will pave the future for true AI and all of the technological advances that will entail. We have already seen a small glimpse of this with Oracle’s announcement of the Autonomous Database.
When the Autonomous Database is proven out, this will fundamentally change the landscape and data labor force. Subscribing to Jed Yueh’s “Tree of Innovation” from Disrupt or Die: What the World Needs to Learn from Silicon Valley to Survive the Digital Era, I believe we will see many Technology Transfer apps that will pop up to offer “autonomous” innovations for other data platforms and orthogonal technologies. This will fundamentally change the shape and nature of roles and responsibilities, between consumers and operators, man and machine, across the entire data lifecycle.
The other major contributing factor I believe will be the advancement of a truly connected global community. I believe within the next 20 years we will see broadband speeds in excess of 20gbps available wirelessly to all people, dramatically increasing the demand for data and opening previously unavailable markets.
As for what to expect next, our mission is to be Fast, Secure, and Everywhere. In addition to expanding our coverage of cloud platforms and data types to meet the needs of our customers, Delphix is always evaluating ways in which we can eliminate more friction from the data lifecycle so that our customers can unlock innovation in their enterprise. We have some exciting things coming up soon, but you will have to stay tuned for specifics. Delphix is also taking a leadership role in driving the awareness around DataOps with other DataOps technology vendors and some of the world’s F100 companies. DataOps is much bigger than any one technology or vendor, and we realize that together we can usher in the same types of rapid advancements that DevOps brought to the SDLC.
About Delphix
Established in 2008, Delphix was founded with the mission to help companies break free from the shackles of data friction that hinder them from fully achieving growth and innovation. Delphix offers the Dynamic Data Platform that helps companies connect, virtualize, secure, and make their most critical data assets flow freely within the business, enhancing agility and competitiveness. Delphix is trusted by the world’s leading businesses, including Facebook and Walmart. The company has also been recently ranked as the 210th fastest growing company in North America on Deloitte’s Technology Fast 500™.