DataOps is the new name on the tech term block, but the name is a bit of misnomer. The connotation is that, because it’s a mashup of DevOps and data, it’s a term similar to DevOps. But that is not the reality on the ground. In fact, DataOps specialists are not strictly technology focused - they are outcome focused. They are focused on what value can be extracted from an organization’s vast stores of data. At least, in an ideal scenario.
Ideally, organizations don’t let the name DataOps name throw them off with regards to the real power this role can bring to the modern enterprise. Organizations should not be asking their DataOps specialists to merely create spreadsheets of data that live an analytics department. They should be tasking them with the goal of making better use of their data, of extracting specific value from it to solve problems. To achieve this primary objective, DataOps practitioners need more than just leading-edge technologies. They also need the right people who can leverage technologies in innovative ways, and they need robust processes beyond the traditional data processing methodologies.
We invited Nihar Gupta, General Manager, Data Services, and Nirmal Rangathan, Principal Architect from Rackspace Technology to the Cloud Talk podcast. They chatted with technology evangelist and host Jeff DeVerter to explore the emergence, expectations and future of the DataOps role in the tech industry.
Tune in to hear about the following range of topics they discussed:
- Bringing together the right technologies, people and processes to extract value from data
- Creating an easier pathway for delivering the value into the right hands
- Determining the right time for building a DataOps operation within an organization
- Deciding on the qualifications to look for in ideal DataOps candidates
“The title DataOps is much more than just DevOps for data,” said Nirmal. “That’s part of it. But the role is much bigger than that. It actually pulls from multiple disciplines and converges into one. The DevOps component brings in the automation tools we need. Lean manufacturing is where quality processes come into play. And data governance is a third important factor to ensure the quality and security of your data. These three disciplines combine to form what a DevOps specialist should be.”
That trifecta of capabilities is what creates the ability for DataOps teams to gather data, clean it up, enter it into processes, extract value and deliver that value to the appropriate departments in the enterprise. Outcomes can include, for example, understanding customers better, finding greater efficiencies in the supply chain or gaining insights to create the next innovative product.
The whole purpose of data today is to extract value so you can answer a wide range of very difficult questions facing the enterprise, said Nihar. “Everything should be in service of that goal. The faster and more efficiently you can do that the more effective you’ll be at achieving important goals, like differentiating yourself in the marketplace. So, the issue becomes, how do you maximize the value of your data and move it from its source to wherever it will deliver value, whether it’s a dashboard or machine-learning model? To achieve this outcome organizations must undergo a mindset shift about their data.”
Today, the DataOps role sits at two ends of a spectrum. Either they are primarily focused on a traditional data governance role where teams are data stewards and analysts, or they are moving toward a modern approach in which extracting value is the primary goal of the DataOps team. What corporations need to ask themselves is this: Where do we want to live on this spectrum?