View web version
CIO

Sponsored by

HPE

FutureIT

Reinventing IT to support today’s data-driven business. A special series from the editors of CIO First Look.

May 24, 2023

Carl by Thor Olavsrud

What is dataops? Collaborative, cross-functional analytics

Dataops (data operations) brings together devops teams with data engineers and data scientists to provide the tools, processes, and skills to enable the data-driven enterprise.


What is dataops?


Dataops (data operations) is an agile, process-oriented methodology for developing and delivering analytics. It brings together devops teams with data engineers and data scientists to provide the tools, processes, and organizational structures to support the data-focused enterprise. Research firm Gartner further describes the methodology as one focused on “improving the communication, integration, and automation of data flows between data managers and data consumers across an organization.”


Dataops goals


According to Dataversity, the goal of dataops is to streamline the design, development, and maintenance of applications based on data and data analytics. It seeks to improve the way data are managed and products are created, and to coordinate these improvements with the goals of the business. According to Gartner, dataops also aims “to deliver value faster by creating predictable delivery and change management of data, data models, and related artifacts.”


Dataops vs. devops


Devops is a software development methodology that brings continuous delivery to the systems development lifecycle by combining development teams and operations teams into a single unit responsible for a product or service. dataops builds on that concept by adding data specialists — data analysts, data developers, data engineers, and/or data scientists â€” to focus on the collaborative development of data flows and the continuous use of data across the organization.

DataKitchen, which specializes in dataops observability and automation software, maintains that dataops is not simply “devops for data.” While both practices aim to accelerate the development of software (software that leverages analytics in the case of dataops), dataops has to simultaneously manage data operations. 
 

Dataops principles


Like devops, dataops takes its cues from the agile methodology. The approach values continuous delivery of analytic insights with the primary goal of satisfying the customer.

According to the Dataops Manifesto, dataops teams value analytics that work, measuring the performance of data analytics by the insights they deliver. Dataops teams also embrace change and seek to constantly understand evolving customer needs. They self-organize around goals and seek to reduce “heroism” in favor of sustainable and scalable teams and processes.

Dataops teams also seek to orchestrate data, tools, code, and environments from beginning to end, with the aim of providing reproducible results. Such teams tend to view analytic pipelines as analogous to lean manufacturing lines and regularly reflect on feedback provided by customers, team members, and operational statistics.
https://www.cio.com/article/227979/what-is-dataops-data-operations-analytics.html
Sponsored by Hewlett Packard Enterprise

Content from our sponsor

The cloud that comes to you: HPE GreenLake the edge-to-cloud platform

Learn how to make orchestrating your complex IT estate less daunting. Centralize insights and operations across edges, datacenters, colocations and public clouds.
 

Click to learn more

Where dataops fits


Enterprises today are increasingly injecting machine learning into a vast array of products and services and dataops is an approach geared toward supporting the end-to-end needs of machine learning.

“For example, this style makes it more feasible for data scientists to have the support of software engineering to provide what is needed when models are handed over to operations during deployment,” Ted Dunning and Ellen Friedman write in their book, Machine Learning Logistics.

“The dataops approach is not limited to machine learning,” they add. “This style of organization is useful for any data-oriented work, making it easier to take advantage of the benefits offered by building a global data fabric.”

They also note dataops fits well with microservices architectures.


Dataops in practice


To make the most of dataops, enterprises must evolve their data management strategies to deal with data at scale and in response to real-world events as they happen, according to Dunning and Friedman.

Because dataops builds on devops, cross-functional teams that cut across “skill guilds” such as operations, software engineering, architecture and planning, product management, data analysis, data development, and data engineering are essential, and dataops teams should be managed in ways that ensure increased collaboration and communication among developers, operations professionals, and data experts.

Data scientists may also be included as key members of dataops teams, according to Dunning. “I think the most important thing to do here is to not stick with the more traditional Ivory Tower organization where data scientists live apart from dev teams,” he says. “The most important step you can take is to actually embed data scientists in a devops team. When they live in the same room, eat the same meals, hear the same complaints, they will naturally gain alignment.”

But Dunning also notes that data scientists may not need to be permanently embedded in a dataops team.

“Typically, there’s a data scientist embedded in the team for a time,” Dunning says. “Their capabilities and sensibilities begin to rub off. Someone on the team then takes on the role of data engineer and kind of a low-budget data scientist. The actual data scientist embedded in the team then moves along. It’s a fluid situation.”


How to build a dataops team


Most devops-based enterprises already have the nucleus of a dataops team on hand. Once they have identified projects that need data-intensive development, they need only add someone with data training to the team. Often that person is a data engineer rather than a data scientist. DataKitchen suggests organizations seek out dataops engineers who specialize in creating and implementing the processes that enable teamwork within data organizations. These individuals design the orchestrations that allow work to flow from development to production and ensure that hardware, software, data, and other resources are available on demand.

Many teams are built of individuals with overlapping skillsets, or individuals may take on multiple roles with a dataops team, depending on expertise.
 

According to Michele Goetz, vice president and principal analyst at Forrester, some of the key areas of expertise on dataops teams include:

  • Databases

  • Integration
  • Data to process orchestration
  • Data policy deployment
  • Data and model integration
  • Data security and privacy controls

Regardless of makeup, dataops teams must share a common goal: the data-driven needs of the services they support.


Dataops roles


According to Goetz, dataops team members include:
 
  • Data specialists, who support the data landscape and development best practices
  • Data engineers, who provide ad hoc and system support to BI, analytics, and business applications
  • Principal data engineers, who are developers working on product and customer-facing deliverables


Dataops salaries


Here are some of the most popular job titles related to dataops and the average salary for each position, according to data from PayScale:
 


Dataops tools


The following are some of the most popular dataops tools:
 
  • Census: An operational analytics platform specialized for reverse ETL, the process of synching data from a source of truth (like a data warehouse) to frontline systems like CRM, advertising platforms, etc.
  • Databricks Lakehouse Platform: a data management platform that unifies data warehousing and AI use cases
  • Datafold: A data quality platform for detecting and fixing data quality issues
  • DataKitchen: A data observability and automation platform that orchestrates end-to-end multi-tool, multi-environment data pipelines
  • Dbt: A data transformation tool for creating data pipelines
  • Tengu: A dataops orchestration platform for data and pipeline management
CIO
LI FB TW YT
Privacy Policy| Unsubscribe

© 2023 CIO
140 Kendrick Street,Building B
Needham, MA 02494
United States