View web version
CIO

Sponsored by

HPE

FutureIT

Reinventing IT to support today’s data-driven business. A special series from the editors of CIO First Look.

September 11, 2023

6 barriers to becoming a data-driven company

Companies that embrace data-driven approaches stand to perform much better than those that don’t, yet they’re still in the minority. What’s standing in the way?

It’s no surprise that becoming a data-driven company is at the top of the corporate agenda. A recent IDC whitepaper found that data-savvy companies reported a threefold increase in revenue improvement, almost tripling the likelihood of reduced time to market for new products and services, and more than doubling the probability of enhanced customer satisfaction, profits, and operational efficiency.

But according to a January survey of data and information executives from NewVantage Partners, merely a quarter of companies describe themselves as data-driven, and only 21% say they have a data culture in their organizations.

Several key factors help explain this disconnect, but cultural issues were cited by 80% of respondents as the biggest factor keeping them from getting value from their data investments, while only 20% pointed to technology limitations. Based on the experience of experts who have surmounted these roadblocks firsthand, others remain as well.

Recognizing bad data


Even the best of analytics strategies can be derailed if the underlying data is bad. But solving data quality problems requires a deep understanding of what the data means and how it’s collected. Resolving duplicate data is one issue, but when the data is just wrong, that’s much harder to fix, says Uyi Stewart, chief data and technology officer at Data.org, a nonprofit backed by the Mastercard Center for Inclusive Growth and the Rockefeller Foundation.
 
“The challenge of veracity is much more difficult and takes more time,” he says. “This is where you require domain expertise to allow you to separate fact from fiction.”

Simple technical skills are not enough. That’s what Lenno Maris found out when he joined FrieslandCampina, a multinational dairy cooperative, in 2017, when the company was embarking on a strategic plan to become a data-driven company.

It was a big challenge. The company has over 21,000 employees in 31 countries, and has customers in over 100 countries. It quickly became clear that data quality was going to be a big hurdle.

For example, inventory was reported based on the number of pallets, but orders were based on unit numbers, says Maris, the company’s senior global director for enterprise data and authorizations. This meant that people had to do manual conversions to ensure the right quantities were delivered at the right price.
 

HPE

Content from our sponsor

The open and secure edge-to-cloud platform

Do you want to connect your data, securely, where it lives and turn it into intelligence? Do you want a unified, trusted data source to make smart decisions wherever your people and devices are? Do you need a platform that comes to your edges, data centers, or colocation facilities?

If you do, then HPE GreenLake is the open and secure edge-to-cloud platform that you've been waiting for.

Click to learn more

Or take commodity codes. Each plant put in the commodity code that best fit the product, with different plants using different codes that were then used to reclaim import and export taxes. “But tax reporting is performed at the corporate level, so consistency is needed,” says Maris.

To fix the data issues, FrieslandCampina had to evolve its data organization. At the start of the project, the team focused mostly on the technical details of data entry. But that changed quickly.

“We’ve been able to retrain our team to become process experts, data quality experts, and domain experts,” Maris says. “That allows us to transition to proactive data support and become advisors to our business peers.”

Similarly, the technology platform chosen to help the company improve its data quality, Syniti, had to adapt as well.

“The platform is good but highly technical,” Maris says. “So we had some challenges with our business user adoption. We’ve challenged Syniti to provide a business-relevant user interface.”

In 2018, the tier-one master data objects were in place: vendors, materials, customers, and finance. The following year, this expanded to tier-two data objects, including contracts, bills of materials, rebates, and pricing. By the end of 2022, the company had finished orchestrating the logical business flows and the project was fully deployed. The result was a 95% improvement in data quality and a 108% improvement in productivity.

“Prior to implementation of the foundational data platform, we had over 10,000 hours of rework on our master data on an annual basis,” he says. “Today, this has been reduced to almost zero.”

Data quality was also an issue at Aflac, says Aflac CIO Shelia Anderson. When Aflac began its journey toward becoming a data-driven company, there were different business operations across Aflac’s various books of business, she says.

“There were multiple systems of data intake, which presented inconsistencies in data quality,” she says. That made it difficult to get useful insights from the data. To solve the problem, Aflac moved to a digital-first, customer-centric approach. This required data consolidation across various ecosystems, and as a result, the customer experience has improved and the company has been able to increase automation in its business processes and reduce error rates.

“A significant benefit is that it frees bandwidth for customer service agents, enabling them to focus on higher complexity claims that require a more personal touch,” she says.

Seeing data consolidation as a technology problem


One of Randy Sykes’ previous employers spent eight years building a data warehouse without success.

“That’s because we tried to apply standard system development techniques without making sure that the business was with you in lockstep,” he says.

Today, Sykes is IT director of data services at Hastings Mutual Insurance Co. This time, he took a different approach to consolidating the organization’s data.

Ten years ago, the company decided to bring everything together into a data warehouse. At the time, reports took 45 days to produce and business users didn’t have the information they needed to make business decisions.

First, data would be collected in a landing area via nightly batch imports from legacy systems. It would then move into a staging area, where business rules would be applied to consolidate and reconcile data from different systems. This required a deep understanding of how the company operates and what the data means. But this time, the project was successful because there were subject matter experts on the team.

“We had a couple of business folks who’d been with the company a long time and had a lot of knowledge of the organization,” he says. “You actually have a cross-functional team to be successful.”

For example, different insurance policy systems might have different terms, and different coverage areas and risks. In order to consolidate all this information, the data team needs to have a good understanding of the business language and the rules needed to transform the raw data into a universal format.

“That’s the biggest challenge that companies run into,” he says. “They try to get the data and technically put it together and forget the business story behind the information. A lot of times, these types of projects fail.”

Today, a report that used to take 45 days can be turned around in 24 hours, he says. Then, as databases continue to get modernized and become event-driven, the information will become available in real time.

No short-term business benefits


Once Hastings started getting data together, the data project began producing value for the company, within a year, even though the data warehouse project, which began in 2014, wasn’t delivered until 2017.

That’s because the landing and staging areas were already providing value in terms of gathering and processing the data.

Data projects have to deliver business value all throughout the process, Sykes says. “Nobody is going to wait forever.”

A similar “quick win” helped lead to the success of a major data project for Denise Allec, principal consultant at NTT Americas, back when she was the director of corporate IT at a major corporation.

A six-week proof-of-concept project showed that the project had value, she says, and helped overcome challenges such as business units’ unwillingness to give up their silos of data.

“Giving up ownership of data represents a loss of control to many,” she says. “Information is power.”

This kind of data hoarding isn’t limited to senior executives, though.

“Employees tend not to trust others’ data,” she says.

They want to validate and scrub their own sources, and massage and create their own reporting tools that work for their unique needs.

“We’ve all seen the numerous duplicative databases that exist throughout a company and the challenges that arise from such a situation,” she says.

Choosing data projects that don’t have immediate benefits is a major roadblock to successful data initiatives, confirms Sanjay Srivastava, chief digital strategist at Genpact.

 â€œUntil you do this, it’s all a theoretical discussion.”

The flip side is choosing projects that don’t have any ability to scale—another major barrier.

Without the ability to scale, a data project won’t have meaningful long-term impact, instead using up resources for a small or idiosyncratic use case.

“The key is how you deliver business value in chunks, in a time frame that keeps people’s attention, and that is scalable,” he says.

Not giving end users the self-service tools they need


Putting the business users first means giving people the data they need in the form they need it. Sometimes, that means Excel spreadsheets. At Hastings, for example, staff would historically copy-and-paste data into Excel in order to work with it.

“Everybody uses Excel,” says Hastings’ Sykes. “Now we say, ‘Why don’t we just give you the data so you don’t have to copy-and-paste it anymore.’”

But the company has also been creating dashboards. Today, about a quarter of the company’s 420 employees are using the dashboards as well as outside agencies.

“They can now help agents cross-sell our products,” he says. “We didn’t have that before.”

But providing people with the self-serve analytics tools they need is a challenge. “We’re still behind the eight ball a little bit,” he says. But with 200 business-focused dashboards already in place, the process is well under way.

Another organization that recently began the process of democratizing access to data is the Dayton Children’s Hospital in Dayton, Ohio.

“We weren’t doing that well five years ago,” says CIO J.D. Whitlock. “There were still a lot of spreadsheets. Now we’re using the Microsoft data stack, like a lot of people are doing. So as long as someone knows a little bit about how to use PowerBI, we’re serving up the appropriate data, in the appropriate format, with appropriate security.”

In addition, data analysts have also been decentralized, so people don’t have to go to a single team with their data questions.

“Say you want to know how many of procedures X doctor Y did last year,” says Whitlock. “It’s a relatively simple query. But if you don’t give people the tools to do that themselves, then you’ve got a thousand requests.”

Putting self-serve data tools in place has helped the company move toward being a data-driven organization, he says. “With the caveat that it’s always a journey and you never declare victory.”

Not including end users in your development process


Ignoring user needs is nearly always a recipe for disaster. For example, Nick Kramer recently worked with a national restaurant services company. Kramer is the leader of applied solutions at SSA & Company, a global consulting firm. The restaurant services company was growing rapidly but service levels were dropping.

“Everybody was pointing fingers at each other,” he says. “But the CIO had no dashboards or reports—just anecdotes and opinions.”

One of the problems was that the central installation system was widely ignored. Employees updated records, but after the fact. The system had been imposed on them and was hard to use.

“People in the order department, in sales, legal, and on the installation side—every office had their own spreadsheets they ran their schedules on,” Kramer says. “None of the communication was happening and the data wasn’t flowing. So you had to go office by office to find out who was doing what and how well, and which delays were unsolvable and which ones could be addressed.”

The solution was to get close to the business users, to understand how the data was used.

Joshua Swartz, partner at Kearney, had a similar experience recently when he was working on a consulting project with a US food company with several billion in annual revenues.

The company wanted to enable production managers to make better decisions about what to produce based on real data.

“For example, there’s a production line in a certain production site and it can make either tortilla chips or pita bread,” says Swartz. “If there’s a switchover, you have to stop and clean and change the ingredients.”

But, say, the old way was to do four hours on tortillas and four hours on pita bread, and the data showed that you should do two hours on tortilla chips—and then tomorrow it may be the opposite. And since food products are perishable, getting production wrong means that some product would have to be thrown away. But when the company first designed its solution, the production workers weren’t involved, says Swartz. “They were too busy producing food and didn’t have time to stop and attend meetings.”

This wasn’t expected to be a problem because the company’s culture was hierarchical. “When the CEO says something and pounds their fist on the table, everyone has to follow suit,” he says.

But the new system was used for only a couple of weeks in the pilot site and then the employees found that the system didn’t really work for them and went back to doing things the old way. Also, it didn’t help that the company’s data czar was located a couple of layers down in the company’s technology organization, rather than closer to top management or to the business units.

Fixing the problem required bringing the actual employees to the design suite, even though it required adding capacity to the production lines to free up workers.

“Food companies with very thin margins weren’t comfortable making that investment,” Swartz says. But when they became part of the process, they were able to contribute to the solution, and today a third to a half of the facilities are using the new technology.

Swartz also recommends that the chief data officer be located closer to the company’s most valuable data.

“If data is a strategic asset of the business, I would place the CDO closer to the part of the business that has ownership of the data,” he says. “If the organization is focused on using data for operational efficiency, then under the COO might be the right place.”

A sales-driven company might want to put the CDO under the sales officer, however, and a product company, under the marketing officer, he says. One consumer packaged goods company he worked with actually had the CDO report directly to the CEO.

“If you think of data as a technology problem, you’re going to keep running into challenges of how much value you are actually getting from data and analytics,” says Swartz.

A lack of trust


The responsible use of data is important for the success of data initiatives, and nowhere more so than in finance.

“Trust is of utmost importance in the banking sector,” says Sameer Gupta, chief analytics officer at DBS Bank. “It’s crucial to use data and models responsibly, and ethical considerations must be upheld while using data.”

Data use should be purposeful, he says, respectful, and explainable, and should never come as a surprise. “Data use should be expected by individuals and corporates,” he says.

By focusing on trust, he adds, the bank has been able to deploy AI and data use cases across the enterprise—260 at the last count—ranging from customer-facing businesses like consumer and small and medium enterprise banking, to support functions like compliance, marketing, and HR.

“In 2022, the revenue uplift from our AI and machine learning initiatives was about SGD 150 million [US $112 million], more than double that from the previous year,” he says. “We aspire to achieve SGD 1 billion in the next five years.”

Earning trust takes time and commitment. Becoming a data-driven company is all but impossible without it. But once trust is gained, it begins a virtuous cycle. According to a CapGemini change management study released in January, in organizations with strong data analytics, employees are 18% more likely to trust the company. And when those companies need to evolve further, the probability of successful change is 23 to 27% higher than at other organizations.

“Many people, including data experts, think most issues while transitioning toward becoming a data-driven company are technology-related,” says Eugenio Zuccarelli, a data scientist at a global retailer and former AI research scientist at MIT.

But the real barriers are personal, he says, as people have to learn to understand the value of making data-based decisions.

“While doing research at MIT, I often saw experts and leaders of organizations struggle with their transition toward becoming a more data-driven organization,” he says. “The main issues were usually cultural, such as a belief that technology would have overtaken their decision-making, rather than empowering them, and a general tendency to take decisions based on experience and gut feelings.”

People need to understand that their expertise is still vital, he adds, and that the data is there to provide additional input. 

Companies need to stop thinking about becoming a data-driven company as a technology problem.

“All our clients are talking about becoming more data driven, and none of them know what it means,” says Donncha Carroll, partner in the revenue growth practice and head of the data science team at Lotis Blue Consulting. They focus on their technology capabilities, he says, not what people will be able to do with the data they get.

“They don’t put the user of the solution in the frame,” he says. “Lots of data analytics teams provide data dashboards that provide information that is neither useful nor actionable. And it dies on the vine.”

 

CIO
LI FB TW
Privacy Policy | Unsubscribe Advertise with us | Our Brands

© 2023 CIO
IDG Communications Ltd.,
Level 10, 15 Blue Street,
North Sydney NSW 2060,
Australia,
A.B.N. 14 001 592 650