ink drop - stock.adobe.com

Teradata reports strong cloud earnings

Teradata’s cloud-first strategy is starting to pay off, with the company reporting its strongest cloud earnings to date in the fourth quarter last year

Teradata reported its strongest cloud earnings to date in the fourth quarter of 2022, with annual recurring revenue from public cloud growing 81% year on year in constant currency to hit $357m, underscoring the traction it has been gaining in cloud-based data warehouses and analytics software.

The company attributed the growth to customer demand for its differentiated platform, resulting in new, incremental workloads that drove healthy migrations and expansions.

“Just over two short years ago, we declared that Teradata would be cloud-first, and the entire team stepped up and executed with determination and consistency. Once we set our sights on our cloud-first future, we have delivered more than a sixfold growth in cloud; just remarkable results with growth well ahead of the market,” Teradata CEO Steve McMillan said during a recent earnings call.

In August 2022, at the New York Stock Exchange, Teradata launched VantageCloud Lake, a cloud-native version of its data and analytics platform, and ClearScape Analytics, an expanded version of its business intelligence suite.

McMillan said there had been “incredibly positive market response to the launch of these powerful capabilities”, adding that Teradata customers were already leveraging the Cloud Lake product to get meaningful business value.

Cloud-first strategy

Speaking to Computer Weekly in Singapore, Richard Petley, executive vice-president of Teradata’s international business, pointed out two aspects of the company’s cloud-first strategy.

“One is helping our customers modernise and migrate to a cloud platform. And there are lots of different motivations for why customers would want to do that, whether it’s around agility, cost or flexibility,” he said.

“That’s not to say we are cloud-only. One of the strong things about the Teradata business model is that we support a variety of deployment methods, and we have customers that are clear about staying on-premise and we’re delighted to continue to support them”
Richard Petley, Teradata

Petley said cloud-based delivery also creates new opportunities for Teradata customers to solve new and old problems by leveraging technologies such as artificial intelligence and machine learning.

“That’s not to say we are cloud-only,” he added. “One of the strong things about the Teradata business model is that we support a variety of deployment methods, and we have customers that are clear about staying on-premise and we’re delighted to continue to support them.”

Used by some of the world’s largest organisations to run mission-critical, high-performance data workloads, Teradata is not short of capabilities against the likes of Snowflake, Databricks, Amazon, Microsoft and Google.

But when asked about how Teradata stacks up against its cloud-native rivals, Petley flipped the question around: “If you think about what we’ve been famous for – solving at scale, complex data and analytical needs for our customers in a variety of environments – how do some of these cloud-native competitors compare to us?

“In those areas, whether it’s a technology or cost per query comparison, we think we’re very favourably positioned. And with Cloud Lake, which is the best of Teradata that people have traditionally loved but packaged in a cloud-native environment, we have the ability to take all of the attributes of Teradata and apply them into the cloud space.”

Fending off perceptions

Still, Teradata has had to fend off some perceptions that its systems can cost a lot, which Martin Willcox, Teradata’s international vice-president for analytics and architecture, said is a misperception.

“The TCO [total cost of ownership] of large enterprise platforms is brutally hard to measure accurately in large and complex organisations, and most don’t measure it properly,” Willcox said. “They take shortcuts, and what they typically look at is acquisition costs, rather than TCO and lifetime costs.”

Willcox said the cost of storage tends to be prevalent in TCO calculations for data warehouses and analytics. However, that metric also reflects the thinking that data is an asset rather than a product that drives business outcomes.

“The datacentre industry needs to have a serious conversation about efficiency. We’re using computing resources efficiently and not just throwing hardware at a software problem, which is an approach that a lot of our competitors take”
Martin Willcox, Teradata

“If the thinking is that we’re going to collect as much data as possible without really thinking about how we’re going to use it and how it’s going to be consumed, then it’s a cost of doing business.”

A more meaningful measurement, Willcox suggested, is cost per query in analytics environments, which tend to be read intensive. “How often are we reading the data and how much does it cost us each time we read the data?”

To that, he claimed that the cost per query for Teradata is very competitive: “If you look at two platforms, say, with a 1.5 times difference in cost to acquire, and if one of those platforms is supporting 10 or 100 times more queries than the other, then the cost per query is completely different.”

Another way to keep costs in check is to use different storage tiers. “The broader point going forward around TCO comes back to this idea of flexibility in the storage tier that we choose to maintain data on, so we can choose lower-cost storage tiers for data products with lower value, but also flexibility in compute resources.

“Do I have to acquire compute resources ahead of time on a capacity-based model? Can I flex them up to meet peaks and troughs in demand? Both of those things are baked into the Cloud Lake architecture,” Willcox said.

Cost controls are also built into Teradata’s cloud offerings to help organisations avoid bill shocks, along with more efficient use of compute resources to reduce TCO and contribute to environmental sustainability.

Willcox said: “The global power consumption of datacentres is getting higher, and the industry needs to have a serious conversation about efficiency. We’re using computing resources efficiently and not just throwing hardware at a software problem, which is an approach that a lot of our competitors take.”

Read more about analytics and data warehouses in APAC

  • Alteryx’s senior vice-president for Asia-Pacific addresses some misconceptions about data analytics and what the company is doing to stake a bigger claim on the market.
  • At Singapore’s DBS Bank, the use of data analytics has been instrumental in its efforts to make digital banking transactions as seamless as possible.
  • Databricks is making it easier for organisations to adopt a data lakehouse architecture through support for industry-specific file formats, data sharing and streaming processing, among other areas.
  • Globe Telecom has moved its on-premise data warehouse to Snowflake to address scalability challenges and improve customer experience.

Read more on Data warehousing

CIO
Security
Networking
Data Center
Data Management
Close