ThoughtSpot ramps up analytics code deployment with dbt

Systems need data, organisations need data and, essentially, above all, people need data.

But basic truisms aside, systems, organisations and people need data to be created, controlled, managed and processed today at a far higher rate of throughput on its journey towards analytics and the insight that we will derive from that intelligence process.

In the modern age of digital business, we don’t need to be told that data in the lifeblood of commerce, that part is understood. What don’t even need to be told that the mechanics are in place to deliver personalised, actionable insights to make data-driven decisions.

What we need to know if how we can get data analytics, where we can channel it into live workflows and currently deployed applications and services… and how fast we can use it. The end game here is all about giving users self-service live analytics,

But before these users even touch a self-service live analytics platform the data must be appropriately modeled by analytics engineers. To keep up with intensifying pressure, many analytics engineering teams are throwing [old notions of] rigid data warehouse models out the window.

Build software, build data

Instead, they’re managing complex data work with the same principles used to build software – prioritising extensibility, programmability and collaboration to scale up quickly.

Known for its prowess in the live data analytics space, ThoughtSpot says it is now making it easier to achieve these aims with a new integration to dbt – a transformation workflow that lets teams collaboratively deploy analytics code following software engineering best practices like modularity, portability, Continuous Integration & Continuous Deployment (CI/CD) and documentation.

The integration is designed to allow analytics engineers to seamlessly integrate dbt models into ThoughtSpot’s analytics layer.

The T in ELT

The company suggests that today, we can say that dbt has become synonymous with the “T” in ELT (Extract, Load, Transform) i.e. the processes involved in moving, shifting and transferring raw data from a source server location such as a data warehouse or data lake and getting it onto its target server where it will be transformed into a state where its is useful and functional for downstream users.

In terms of use, dbt lets analysts and engineers collaborate on transformation workflows using just their shared knowledge of SQL.

By bringing this framework to ThoughtSpot, now any data analyst or engineer can work faster and more efficiently by detecting table relationships from existing dbt models; generating scriptable ThoughtSpot Modeling Language (TML) representations of dbt models; automatically generating worksheet metadata models in ThoughtSpot; importing documented column and model descriptions to help users better understand their data; and enabling users to start searching and building Liveboards on their dbt-modeled data

“Together, ThoughtSpot and dbt deliver increased analytics engineering productivity and accelerated timelines to launch strategic analytics use cases to the rest of your business. If you’re ready to start seamlessly integrating your dbt models with ThoughtSpot and opening up a whole new world of data to more business users, you can get started with a free trial today,” said the company, in a press statement.

ThoughtSpot customers on dbt Cloud or open source dbt can now access the integration directly from the newly enhanced Data Workspace, a ThoughtSpot service billed as a mission control centre for all things live analytics. From there, users can connect to a cloud data platform, create searchable views with custom SQL commands, then ship an analytics-ready dbt data model to business users throughout the organisation.

It’s a completely scriptable and API-friendly experience fit for data and analytics pros.

Image credits: ThoughtSpot

 

 

CIO
Security
Networking
Data Center
Data Management
Close