Skip to main content

Select your location

Google Next Data Analytics

Unlocking the Power of Data with Google Cloud

Big data offers tremendous potential, but tapping its full value depends on how quickly and efficiently you can transform that data into insights that deliver a competitive advantage. Exploring ways to achieve that goal was the focus of many of the “Data Analytics” focused breakout sessions at Google Cloud Next ’20.

One of those sessions, “Don’t Sweat the Big Stuff: Make it Google’s Problem, featured a deep dive discussion by Daniel Lewis, senior data scientist at Geotab. Lewis described how Google Cloud’s rich ecosystem of tools was a major contributor to Geotab’s rapid growth and ongoing success.

Geotab’s telematics solutions produce a variety of aggregate datasets gathered from commercial vehicle telematics devices. Insights from that data enable companies to improve fleet safety, reduce fuel consumption, and optimize operations.

By helping to remove many of the time-intensive processing and orchestration tasks inherent in big data analytics, Google BigQuery allows Geotab to focus on what it does best ― building models, driving insights, and delivering value.

Geotab leverages Google BigQuery on multiple levels to accelerate analytical insights, streaming more than three billion raw data records into BigQuery every day with more than 1 petabyte of data stored in the BigQuery data warehouse platform. As part of his breakout session, Lewis highlighted several examples of how Google Cloud’s suite of tools is helping the company solve some of its most difficult big data challenges, including those listed below.

Processing Data at Scale

Billions of data records pass through Geotab’s servers every day, including GPS data, engine data, and accelerometer data. The speed at which Geotab’s scientists can access that volume of data gives the company an important competitive advantage.

Thanks to high efficiency, high-throughput database services integrated into Google Cloud Platform components, only five to ten seconds elapse between the time a sensor collects data and the time it’s available for analysis in BigQuery. The platform’s self-scaling features allow Geotab to deliver high-speed analysis of large datasets (even at petabyte scales) without the need to invest in onsite infrastructure.

Our Key Takeaway

As data warehouses grow in size, the ability to efficiently process and aggregate will become increasingly important. While separating compute from storage is the right approach, the ability to move data across services is especially critical. This is exactly what Google Cloud has mastered — the ability to take data from BigQuery and move it into Dataflow for processing and then rapidly save the results back to BigQuery.

Geotab’s experience with data processing proves this point well with its ability to run DBSCAN clustering for all its streets across the entire world in less than ten minutes. Kin + Carta can help companies like Geotab by liberating the power of data to drive growth through improved competitiveness and operational agility.

More Than a Data Warehouse

Some of the most exciting discoveries from Geotab datasets come from BigQuery’s geospatial functions. Thanks to the BigQuery Geographic Information Systems (GIS) feature, Geotab data scientists can power through enormous geographic datasets in near real-time, which opens the possibility of rich data insights from traffic analysis to infrastructure planning.

Geotab has more than two million GPS devices in operation, tracking more than 85 million daily data points. By tracking a truck’s position and movement, Geotab can predict road congestion and intersection wait times for particular routes during certain times of the day, helping to optimize deliveries, vehicles, and drivers’ schedules.

Our Key Takeaway

Adopting BigQuery is not just a lift-and-shift for a traditional data warehouse; it is a reimagination of and commitment to innovation and creativity.

That’s also an area where Kin + Carta can add substantial value. In addition to our expertise in migrating legacy data warehouses to the cloud, we can help optimize the full value of data by leveraging BigQuery’s built-in ML and GIS features to foster meaningful insights. Kin + Carta is committed to helping discover value in data while BigQuery is committed to hosting a platform for continued and scalable success.

Although the cost savings and added peace of mind with advanced data governance and security protocols are enough to switch to the cloud, we believe this is the first step in the journey. If your goal is to leverage BigQuery ML to perform cluster analysis or use BigQuery GIS to find trends in customer data, the opportunities have never been so close to where the data resides.

Simplifying the Complex

Another tool that is paying big dividends for Geotab is BigQuery ML because it allows you to bring the model to the data, reducing complexity and the steps required to start the process and, once again, improving speed and efficiency.

BigQuery identifies resource requirements needed for each query to finish quickly and efficiently and provides those resources to meet the demand. Once the workload has been completed, BigQuery reallocates those resources to other projects and other users.

With BigQuery’s serverless model, processing is automatically distributed over many machines working in parallel. The result: data engineers focus less on infrastructure and more on provisioning servers and gaining insights from data.

Our Key Takeaway

The challenges of managing a massive ecosystem (like Geotab’s) outside of the cloud would leave resources exhausted and without capacity to analyze and gain insights from the data. As Lewis noted in the session, traditional data warehouses often require extensive resource provisioning and custom monitoring to track reliability and complex ETL pipelines.

Rather than focusing on growing resources or data loading efforts in the environment, our goal at Kin +Carta is to help our clients focus on growing their businesses by using advanced analytics and adopting managed services in the cloud. Relieving engineers from managing infrastructure will allow for more emphasis on enhancing existing features and developing new components.

Easing the “Big Stuff” Burden

For companies like Geotab, driving insights from data is their lifeblood. Extracting those insights efficiently requires a fast, reliable, scalable database infrastructure, and a partner to make data “work.”

At Kin + Carta we recognize that many organizations acknowledge data as an asset, but don’t yet have the tools or expertise to optimize its full potential. That’s an area where we can provide the most significant acceleration for our clients. With our competency in data analytics and partnership with Google Cloud, we continue to raise the bar on decision intelligence — giving users the fast insights they want without having to worry about the complexities of big data management, storage, and orchestration.

Our experts at Kin + Carta took a deep dive into today’s data challenges and highlighting many of the new capabilities available in next-generation business intelligence tools and systems.

Download our Whitepaper “From Storage to Story: Delivering New Value by Unlocking the Power of Data”

Article Authors:

Ryan Fitch

Want to know more?

Click here to get in touch