At Kin + Carta, we deliver solutions that enable the creation of data products and intelligent experiences. As a certified Google Cloud Partner we utilize Google’s suite of Data Analytics and AI/ML solutions to build systems that realize the highest potential of data lifecycle value across the greatest range of stakeholders.
We recently held a broad industry roundtable with IT executives examining the status of data-driven transformation. Four key themes included the need to:
- Reduce Barriers for Adoption
- Enable Data-as-a-Product
- Build with Governance
- Sustain Momentum
Following these key themes are core opportunities to leverage Google Cloud to accelerate data transformation into insights.
Reducing barriers for adoption
From engineering to data science, bringing familiar toolchains and practices closer to the center of data gravity for users may yield greater willingness to experiment, unlock, and realize value from data in the cloud.
Foundational to Google Cloud’s Data Cloud is BigQuery, the serverless, fully-managed data warehouse with low-latency petabyte-scale querying power for data analysis. With its architecture decoupling storage and compute, BigQuery serves as a landing zone for processing and analysis. There's a robust ecosystem to support integration patterns, including Data Fusion, a low/no-code ETL solution with a vast range of transformation plug-ins to simplify building data pipelines. The BigQuery Data Transfer Service brings SaaS and enterprise data stores into BigQuery quickly and reliably without writing a single line of code. Datastream provides continuous, real-time replication for CDC-based workload suitable for downstream analysis and reporting in BigQuery.
For baseline model development, BigQuery ML allows data analysts to leverage native SQL skills to generate ML and evaluate models directly from BigQuery datasets while reducing data movement, lifecycles and security issues by managing training and predicting jobs in a single interface.
Accelerating data science time-to-value is supported by Google Cloud AutoML (integrated with the Vertex AI platform), as well as through Kin + Carta’s Octain solution. Both let you rapidly develop models with relatively small data sets to prove use-case impact and begin further iteration against larger datasets to support deployment in production. Google Cloud’s integrated AI/ML products are built atop Google’s world-class AI capabilities and offer pretrained models for common-use cases and let you rapidly customize to your goals.
For data activation and reporting, Looker’s self-service BI platform serves the spectrum of data analysts and business users. Its powerful modeling language removes data movement concerns to ensure quality controls and simplify adoption. Along with a marketplace of prebuilt data connectors and plugins, Looker is designed to readily spin up visualizations and dashboards for common use-cases and is accessible from nearly any environment.
Enable Data-as-a-Product
Data-led transformation means rethinking the stakeholder relationship to the data lifecycle. The Data Mesh architecture pattern is a response to new challenges of scale, quality, access, and activation with four core principles: domain-driven ownership of data, data-as-a-product, self-service data platform, and federated computational governance.
When thinking about data-as-a-product, it’s important to extend quality controls and flexibility across the boundaries of data owners. Domain-driven stewardship provides the confidence, flexibility and velocity to generate new use-cases for activating data at scale.
Looker’s modern analytics capabilities support data mesh principles and data product-thinking via user-driven data activation and federated access patterns. It uses a modeling layer that abstracts underlying data connections from querying and business logic, along with robust governance controls. It’s designed to empower self-service analysis and experimentation across various data sources with guardrails to ensure a single version of the truth across data platforms. Looker further supports data products through its API and extension framework, embedding visualizations directly into applications, triggering business workflows, and exposing insights where consumer-endpoints can be best targeted.
To further enable data-as-a-product, you can tap into Analytics Hub to securely exchange data assets across your organization and meet quality, security and cost controls. This service lets you curate libraries of internal and external assets to deliver data at scale to precise audiences. Being built on the power of BigQuery with storage and compute decoupled, publishers can share datasets with multiple subscribers without duplication, securely federating access to trusted data sets and APIs for domain owners to capitalize on data marketplaces.
Build with Governance
Google Cloud services such as Data Fusion, BigQuery, Analytics Hub and Dataplex all readily support best-practices for designing a data mesh architecture. Foundational concerns including governance, IAM, security and auditing, are supported at every level of implementation with all data handling in Google Cloud encrypted by default, both in transit and at rest.
Dataplex suite helps organizations discover, manage, monitor, and govern data across disparate data stores with consistent controls, providing access to trusted data and powering analytics at scale. Dataplex lets you automatically capture data lineage to understand, trace dependencies, and effectively troubleshoot data issues. Dataplex also integrates with BigLake to simplify asset discovery and domain policy enforcement when querying with OSS engines or BigQuery.
Dataplex is also integrated with Data Catalog for technical and business metadata management and leverages Data Loss Prevention to automatically detect, obfuscate and de-identify sensitive information in your data, minimizing the risk of compliance violations.
Control over data access and analysis in BigQuery is possible with features such as authorized views and column-level and row-level access policies for secure granular access and provenance controls with key stakeholders.
Robust auditing and metadata are standard in Data Fusion’s integrations to support governance and compliance alongside the partner ecosystem. It extends data lineage controls across on-prem and hybrid cloud environments, and can improve workload migration velocity through improved visibility and standardization.
Sustain momentum
Data-led transformation requires the tools, people and processes to manage change at the rate it can be metabolized. This requires strong cloud foundations to support workload migrations, enable automation and increase the scale and velocity of change.Kin + Carta’s Data Launchpad leverages the Google Cloud Cloud Adoption Framework to guide enterprise cloud readiness and evaluate opportunities for database, analytics and data application workload migration. We leverage Google's Cloud Foundation Toolkit with Terraform to provision infrastructure as code (IaC) for repeatable pipeline deployments, managing velocity of data-driven applications at scale. This approach ensures the right workloads and migration patterns to lower risk of configuration changes, and it increases the rate of return when ultimately implementing a data migration program on Google Cloud.
The combined abilities of Google's Data Cloud create an interoperable and secure framework of open source standards and extensible APIs that can harness data in hybrid-cloud environments and accelerate the momentum to turn data into insights at varying stages of data-led transformation initiatives.
Ready to make data work harder for your organization?
Kin + Carta’s solutions leveraging Google Cloud across the data value lifecycle can smooth the cloud adoption curve and accelerate time to value by building data products and intelligent experiences at scale.Learn how Kin + Carta can help you harness the power of data
Get in touch