Skip to content

Latest commit

 

History

History
102 lines (75 loc) · 5.25 KB

File metadata and controls

102 lines (75 loc) · 5.25 KB

BigQuery ML and Vertex AI Pipeline

This blueprint provides the necessary infrastructure to create a complete development environment for building and deploying machine learning models using BigQuery ML and Vertex AI. With this blueprint, you can deploy your models to a Vertex AI endpoint or use them within BigQuery ML.

This is the high-level diagram:

High-level diagram

It also includes the IAM wiring needed to make such scenarios work. Regional resources are used in this example, but the same logic applies to 'dual regional', 'multi regional', or 'global' resources.

The example is designed to match real-world use cases with a minimum amount of resources and be used as a starting point for your scenario.

Managed resources and services

This sample creates several distinct groups of resources:

  • Networking
    • VPC network
    • Subnet
    • Firewall rules for SSH access via IAP and open communication within the VPC
    • Cloud Nat
  • IAM
    • Vertex AI workbench service account
    • Vertex AI pipeline service account
  • Storage
    • GCS bucket
    • Bigquery dataset

Customization

Virtual Private Cloud (VPC) design

As is often the case in real-world configurations, this blueprint accepts an existing Shared-VPC via the vpc_config variable as input.

Customer Managed Encryption Keys

As is often the case in real-world configurations, this blueprint accepts as input existing Cloud KMS keys to encrypt resources via the service_encryption_keys variable.

Demo

In the demo folder, you can find an example of creating a Vertex AI pipeline from a publicly available dataset and deploying the model to be used from a Vertex AI managed endpoint or from within Bigquery.

To run the demo:

  • Connect to the Vertex AI workbench instance
  • Clone this repository
  • Run the and run demo/bmql_pipeline.ipynb Jupyter Notebook.

Files

name description modules resources
datastorage.tf Datastorage resources. bigquery-dataset · gcs
main.tf Core resources. project
outputs.tf Output variables.
variables.tf Terraform variables.
versions.tf Version pins.
vertex.tf Vertex resources. iam-service-account google_notebooks_instance · google_vertex_ai_metadata_store
vpc.tf VPC resources. net-cloudnat · net-vpc · net-vpc-firewall google_project_iam_member

Variables

name description type required default
prefix Prefix used for resource names. string
project_id Project id references existing project if project_create is null. string
location The location where resources will be deployed. string "US"
project_create Provide values if project creation is needed, use existing project if null. Parent format: folders/folder_id or organizations/org_id. object({…}) null
region The region where resources will be deployed. string "us-central1"
service_encryption_keys Cloud KMS to use to encrypt different services. The key location should match the service region. object({…}) null
vpc_config Shared VPC network configurations to use. If null networks will be created in projects with pre-configured values. object({…}) null

Outputs

name description sensitive
bucket GCS Bucket URL.
dataset GCS Bucket URL.
notebook Vertex AI notebook details.
project Project id.
service-account-vertex Service account to be used for Vertex AI pipelines.
vertex-ai-metadata-store Vertex AI Metadata Store ID.
vpc VPC Network.

Test

module "test" {
  source = "./fabric/blueprints/data-solutions/bq-ml/"
  project_create = {
    billing_account_id = "123456-123456-123456"
    parent             = "folders/12345678"
  }
  project_id = "project-1"
  prefix     = "prefix"
}

# tftest modules=9 resources=50