Member-only story

Insurance Lakehouse using Databricks & AWS

VerticalServe Blogs
3 min readJan 14, 2025

--

Designing a Databricks Lakehouse on AWS for a Property and Casualty Insurance CompanyA Databricks Lakehouse on AWS provides a powerful platform for Property and Casualty (P&C) insurance companies to manage their data, analytics, and AI workloads. This design blog post outlines a comprehensive approach to setting up a Lakehouse architecture using Databricks on AWS, focusing on key aspects such as workspace environments, data integration, data layout, workflow creation, security, governance, and observability.

Workspace Environments

  • Development (DEV): For experimentation, building notebooks, and testing integrations.
  • User Acceptance Testing (UAT): To validate integrations and workflows before production.
  • Production (PROD): For running production workloads and accessing enterprise dashboards.

Each workspace is isolated with its own compute clusters, job workflows, and permissions. AWS account separation or resource tagging can be used for additional governance.

Each workspace will be configured using Databricks on AWS, leveraging the pay-as-you-go model for cost-effectiveness

This setup allows for seamless progression of code and data pipelines from development to production.

Data Integration

Sources of Data:

  • Policy Subsystem: Policy issuance, endorsements, renewals…

--

--

No responses yet