Daco
DocsPricingStudio
Dacodaco

A code-first data catalog

Data products defined in code, stored in your repo, and synced to your catalog on every push.

Open source

Get started with the Daco CLI

Define and validate data products from your terminal. Everything lives as YAML in your repo, following the open OpenDPI spec.

Read the tutorial →
macOS/Linux
$ brew install dacolabs/tap/daco
terminal

Define once, use everywhere

Write your schema once, then generate code for PySpark, dbt, Pydantic, Go, and more. Run it locally, in CI, or on deploy. Your definitions stay in sync across your entire stack.

terminal
$ daco ports translate --format pyspark
Inputdataproduct.yaml
ports:
  daily_metrics:
    schema:
      type: object
      properties:
        customer_id:
          type: string
        date:
          type: string
          format: date
        total_orders:
          type: integer
        revenue:
          type: number
Outputschema.py
from pyspark.sql.types import *

daily_metrics_schema = StructType([
    StructField("customer_id", StringType()),
    StructField("date", DateType()),
    StructField("total_orders", IntegerType()),
    StructField("revenue", DoubleType()),
])

Simple, predictable pricing

No surprises. Pick a plan that fits your team and scale when you need to.

Free

€0

For individuals exploring Daco.

  • 2 users
  • 3 data products
  • 100 API requests / day
Get started
Most popular

Pro

€20/ seat / month

For teams managing data products together.

  • Per-seat pricing
  • 25 data products
  • 1,000 API requests / day
Get started

Enterprise

Custom

For organizations that need full control.

  • Unlimited seats
  • Unlimited data products
  • Unlimited API requests
Contact us

Latest from the Blog

Tutorials, updates, and thoughts on data cataloging.

Stay up to date

Get notified when we publish new blog posts.

Join the Community

Ask questions, share ideas, or just follow along.