The Mission: Architect the Ground Truth
At Hilbert’s AI, we shorten "months" to "minutes." We build the AI Growth Engine that Fortune 500 enterprises and iconic B2C brands use to navigate their revenue drivers in real-time.
We are expanding our deployment model. Whether we are ingesting data into our high-performance Clickhouse managed warehouse or implementing Warehouse-Native logic directly within a customer’s Snowflake or BigQuery, the goal is the same: providing our AI Agents with a perfect, high-fidelity view of the business. We need a Forward Deployed Data Engineer who can architect these bridges in any environment.
The Hard Problem: Architectural Hybridity & Semantic Clarity
The challenge isn't just moving rows; it’s ensuring "Reasoning Consistency." You are responsible for making sure our AI produces the same high-quality insights regardless of where the data resides.
Our Current Hurdles:
The Dual-Track Challenge: We are building a system that can either pipe data into our Clickhouse stack using Dagster and Airbyte, or execute "Zero-Extract" transformations natively in the customer's warehouse. You’ll need to master both worlds.
The AI Semantic Layer: For customers with custom datasets that don't fit our canonical models, you will build the "Translator." You’ll define the metadata and business logic so our agentic flows can "understand" custom columns without hallucination.
Agent-Assisted Integration: We are building a Data Discovery Agent to automate the "messy" part of your job. You will prompt and partner with this agent to analyze customer data, suggest mapping alternatives, and generate the first version of pipelines automatically, slashing integration time by 50%.
Who You Are (The Profile)
The Pragmatic Polyglot: You are equally comfortable optimizing a Clickhouse query as you are writing native Snowpark (Snowflake) or BigQuery SQL.
The Data Ontologist: You understand that an AI needs more than just a table; it needs context. You have the discipline to define the "meaning" behind the data.
The "Translator": You can earn technical trust with a customer's Data Architect in minutes, extracting the logic of custom tables and mapping them to Hilbert’s reasoning engine.
The "Pilot" Mindset: You are excited to use and improve AI agents that handle the heavy lifting of data discovery and pipeline scaffolding.
What You’ll Own
Hybrid Integration: Own the technical lifecycle of new customers, choosing and implementing the best deployment path (Managed Clickhouse vs. Warehouse-Native).
Accelerated Onboarding: Use Hilbert's internal Discovery Agent to create reports and suggest mappings, moving from raw data to a working v1 pipeline in record time.
AI Semantic Layer Construction: Architect the semantic definitions for custom enterprise data, ensuring our agentic conversation engine has the "Ground Truth" for every query.
Canonical Modeling: Transform diverse source data into Hilbert’s unified growth models to power our generic ML systems.
Enterprise Deployment: Act as the lead technical resource for high-stakes enterprise implementations, ensuring our stack is "packaged" and performant in their specific infra.
Bonus Points
Deep experience with dbt for warehouse-native modeling.
Experience with Semantic Layer frameworks (Cube, MetricQL, etc.).
Background in E-commerce/Retail (understanding LTV, CAC, and Attribution).
Having built an agentic workflow before.
Location
San Francisco, with occasional travel for team meets, off-sites or customer engagements.
Compensation
Competitive salary + equity package, commensurate with experience.
Performance-based bonuses tied to project milestones and customer impact.
The Hiring Journey
Short form → Intro Call → Technical working session → Team conversations → Offer
Fast, human, no bureaucracy.