By: Filips Nastins, Developer at Funnel
Every growing product reaches an inflection point - when the thing that once powered your success starts to slow you down. For Funnel, that moment came with our data connectors.
In the early days, our mission was clear: help marketers collect and prepare data from external platforms for analysis. We called it the Funnel Data Hub, a single place to unify marketing data. We started small, with just six API integrations. But as digital marketing exploded, so did our connector library - soon exceeding 600 integrations and becoming one of Funnel’s biggest assets.
As our connector library grew, so did the support and maintenance needs. Each new connector meant more APIs to monitor, more fields to update, and more maintenance to keep things running. We found ourselves spending more time keeping connectors alive than pushing the product forward.
At the same time, Funnel was expanding beyond the Data Hub into a Marketing Intelligence suite - adding products for Measurement, Reporting, Planning, and Activation. These new products demanded richer, more contextual data, and new teams wanted to build their own connectors for specialized use cases.
It was clear we needed a new approach. One that could scale with the company, empower more teams to contribute, and turn connector development into a shared, flexible capability rather than a bottleneck.
That realization led to the creation of the Connector Platform and, with it, the Declarative Connectors framework, a low-code and no-code foundation designed to make connector development faster, easier, and collaborative across teams.
At Funnel, teams have the freedom to choose the tools and approaches that best fit their problems. Early on, our connector design was intentionally flexible and unopinionated. That autonomy allowed small teams to move fast, experiment, and adapt as we learned the marketing data domain, fueling the rapid growth of our connector library and helping Funnel become a leading provider of marketing data integrations.
Each connector is a standalone Python app with a few entry points for tasks like validating credentials, fetching account details, and downloading data. The key entry point is download, which retrieves and transforms data from external APIs and stores it in S3. The implementation of the download entry point varies across connectors, ranging from a small, standalone Python function to more modular designs.
Simple Connector Architecture
As the connector library grew, the lack of constraints started to show. With multiple teams building connectors independently, patterns and tools were often reinvented. Each connector became a snowflake - flexible, but built in its own way. Inconsistencies between connectors made maintenance harder and slowed down feature development. Shared improvements couldn’t be rolled out across all connectors, and developers faced high cognitive load when switching between different implementations.
To scale development, we created a Connector Platform team that provides reusable integration building blocks. On this foundation, the Declarative connector framework was built. Instead of repetitive Python code, developers and domain experts now define what data a connector outputs in a YAML file, and the framework handles how to fetch and transform it.
This declarative model enables more Funnel teams, particularly those closest to new Marketing Intelligence use cases, to self-serve their integration needs, removing bottlenecks and ensuring consistency. Unlike shared code libraries, the Declarative framework offers built-in orchestration and a standardized data flow, allowing domain experts to focus purely on data description. This approach could later enable customer-created connectors via a UI.
The declarative approach is ideal for generative AI due to the explicit schema and clear component boundaries. It can bootstrap the connector YAML from API documentation and act as a co-pilot during development.
At Funnel, our strategy is to start simple, deploy often, and iterate based on real-world feedback rather than seeking a perfect upfront solution.
We began developing the Declarative framework in early 2025 by converting one simple connector, then moved to progressively more complex ones. This resulted in the conversion of over 80 connectors and more than 180 framework releases (almost one per workday).
To avoid dead ends, we leveraged existing connector knowledge and codebase, continuously testing the framework's limitations. We iterated daily in small, safe increments, distilling common patterns into a coherent declarative structure using established open-source libraries.
Connector YAML is defined with Pydantic models. Pydantic provides input data validation and generates JSON schema from Python classes. The JSON schema enables code completion, error highlighting, and documentation in the IDE using a YAML language server.
The connector YAML file allows configuration of three main components - client, datasets, and tables - and allows running custom Python code to handle platform-specific edge cases. We chose YAML for its readability, rich editor and JSON schema support.
High-Level Declarative Connector Components
Client: Defines interaction with the external API, including authentication, error handling, rate limiting, and retries.
Datasets: Collections of data from API endpoints. Their lifecycle involves three steps:
Tables: The final data products, built from one or more datasets. They define the data schema for downstream Funnel applications and are represented as Polars DataFrames for easy joining and transformation.
Declarative Connector Building Blocks
The Declarative Connector framework balances no-code and low-code approaches. When the declarative building blocks don’t fully cover a platform’s use cases, they can be replaced with custom Python implementations, as long as they follow the same interface. For additional flexibility, the connector YAML also supports embedding small Python code expressions directly in the template using Jinja.
Programming language and framework choices come down not only to syntax and features but also to the developer community, surrounding tools, and developer experience. Our goal for the connector development experience is to be approachable, productive, and reliable, focusing on:
Getting started involves installing the Python library for the connector runtime and CLI tools, plus a custom VSCode extension that configures YAML support and IDE autocompletion.
The CLI tools enable formatting, linting, and validation of YAML, as well as running connector parts in isolation (making HTTP requests, loading datasets, or executing the download entrypoint) without the need to deploy to a staging environment. An MCP server exposes these tools for AI agent-enhanced development.
For documentation, we adopted the systematic Diátaxis approach (tutorials, how-to guides, explanations, and references), storing it with source code and publishing via Material for MkDocs. The core documentation is a YAML reference automatically generated from Pydantic model docstrings.
To manage frequent framework updates and breaking changes, we developed a YAML migration framework, similar to a database schema migration tool. Every breaking change comes with a migration script that is automatically applied when a connector upgrades to a new version, turning upgrades from a manual, error-prone task into a deterministic automated process.
For connector testing, we focus on validating the correctness of API requests and data output. The framework provides a snapshot testing tool based on VCR.py. Tests run in “record” mode to capture and anonymize real HTTP interactions and data output, and “replay” mode to compare subsequent runs against stored snapshots. Snapshot tests serve as regression checks, data examples for documentation, and a safety net for the framework itself.
Data integration will always be at the core of Funnel, not just as a standalone product, but as the backbone that powers our entire Marketing Intelligence suite. Connector development isn’t going away either. It’s evolving into a more accessible, data-centric discipline that emphasizes quality, consistency, and semantic richness over repetitive coding. This shift frees developers to focus on what really matters: designing meaningful datasets, defining relationships, and shaping data models that unlock new marketing insights.
To make connector development accessible to more people across Funnel, we’re continuing to invest heavily in developer experience. That means better tooling, stronger observability and monitoring, and new ways to collaborate on data. We’re also exploring a Connector Builder UI - a step toward empowering customers to create and manage their own custom connectors with minimal technical effort.
Looking further ahead, our vision is to make connector data products even smarter, enriching them with deeper semantics so that data isn’t just collected and transformed, but also understood by the systems and applications that depend on it.