*Overview *Read all the information about this opportunity carefully, then use the application button below to send your CV and application.We are looking for a senior-level Data Platform Engineer who will own the entire data lifecycle"”from raw ingestion to clean, standardized tables to production-grade KPI calculation and API exposure. This is a true "full-stack data" role: you will deeply understand every data source, build and maintain the PostgreSQL core, implement all business-critical metrics and calculations, and expose them reliably via FastAPI (or GraphQL) endpoints consumed by our applications and dashboards. This is a high-ownership, hands-on engineering position with end-to-end responsibility for data correctness, performance, and usability. Key Responsibilities*Multi-Source Data Ingestion & Standardization *Proactively request, receive, and integrate data from numerous external and internal sources in varied formats (CSV, JSON, Excel, APIs, database dumps, etc.). Build robust, automated Python-based ingestion pipelines that parse, validate, cleanse, enrich, and standardize incoming data. Transform heterogeneous sources into a single, consistent schema in PostgreSQL with full auditability and error handling. Compliance with FAIR(T) principles, and familiarity of the ISO-8000 standard.*Deep Data Ownership & PostgreSQL Mastery *Design, evolve, and performance-tune PostgreSQL schemas, tables, indexes, materialized views, and complex SQL functions. Maintain referential integrity, slowly changing dimensions, and reference/lookup tables. Develop intimate, expert-level knowledge of every field, entity, and business meaning in the database. Own database migrations, versioning, documentation, and query optimization.*KPI & Analytical Metrics Engine *Translate business requirements into precise, reproducible KPI calculations (revenue, utilization, compliance ratios, performance scores, etc.). Implement all KPIs as reusable, version-controlled logic in both SQL (stored functions/views) and Python modules. Guarantee 100% consistency between metrics shown in dashboards, APIs, and reports. Build a central, well-documented library of metrics that the entire company trusts as the single source of truth.*API-First Data Exposure (FastAPI / GraphQL) *Design and maintain a clean, performant API layer using FastAPI (or GraphQL) that exposes KPIs, aggregated datasets, and granular queries to frontend applications, dashboards, and third-party consumers. Create parameterized, secure, and cache-friendly endpoints with clear contracts and OpenAPI/Schema documentation. Optimize query performance for high-concurrency API traffic and large reporting workloads.*Data Modeling & Long-Term Integrity *Define and enforce canonical data models for core entities (assets, sites, transactions, customers, events, classifications, etc.). Keep the data warehouse clean, de-duplicated, and future-proof as new sources and features are added. Collaborate closely with engineering, product, and analytics teams to evolve models without breaking existing consumers.*Required Skills & Experience *- Bachelors / Masters degree in related field.- 7+ years of hands-on data engineering or platform engineering experience.- Expert-level Python (pandas, pydantic, clean modular design, testing).- Expert-level PostgreSQL (complex schema design, query optimization, proceduralSQL, performance tuning).- Proven track record of ingesting and harmonizing messy, multi-format data from many sources.- Experience building and maintaining a company-wide KPI/metrics framework that is treated as the source of truth.- Strong experience exposing data via production APIs with FastAPI or GraphQL.- Mastery of relational and dimensional modeling concepts.- Ability to work autonomously, communicate complex data concepts clearly, and drive projects to completion.*Nice-to-Have (Preferred) *- Previous work in operational analytics, compliance, fintech, or SaaS environments.- Experience integrating data layers directly into web applications.- Familiarity with BI tools (Tableau, Power BI, Qlik, etc...).- Exposure to multi-tenant data architectures.*What You'll Get *- Full ownership of the data platform that powers every analytical and product decision in the company.- Direct, measurable impact on product performance, client reporting, and business outcomes.- A role that combines deep data engineering, metrics definition, and API development with no silos.- High visibility and influence on architecture and strategy as the company scales.Job Type: Full-timePay: $100,000.00 - $130,000.00 per yearBenefits:* 401(k)* 401(k) matching* Health insurance* Paid time offWork Location: Hybrid remote in Bridgewater, NJ 08807
Job Title
Data Platform Engineer (ETL, Data Modeling, KPI & API Layer)