hackajob is partnering with Oritain to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Principal Data Engineer
We're building a modern technology platform that applies data engineering and AI to a problem with real global consequence: proving trust and transparency in supply chains. Our work helps protect people from exploitation, gives brands and producers confidence in what they sell, and supports more sustainable choices for the planet.
From a tech perspective, this is greenfield - designing scalable, cloud-native systems and AI-driven products that turn complex data into evidence customers can stand behind. You'll be working on hard, meaningful problems, with real ownership, close collaboration with product and leadership, and the opportunity to shape both the technology and the impact it delivers as we scale.
We're a global team across London, Auckland, Singapore, and Washington D.C.
Key Responsibilities
Define and own the technical strategy and architecture for our entire data platform: ingestion, storage, processing, governance, and consumption
Design and implement scalable, reliable ETL/ELT pipelines handling complex scientific datasets, supply chain inputs, and business data
Lead the design of canonical data models for our data warehouse and operational stores, ensuring quality, consistency, and integrity
Define and maintain a single source of truth for clients, suppliers, and transactions across systems (Salesforce, NetSuite, internal databases)
Implement data governance, security policies, automated quality checks, and robust monitoring across all pipelines
Work with the infrastructure team to provision Azure data resources using Terraform or equivalent IaC tooling
Partner with our Science teams to ensure accurate ingestion and translation of raw scientific data
Mentor engineers across the team on best practices for building and consuming data services
Skills & Experience
Essential
7+ years in data engineering, with significant time in a Principal, Lead, or Architect role defining data strategy
Deep, practical experience with Databricks — architecture and implementation
Strong hands-on experience across the Azure data stack: Data Factory, Data Lake, Synapse Analytics, Azure SQL/Cosmos DB
Expert-level Python and SQL, with a strong focus on clean, tested, performant data processing code
Proven track record designing and implementing scalable data warehouses and data marts
Experience with workflow orchestration, CI/CD for data pipelines, and IaC (Terraform)
Desirable
Experience with scientific, geospatial, or time-series data
Background in governance or compliance environments
Familiarity with streaming data technologies
The Recruitment Process
We like to keep our recruitment process smooth and efficient, giving you the opportunity to showcase your skillset in a comfortable environment. The process for this position is a 2-stage interview and short take home task.
Company Benefits
Paid Leave- 35 days (inclusive of public holidays) + your birthday off
Volunteering Leave Allowance
Enhanced Parental Leave
Life Insurance
Healthcare Cash Plan
Employee Assistance Programme (EAP)
Pension
Monthly Wellbeing Allowance
Breakfast, Snacks, Friday lunch & Barista Coffee Machine in the office
Learning Portal with over 100,000 assets available to support professional development
hackajob is partnering with Oritain to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.