Save time and effort sourcing top tech talent

Senior Data Engineer

Remote
Data Engineer
Actively hiring

Senior Data Engineer

Version 1
Remote
Data Engineer
Version 1
Actively hiring

hackajob is partnering with Version 1 to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.

 

This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. 

The ideal candidate will have a proven track record as a senior/self-starting data engineer in implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically Snowflake, DBT and Databricks, to play an important role in developing and delivering early proofs of concept and production implementation.

You will ideally have experience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines.

Your main responsibilities will be:

  • Designing and developing robust ingestion and transformation pipelines using Snowpark, dbt, SQL, and orchestration tools (e.g., ADF/Airflow).
  • Implement Zero‑Copy Cloning, Time Travel, Materialized Views, and Tasks/Streams for reliable data flows
  • Embed data quality checks and lineage.
  • Tune Virtual Warehouses, caching, micro‑partitioning, and query plans.
  • Apply FinOps practices: right‑size compute, implement auto‑suspend/auto‑resume, usage dashboards, and resource monitors.
  • Configure roles, RBAC, masking policies, row‑level access, and TAG‑based governance.
  • Operationalize Data Contracts and collaborate with platform/security teams on compliance.
  • Developing scalable and re-usable frameworks for ingestion and transformation of large data sets
  • Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search)
  • Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.

Qualifications

  • Direct experience of building data piplines using Snowpark, dbt, SQL, and orchestration tools (e.g., ADF/Airflow).
  • SnowPro Core, SnowPro Advanced Architect/Data Engineer, relevant cloud certifications (Azure/AWS).
  • Hands on experience designing and delivering data solutions using the Azure and AWS cloud platform.
  • Experience building data warehouse solutions using ETL / ELT tools like Databricks, Teradata.
  • Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.

 

hackajob is partnering with Version 1 to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.

 

Upskill

Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.

Ready to reach your potential?