hackajob is partnering with Virgin Media O2 X giffgaff to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Key responsibilities :
● Pipeline Design & Build (ETL/ELT): Design and implement robust, high-volume ETL/ELT data pipelines to ingest data from various sources (including high-velocity Kafka streams and batch feeds) into our Snowflake enterprise data platform.
● Orchestration & Automation: Build, automate, and manage the lifecycle of complex data workflows using cloud-native orchestration tools like Argo Workflows running on Kubernetes.
● CI/CD Implementation: Own and maintain the Continuous Integration/Continuous Delivery (CI/CD) pipelines for data applications using tools like GitHub Actions.
● Observability & Reliability: Implement comprehensive monitoring and observability solutions, leveraging tools like Prometheus and Grafana, to track pipeline performance, detect anomalies, and ensure minimal downtime and data integrity.
● Data Quality & Governance: Establish and automate data quality checks, validation rules, and governance procedures at scale, ensuring high data reliability and strict GDPR/Security compliance.
● Data Modeling: Design and implement scalable, auditable data models using modern techniques (e.g., Kimball, Data Vault) within dbt.
● Collaboration: Work closely with Data Scientists, Business Intelligence Analysts, and other product stakeholders to translate complex business requirements into high-performing, reliable data solutions.
Essential Expertise (Must-haves) :
● Education: Bachelor's degree in computer science, engineering, mathematics, or a related field.
● Data Engineering Experience: Minimum of 3+ years of experience in data engineering, with a proven focus on platform reliability and automation.
● Core Languages: Deep proficiency in Python for pipeline development, and expert command of SQL for complex querying and data modeling.
● Data Warehouse: Extensive, hands-on experience with modern cloud data warehouses, specifically Snowflake.
● Data Transformation: Expert proficiency in using dbt (data build tool) for managing the transformation layer and implementing data models (e.g., dimensional modeling).
● Containerization & Orchestration: Proven experience with Docker and Kubernetes for deploying and scaling data microservices and applications.
● Workflow Management: Direct experience orchestrating complex data pipelines using cloud-native tools like Argo Workflows, Apache Airflow, or Prefect/Dagster.
● Cloud Platform: Experience deploying and managing data infrastructure on major cloud platforms (AWS, GCP, or Azure)
hackajob is partnering with Virgin Media O2 X giffgaff to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.