Save time and effort sourcing top tech talent

Data Engineer

Paddington, London, UK
Data Engineer Python Developer Full Stack Python Developer SQL Developer
Virgin Media O2 X giffgaff
Actively hiring

Sign up for the chance to get matched to this role, and similar opportunities.

Data Engineer – Data Platforms (Comms Platform)

 Virgin Media O2 are on a mission to become the most recommended brand by our customers. At the heart of this strategy is digital innovation and our significant data resources.

 So that's what you'll get up to, but what about us?

Well, we're super proud of our history, helping communities to stay connected with oodles of top-notch products and services. We offer the full kit and caboodle – broadband, TV, mobile and landline – equipping our customers out with the very latest tech.

But it's not just what we do, but why we do it that really matters.

Our mission is to become the most recommended brand, by our people and our customers. A massive part of that journey is about how we ensure that our brilliant people have a working environment in which they can truly belong and thrive. For us, it's critical that every single person can bring, and be, their whole selves at work, and we're working hard every day to achieve this.

 As a Data Engineer, you will…

Use cutting edge technology on Google Cloud Platform (GCP) to develop exceptional data pipelines, encompassing both batch and real-time processing capabilities.

·       Build and deploy complex microservices that form the foundation of a unified Comms Platform, catering to service and marketing communication channels for VMO2 customers.

·       Provide ongoing support for existing and future Comms Platform systems, ensuring they adhere to service level agreements (SLAs).

·       Troubleshoot and resolve issues related to the Comms Platform, providing timely and effective solutions to maintain system stability and availability.

·       Collaborate with a team of skilled data engineers to implement agreed-upon Objectives and Key Results (OKRs), taking projects from development to a production-level service.

·       Work closely with stakeholders from various departments to gather requirements and ensure the successful delivery of solutions that precisely meet their needs.

·       Utilise Infrastructure as Code (IaC) principles to deploy cloud infrastructure effectively and efficiently using tools like Terraform.

·       Stay up-to-date with the latest industry trends, technologies, and best practices, continuously improving your skills and applying them to enhance the organisation's technical capabilities.

·       Actively participate in code reviews, promoting a culture of high-quality code and sharing knowledge with other team members.

·       Play an instrumental role in identifying areas for improvement, proposing innovative ideas, and contributing to the overall growth and success of the team and the organisation.

 

 

We’d love to hear from you, if you are…

 

·       Business value and delivery focused

·       Have the “I can solve it” attitude

·       A team player and have excellent communication skills

·       Strong proficiency in Python with proven Data Engineering experience

·       Strong SQL skills including analytic functions

·       Understand DevOps best practices including IaC and CI/CD

·       Had production experience working on a cloud environment, preferably GCP

·       Had production experience using a data warehouse engine such as BigQuery or Redshift

·       Had production experience using a key/value store such as Datastore or DynamoDB

 

It would be nice if you…

·       Have worked in an Agile environment

·       Came across SOLID principles and interested in the concept

·       Have practiced or used TDD

·       Have used Apache Airflow and maintained pipelines on Production

·       Had experience with Apache Beam or Apache Spark

·       Had experience with Kubernetes

·       Had experience using Terraform

 

What we use…

·       Apache Airflow on Cloud Composer

·       Apache Beam on Cloud Dataflow

·       Cloud Run, Google Kubernetes Engine(GKE), Firestore in Datastore mode, Pubsub, BigQuery, Cloud Storage, Terraform, GitLab and other tech if it makes sense

 

 

Sign up for the chance to get matched to this role, and similar opportunities.

Upskill

Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.

Ready to reach your potential?