hackajob is partnering with CGI to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
At CGI, we help organisations turn complex data into powerful, trusted platforms that enable smarter decisions and lasting impact. As a Senior Data Engineer, you’ll be instrumental in designing, building, and evolving a large-scale data platform that underpins critical business outcomes. Working within a collaborative DevOps environment, you’ll deliver resilient, scalable data solutions while shaping best practice and driving continuous improvement. You’ll be encouraged to take ownership of your work, explore creative approaches to data engineering challenges, and grow your career within a supportive culture that values collaboration, innovation, and real-world impact.
CGI was recognised in the Sunday Times Best Places to Work List 2025 and has been named a UK ‘Best Employer’ by the Financial Times. We offer a competitive salary, excellent pension, private healthcare, plus a share scheme (3.5% + 3.5% matching) which makes you a CGI Partner not just an employee. We are committed to inclusivity, building a genuinely diverse community of tech talent and inspiring everyone to pursue careers in our sector, including our Armed Forces, and are proud to hold a Gold Award in recognition of our support of the Armed Forces Corporate Covenant. Join us and you’ll be part of an open, friendly community of experts. We’ll train and support you in taking your career wherever you want it to go.
Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This is a hybrid role offering flexibility to balance on-site collaboration and remote working. You’ll primarily work from home or your local CGI office, with occasional travel to client workshops or team sessions at key locations such as Birmingham, London, Manchester, or Leeds.
Your future duties and responsibilities
In this role, you will play a key part in building and maintaining robust data pipelines that support a modern, enterprise-scale data platform. You’ll work closely with architects, analysts, and client stakeholders to ensure data is reliable, accessible, and aligned to architectural standards. You’ll be trusted to take ownership of engineering outcomes, contribute ideas to improve platform performance and resilience, and help shape a positive, collaborative engineering culture.
You will also support and guide less experienced colleagues, sharing knowledge and encouraging high standards across the team, while continuously improving ways of working within an agile DevOps environment.
Key responsibilities include:
• Build & Deliver: Design, develop, and maintain scalable data pipelines using Databricks, Azure Data Factory, and Python.
• Ensure Quality: Maintain data quality, consistency, and lineage across ingestion, transformation, and delivery layers.
• Orchestrate & Monitor: Implement orchestration, scheduling, and monitoring to ensure reliable data operations.
• Collaborate & Align: Work with Data Architects and Analysts to align pipelines with data models and target architecture.
• Troubleshoot & Optimise: Resolve data issues across development and production environments to maintain platform stability.
• Document & Share: Maintain clear technical documentation and contribute to shared engineering knowledge.
• Support & Mentor: Coach and support team members, helping to raise capability across the data engineering function.
Required qualifications to be successful in this role
To be successful, you will bring strong hands-on data engineering experience, a proactive mindset, and the ability to work collaboratively within agile teams. You should be comfortable taking ownership of data solutions, continuously improving them, and supporting others to succeed.
Essential qualifications:
• Strong experience with Databricks, SQL, Azure Data Factory, and Python.
• Experience working within Azure-based data platforms.
• Proven background in building and maintaining scalable data pipelines.
• Experience with Azure DevOps and agile delivery practices.
• Solid understanding of data modelling and data architecture principles.
• Strong problem-solving, communication, and collaboration skills.
• Experience with CI/CD, version control, and DevOps practices for data.
• Desirable: Azure Synapse, Blob Storage, data quality/testing tooling, Git, or triplestore technologies.
hackajob is partnering with CGI to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.