hackajob is partnering with Domestic & General to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Job summary:
D+G is transforming into a technology-powered product business serving customers around the world. Our products and services rely heavily on compelling digital experiences and data-led journeys for our B2B and B2C clients and customers.
This is a key lead engineering role with D&G’s technology team which presents a challenging and exciting opportunity which will require real enthusiasm and modern Data engineering experience to stabilise, enhance and transform D&G’s operational Customer Databases as they move from legacy systems to new scalable cloud solutions across the UK, EU and US. The role will require an experienced Data engineer with good knowledge of IBM Datastage & DB2, AWS & Databricks pipelines who is able to excel in challenging environments with the confidence to help the teams steer the right course in the development of the data platform alongside supporting any required tooling decisions.
The role will enable D+G to deliver a modern data services layer, delivered as a product, and which can hence be consumed by key service channels and stakeholders on demand.
Strategic Impact:
Quality Customer Data is the lifeblood of D&G’s operations which allows us to serve our customers with outstanding propositions and outcomes. This role will be integral to supporting this through the following areas of delivery:
On-Prem Customer Data Platform Stabilisation
This role will initially help stabilise existing on-prem Customer Data Platforms to help serve our customers and protect the one-billion-pound revenue across the UK and EU. Targets will be to reduce merge and compliance incident backlog, promote more automation and support onboarding of 3rd party to provide managed break / fix service.
Support Data Growth in UK and US Markets
Supporting further growth in UK / EU markets through enhancement of the Customer on-prem IBM platforms to ensure they remain available, robust and secure for growing data demands in UK / EU whilst leading on delivery of cloud based solutions for the US pipelines and Data platform.
Knowledge, Expertise, Complexity and Scope:
Knowledge in the following areas essential:
Data Engineering Experience:
Ensure all developments are tested and deployed within the automated CI / CD pipeline where appropriate.
Version and store all development artefacts in the agreed repository
Ensure all data are catalogued and appropriate documentation is created and maintained for all ETL code and associated NFR’s.
Collaborate with the product owner (Data) & business stakeholders to understand the requirements and capabilities.
Collaborate with the lead architect, CCOE to align to the best practice delivery strategy.
Participate in the teams agile planning and delivery process to ensure work is delivered in line with the Product Owners priorities
Create low level designs for Epic’s and Stories and where required, support the lead architect to create the designs to enable the realization of the Data Lake, Operational Customer DB, Warehouse and marts while ensuring scalability, security by design, ease of use and high availability & reliability.
Identify the key capabilities needed for success and the technology choices, coding standards, testing techniques and delivery approach to deliver reliable data services
Learn emerging technologies to keep abreast of new or better ways of delivering the Data Pipeline
Welcomes a challenge as a new opportunity to learn new things and make new friends whilst always thinking of better techniques to solve problems
hackajob is partnering with Domestic & General to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.