Why we need this role
The Data Engineer will be a key contributor to building, optimising, and maintaining Colt DCS’s modern data environment. Working within the Data Integrations & BI team, you will design and support data pipelines, integrations, Azure data structures, and Power BI‑ready datasets across our Microsoft-centric ecosystem.
This role ensures the organisation has high‑quality, reliable, and well‑structured data for analytics, reporting, and operational decision‑making. You will be responsible for the technical development of integration workflows, ETL/ELT processes, Azure data structures, and supporting Power BI data models used across the organisation.
What we're looking for
Key Responsibilities
Data Engineering & Development
- Design, build, and maintain robust data pipelines using Azure Data Factory, Logic Apps, and related technologies.
- Develop and optimise ELT/ETL workflows for ingestion, transformation, and delivery into the Azure SQL data warehouse.
- Implement efficient data models and structures to support analytics and reporting requirements.
- Build and maintain integration solutions between business systems (e.g., Dynamics 365, SugarCRM, Workday, operational platforms).
- Ensure data pipelines are scalable, secure, and aligned with architectural standards.
Data Quality, Reliability & Operations
- Monitor, troubleshoot, and resolve issues in data workflows and integrations.
- Maintain BAU operations such as daily data refreshes, pipeline monitoring, and user support.
- Implement data quality measures, validation rules, and reconciliation checks.
- Contribute to data governance practices, including metadata, lineage, and documentation.
Collaboration & Stakeholder Engagement
- Work closely with BI developers, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
- Collaborate with external partners delivering development or support services.
- Support business teams in improving the use and understanding of enterprise data assets.
Continuous Improvement & Innovation
- Identify opportunities to automate processes, improve performance, and enhance data reliability.
- Research and evaluate emerging tools, features, and best practices within the Azure and data engineering landscape.
- Contribute to the long‑term data platform roadmap and architectural decisions.
Skills and Experience
- 3–5+ years’ experience as a Data Engineer or similar role.
- Strong experience designing, building, and supporting data pipelines using Azure Data Factory, Azure Logic Apps, or similar tools.
- Hands‑on experience with Azure SQL Database, including relational modelling and performance optimisation; strong SQL proficiency for data transformation and analysis.
- Experience developing ETL/ELT pipelines and working with structured/unstructured data sources.
- Understanding of data warehousing concepts, star schemas, and modern analytics patterns.
- Familiarity with the Microsoft ecosystem (Azure, Power BI, Power Platform).
- Experience with DevOps practices such as version control (Git), deployment automation, or CI/CD.
- Excellent analytical and problem-solving skills, with a strong business acumen.
- Experience integrating with business systems such as Dynamics 365, Salesforce, or similar platforms.
- Exposure to Python or other scripting languages used in data engineering.
- Ability to translate complex data challenges into clear business outcomes and priorities.
- Exceptional communication and interpersonal skills, with the ability to influence and engage stakeholders at all levels.
hackajob is partnering with Colt Technology Services to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.