Save time and effort sourcing top tech talent

Senior Data Engineer

Remote
Data Engineer
Actively hiring

Senior Data Engineer

Kainos
Remote
Data Engineer
Kainos
Actively hiring

hackajob is partnering with Kainos to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.

 

As a Senior Data Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take responsibility for significant technical components of data systems . You will work within a multi-skilled agile team to design and develop large-scale data processing software to meet user needs in demanding production environments.

Your responsibilities will include:

  • Working to develop data processing software primarily for deployment in Big Data technologies. The role encompasses the full software lifecycle including design, code, test and defect resolution.

  • Working with Architects and Lead Engineers to ensure the software supports non-functional needs.

  • Collaborating with colleagues to resolve implementation challenges and ensure code quality and maintainability remains high. Leads by example in code quality.

  • Working with operations teams to ensure operational readiness

  • Advising customers and managers on the estimated effort and technical implications of user stories and user journeys.

  • Coaching and mentoring team members.

 

MINIMUM (ESSENTIALREQUIREMENTS:

  • Strong software development experience in one of Java, Scala, or Python

  • Software development experience with data-processing platforms from vendors such as AWS, Azure, GCP, Databricks.

  • Experience of developing substantial components for large-scale data processing solutions and deploying into a production environment 

  • Proficient in SQL and SQL extensions for analytical queries

  • Solid understanding of ETL/ELT data processing pipelines and design patterns

  • Aware of key features and pitfalls of distributed data processing frameworks, data stores and data serialisation formats

  • Able to write quality, testable code and has experience of automated testing

  • Experience with Continuous Integration and Continuous Deployment techniques

  • A Keen interest in AI Technologies

 

 

DESIRABLE REQUIREMENTS:

  • Experience of performance tuning

  • Experience of data visualisation and complex data transformations

  • Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products

  • Expertise in continuous improvement and sharing input on data best practice

  • Practical experience with AI technologies, tools, processes and delivery

 

hackajob is partnering with Kainos to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.

 

Upskill

Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.

Ready to reach your potential?