hackajob is partnering with Barclays to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
At Barclays you will spearhead the transformation of our data landscape, driving innovation and excellence across enterprise data platforms. As a Senior Data Lead, you will leverage cutting-edge AWS technologies and Databricks to modernize and migrate SQL Server workloads into scalable, secure, and compliant cloud-native solutions. With a primary focus on AWS services including S3, Glue, Redshift, Athena, and Iceberg, alongside Python, SQL, and orchestration tools like Airflow, you will design and deliver robust data pipelines, establish strong data governance frameworks including MNPI controls, and enable high-quality, insight-driven decision-making.
To be successful in this role you should possess:
Proficiency in designing and implementing scalable data platforms on AWS using services such as S3, Glue (ETL Jobs, Data Catalog), Redshift, Athena, and Iceberg with strong hands-on experience with Databricks for building notebooks, ELT pipelines, and working with distributed data processing frameworks such as Apache Spark.
Expertise in migrating large-scale SQL Server workloads to AWS, including defining source-to-target mappings and ensuring performance and data integrity.
Proficiency in Python and SQL for developing data pipelines, transformations, and analytical workflows and Experience in building and orchestrating ETL/ELT pipelines using AWS Glue, DBT, and tools such as Airflow/Astronomer.
Ability to define data contracts, design Iceberg table schemas, and implement partitioning, metadata management, and Glue Catalog integration.
Strong understanding of data governance, including implementation of Lake Formation policies for row- and column-level security.
Experience embedding MNPI controls, including data classification, access restrictions, and working with Compliance/CISO stakeholders.
Hands-on experience with CI/CD pipelines, DevOps practices, and Agile delivery methodologies.
Design and optimize data lake and warehouse solutions for structured and unstructured data across AWS platforms
.
Some other highly valued skills may include:
Strong analytical skills to analyze large datasets, identify trends, and provide actionable insights.
Experience leveraging GenAI tools (GitLab Duo, Copilot, Claude) to accelerate development and code generation.
Familiarity with BI tools such as Power BI or Tableau for data visualization and reporting and exposure to legacy tools such as SSIS and experience with version control systems like Git, GitHub, or GitLab.
Ability to document data architecture, pipelines, and governance processes for maintainability and knowledge sharing.
Strong communication and stakeholder management skills to collaborate effectively across business, technology, and compliance teams.
You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills.
The location of the role is Pune, IN.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Vice President Expectations
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
hackajob is partnering with Barclays to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.