Save time and effort sourcing top tech talent

GCP Data Engineer / Consultant Specialist

Pune, Maharashtra, India
Cloud Engineer Data Engineer Platform Engineer Site Reliability Engineer DevOps Engineer
Actively hiring

GCP Data Engineer / Consultant Specialist

HSBC
Pune, Maharashtra, India
Cloud Engineer Data Engineer Platform Engineer Site Reliability Engineer DevOps Engineer
HSBC
Actively hiring

hackajob is partnering with HSBC to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.

 

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

Department: - CTO Data - Data Tech IWPB & UK - Data Tech IWPB

In this role, you will:

  • Design and deliver batch and streaming ETL/ELT pipelines to ingest data from multiple sources into GCP.
  • Gather requirements with stakeholders and translate business needs into scalable data solutions.
  • Build Apache Beam pipelines on GCP Dataflow for extraction, transformation, and analytical preparation.
  • Integrate data from databases, APIs, and flat files while enforcing data quality, consistency, and governance.
  • Manage and optimise GCP storage/warehouse layers (BigQuery, Cloud Storage) for analytics and reporting.
  • Orchestrate and automate workflows using Airflow/Cloud Composer; reduce manual operations through automation.
  • Implement observability (monitoring/alerting), troubleshoot bottlenecks, and ensure SLA/SLO adherence.
  • Lead SRE practices: reliability strategy, incident response, post-mortems, documentation, security/compliance, and continuous improvement.

To be successful in this role, you should meet the following requirements:

  • Bachelor’s degree in Computer Science/IT (or equivalent experience).
  • Proven ETL/ELT experience: data modelling, data warehousing concepts, and strong SQL.
  • Strong GCP experience (certification preferred), especially BigQuery/Cloud Storage/Dataflow.
  • Big data batch + streaming expertise (e.g., Kafka, Spark/Flink, Hadoop/Hive/HBase, dbt).
  • Proficiency in Java and/or Python for data engineering and data manipulation.
  • Proven SRE/DevOps experience: architecture, reliability, automation, and operational excellence.
  • Incident management capability: on-call/response, RCA, and post-mortem execution.
  • Strong communication and problem-solving; experience with AI/LLM (preferably financial services).

You’ll achieve more when you join HSBC.
www.hsbc.com/careers 

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India

hackajob is partnering with HSBC to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.

 

Upskill

Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.

Ready to reach your potential?