JOB DESCRIPTIONAWS Data Engineer (AI, Data Lake, Snowflake, Python, Spark, Copilot & Claude)
Job Summary
As a Software Engineer II at JPMorgan Chase within CCB, you will design, build, and optimize scalable data pipelines and architectures on AWS, leveraging Data Lake, Snowflake, and distributed processing technologies.
Job Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes on AWS (S3, Glue, Lambda, Redshift, etc.) using Python and Spark.
- Architect and implement Data Lake solutions, ensuring efficient data ingestion, storage, and retrieval.
- Integrate and manage Snowflake environments for data warehousing and analytics.
- Develop and optimize distributed data processing workflows using Apache Spark (PySpark).
- Collaborate with AI/ML teams to enable data-driven models and solutions, supporting feature engineering and model deployment.
- Leverage AI coding assistants such as Copilot and Claude to accelerate development, improve code quality, and automate repetitive tasks.
- Optimize data workflows for performance, reliability, and cost efficiency.
- Ensure data quality, governance, and security across all platforms.
- Automate data processing tasks using Python, Spark, and AWS-native tools.
- Monitor, troubleshoot, and resolve issues in data pipelines and infrastructure.
- Document technical solutions and provide knowledge transfer to team members.
Required qualifications, capabilities and skills
- Bachelorâs or Masterâs degree in Computer Science, Engineering, or related field.
- 5+ years of experience in data engineering, with hands-on AWS experience.
- Strong proficiency in AWS services: S3, Glue, Lambda, Redshift, IAM, etc.
- Experience with Data Lake architecture and implementation.
- Expertise in Snowflake data warehousing, including schema design, performance tuning, and security.
- Advanced programming skills in Python and SQL.
- Hands-on experience with Apache Spark (preferably PySpark) for large-scale data processing.
- Familiarity with AI/ML concepts and workflows; experience supporting data science teams.
- Experience using AI coding assistants such as GitHub Copilot and Claude to enhance productivity and code quality.
- Knowledge of data governance, security, and compliance best practices.
- Excellent problem-solving and communication skills.
Preferred qualifications, skills, and capabilities
- Experience with workflow/orchestration tools such as Apache Airflow.
- Exposure to DevOps practices and CI/CD pipelines.
- AWS certification (e.g., AWS Certified Data Analytics, Solutions Architect).
- Experience with real-time data processing and streaming (e.g., Kinesis, Kafka).
- Familiarity with BI tools (e.g., Tableau, Power BI).
ABOUT US
hackajob is partnering with JPMorganChase to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.