hackajob is partnering with CGI to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Position Description
We are seeking an experienced Big Data Engineer to join our team and play a crucial role in architecting, developing, and optimizing complex data pipelines on Google Cloud Platform (GCP). The ideal candidate will have a strong background in big data technologies, with robust experience in GCP and a proven track record of managing large-scale data environments and ingesting large amounts of data seamlessly.
This position is located in our Fairfax, VA office; however, a hybrid working model is acceptable.
Your future duties and responsibilities
• Design, build, and maintain scalable big data solutions on GCP.
• Develop and optimize ETL processes, ensuring data integrity and quality across the data lifecycle.
• Implement efficient data ingestion strategies to handle large volumes of data from various sources.
• Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
• Implement data security best practices and ensure compliance with internal and external regulations.
• Monitor and troubleshoot data infrastructure, ensuring optimal performance and uptime.
• Produce and maintain technical documentation for data architecture and processes.
• Stay up-to-date with the latest trends and technologies in big data and cloud computing, and share insights with the team.
Required qualifications to be successful in this role:
• Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
• 5+ years of experience in big data engineering.
• Strong proficiency with GCP services, including BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
• Proven experience in ingesting, processing, and managing large volumes of data.
• Experience with ETL tools and methodologies.
• Skilled in programming languages such as Python, Java, or Scala.
• Familiarity with data modeling, warehousing, and relational databases (e.g., SQL).
• Solid understanding of data security practices.
• Excellent problem-solving skills and attention to detail.
• Strong communication skills and ability to collaborate effectively with team members and stakeholders.
Required qualifications to be successful in this role
• Certifications in Google Cloud Platform.
• Experience with orchestration tools like Apache Airflow.
• Knowledge of machine learning and AI tools on GCP.
hackajob is partnering with CGI to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.