Save time and effort sourcing top tech talent

Data Engineer

Lafayette, LA, USA
Data Engineer
CGI
Actively hiring

Sign up for the chance to get matched to this role, and similar opportunities.

As a Data Engineer with Databricks, you will be responsible for designing, developing, and maintaining the infrastructure and systems necessary for data storage, processing, and analysis. You will play a pivotal role in constructing and managing data pipelines that ensure efficient and reliable data integration, transformation, and delivery across the enterprise.

Location: Lafayette, LA in a Hybrid Model.

Your future duties and responsibilities
• Designs, develops, and maintains robust data pipelines using Databricks and Apache Spark, that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems.
• Implement and manage advanced data products in our medallion architecture.
• Implement and manage advanced data models, including dimensional, relational, and Data Vault modeling techniques.
• Integrates data from different sources, including databases, data warehouses, APIs, and external systems.
• Ensures data consistency and integrity during the integration process, performing data validation and cleaning as needed.
• Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques.
• Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency.
• Monitors and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance. Implements data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data.
• Manage Databricks Unity Catalog including administrative activities (policies, security, access control, cost control).
• Optimize and administer Databricks environments to ensure high performance and reliability.
• Cluster configuration and policy management in Databricks.
• Build and maintain CI/CD pipelines (DevOps) in GitHub or Azure DevOps.
• Collaborate with cross-functional teams to deliver comprehensive data solutions.
• Ensure data quality, security, and governance across all data processes.
• Communicate effectively with stakeholders to understand requirements and deliver actionable insights.
• Operate within agile teams, contributing to continuous improvement in data engineering practices and processes.


Required qualifications to be successful in this role
• Databricks Certifications (Associate, Professional, Administration).
• Minimum of 8 years of experience in data engineering, with at least 4 years of hands-on experience with Databricks.
• At least 8 years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks.
• Proficiency in modern data modeling techniques and data administration.
• Strong knowledge of SQL, Python, PySpark.
• Experience with cloud platforms such as AWS and Azure. Also, very familiar with AWS IAM and Azure AD security mechanisms.
• Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve repetitive problems.
• Strong communication skills and a proactive, “getting things done” mindset.
• Experience working in agile teams and familiarity with agile methodologies.
• Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data to support AI, ML, and BI.
• Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks).
• Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata.
• Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products.
• Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals.
• Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options.
• Ability to translate among the languages used by executive, business, IT, and quant stakeholders.


•Competitive compensation


•Comprehensive insurance options


•Matching contributions through the 401(k) plan and the share purchase plan


•Paid time off for vacation, holidays, and sick time


•Paid parental leave


•Learning opportunities and tuition assistance


•Wellness and Well-being programs

Sign up for the chance to get matched to this role, and similar opportunities.

Upskill

Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.

Ready to reach your potential?