hackajob is partnering with Verisk Analytics to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
As a technical lead, you will design, build, and operate scalable data platforms in AWS while also guiding engineers through best
practices, code quality, and delivery excellence. You will work closely with product owners, architects, and stakeholders to translate
business requirements into robust, production-grade data solutions.
• Accountable for delivery of data deliveries for top insurance and financial services clients world-wide
• Design, build, and optimize cloud-native data platforms on AWS, including data lakes, data warehouses, and analytics
stores.
• Collaborate with BI and analytics platform vendors and internal teams to support production issues, upgrades, and platform
enhancements.
• Document data sources on client project and perform gap analysis and identify inflow of bad data into Cloud data
warehouse and develop recommendations
• Development of database solutions by designing proposed system; defining database physical structure and functional
capabilities, security, back-up, and recovery specifications to internal and external stakeholders
• Work closely with ETL/ELT team for implementing data transformation requirements into the cloud data warehouse.
• Lead, mentor, and coach a team of data engineers, fostering technical excellence and continuous improvement.
• Support onboarding and skill development of team members through guidance, pairing, and code reviews.
• Plan and track delivery commitments, ensuring predictable and high-quality outcomes.
• Partner with product owners, architects, and stakeholders to prioritize work and manage expectations.
• Contribute to hiring by participating in interviews and technical assessments
• 5-8+ years of experience in Python programming, specifically in complex data loading using Python.
• 5-8+ years of hands on (strong) ETL is a must and prior experience in any RDMS experience is a must (Postgres is preferable)
• 2+ years of mandatory hands on experience in AWS cloud technologies, CI/CD(GIT/ BitBucket) and managing data in AWS
• 2+ years of hands-on experience in Snowflake is required.
• Data Mapping expertise, ability to understand business requirements and derive data dictionary and data assets
requirement for ETL/ELT and DBA groups.
• Excellent communication skills, both written and verbal and attention to detail
• Analytical skills is must to understand the requirements and work on the tasks as an individual contributor.
• Ability to collaborate with peers and develop good working relationships.
• Knowledge of data warehousing on the AWS platform is required.
• AWS Certification especially Cloud Practitioner completion is desired.
• Hands on experience with Snowflake will be an asset.
• Hands on experience on Linux shell scripting is preferable.
hackajob is partnering with Verisk Analytics to fill this position. Create a profile to be automatically considered for this role—and others that match your experience.
Level up the hackajob way. Verify your skills, learn brand new ones and test your ability with Pathways, our learning and development platform.