Typical Day in Role:
• Design, develop and maintain robust data pipelines for ingestion, transformation, and distribution of large datasets.
• Utilize services and tools to automate data workflows and streamline the data engineering process.
• Collaborate with stakeholders and product managers to analyze, and build data mapping, models and reporting needs.
• Monitor application and pipeline performance.
• Conduct data quality checks
Candidate Requirements/Must Have Skills:
• 10+ years of experience with Data Warehouse / Data Platforms
• 5+ years of experience creating ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL.
• 2+ years of experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte or others.
• 5+ years of experience with Cloud: GCP
• 5+ years of experience working as a data developer, data engineering, programming, ETL, ELT, processes for data integration.
• 5+ years continuous integrations and continuous deployment pipeline (CI/CD) and working with source control systems such as Github, Bitbucket, and Terraform
Nice-To-Have Skills:
• Experience in data modelling, manipulating large data sets and handling raw SQL, and handling other cleaning techniques.
• Python – nice to have
• DBT – nice to have
Soft Skills Required:
• Expert at problem solving.
• Experience collaborating and working with DevOps and Scrum Teams
• Demonstrated team player with strong communication skills and a track record of successful delivery of product development.
• Ability to collaborate across organizational boundaries, build relationships, and achieve broader organizational goals.
• Ability to adapt to a number of conflicting deadlines and quickly grasp and work independently with minimal supervision
Education & Certificates:
Bachelor’s degree in a technical field such as computer science, computer engineering or related field required or sufficient experience.