Typical Day in Role
• Designing, building, operationalizing the Wealth Data Hub (WDH) using Google Cloud Platform (GCP) data services such as DataProc, Dataflow, CloudSQL, BigQuery, CloudSpanner in combination with third parties such as Spark, Apache Beam/ composer, DBT, Cloud PubSub, Confluent Kafka, Cloud storage Cloud Functions & Github
• Designing and implementing data ingestion patterns that will support batch, streaming and API interface on both the Ingress and Egress.
• Guide a team of data engineers and work hands on in developing framework and custom code using best practices that will meet the demanding performance requirements
• Take a lead in designing and building production data pipelines from data ingestion to consumption using GCP services, Java, Python, Scala, BigQuery, DBT, SQL etc.
• Experience using Cloud Dataflow using Java/Python for deploying streaming jobs in GCP as well as batch jobs using text/JSON files and writing them to BigQuery
• Building and managing data pipelines with a deep understanding of workflow orchestration, task scheduling and dependency management
• Ability to do proof of technology using GCP technologies and work with data architects, solution architects to achieve the desired results and performance.
• Provide end-to-end technical guidance and expertise on how to effectively use Google Cloud to build solutions; creatively applying cloud infrastructure and platform services to help solve business problems; and communicating these approaches to different business users
• Provide guidance on Implementing application logging, notification, jobs monitoring and performance monitoring
Candidate Requirements/Must Have Skills:
• 8-10 years of experience in data engineering, performance optimization for large OLTP applications with a minimum of 3 years of working experience as Google Cloud Platform (GCP) developer
• 5+ years of experience working with relational/NoSQL databases
• 2-3 years of experience with the primary managed data services within GCP, including DataProc, Dataflow, BigQuery/DBT, Cloud Spanner, Cloud SQL, Cloud Pub/Sub etc.
•2-3 years of experience with Google Cloud Platform Databases (SQL, Spanner, PostgreSQL)
•1-2 years of experience with data streaming and technologies such as Kafka, Spark-streaming etc.
Nice-To-Have Skills:
• Working knowledge of developing and scaling JAVA REST services, using frameworks such as Spring
• Understanding of Wealth business line and the various data domains required for building an end to end solution
• Experience with Infrastructure as Code (IaC) practices and frameworks like Terraform
• Knowledge of Java microservices and Spring Boot
• Strong architecture knowledge with experience in providing technical solutions for cloud infrastructure.
• Active Google Cloud Data Engineer certification or Google Professional Cloud Architect certification preferred
Education:
• Degree in Computer Science or related field
• Active Google Cloud Data Engineer certification or Google Professional Cloud Architect certification preferred