Job Title: Technology & Transformation: EAD: GCP Data Engineer
Job Description:
1. Good hands-on experience in GCP services including BigQuery, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer/Airflow, and IAM.
2. Must have proficient experience in GCP Databases : Bigtable, Spanner, CloudSQL and AlloyDB
3. Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. 4. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers)
5. Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow.
6. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle.Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs.Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus
7. Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices.
8. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity.
9. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
10. Ability to work independently and manage multiple priorities effectively.
11. Preferably having expertise in end to end DW implementation