Role:- GCP Data Engineers
Job Description
Proficiency in Google Cloud Platform (GCP) services such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage, and Pub/Sub.
Strong expertise in SQL for querying, transforming, and managing data within relational databases and BigQuery.
Hands-on experience with orchestration tools like Apache Airflow or GCP Cloud Composer for automating and scheduling data workflows.
Solid knowledge of Data Warehousing (DW) concepts, including dimensional modelling, star schema, and snowflake schema design.
Understanding of Slowly Changing Dimensions (SCD), especially SCD Type 2, for managing historical data in data warehousing environments. Strong grasp of ETL/ELT concepts, with experience in building scalable and efficient data pipelines.
Experience using version control systems such as Git and implementing CI/CD practices for data engineering workflows.
Ability to work with large datasets, optimize performance, and ensure data quality and reliability.
Strong analytical and problem-solving skills with the ability to work in a collaborative environment.
Good communication skills to interact with cross-functional teams and stakeholders.

NEURONIMBUS SOFTWARE SERVICES PRIVATE LIMITED Neuronimbus is a 15 years young digital solutions company that believes the world of jargons and complex dictionary is best left to boardrooms. We are a zealous bunch who go after things that excite us on the web and mobile solutions and always look ...