Job Description- Senior Data Engineer- GCP Functional Title: GCP Senior Data Engineer Start Date: ASAP Job Type : Full Time Experience- 7+ Years Job Location: Bangalore Responsibilities: Clean, prepare and optimize data at scale for ingestion and consumption by machine learning models Drive the implementation of new data management projects and re-structure of the current data architecture Implement complex automated workflows and routines using workflow scheduling tools Build continuous integration, test-driven development and production deployment frameworks Drive collaborative reviews of design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards Anticipate, identify and solve issues concerning data management to improve data quality Design and build reusable components, frameworks and libraries at scale to support machine learning products Design and implement product features in collaboration with business and Technology stakeholders Analyze and profile data for the purpose of designing scalable solutions Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues Mentor and develop other data engineers in adopting best practices Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Qualifications: 4+ years of experience developing scalable Pyspark applications or solutions on distributed platforms Experience in Google Cloud Platform (GCP) and good to have other cloud platform tools Experience working with Data warehousing tools, including DynamoDB, SQL, and Snowflake Experience architecting data products in Streaming, Serverless and Microservices Architecture and platform. Experience with Pyspark, Spark (Scala/Python/Java) and Kafka Work experience with using Databricks (Data Engineering and Delta Lake components) Experience working with Big Data platforms, including Dataproc, Data Bricks etc Experience working with distributed technology tools including Spark, Presto, Databricks, Airflow Working knowledge of Data warehousing, Data modeling Experience working in Agile and Scrum development process Bachelor's degree in Computer Science, Information Systems, Business, or other relevant subject area If interested ,Please share your resume at hidden_email #GCP#GoogleCloud#DataEngineer#CloudEngineer#BigData#SeniorDataEngineer#CloudComputing#GoogleCloudPlatform#DataEngineering#TechJobs#CloudArchitecture#DataAnalytics#GCPJobs#Hadoop#MachineLearning#AI#TechHiring,
Employement Category:
Employement Type: Full timeIndustry: IT Services & ConsultingRole Category: Not SpecifiedFunctional Area: Not SpecifiedRole/Responsibilies: GCP Big Data Engineer Job in KogniVera at