Your browser does not support javascript! Please enable it, otherwise web will not work for you.

GCP Big Data Engineer Job in KogniVera @ Kpr sugar apperals

Home >

 GCP Big Data Engineer Job in KogniVera

Job Description

    Job Description- Senior Data Engineer- GCP Functional Title: GCP Senior Data Engineer Start Date: ASAP Job Type : Full Time Experience- 7+ Years Job Location: Bangalore Responsibilities: Clean, prepare and optimize data at scale for ingestion and consumption by machine learning models Drive the implementation of new data management projects and re-structure of the current data architecture Implement complex automated workflows and routines using workflow scheduling tools Build continuous integration, test-driven development and production deployment frameworks Drive collaborative reviews of design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards Anticipate, identify and solve issues concerning data management to improve data quality Design and build reusable components, frameworks and libraries at scale to support machine learning products Design and implement product features in collaboration with business and Technology stakeholders Analyze and profile data for the purpose of designing scalable solutions Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues Mentor and develop other data engineers in adopting best practices Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Qualifications: 4+ years of experience developing scalable Pyspark applications or solutions on distributed platforms Experience in Google Cloud Platform (GCP) and good to have other cloud platform tools Experience working with Data warehousing tools, including DynamoDB, SQL, and Snowflake Experience architecting data products in Streaming, Serverless and Microservices Architecture and platform. Experience with Pyspark, Spark (Scala/Python/Java) and Kafka Work experience with using Databricks (Data Engineering and Delta Lake components) Experience working with Big Data platforms, including Dataproc, Data Bricks etc Experience working with distributed technology tools including Spark, Presto, Databricks, Airflow Working knowledge of Data warehousing, Data modeling Experience working in Agile and Scrum development process Bachelor's degree in Computer Science, Information Systems, Business, or other relevant subject area If interested ,Please share your resume at hidden_email #GCP#GoogleCloud#DataEngineer#CloudEngineer#BigData#SeniorDataEngineer#CloudComputing#GoogleCloudPlatform#DataEngineering#TechJobs#CloudArchitecture#DataAnalytics#GCPJobs#Hadoop#MachineLearning#AI#TechHiring,

Employement Category:

Employement Type: Full time
Industry: IT Services & Consulting
Role Category: Not Specified
Functional Area: Not Specified
Role/Responsibilies: GCP Big Data Engineer Job in KogniVera at

Contact Details:

Company: KogniVera
Location(s): Other Karnataka

+ View Contactajax loader


Keyskills:   Data warehousing SQL Snowflake Streaming Spark Scala Python Java Kafka Airflow Presto Agile Scrum Data modeling

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Specified

Similar positions

Business Development Assistant Manager

  • Bajaj Capital
  • 5 to 10 Yrs
  • 8 days ago
₹ Not Disclosed

QA Analyst (Manual Testing)

  • Unnati
  • 2 to 6 Yrs
  • 14 days ago
₹ Not Disclosed

AI Agents & Workflow Integration

  • Bajaj Capital
  • 6 to 10 Yrs
  • 24 days ago
₹ Not Disclosed

Business Advisory - Gen Ai (bfsi Domain)

  • LTIMindtree
  • 10 to 14 Yrs
  • All India
  • 1 month ago
₹ Not Disclosed

Kpr sugar apperals

Kpr sugar and apperals ltd