Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Officer - Data Engineer - C11 - Hybrid @ Worksoft

Home >

 Officer - Data Engineer - C11 - Hybrid

Job Description

    This is a data engineer position where you will be responsible for the design, development, implementation, and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. Your overall objective will be to define optimal solutions to data collection, processing, and warehousing. You must have expertise in Spark Java development for big data processing, as well as proficiency in Python and Apache Spark, particularly within the banking & finance domain. Your role will involve designing, coding, and testing data systems, and implementing them into the internal infrastructure. Responsibilities: - Ensure high-quality software development with complete documentation and traceability - Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large-scale financial data - Design and implement distributed computing solutions for risk modeling, pricing, and regulatory compliance - Ensure efficient data storage and retrieval using Big Data - Implement best practices for Spark performance tuning including partition, caching, and memory management - Maintain high code quality through testing, CI/CD pipelines, and version control (Git, Jenkins) - Work on batch processing frameworks for Market risk analytics - Promote unit/functional testing and code inspection processes - Collaborate with business stakeholders and Business Analysts to understand the requirements - Work with other data scientists to understand and interpret complex datasets Qualifications: - 5-8 years of experience in working in data ecosystems - 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting, and other Big data frameworks - 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase - Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc.), Scala, and SQL - Data Integration, Migration & Large Scale ETL experience - Data Modeling experience - Experience working with large and multiple datasets and data warehouses - Experience building and optimizing big data pipelines, architectures, and datasets - Strong analytic skills and experience working with unstructured datasets - Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines, and toolchain - Experience with external cloud platforms such as OpenShift, AWS & GCP - Experience with container technologies and supporting frameworks - Experience in integrating search solutions with middleware & distributed messaging - Kafka - Excellent interpersonal and communication skills with tech/non-tech stakeholders - Experience in software development life cycle and good problem-solving skills - Strong mathematical and analytical mindset - Ability to work in a fast-paced financial environment Education: - Bachelors/University degree or equivalent experience in computer science, engineering, or similar domain If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, review Accessibility at Citi. View Citi's EEO Policy Statement and the Know Your Rights poster.,

Employement Category:

Employement Type: Full time
Industry: BFSI
Role Category: Not Specified
Functional Area: Not Specified
Role/Responsibilies: Officer - Data Engineer - C11 - Hybrid

Contact Details:

Company: Early Career
Location(s): Chennai

+ View Contactajax loader


Keyskills:   Python Apache Spark Hadoop Scala Java Hive Kafka Impala Unix Scripting SQL NoSQL Oracle MongoDB HBase ETL Data Modeling Data Integration Data Migration Data Warehousing Data Analysis Analytical Skills Git Jenkins OpenShift AWS GCP Docker Kubernetes Middleware

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

AI Engineer

  • The Professionals
  • 8 to 12 Yrs
  • Kolkata
  • 27 days ago
₹ Not Disclosed

Java Big Data Engineer

  • Capgemini
  • 4 to 8 Yrs
  • karnataka
  • 1 month ago
₹ Not Disclosed

Senior Software Engineer - Test Automation

  • Mediaocean
  • 4 to 8 Yrs
  • 2 mths ago
₹ Not Disclosed

Associate - Project Cost Analyst

  • Genpact
  • 2 to 6 Yrs
  • 2 mths ago
₹ Not Disclosed

Worksoft

Worksoft empowers business and IT to deliver flawless applications faster and more efficiently with the ability to discover, document, test, and automate end-to-end business processes in pre- production and production environments. Our solutions ensure business process quality and resilience by aut...