Your browser does not support javascript! Please enable it, otherwise web will not work for you.

AWS Cloud BigData Engineer @ Schneider Electric

Home > Admin / Maintenance / Security / Datawarehousing

 AWS Cloud BigData Engineer

Job Description


Job Role: AWS Cloud BigData Engineering Support

Essential skills : Very strong technical skills, Process compliance, Professional English general communication skills. AWS experience. Exposure to international customers willingness to work in shifts (24 x 7)

Experience Education: Around 3-5 years in similar position

SUMMARY OF JOB:

- Should be an expert with data lakes technical components (e.g. data Modeling, ETL and Reporting).

- Should have hands on experience on AWS based

- Should have deep understanding of the architecture for enterprise level data warehouse solutions.

- Should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive growth.

RESPONSIBILITIES:

Analyze and solve problems at their root, stepping back to understand the broader context.

Interface with customers, understanding their requirements and delivering complete data solutions.

Provide Access to large datasets.

Learn and understand a broad range of Amazon s data resources and know when, how, and which to use and which not to use.

Tune application and query performance using profiling tools and SQL.

Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.

Model data and metadata to support ad hoc and pre-built reporting.

BASIC QUALIFICATIONS

Bachelors degree in Computer Science, Information Systems or related field

3+ years of SQL development experience

 

Job Role: AWS Cloud BigData Engineering Support

Essential skills : Very strong technical skills, Process compliance, Professional English general communication skills. AWS experience. Exposure to international customers willingness to work in shifts (24 x 7)

Experience Education: Around 3-5 years in similar position

SUMMARY OF JOB:

- Should be an expert with data lakes technical components (e.g. data Modeling, ETL and Reporting).

- Should have hands on experience on AWS based

- Should have deep understanding of the architecture for enterprise level data warehouse solutions.

- Should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive growth.

RESPONSIBILITIES:

Analyze and solve problems at their root, stepping back to understand the broader context.

Interface with customers, understanding their requirements and delivering complete data solutions.

Provide Access to large datasets.

Learn and understand a broad range of Amazon s data resources and know when, how, and which to use and which not to use.

Tune application and query performance using profiling tools and SQL.

Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.

Model data and metadata to support ad hoc and pre-built reporting.

BASIC QUALIFICATIONS

Bachelors degree in Computer Science, Information Systems or related field

3+ years of SQL development experience

3+ years of experience in data modeling, ETL, and Data Warehousing

3+ years of experience architecting, designing, developing and implementing cloud solutions on AWS platforms

Demonstrated experience with designing and implementing solutions using AWS platform and tools including EC2, AWS console, CloudWatch, S3, Presto, Lambda functions, Python programing, Redshift SQL, Linux, DynamoDB, CloudFormation, RDS, VPC, IAM and security.

Experience with Big Data technologies such as Hive/Spark.

Server management and administration including basic Linux scripting

An ability to work in a fast-paced environment where continuous innovation is occurring and ambiguity is the norm.

SQL scripting on Presto and Redshift.

Strong business communication skills

3+ years of Python scripting (or other platform-agnostic language)

AWS Certifications (such as AWS solutions architect or other specialty certifications) are a plus

BI Reporting tools such as Tableau is a plus.

AWS console, S3, Lambda functions, python programing, Redshift SQL and Linux).

Redshift Administration skills is a plus.

Exposure to Informatica Cloud or Informatica PowerCenter is a plus.

Strong organizational and multi-tasking skills with ability to balance competing priorities.


 

Job Classification

Industry: Electricals, Switchgears
Functional Area: IT Software - Application Programming, Maintenance,
Role Category: Admin/Maintenance/Security/Datawarehousing
Role: Admin/Maintenance/Security/Datawarehousing
Employement Type: Full time

Education

Under Graduation: Any Graduate in Any Specialization
Post Graduation: Post Graduation Not Required
Doctorate: Doctorate Not Required

Contact Details:

Company: Schneider Electric
Location(s): Bengaluru

+ View Contactajax loader


Keyskills:   Computer science Business communication metadata Linux Data modeling Informatica AWS Reporting tools SQL Python

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Disclosed

Schneider Electric

At Mahindra, we offer clients world-class consulting services in three key verticals: information security and risk management, engineering consulting and business process management. Mahindra Special Services Group (MSSG) collaborates with organizations to better protect their information assets, s...