Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Architect @ Mspc Services

Home > Application Programming / Maintenance

 Data Architect

Job Description

 

Data Build Tool, Snowflake, Spark / Pyspark , Java/ Python, Scala, CLOUD

 

Role and Responsibilities

Hands-on working knowledge of Snowflake Architecture (Access Control, provisioning etc)
Person should be good with Data transformation and processing using Data Build Tool.
SnowPro Data engineering certification is a plus
Teradata and Snowflake experience.
Professional experience with source control, merging strategies and coding standards, specifically Bitbucket/Git and deployment through Jenkins pipelines.
Demonstrated experience developing in a continuous integration/continuous delivery (CI/CD) environment using tools like Jenkins, circleCI Frameworks.
Demonstrated ability to maintain the build and deployment process through the use of build integration tools
Experience designing instrumentation into code and using and integrating with software and logging analysis tools like log4Python, New Relic, Signal FX and/or Splunk.
Conduct knowledge-sharing sessions and publish case studies. Take accountability for maintaining program or project documents in a knowledge base repository.
Identify accelerators and innovations. Understand complex interdependencies to identify the right team composition for delivery.
Required Skills

Working experience and communicating with business stakeholders and architects

  • Industry experience in developing relevant big data/ETL data warehouse experience building cloud-native data pipelines
  • Experience in Python, Pyspark, Scala, Java and SQL Strong Object and Functional programming experience in Python
  • Experience working with REST and SOAP-based APIs to extract data for data pipelines
  • Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc.
  • Experience working in a public cloud environment, particularly AWS is mandatory
  • Ability to implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena
  • Experience in working with Real-time data streams and Kafka Platform.
  • Working knowledge of workflow orchestration tools like Apache Airflow design and deploy dags.
  • Hands-on experience with performance and scalability tuning
  • Professional experience in Agile/Scrum application development using JIRA
  •  

Employement Category:

Employement Type: Full time
Industry: IT Services & Consulting
Role Category: Application Programming / Maintenance
Functional Area: Not Applicable
Role/Responsibilies: Data Architect

+ View Contactajax loader


Keyskills:   python sql snow flake snowflake azure

 Job seems aged, it may have been expired!
 Fraud Alert to job seekers!

₹ Not Specified

Similar positions

Java Architect

  • Ultrafly Solutions
  • 3 to 10 Yrs
  • Hyderabad
  • 1 month ago
₹ Not Specified

Mirketa - Salesforce Technical Architect

  • Talent Corner
  • 5 to 8 Yrs
  • All India
  • 1 month ago
₹ Not Specified

Data Analyst Intern

  • Consulttrinity
  • 0 to 4 Yrs
  • Kochi+1 Other Kerala
  • 1 month ago
₹ Not Specified

Azure Data Factory, Databricks

  • Best infosystems
  • 3 to 8 Yrs
  • Multi-City, India
  • 1 month ago
₹ Not Specified

Mspc Services

MSPC Services  Pvt Ltd formerly known as SPC Services is the most trusted and fastest-growing company in the Software industry. The intent is simple to offer each client with services that have a positive impact. We help our clients in distinguishing their business by driving more traffic and rai...