Job Description
Project Role :Data Architect
Project Role Description :Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills :Microsoft Azure Databricks
Good to have skills :NA
Minimum 7.5 year(s) of experience is required
Educational Qualification :15 years full time education
Summary:
As a Data Architect, a typical day involves defining the data requirements and designing the structure necessary for the application. This role includes modeling the data architecture, planning how data will be stored efficiently, and ensuring seamless integration across various components. The position requires a thoughtful approach to organizing data to support application functionality and scalability, collaborating with different stakeholders to align data strategies with project goals, and continuously refining data models to meet evolving needs.Key Responsibilities
A.Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposal
B.Discuss specific Big data architecture and related issues with client architect/team (in area of expertise)
C.Worked in implementation of Databricks Gen AI/Agentic AI use case
D.Knowledge in LLM and Prompt engineering, AI foundry
E.Candidate should have worked in Data governance Solution
F.Analyze and assess the impact of the requirements on the data and its lifecycle
G.Lead Big data architecture and design medium-big Cloud based, Data and Analytical Solutions using Lambda architecture.
H.Breadth of experience in various client scenarios and situations
I.Experienced in Big Data Architecture-based sales and delivery
J.Thought leadership and innovation
K.Lead creation of new data assets offerings
L.Experience in handling OLTP and OLAP data workloads
Technical Experience:
A.Experience working in Medallion architecture involving Delta lake house principles
B.Expert level in Designing and Architect solutions in Azure Databricks, Azure Data lake, Delta Lake implementation.
C.Experience in Databricks GenAI Implementation
D.Experience in Azure purview/Profisee/Unity Catalog
E.Well versed in Real time and batch streaming concepts and experience in its implementation
F. Expert level experience in Azure cloud technologies like PySpark, Databricks, Python, Scala and SQL.
G.Exp in one or more Real-time/batch ingestion including:Azure Delta live tables , Autoloader
H.Exp in handling medium to large Big Data implementations
I. Strong understanding of data strategy. Data Quality and Delta lake components
J.Candidate must have 10-12 years of IT experience and around 5 years of extensive Big data experience (design + build) in Databricks
K.Architect for a medium sized client delivery project Professional Experience:
A.Should be able to drive the technology design meetings, propose technology design and architecture
B.Should have excellent client communication skills
C.Should have good analytical and problem-solving skills
Educational Qualification:
A.Must have:BE/BTech/MCA
B.Good to have:ME/MTech
Qualification 15 years full time education
Job Classification
Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Data warehouse Architect / Consultant
Employement Type: Full time
Contact Details:
Company: Accenture
Location(s): Hyderabad
Keyskills:
data architect
python
scala
ai
databricks
microsoft azure
pyspark
data architecture
llm
cloud technologies
sales
sql
cloud
data quality
lambda
data governance
big data
data lake
architecture
azure
communication skills