Your browser does not support javascript! Please enable it, otherwise web will not work for you.

Data Architect @ Accenture

Home > DBA / Data warehousing

 Data Architect

Job Description


 About The Role  

Project Role :
Data Architect
Project Role Description :Define the data requirements and structure for the application. Model and design the application data structure, storage and integration.
Must have skills :Databricks Unified Data Analytics Platform

Good to have skills :
NA
Minimum 12 year(s) of experience is required

Educational Qualification :
15 years full time education
Summary:As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and efficient, while also addressing any challenges that arise during the development process. Your role will be pivotal in guiding the team towards achieving optimal data solutions that support the organization's goals. Responsibilities:Proven experience as a Data Architect and Data Engineer leading enterprise-scale Lakehouse initiatives.Expert-level understanding of modern Data & Analytics Architecture patterns including Data Mesh, Data Products, and Lakehouse Architecture.Excellent programming and debugging skills in Python.Strong experience with Py Spark for building scalable and modular ETL/ELT pipelines.Design and build complex pipelines using Delta Lake, Auto Loader, Delta Live Tables (DLT), and deployment using Asset Bundles.Architect data ingestion and transformation using DLT Expectations, modular Databricks Functions, and reusable pipeline components.Must have hands-on expertise in at least one major cloud platform:AWS, GCP, or Azure.Lead implementation of Unity Catalog:create catalogs, schemas, role-based access policies, lineage visibility, and data classification tagging (PII, PHI, etc.).Guide organization-wide governance via Unity Catalog setup:workspace linkage, SSO, audit logging, external locations, and Volume access.Enable cross-platform data access using Lakehouse Federation, querying live from externally hosted databases.Leverage and integrate Databricks Marketplace to consume high-quality third-party data and publish internal data assets securely.Experience with cloud-based services relevant to data engineering, data storage, data processing, data warehousing, real-time streaming, and serverless computing.Govern and manage Delta Sharing for securely sharing datasets with external partners or across tenants.Design and maintain PII anonymization, tokenization, and masking strategies using dbx functions and Unity Catalog policies to meet GDPR/HIPAA compliance.Architect Power BI, Tableau, and Looker integration with Databricks for live reporting and visualization over governed datasets.Build Databricks SQL Dashboards to enable stakeholders with real-time insights, KPI tracking, and alerts.Hands on Experience in applying Performance optimization techniquesLead cross-functional initiatives across data science, analytics, and platform teams to deliver secure, scalable, and value-aligned data products.Provide thought leadership on adopting advanced features like Mosaic AI, Vector Search, Model Serving, and Databricks Marketplace publishing.Working knowledge of DBT (Data Build Tool) is a plus.Strong background in data modeling and data warehousing concepts is required.Nice to Have:1.Certifications:Databricks Certified Professional or similar certifications.2.Machine Learning:Knowledge of machine learning concepts and experience with popular ML libraries.3.Knowledge of big data processing (e.g., Spark, Hadoop, Hive, Kafka)4.Data Orchestration:Apache Airflow.5.Knowledge of CI/CD pipelines and DevOps practices in a cloud environment.6.Experience with ETL tools like Informatica, Talend, Mati Llion, or Five Tran.7.Familiarity with DBT (Data Build Tool)Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.
Educational Qualification-- 15 years full time education is required. Qualification 15 years full time education

Job Classification

Industry: IT Services & Consulting
Functional Area / Department: Engineering - Software & QA
Role Category: DBA / Data warehousing
Role: Data warehouse Architect / Consultant
Employement Type: Full time

Contact Details:

Company: Accenture
Location(s): Bengaluru

+ View Contactajax loader


Keyskills:   python data analytics pyspark data modeling debugging hive continuous integration airflow talend ci/cd microsoft azure data warehousing machine learning cloud platform spark gcp devops kafka data warehousing concepts hadoop aws etl informatica

 Fraud Alert to job seekers!

₹ Not Disclosed

Similar positions

SAP Data and AI Architect

  • Capgemini
  • 20 - 25 years
  • Mumbai
  • 3 days ago
₹ Not Disclosed

Job Opening | Data Engineer- Python And Pyspark | Hcl Tech |

  • HCLTech
  • 5 - 10 years
  • Hyderabad
  • 3 days ago
₹ .75-8.75 Lacs P.A.

Data Engineer II

  • Amazon
  • 1 - 6 years
  • Hyderabad
  • 3 days ago
₹ Not Disclosed

Data Architect

  • Mphasis
  • 8 - 10 years
  • Bengaluru
  • 3 days ago
₹ Not Disclosed

Accenture

Roles and Responsibilities Handle customer calls to resolve their queries and concerns. Provide excellent customer service by listening actively, empathizing with customers' issues, and offering solutions. Maintain accurate records of all interactions with customers using CRM software. Collaborat...