• Jobs
  • >
  • Data Architect (Databricks + AWS)

Data Architect (Databricks + AWS)

  • Permanent
  • Full time
  • 560016, Banglore, karnataka, India

Job Title: Data Architect (Databricks + AWS)

Location: Bangalore, Hyderabad, or Pune
Experience: 8+ years
Employment Type: Full-Time


About the Role

We are seeking an experienced Data Architect with expertise in Databricks and AWS Cloud Services to design and implement scalable data solutions for enterprise environments. This role requires a visionary leader with a strong technical background in big data technologies, cloud architecture, and data engineering practices.


Key Responsibilities

  • Design and implement modern data architecture solutions leveraging Databricks, AWS, and big data ecosystems.
  • Develop and optimize data pipelines, ETL/ELT workflows, and data lakes/warehouses on cloud platforms.
  • Collaborate with stakeholders to understand business requirements and translate them into scalable data solutions.
  • Define data governance, security, and compliance best practices in alignment with organizational policies.
  • Implement data modeling, data integration, and data quality frameworks for structured and unstructured datasets.
  • Drive performance tuning and cost optimization for cloud-based data platforms.
  • Provide technical leadership to data engineers and analysts within cross-functional teams.

Key Skills & Qualifications

Strong expertise in Databricks (Delta Lake, Spark, MLFlow, DBX)
✅ Proficiency in AWS cloud services: S3, Redshift, Glue, Lambda, EMR, Athena
Programming Skills: Python, SQL, Scala (preferred)
✅ Deep understanding of data lakehouse architecture and big data frameworks
✅ Experience with CI/CD for data pipelines and infrastructure as code (Terraform, CloudFormation)
✅ Knowledge of data security, compliance (GDPR, HIPAA), and governance frameworks
✅ Strong analytical, problem-solving, and communication skills


Preferred Qualifications

  • AWS Certified Solutions Architect or AWS Certified Data Analytics certification
  • Experience in streaming technologies (Kafka, Kinesis)
  • Exposure to BI tools (Tableau, Power BI, QuickSight)