Data Architect
Bengaluru, Karnataka, India
Candidates from Bangalore/ Chennai/ Pune/ Gurgaon/ Pune/ Kolkata can apply
Roles and Responsibilities:
● Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack
● Ability to provide solutions that are forward-thinking in data engineering and analytics space
● Collaboration with DW/BI leads to understanding new ETL pipeline development requirements.
● Triage issues to find gaps in existing pipelines and fix the issues
● Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs
● Help joiner team members to resolve issues and technical challenges.
● Drive technical discussion with client architect and team members
● Orchestrate the data pipelines in scheduler via Airflow
Skills and Qualifications:
● Bachelor's and/or master’s degree in computer science or equivalent experience.
● Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects.
● Deep understanding of Star and Snowflake dimensional modelling.
● Strong knowledge of Data Management principles
● Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture
● Should have hands-on experience in SQL, Python and Spark (PySpark)
● Candidate must have experience in AWS/ Azure stack
● Desirable to have ETL with batch and streaming (Kinesis).
● Experience in building ETL / data warehouse transformation processes
● Experience with Apache Kafka for use with streaming data / event-based data
● Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala)
● Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)
● Experience working with structured and unstructured data including imaging & geospatial data.
● Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT.
● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
● Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
● Should have experience working in Agile methodology.
Job Description:
• 15+ Years of experience in Data & Analytics with Good Communication and presentations skills.
• At least 2 years' experience in Databricks implementations, 2 large scale data warehouse end-to- end
implementation experience.
• Must have Databricks certified architect.
• Proficiency in SQL and experience with scripting languages (e.g., Python, spark, Py spark) for data manipulation
and automation.
• Solid understanding of cloud platforms (AWS, Azure, GCP) and their integration with Databricks.
• Familiarity with data governance and data management practices. exposure to Data sharing, unity catalog, DBT,
replication tools, performance tuning will be added advantage / must have skills.
About Tredence:
Tredence focuses on last-mile delivery of powerful insights into profitable actions by uniting its strengths in business
analytics, data science and software engineering. The largest companies across industries are engaging with us and
deploying their prediction and optimization solutions at scale. Head quartered in the San Francisco Bay Area, we serve
clients in the US, Canada, Europe, and Southeast Asia.
Tredence is an equal opportunity employer. We celebrate and support diversity and are committed to creating an inclusive
environment for all employees.
Visit our website for more details: https://www.tredence.com