Job Description
Role Purpose
The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists.
͏
Primary Skill: Snowflake
Secondary Skills: Azure, Cosmos DBRole SummaryWe are looking for a Level‑2 Data Engineer with strong hands‑on experience in Snowflake along with working knowledge of Azure services and Cosmos DB. The candidate will be responsible for building, optimizing, and maintaining data pipelines, Snowflake objects, and integrations within Azure cloud ecosystems.ResponsibilitiesDevelop and optimize ELT/ETL pipelines with Snowflake as the central data platform.Design and implement Snowflake objects—tables, views, stages, file formats, streams, tasks, and resource monitors.Perform query optimization, warehouse tuning, clustering, micro‑partitioning, and cost governance.Integrate Snowflake with Azure Data Factory, ADLS, and Cosmos DB.Build secure data flows using RBAC, masking policies, secure views, and other Snowflake governance features.Work with Cosmos DB for ingestion, change feed, RU management, and query performance.Participate in requirement analysis, estimations, and Agile ceremonies.Prepare technical documentation, version control, and deployment support using CI/CD tools.Required SkillsSnowflake (Primary)3+ years hands‑on experience with Snowflake in Development.Strong SQL and Snowflake performance tuning.Experience with Streams & Tasks (CDC), warehouse sizing, cost optimization.Knowledge of data modeling—Star/Snowflake schemas, SCD, incremental loads.Security experience with roles, masking policies, secure views.Azure & Cosmos DB (Secondary)Experience with Azuree Data Factory (pipelines, triggers, linked services).Knowledge of Azure Storage/ADLS, Key Vault, and basic networking (managed identities, private endpoints).Cosmos DB: partition key design, RU provisioning, ingestion patterns, change feed basics.Nice to HavePython or PySpark for transformations.Experience with Databricks or dbt.Familiarity with CI/CD (Azure DevOps/GitHub).Knowledge of Power BI/Tableau.
͏
-
Primary Skill: Snowflake
Secondary Skills: Azure, Cosmos DBRole Summary
We are looking for a Level‑2 Data Engineer with strong hands‑on experience in Snowflake along with working knowledge of Azure services and Cosmos DB. The candidate will be responsible for building, optimizing, and maintaining data pipelines, Snowflake objects, and integrations within Azure cloud ecosystems.
Responsibilities
- Develop and optimize ELT/ETL pipelines with Snowflake as the central data platform.
- Design and implement Snowflake objects—tables, views, stages, file formats, streams, tasks, and resource monitors.
- Perform query optimization, warehouse tuning, clustering, micro‑partitioning, and cost governance.
- Integrate Snowflake with Azure Data Factory, ADLS, and Cosmos DB.
- Build secure data flows using RBAC, masking policies, secure views, and other Snowflake governance features.
- Work with Cosmos DB for ingestion, change feed, RU management, and query performance.
- Participate in requirement analysis, estimations, and Agile ceremonies.
- Prepare technical documentation, version control, and deployment support using CI/CD tools.
Required Skills
Snowflake (Primary)
- 3+ years hands‑on experience with Snowflake in Development.
- Strong SQL and Snowflake performance tuning.
- Experience with Streams & Tasks (CDC), warehouse sizing, cost optimization.
- Knowledge of data modeling—Star/Snowflake schemas, SCD, incremental loads.
- Security experience with roles, masking policies, secure views.
Azure & Cosmos DB (Secondary)
- Experience with Azure
- e Data Factory (pipelines, triggers, linked services).
- Knowledge of Azure Storage/ADLS, Key Vault, and basic networking (managed identities, private endpoints).
- Cosmos DB: partition key design, RU provisioning, ingestion patterns, change feed basics.
Nice to Have
- Python or PySpark for transformations.
- Experience with Databricks or dbt.
- Familiarity with CI/CD (Azure DevOps/GitHub).
- Knowledge of Power BI/Tableau.
͏
-
Primary Skill: Snowflake
Secondary Skills: Azure, Cosmos DBRole Summary
We are looking for a Level‑2 Data Engineer with strong hands‑on experience in Snowflake along with working knowledge of Azure services and Cosmos DB. The candidate will be responsible for building, optimizing, and maintaining data pipelines, Snowflake objects, and integrations within Azure cloud ecosystems.
Responsibilities
- Develop and optimize ELT/ETL pipelines with Snowflake as the central data platform.
- Design and implement Snowflake objects—tables, views, stages, file formats, streams, tasks, and resource monitors.
- Perform query optimization, warehouse tuning, clustering, micro‑partitioning, and cost governance.
- Integrate Snowflake with Azure Data Factory, ADLS, and Cosmos DB.
- Build secure data flows using RBAC, masking policies, secure views, and other Snowflake governance features.
- Work with Cosmos DB for ingestion, change feed, RU management, and query performance.
- Participate in requirement analysis, estimations, and Agile ceremonies.
- Prepare technical documentation, version control, and deployment support using CI/CD tools.
Required Skills
Snowflake (Primary)
- 3+ years hands‑on experience with Snowflake in Development.
- Strong SQL and Snowflake performance tuning.
- Experience with Streams & Tasks (CDC), warehouse sizing, cost optimization.
- Knowledge of data modeling—Star/Snowflake schemas, SCD, incremental loads.
- Security experience with roles, masking policies, secure views.
Azure & Cosmos DB (Secondary)
- Experience with Azure
- e Data Factory (pipelines, triggers, linked services).
- Knowledge of Azure Storage/ADLS, Key Vault, and basic networking (managed identities, private endpoints).
- Cosmos DB: partition key design, RU provisioning, ingestion patterns, change feed basics.
Nice to Have
- Python or PySpark for transformations.
- Experience with Databricks or dbt.
- Familiarity with CI/CD (Azure DevOps/GitHub).
- Knowledge of Power BI/Tableau.
͏
Deliver
| No | Performance Parameter | Measure |
| 1 | Process | No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT |
| 2 | Team Management | Productivity, efficiency, absenteeism |
| 3 | Capability development | Triages completed, Technical Test performance |
Experience: 5-8 Years .
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.