Job Description
Role Purpose
The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters
Job Description: -
Cloud certification in Azure Data Engineer
Azure Data Factory, Azure Databricks Spark (PySpark or Scala), SQL, Data Ingestion, Curation
Semantic Modeling/Optimization of data model for Rahona
Experience in Azure ingestion from on-prem sources like mainframe, SQL server, Oracle
Experience in Sqoop/Hadoop
Proficiency in Microsoft Excel for metadata files
Certification in Azure/AWS/GCP with hands-on data engineering experience in the cloud
Strong programming skills in Python, Scala, or Java
Proficient in SQL (T-SQL or PL-SQL) Data files movement via mailbox
Source-code versioning/promotion tools like Git/Jenkins
Orchestration tools like Autosys, Oozie
S͏kills:-
Azure Data Factory (primary)
Azure Databricks Spark (PySpark, SQL)
N͏ice to Have:-
Experience with mainframe files
Agile environment experience, JIRA/Confluence tool familiarity
͏
Deliver
| No. | Performance Parameter | Measure |
| 1. | Continuous Integration, Deployment & Monitoring of Software | 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan |
| 2. | Quality & CSAT | On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation |
| 3. | MIS & Reporting | 100% on time MIS & report generation |