Job Description
ETRM Data Engineer:
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETRM systems.
- Work on data integration projects within the Energy Trading and Risk Management (ETRM) domain.
- Collaborate with cross-functional teams to integrate data from ETRM trading systems like Allegro, RightAngle, and Endur.
- Optimize and manage data storage solutions in Data Lake and Snowflake.
- Develop and maintain ETL processes using Azure Data Factory and Databricks.
- Write efficient and maintainable code in Python for data processing and analysis.
- Ensure data quality and integrity across various data sources and platforms.
- Ensure data accuracy, integrity, and availability across various trading systems.
- Collaborate with traders, analysts, and IT teams to understand data requirements and deliver robust solutions.
- Optimize and enhance data architecture for performance and scalability
Mandatory Skills:
- Python/ pyspark
- Fast API
- Pydantic
- SQL Alchemy
- Snowflake or SQL
- Data Lake
- Azure Data Factory (ADF)
- CI\CD, Azure fundamentals, GIT
- Integration of data solutions with CETRM trading Systems(Allegro, RightAngle, Endur)
Good to have:
- Databricks
- Streamlit
- Kafka
- Power BI
- Kubernetes
- Fast Stream