Role Responsibilities:
- Create solutions with Azure big data (Data Lake, Data Factory) to ingest data from business operations for data scientists and software developers.
- Designing, building, operationalising, securing and monitoring data pipelines and data stores.
Role Requirements:
- Good knowledge of Azure cloud services
- Experience in any of Azure data solutions (Data Lake, Data Factory, Synapse Analytics, etc.)
- Experience in relational or non-relational database solutions
- Programming skills (Python, C#, etc.)
- Understanding of data architecture patterns, including data mesh, microservices, etc.
- Familiarity with Agile ways of working
- Building data pipelines within Azure Data Factory
- Databricks/Spark
- Microsoft certifications
- Experience in designing and implementing distributed systems
- Experience with alternative cloud providers
- Experience with ETL, SSIS, SSAS may be useful, however, we require someone who has already created solutions using the Azure big data services. Experience with presentation of data i.e. PowerBI, Dashboards etc, is not particularly relevant.
What’s in it for you:
- Competitive hourly rate
- Stable contract duration with possibility for extension