A Data Engineer is required for a London based client. Must have experience working with Azure Data Factory, Azure Integration runtime, Python, Cosmo DB, Data workflow design [Data Ingestion and Data Processing], Azure Delta Lake [Databricks], SQL Query, MS SQL Server Database, understanding of data models, data structure, file formats [ Avro , Parque and JSON ]. Knowledge of Azure Data Streaming, DevOps concepts, knowledge Time-series Data model and Azure Synapse.

Experience required:

  • A strong understanding of SQL Azure principals (Azure Data Factory, Databricks, Data Lake, Logic Apps and Function Apps).
  • SQL and Python
  • Azure Data Streaming
  • Be the technical expert with the latest technologies. Make recommendations to the wider business as to how to keep up to date with the best data solutions.
  • Working with and modelling data warehouses

- Develop Data Integration jobs in Azure Data Factory in support of applications and analytics.

- Troubleshoot integration package when there are failures to now only remediate the existing issue but identify opportunities to improve overall solution to avoid future failures.

- Manage database assets and integration workflow scheduling

- Imagine and develop new analytic opportunities with Intalere data for both internal and external customers

- Demonstrable experience working with huge data sets

- A scientific and mathematical education background (Masters and/or PhD welcome)

 

Platform and Tools:

  • Azure PAAS
  • Azure SQL Server, Synapse  
  • Power BI/Tableau
  • Jupiter Notebook

If you love working with large data sets, cloud computing, data processing and analytics techniques then this rare opportunity to join an exceptionally strong team is for you! You will be expected to work on many levels - from mining massive datasets for identifying opportunities, through designing and implementing your solutions into our offline and live production systems.

 

 Apply Now 

 Get In Touch