15 Oct
Asterix Communications
Abu Dhabi
Please apply only if you have a UAE resident visa and are not currently employed. We're looking for a Data Engineer for an up to 6 months ON SITE project in Abu Dhabi. The candidate must have their own residency visa.
Important Skills:
1. Programming skills: Python and SQL
2. Strong API development from scratch
3. Strong skills in managing the data warehouse cloud
4. Knowledge with creating/debugging stored procedures, SQL statements, views, triggers, and functions for Azure/MS SQL Server
5. Experience with Database administration: Azure/MS SQL Server and MS Azure Cloud
6. Must be well familiar with Visual Studio and Team Foundation Server/GitHub
7. Data Modeling expertise is important: Tools such as Erwin
8.
Strong skills in deploying ML models solutions in the cloud
9. Knowledge with MDM solutions & customer profiling
Skills considered as a PLUS:
- Experience with machine learning algorithms (Python, Databricks and/or Azure ML)
Important Experience:
- At least 5 years of experience with:
- ETL development using Talend / Airflow
- Developing SQL queries, stored procedures, and views
- At least 3 years of experience with:
- Developing/coding in Python
- Database administration working with any cloud solutions
Expertise as a PLUS:
- At least 3 years of experience with Azure Databricks or any cloud data warehouse such as (Snowflake, Redshift)
- At least 2 years experience of Azure DF
Must well understand ETL methodologies, Big Data, and Data Warehousing principles, approaches, technologies, and architectures including the concepts, designs, and usage of data warehouses. Works with teams to understand, document, design, develop, code, and test ETL processes. Also, write, implement,
and maintain appropriate ETL processes. Demonstrate experience with ETL design and options to improve load and extract performance. Translate source to target mapping documents into ETL processes. Build and integrate APIs. Deploy ML solutions in the cloud. Design, develop, test, optimize, and deploy ETL code and stored procedures to perform all ETL related functions. Works in an Agile environment and utilizes best principles of CI/CD and DevOps. Design Big Data model. Responsible to design, implement, and manage ETL processes using Talend and Azure Data Factory and administer and maintain the Azure/MS SQL server Databases and Microsoft Enterprise Data warehouse. Develop Spark Job in Scala or Java. Designs, codes in Python and SQL, orchestrates,
and monitors jobs in Azure Databricks / Snowflake or any data warehouse cloud.
Qualifications:
- Bachelor's Degree in Computer Science, Engineering, or a related field
As a PLUS:
- Advanced certification in Azure (Azure DF, Azure ML)
- Certification in Azure Databricks or any cloud warehouse
- Any certification in data science is a big plus.
#J-18808-Ljbffr