Middle Data Engineer

Middle data engineer

EU

Remote

We are looking for an experienced data engineer to join our team of experts. You’ll collaborate with top-tier clients globally, utilizing advanced data engineering and AI technologies.

Necessary skills:

  • You possess a degree in Computer Science, Applied Mathematics, Engineering or a related field.
  • You have at least 2 years of experience, ideally within a Data Engineer role.
  • Strong understanding of data modeling, data warehousing concepts, and ETL processes.
  • Experience with Cloud technologies (Microsoft Azure).
  • Proficient experience in distributed computing principles and familiar with key architectures, and has abroad experience across a set of data stores (Azure Data Lake Store, Azure Synapse Analytics, Apache Spark, Azure Data Factory).
  • Understanding of landing, staging area, data cleansing, data profiling, data security and data architecture concepts (DWH, Data Lake, Delta Lake/Lakehouse, Datamart).
  • Strong SQL-skills.
  • Strong communication and interpersonal skills, with the ability to effectively collaborate with clients and team members at all levels.
  • English —B1 or higher.
  • Experience working in Agile development environment with tools such as Jira.

Will be beneficial: Relevant Data engineering certification.

The company offers:

  • Opportunities for professional growth and international certification.
  • Free of charge technical and business trainings and the best bootcamps (worldwide, including HQ Microsoft- Redmond courses).
  • Innovative data and analytics projects and practical experience with cutting-edge Microsoft technologies.
  • Long-term employment (not a temporary project).
  • Individual development plan.
  • English speaking club — to develop your language level.

Responsibilities:

  • The Data Engineer is responsible for helping to select, deploy and manage the systems and infrastructure required of a data processing pipeline in support of the customer requirements. 
  • Primary responsibilities revolve around DevOps and include implementing ETL pipelines, monitoring/maintaining data pipeline performance, model optimization.
  • You will work on cutting-edge cloud technologies including Azure Fabric, Apache Spark, Data Lake, Data Bricks, Data Factory, Cosmos DB, HD Insights, Stream Analytics, Event Grid etc., in the implementation projects for corporate clients all over the world.
Share:
FacebookLinkedInTwitter
Apply now

Write to us

Get in touch

Contact us

Address