Experience: 0-2 years

Bucharest, full-time, hybrid

Requirements:

  • You have an intermediate proficiency in French, B1 minimum
  • Bachelor’s or master’s degree in computer science, Engineering, or a related field, or equivalent professional experience.
  • Proven experience as a Data Engineer or in a similar role.
  • Strong programming skills in Python and/or Java.
  • Solid understanding of ETL concepts and hands-on experience with ETL tools and frameworks.
  • Experience with Apache Spark for distributed data processing.
  • Proficiency in working with various database technologies (both SQL and NoSQL).

Responsibilities:

  • Design, develop, and maintain efficient ETL pipelines for ingesting, processing, and transforming large-scale data from multiple sources.
  • Develop solutions using distributed computing frameworks such as Apache Spark.
  • Work with both relational and NoSQL databases to ensure data integrity, optimization, and accessibility.
  • Implement streaming and messaging solutions using Apache Kafka to support real-time data processing.
  • Build and deploy data engineering solutions within containerized environments using Kubernetes.
  • Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
  • Monitor and troubleshoot data pipelines to guarantee their stability and performance.

Document technical designs, architecture, and operational procedures.