img
Contract TypeContract TypeFull-time
Workplace typeWorkplace typeOn-site
LocationLocationJeddah
Job Summary:
We are looking for a Mid Level Junior Data Engineer with solid fundamentals and hands-on experience in building and supporting data pipelines. This role suits someone who has moved beyond entry-level and can be relied on for execution while still growing under the guidance of senior engineers.

Key Responsibilities:
  • Build and maintain ETL/ELT data pipelines under defined architectures.
  • Ingest data from databases, APIs, and flat files.
  • Develop and maintain data warehouse tables and basic dimensional models.
  • Write and optimize SQL queries for analytics and reporting.
  • Implement data validation and quality checks.
  • Monitor pipelines, troubleshoot issues, and resolve common failures.
  • Support batch and scheduled data processing jobs.
  • Document data pipelines, schemas, and data flows.
  • Collaborate with Data Analysts, Integration, and Business teams.

Required Skills & Experience:
  • 2–4 years of hands-on experience in Data Engineering or related roles.
  • Strong SQL skills (joins, aggregations, subqueries).
  • Good working knowledge of Python.
  • Practical experience with ETL/ELT tools (Airflow, NiFi, or similar).
  • Understanding of data warehousing and dimensional modeling basics.
  • Experience supporting production data pipelines.

Education:
Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field.

Requirements

  • Requires 2-5 Years experience

Similar Jobs

Data Engineer

📣 Job Ad

Barakah

Full-time
Join Barakah as a Senior Data Engineer!

At Barakah, we are committed to reducing food waste through technology and social awareness. As a Senior Data Engineer, you will be instrumental in enhancing our data platform, ensuring it is reliable, scalable, and high-performance. This position entails:
  • Designing, building, and maintaining data pipelines across various use cases.
  • Optimizing the data warehouse to ensure efficiency.
  • Collaborating closely with analytics, product, and engineering teams to maintain data integrity.

Key Responsibilities:
  • Manage and optimize a ClickHouse data warehouse.
  • Build robust data pipelines using dbt and Kafka.
  • Implement monitoring, alerting, and observability.
  • Design backup and recovery systems.
  • Support analytics teams with reliable data delivery.
  • Establish data engineering best practices.

Required Skills:
  • Strong SQL and database experience (ClickHouse and/or PostgreSQL is a plus).
  • Experience in building and managing production data pipelines.
  • Familiarity with orchestration tools and batch processing.
  • Stream processing knowledge with Kafka.
  • API integration with resilience patterns.
  • Proficient in Python for data processing.

Preferred Skills:
  • Experience with Kubernetes, Docker, and DataOps.
  • Prior experience in the GCC region.
  • Arabic language fluency.

What We Offer:
  • Opportunity to make a meaningful impact on food sustainability.
  • A supportive environment for learning and growth.
  • Career advancement opportunities within a growing company.

breifcase2-5 years

locationJeddah

17 days ago