img
Contract TypeContract TypeFull-time
Workplace typeWorkplace typeRemote
LocationLocationMakkah
Join Our Team as a Senior Data Engineer!
We are seeking a skilled Senior Data Engineer with deep expertise in ClickHouse and streaming data, and a passion for building scalable real-time analytics solutions. In this role, you will design, develop, and optimize our data pipelines and analytics infrastructure, empowering our teams to harness real-time insights that enhance customer experience and drive business growth.

Key Responsibilities:
  • Design, implement, maintain and document highly scalable data pipelines for real-time and batch processing.
  • Build and optimize data systems to support accurate, low-latency analytics and reporting use cases.
  • Develop and maintain solutions for streaming and serverless data processing.
  • Collaborate with cross-functional teams to implement and support end-to-end analytics workflows.
  • Ensure data quality, reliability, and performance across the platform.
  • Monitor, troubleshoot, and optimize data infrastructure to maintain high availability.
  • Mentor junior engineers and contribute to the continuous improvement of engineering practices.

Requirements:
  • 5+ years of experience in data engineering or related fields.
  • Strong expertise in ClickHouse: experience in designing schemas and jobs, optimizing data ingestion as well as queries, managing clusters, etc.
  • Proven experience in real-time data processing and enrichment using tools like Apache Kafka, Apache Flink, Apache Spark Streaming, Serverless technologies like AWS Lambda, GCP Functions, etc.
  • Deep understanding of Distributed systems architecture and design, with a focus on scalability and resilience, especially in relation to data processing.
  • Proficiency in programming languages like Python, or Java.
  • Hands-on experience with cloud platforms (*, AWS, GCP, or Azure).
  • Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
  • Strong problem-solving skills and ability to work in a fast-paced environment.
  • Excellent communication and collaboration skills, with a demonstrated ability to work effectively as part of a team.

Preferred Qualifications:
  • Experience in the e-commerce industry or similar high-traffic, data-intensive environments would be a big plus.
  • Knowledge of ETL/ELT tools like Airflow, dbt, or equivalent.
  • Familiarity with monitoring and observability tools for data systems (*, Prometheus, Grafana).

Requirements

  • Requires 2-5 Years experience

Similar Jobs

Data Engineer

📣 Job Ad

Salla

Full-time
We are seeking a skilled Senior Data Engineer with deep expertise in ClickHouse and streaming data, and a passion for building scalable real-time analytics solutions. In this role, you will design, develop, and optimize our data pipelines and analytics infrastructure, empowering our teams to harness real-time insights that enhance customer experience and drive business growth.

Key Responsibilities
  • Design, implement, maintain and document highly scalable data pipelines for real-time and batch processing.
  • Build and optimize data systems to support accurate, low-latency analytics and reporting use cases.
  • Develop and maintain solutions for streaming and serverless data processing.
  • Collaborate with cross-functional teams to implement and support end-to-end analytics workflows.
  • Ensure data quality, reliability, and performance across the platform.
  • Monitor, troubleshoot, and optimize data infrastructure to maintain high availability.
  • Mentor junior engineers and contribute to the continuous improvement of engineering practices.

Requirements
  • 5+ years of experience in data engineering or related fields.
  • Strong expertise in ClickHouse: experience in designing schemas and jobs, optimizing data ingestion as well as queries, managing clusters, etc.
  • Proven experience in real-time data processing and enrichment using tools like Apache Kafka, Apache Flink, Apache Spark Streaming, Serverless technologies like AWS Lambda, GCP Functions, etc.
  • Deep understanding of Distributed systems architecture and design, with a focus on scalability and resilience, especially in relation to data processing.
  • Proficiency in programming languages like Python, or Java.
  • Hands-on experience with cloud platforms (*, AWS, GCP, or Azure).
  • Familiarity with containerization and orchestration tools such as Docker and Kubernetes.
  • Strong problem-solving skills and ability to work in a fast-paced environment.
  • Excellent communication and collaboration skills, with a demonstrated ability to work effectively as part of a team.

Preferred Qualifications
  • Experience in the e-commerce industry or similar high-traffic, data-intensive environments would be a big plus.
  • Knowledge of ETL/ELT tools like Airflow, dbt, or equivalent.
  • Familiarity with monitoring and observability tools for data systems (*, Prometheus, Grafana).

breifcase2-5 years

locationMakkah

8 days ago