img
Contract TypeContract TypeFull-time
Workplace typeWorkplace typeOn-site
LocationLocationMakkah
About Us:
Abdul Latif Jameel United Finance Company is a closed joint stock company licensed by the Saudi Central Bank (SAMA) to work in financial leasing, productive assets financing, consumer products financing, and real estate financing. We provide multiple financing options for individuals and SMEs, including but not limited to cash financing for all kinds of cars, heavy equipment, household and electronic appliances, and real estate financing through financial solutions approved by our Shariah committee.

Job Purpose:
The Data Engineer in the Data Management Office (DMO) is responsible for designing, building, and optimizing systems for data collection, storage, access, and data analytics. You will build data pipelines that transform raw data into usable formats for data scientists, analysts, and decision-makers within the organization.

Key Accountabilities:
  • Analyze data integration requirements and design solutions that align with ALJUF objectives.
  • Map data fields from source to target systems and design data transformation rules.
  • Implement and manage data integration workflows and processes.
  • Monitor data quality and recommend improvements.
  • Collaborate with IT and business units to implement data integration solutions.

Qualifications:
Bachelor’s degree in computer science, Information Technology, or related field.

Skills:
  • Experience in data analysis or a similar data-centric role.
  • Knowledge of ETL processes, data modeling, and integration tools.
  • Proficiency in SQL and experience with programming/scripting languages (*, Python).

Requirements

  • Requires 2-5 Years experience

Similar Jobs

Data Engineer

📣 Job AdNew

Salla E-Commerce Platform

Full-time
Join Our Team as a Senior Data Engineer!
At Salla E-Commerce Platform, we are on the lookout for a highly skilled Senior Data Engineer who possesses a deep expertise in ClickHouse and streaming data. If you are passionate about building scalable real-time analytics solutions, this role is for you!

Key Responsibilities:
  • Design, implement, maintain, and document highly scalable data pipelines for real-time and batch processing.
  • Build and optimize data systems to support accurate, low-latency analytics and reporting use cases.
  • Develop and maintain solutions for streaming and serverless data processing.
  • Collaborate with cross-functional teams to implement and support end-to-end analytics workflows.
  • Ensure data quality, reliability, and performance across the platform.
  • Monitor, troubleshoot, and optimize data infrastructure to maintain high availability.
  • Mentor junior engineers and contribute to the continuous improvement of engineering practices.

Requirements:
  • 5+ years of experience in data engineering or related fields.
  • Strong expertise in ClickHouse including schema design, job optimization, and cluster management.
  • Proven experience in real-time data processing using tools like Apache Kafka and Apache Spark Streaming.
  • Deep understanding of Distributed systems with a focus on scalability and resilience.
  • Proficiency in programming languages like Python or Java.
  • Hands-on experience with cloud platforms like AWS, GCP, or Azure.
  • Familiarity with Docker and Kubernetes.

Preferred Qualifications:
  • Experience in the e-commerce industry or similar high-traffic environments.
  • Knowledge of ETL/ELT tools like Airflow or dbt.
  • Familiarity with monitoring tools like Prometheus or Grafana.

breifcase2-5 years

locationMakkah

2 days ago