img
Contract TypeContract TypeFull-time
Workplace typeWorkplace typeOn-site
LocationLocationJeddah
Join RiDiK as a Data Engineer!
We are seeking a seasoned Data Professional to drive the design and implementation of a modern Data Lakehouse for a major financial services program.

Role Objective:
The ideal candidate will be an expert in the Teradata FSLDM framework and have deep expertise in Informatica for complex ETL/ELT orchestration. You will transform raw financial data into a structured, high-performance Lakehouse architecture that supports both BI and advanced analytics.

Key Responsibilities:
  • Data Modeling: Lead the implementation and customization of the Teradata FSLDM.
  • Architecture Design: Design and maintain Data Lakehouse layers to support massive scales of financial data.
  • ETL/ELT Development: Architect and develop robust data pipelines using Informatica.
  • Performance Tuning: Optimize SQL and mappings for high-volume data processing.
  • Data Governance: Implement data lineage, quality checks, and metadata management.
  • Stakeholder Collaboration: Work closely with Business Analysts and Data Scientists.

Technical Requirements:
  • Expert-level knowledge of FSLDM is mandatory.
  • Extensive experience with Teradata.
  • Advanced proficiency in Informatica.
  • Proven experience in building Data Lakehouse architectures.
  • Strong understanding of Banking/Financial Services domains.
  • Expert in writing and optimizing complex analytical SQL queries.

Requirements

  • Requires 2-5 Years experience

Similar Jobs

Data Engineer

📣 Job Ad

Barakah

Full-time
Join Barakah as a Senior Data Engineer!

At Barakah, we are committed to reducing food waste through technology and social awareness. As a Senior Data Engineer, you will be instrumental in enhancing our data platform, ensuring it is reliable, scalable, and high-performance. This position entails:
  • Designing, building, and maintaining data pipelines across various use cases.
  • Optimizing the data warehouse to ensure efficiency.
  • Collaborating closely with analytics, product, and engineering teams to maintain data integrity.

Key Responsibilities:
  • Manage and optimize a ClickHouse data warehouse.
  • Build robust data pipelines using dbt and Kafka.
  • Implement monitoring, alerting, and observability.
  • Design backup and recovery systems.
  • Support analytics teams with reliable data delivery.
  • Establish data engineering best practices.

Required Skills:
  • Strong SQL and database experience (ClickHouse and/or PostgreSQL is a plus).
  • Experience in building and managing production data pipelines.
  • Familiarity with orchestration tools and batch processing.
  • Stream processing knowledge with Kafka.
  • API integration with resilience patterns.
  • Proficient in Python for data processing.

Preferred Skills:
  • Experience with Kubernetes, Docker, and DataOps.
  • Prior experience in the GCC region.
  • Arabic language fluency.

What We Offer:
  • Opportunity to make a meaningful impact on food sustainability.
  • A supportive environment for learning and growth.
  • Career advancement opportunities within a growing company.

breifcase2-5 years

locationJeddah

17 days ago