img
Contract TypeContract TypeFull-time
Workplace typeWorkplace typeOn-site
LocationLocationRiyadh
Join Future Look ITC as a Data Engineer!

We are seeking a skilled Data Engineer to design and implement scalable data pipelines that will elevate our data handling capabilities. At Future Look ITC, we partner with top innovative names to foster success with cutting-edge solutions aligned with Saudi Arabia's Vision 2030.

Key Responsibilities:
  • Design and implement scalable batch and streaming data pipelines.
  • Build and maintain data ingestion, transformation, and storage systems.
  • Ensure data quality, consistency, lineage, and reliability across pipelines.
  • Collaborate with ML and backend engineers to provide clean, usable datasets for training and inference.
  • Optimize data storage and query performance for analytical and operational workloads.
  • Implement data validation, monitoring, and alerting mechanisms.
  • Manage schema evolution, versioning, and backward compatibility.
  • Document data models, pipelines, and operational procedures.

Core Deliverables:
  • Production-grade data pipelines and ingestion workflows.
  • Curated datasets and data models for analytics and ML.
  • Data quality checks and monitoring dashboards.
  • Reliable data storage and access patterns.
  • Documentation for data architecture and pipelines.

Required Qualifications:
  • 3+ years of experience in data engineering or related roles.
  • Strong experience with Python and SQL.
  • Experience with data processing frameworks and tools.
  • Familiarity with relational and analytical databases.
  • Experience building and operating data pipelines in production environments.

Preferred Qualifications:
  • Experience with streaming platforms and real-time data processing.
  • Familiarity with data lakes, warehouses, and columnar storage formats.
  • Experience supporting ML or analytics platforms.
  • Exposure to cloud or hybrid infrastructure environments.

Success Indicators:
  • Data pipelines are stable, accurate, and scalable.
  • High data availability and low pipeline failure rates.
  • Data consumers can easily discover and trust datasets.
  • Reduced data-related incidents and manual interventions.

Working Style:
  • Strong ownership of data reliability.
  • Attention to detail and data correctness.
  • Clear documentation and communication.

Requirements

  • Requires 2-5 Years experience

Similar Jobs

Data Engineer

📣 Job Ad

Node Technologies

Full-time
Join Node Technologies as a Data Engineer!
We are a leading technology company dedicated to supporting the digital transformation of government entities and major corporations through innovative solutions. In this role, you will design, build, and maintain secure, scalable data pipelines and analytics platforms, leveraging your expertise in data engineering, AI integration, and analytics enablement.

Key Responsibilities:
  • Data Engineering & Ingestion:
    Design, build, and maintain secure data ingestion pipelines from multiple internal and external data sources. Implement ETL/ELT processes to move and transform data into Google BigQuery.
  • Databases & Data Warehousing:
    Design and manage relational databases using PostgreSQL or MySQL, and develop and optimize BigQuery datasets.
  • Data Analytics & Reporting:
    Enable and support analytics and reporting using Power BI Pro and Looker Enterprise.
  • Data Visualization:
    Develop custom data visualizations using libraries such as ***** and ***
  • AI & Machine Learning Integration:
    Integrate external AI services and develop data processing pipelines using Python.
  • Security, Quality & Operations:
    Implement secure data access, monitor data pipelines, and document data architectures.

Technical Skills Required:
  • Minimum 3 years of experience as a Data Engineer.
  • Strong experience with PostgreSQL, MySQL, and Google BigQuery.
  • Proficient in Python, data modeling, and BI tools.
  • Knowledge of data visualization libraries and API-based data ingestion.

Soft Skills:
Strong analytical skills, communication abilities, and the capacity to work in agile, cross-functional teams are essential.

If you are eager to contribute to exciting projects and grow your career in a dynamic setting, we would love to hear from you!

breifcase2-5 years

locationRiyadh

24 days ago

Data Engineer

📣 Job Ad

AiElements

Full-time
Job Summary
We are seeking a motivated and detail-oriented Data Engineer with at least 2 years of experience in the Informatica ecosystem to design, develop, and maintain ETL processes and data pipelines. The ideal candidate will have hands-on experience with Informatica PowerCenter and strong knowledge of data integration, transformation, and optimization techniques.

Key Responsibilities
  • Design, develop, and maintain ETL workflows using Informatica PowerCenter.
  • Build and optimize data pipelines for structured and semi-structured data.
  • Perform data extraction, transformation, and loading (ETL) from multiple data sources.
  • Ensure data quality, accuracy, and consistency across systems.
  • Troubleshoot and resolve ETL and data integration issues.
  • Collaborate with data analysts, BI teams, and stakeholders to understand data requirements.
  • Optimize ETL performance and improve existing data workflows.
  • Document ETL processes, mappings, and technical designs.
  • Support deployment, monitoring, and maintenance of data integration solutions.

Required Qualifications
  • Bachelor’s degree in Computer Science, Information Systems, or a related field.
  • Minimum 2 years of hands-on experience in Informatica tools, especially PowerCenter.
  • Strong understanding of ETL concepts, data warehousing, and data modeling.
  • Experience working with relational databases (Oracle, SQL Server, MySQL, etc.).
  • Strong SQL skills for data validation and troubleshooting.
  • Familiarity with data quality and performance tuning techniques.
  • Ability to analyze and solve data-related issues effectively.

Preferred Skills (Nice to Have)
  • Experience with other Informatica products (IDMC, Informatica Cloud).
  • Knowledge of data warehousing concepts (Star/Snowflake schema).
  • Experience with large-scale or enterprise data environments.
  • Exposure to cloud platforms (Azure, AWS, GCP).
  • Basic knowledge of scripting (Python, Shell).

Skills & Competencies
  • ETL Development
  • Informatica PowerCenter
  • Data Integration & Transformation
  • SQL & Databases
  • Problem-solving & Troubleshooting
  • Attention to detail

breifcase2-5 years

locationRiyadh

14 days ago