img
Contract TypeContract TypeFull-time
Workplace typeWorkplace typeOn-site
LocationLocationRiyadh
Join Quant as a Senior Data Engineer and be part of a pioneering team at the forefront of data science and analytics.

As a Senior Data Engineer at Quant, you will play a critical role in preparing data for modeling and predictive analytics. Your responsibilities will encompass integrating new data sources and performing essential pre-processing tasks such as data cleansing and feature engineering. You will also manage various ETL-related operations and leverage your skills in software development to optimize data processes.

Key Responsibilities:
  • Design and implement scalable data pipelines utilizing SQL, Airflow, Python, Alteryx, and cloud technologies.
  • Lead data source integration and design data models that align with complex business requirements.
  • Ensure data quality by identifying and addressing issues in collaboration with data source owners.
  • Deploy and optimize machine learning models, statistical methods, and analytics programs.
  • Research novel data acquisition methods to enhance business value.
  • Integrate advanced data management tools and software engineering practices for enhanced system performance.
  • Build and maintain high-quality datasets that support decision-making processes.
  • Develop custom software components and analytics applications to meet business needs.
  • Implement strategies to improve data reliability, efficiency, and scalability.
  • Drive innovation through research and development initiatives.
  • Maintain clear and comprehensive technical documentation of ongoing projects.

Requirements:
  • Proficiency in Python with hands-on experience in data analysis and automation.
  • Experience with Dataiku for advanced data workflows and machine learning.
  • Familiarity with AWS or Azure cloud services.
  • Strong background in data visualization and statistical analysis for actionable insights.
  • Experience with machine learning models and big data projects.
  • Solid understanding of programming concepts and data architecture.
  • Ability to define and track meaningful metrics for analytics initiatives.
  • Excellent analytical skills with a balance of quantitative and qualitative insights.
  • Proficiency in Microsoft Office and Google Suite for effective documentation and collaboration.
  • Bachelor’s and/or Master’s degree in Computer Science or a related field.
  • Agility, innovation, and ownership in problem-solving.
  • Collaborative team player in multicultural environments.

Requirements

  • No experience required

Similar Jobs

Data Engineer

📣 Job AdNew

swatX Solutions

Full-time
Job Summary: The Informatica Engineer will be responsible for designing, developing, and implementing ETL processes using Informatica tools within the telecommunications sector. This role involves ensuring data quality, integrity, and efficient data flow across systems.

Key Responsibilities:
  • Design, develop, and implement ETL processes using Informatica PowerCenter and Informatica Data Engineering.
  • Maintain and optimize existing ETL workflows and mappings.
  • Ensure data quality and integrity across all ETL processes.
  • Collaborate with data analysts and other stakeholders to understand data requirements.
  • Troubleshoot and resolve data-related issues.
  • Manage and integrate geographical data using Esri and ArcGIS Data Store.
  • Perform data modeling and control quality of data and follow the best practices in data management.
  • Perform realtime ingestion using CDC from different data stores including MS SQL and MariaDB.

Qualifications:
  • Bachelor’s degree in Computer Science, Information Systems, or a related field.
  • 3+ years of experience with Informatica PowerCenter or similar ETL tools.
  • Experience with SQL and relational databases.
  • Experience in Informatica Cloud or on Premise versions.

Skills:
  • Strong understanding of ETL concepts and best practices, CDC, and Spark.
  • Proficiency in Informatica PowerCenter, Informatica Data Engineering, and SQL.
  • Knowledge of Esri and ArcGIS Data Store.
  • Excellent problem-solving and analytical skills.
  • Experience with Denodo is preferred.

Preferred Qualifications:
  • Experience in data warehousing and big data technologies.
  • Knowledge of data governance and quality management.

breifcase0-1 years

locationRiyadh

about 17 hours ago

Data Engineer

📣 Job AdNew

GitMax

Full-time
Join GitMax as a Lead Data Engineer!

Our client, a leading organization in the financial services sector, is seeking an experienced Lead Data Engineer to drive strategic data platform initiatives. This role is ideal for professionals with a strong background in Data Warehousing (DWH), ETL, and large-scale data migrations, especially in the fintech or IT consulting domain. You'll be working on mission-critical data platform modernization programs for Tier-1 clients in the Middle East.

Key Responsibilities:
  • Lead end-to-end design and execution of enterprise data warehouse and reporting solutions.
  • Oversee the migration of legacy data environments to a modern unified platform based on Greenplum.
  • Drive data integration using Debezium and RabbitMQ for Change Data Capture (CDC) and streaming.
  • Collaborate with business stakeholders and engineering teams to define technical requirements and delivery roadmaps.
  • Guide and mentor a team of 5+ data engineers, ensuring timely delivery and quality standards.
  • Work closely with BI developers and analysts to enable seamless access to reporting data.
  • Ensure alignment with best practices in data governance, architecture, and compliance.
  • Act as a key contributor in client-facing discussions and workshops.

Tech Stack & Requirements:
Must-Have Skills:
  • Proven experience with Greenplum, Teradata, and advanced SQL.
  • Hands-on experience with enterprise ETL tools (Informatica PowerCenter or similar).
  • 5+ years working on Data Warehousing projects within financial or enterprise environments.
  • Team leadership experience, including ownership of deliverables and mentoring.

Nice-to-Have Skills:
  • Debezium, Kafka, RabbitMQ, or other streaming/CDC tools.
  • Exposure to BI tools like MicroStrategy, Power BI.
  • Understanding of Java and microservices-based architecture is a plus.

Other Requirements:
  • Strong communication and stakeholder management skills.
  • Fluency in English (both written and verbal).
  • Willingness to relocate to Saudi Arabia or commit to frequent business travel (minimum 70% on-site).

What’s in it for You:
  • Competitive salary (based on experience & seniority).
  • Comprehensive relocation support: visa sponsorship, accommodation, airfare reimbursement, and local transport.
  • Medical insurance included.
  • Dynamic, multicultural work environment and high-impact projects in the Middle East fintech ecosystem.

breifcase0-1 years

locationRiyadh

Remote Job
about 17 hours ago