About the RoleJoin Capgemini, a market leader in the data, platform, and analytics sectors, as a
Data Engineer. This position plays a pivotal role in driving data-driven decision-making processes within the organization, leveraging your expertise in cloud technologies and data architecture.
Your Responsibilities- Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines using cloud-native tools.
- Implement best practices for data ingestion, transformation, and loading processes across multiple sources.
- Automate workflows and develop data pipelines using tools such as Apache Airflow, Glue, Databricks, or Dataflow.
- Architect and maintain cloud-based data solutions using major platforms such as AWS, Azure, or Google Cloud Platform.
- Optimize cloud infrastructure for cost, performance, and security.
- Implement monitoring, logging, and alerting solutions for data pipelines and cloud infrastructure.
Your Skills and Experience- Bachelor's degree in data science, Analytics, Business, or a related field. Advanced degrees or certifications are a plus.
- Proficiency with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Strong programming skills in Python, Java, Scala, or SQL.
- Experience with big data tools like Hadoop, Spark, Kafka, etc.
- Knowledge of various databases (PostgreSQL, MongoDB, Cassandra).
Why Capgemini?By becoming part of our diverse team, you will leverage technology in innovative ways to help our clients grow and thrive in a sustainable manner. Through collaboration and learning, you will enhance your skills and contribute positively to our clients' journeys towards digital and sustainable transitions.