Job Description
Location: Hybrid/GTA
Job Type: Full-Time
We are seeking a highly skilled and experienced Cloud Data Engineer to join our dynamic team. In this role, you will play a crucial role in designing and implementing cutting-edge data solutions using a variety of technologies and platforms.
Insights you can act on
While technology is at the heart of our clients’ digital transformation, we understand that people are at the heart of business success.
When you join CGI, you become a trusted advisor, collaborating with colleagues and clients to bring forward actionable insights that deliver meaningful and sustainable outcomes. We call our employees \members\ because they are CGI shareholders and owners and owners who enjoy working and growing together to build a company we are proud of. This has been our Dream since 1976, and it has brought us to where we are today — one of the world’s largest independent providers of IT and business consulting services.
At CGI, we recognize the richness that diversity brings. We strive to create a work culture where all belong and collaborate with clients in building more inclusive communities. As an equal-opportunity employer, we want to empower all our members to succeed and grow. If you require an accommodation at any point during the recruitment process, please let us know. We will be happy to assist.
Ready to become part of our success story? Join CGI — where your ideas and actions make a difference.
Your future duties and responsibilities
- Redesign, develop, and maintain scalable and efficient data solutions on the cloud platform.
- Implement and optimize data pipelines using GCP services such as BigQuery, Dataflow, PubSub, and Data Migration tools.
- Build and maintain data warehouses for efficient data storage, retrieval, and analysis.
- Troubleshoot and debug issues related to data pipelines, ensuring smooth and reliable data flow.
- Collaborate with the Data and Analytics team to design and implement data-driven solutions.
- Utilize ELK, Grafana, and other monitoring tools to ensure optimal performance and observability of data systems.
- Work with Looker to handle large-scale data processing and reporting requirements.
- Apply Site Reliability Engineering (SRE) principles to enhance system reliability, incident management, and operations.
- Develop and maintain a robust observability framework for monitoring, alerting, and tracking data system performance.
- Collaborate with cross-functional teams to integrate data systems via APIs and ensure seamless data integration.
- Familiarity with common data pipeline issues and their solutions
Required qualifications to be successful in this role
- Bachelor’s degree in computer science, Engineering, or a related field.
- Proven experience working with GCP services including BigQuery, Dataflow, PubSub, and Data Migration tools.
- Strong proficiency in data warehouse design and implementation.
- Proficiency in troubleshooting and debugging data pipeline issues.
- Solid understanding of data and analytics principles.
- Experience with ELK stack and Grafana for monitoring and observability.
- Familiarity with Hadoop, Looker, and other relevant data processing and reporting tools.
- Knowledge of Site Reliability Engineering (SRE) principles and incident management.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- Experience with Kafka and API integration is a plus.
Join our team of cloud professionals and contribute to the success of our cutting-edge projects. We offer a competitive salary, comprehensive benefits package, and a stimulating work environment.
#LI-LP6
Apply
Go Back