Location: Toronto, Canada Thales people architect solutions that support 85 million mainline and suburban passenger journeys, worldwide, every day. Our Rail Signalling and Communication systems are used on metro lines across major cities, and 72,000 kms of route, 52,000 trains per day in 16 countries are controlled by our Traffic Management Systems. Together We deployed the first-ever nationwide ticketing system which processes over 50 million ticketing transactions in 100 cities daily. Thales provides world-leading Communications-Based Train Control (CBTC) and interlocking for mass rail transit applications globally. The Centre of Competence for Urban Rail Signalling is located right before you at mid-town Toronto. As the largest office in Canada, we house a department for every step of the Software Development Lifecycle (SDLC)! This includes Software, Hardware, Systems Design, Verification & Validation, Operations, etc. The supporting corporate shared services teams in Finance, Human Resources and IT is also located at the office. Our office space was designed to provide a sustainable, healthy workplace that expresses TRSS’s brand while increasing collaboration. Features include Greenguard furniture, EnergyStar appliances, low-emitting adhesives, sealants, and sliding glass doors on internal offices provide daylight and views to the outside. Come join the big Transport family, here in Toronto!
We are looking for a Full Stack Intern to join us for an 8-month internship at our Toronto office.
As a Software Developer within the GTS-URS Data Competency Centre (DCC), you will strive to deliver Digital Products and Services hosted both On-Premises and on Cloud Platforms that delight customers and users. Ready to be flexible and proactive, whilst being comfortable with learning a new domain, concepts and techniques. You will be working with AGILE methodologies, where knowledge of Cloud technologies, Data Analytics, Site Reliability and DevSecOps principles will be an asset to the team.
Focused primarily on our Digital Products and Data Analytics, you will be working collaboratively in an international context to ensure development and support of the solution. From UX design, clean and secure coding, QA, deployment on core or on cloud platform and service delivery and proper documentation write-up, you will be directly contributing to all aspects of define and delivery of core new products for the business. The Data Competency Center’s Platform leverages the abundance of data generated during day-to-day passenger operation for urban railways in every continent and has a direct impact on commuter experience and project execution with improvements such as better passenger travel times, faster system reliability growth, and less passenger service disruptions.
Key Accountabilities:
Development, testing, validation, demonstration and operational of Software solutions/services, including provisioning, packaging, deployment, administration and documentation
Delivery of high-quality best-practice clean and secure code via CI/CD Pipelines
Contribute to solving complex technical integration problems and to work collaboratively with the rest of the team to build a shared approach consensus
Utilization of On-Premise and Cloud (AZURE) platform to meet the project requirements with cybersecurity implications identification
Work in an Agile, cross-functional multinational team, actively engaging to support the success of the team
Required Skills and Experience:
The successful candidate should be working towards a Bachelor’s degree from an accredited university or college in Software Engineering, Computer Science or Information Technology or equivalent
Java (Spring Boot, Hibernate, Maven, Junit, Mockito)
SQL (PostgreSQL)
Web-development technologies including HTML, CSS, JavaScript/TypeScript (Stencil.JS/React or similar front-end frameworks)
Object storage (Azure Data Lake/AWS S3 or similar)
DevOps skills and technologies: version control with Git (GitLab/Bitbucket or similar), Docker, Artifactory, CI/CD (Jenkins, GitLab CI/CD or similar)
Knowledge of cloud and containerization, particularly AZURE, clustered deployment orchestration (Kubernetes), containers (Docker) and Pivotal Cloud Foundry
Experience managing competing priorities simultaneously and driving projects to completion
Fluent in English both in written and oral communication skills
Able to work full time hours (40 hours) with the start date and the agreed work term
Preferred Skills and Experience:
Python development skills and experience with various data science libraries such as Pandas, Numpy, etc.
Experience with big data, analytics platforms and visualizations tools e.g. Kibana, Elasticsearch/ELK, Grafana, Superset, Apache Spark, Azure Data Lake, Data Bricks
Understanding and previous use of common integration patterns for cloud, web services and public API Gateway deployments, (ingress web-server, web-proxy, load-balancer, firewall, databases, etc.)
Experience with Artificial Intelligence (AI) and Machine Learning (ML)
Final year of Bachelor's Degree in Software Engineering, Computer Science, Information Technology or equivalent Masters
Cloud and related technologies Certifications
#LI-Hybrid
#LI-NS1
Thales is an equal opportunity employer which values diversity and inclusivity in the workplace. Thales is committed to providing accommodations in all parts of the interview process. Applicants selected for an interview who require accommodation are asked to advise accordingly upon the invitation for an interview. We will work with you to meet your needs. All accommodation information provided will be treated as confidential and used only for the purpose of providing an accessible candidate experience.
#LI-NS1