Current job openings

Japan (1)
chevron_right
Kraków (25)
chevron_right
London (4)
chevron_right
Malta (1)
chevron_right
Singapore (3)
chevron_right
Toronto (6)
chevron_right
Warsaw, Masovian Voivodeship, Poland (1)
chevron_right

Data Engineering Intern, Toronto

We have come a long way since our first currency feed 23 years ago, we are an award-winning global company offering leading currency solutions for both retail and corporate clients, from a tech start-up to a global corporation. Founded in 1996, we became the first company to share exchange rate information on the internet free of charge and in 2001, we launched a trading platform that helped pioneer the development of online-based trading around the world, enabling forex and CFD investors the ability to trade the financial markets.  Our vision is to transform how our clients can meet all of their currency needs with innovative and award-winning solutions. Under new ownership with significant ambitions to grow the business on the global stage, we are looking for highly motivated, passionate individuals who want to make a mark in a dynamic environment. 

We are looking for a co-op data engineer to help us build our next generation data platform on Google Cloud. This is an exciting role where you will be working alongside experienced data engineers and data scientists on the next generation platform for business intelligence and analytics.

This is a 8 month co-op contract.

During a typical week you will ...

  •  Have the chance to develop various ETL tools, for both batch and real time processing. 
  • Discuss and design the complete life cycle of the flow of information from the data source to end business user.
  • Utilize state of the art cloud based tools such as Google Cloud Storage, BigQuery, Apache Beam, PubSub, BigTable and Airflow.
  • Pilot and assess the latest data platform technologies and make recommendations to management on which would best fit into our company-wide data architecture
  • Work with internal tools for enabling the networks and the right connections.  
  • Pull data from third party APIs such as Salesforce and Marketo and help prepare the information for reporting purposes.
  • Investigate the cause of a failed and missing customer record.
  • Be exposed to the strict importance of monitoring, verifying and securing the flow of financial information.

We are looking for ...

  • Experience with OOP, specifically in languages such as Python, Java or Scala.
  • Quick learner and eager to be the first on the team to try out new tools.
  • Exposure or an interest in working with any one of the Big Data technologies such as BigQuery, Kafka, Airflow, Apache Nifi, or Spark.
  • An excellent team player with the desire to communicate (both verbal and written).
  • Enthusiastic about collaborative problem solving.
  • Motivated by challenges and open ended problems.

Bonus points if you have..

Experience with SQL and using tools for visualizing or monitoring data.

Apply