WHO WE ARE
BUX makes it easy and affordable for Europeans to do more with their money. Since launching in 2014, BUX has made the markets accessible for more than 3 million users across 9 countries in Europe. BUX currently offers 3 apps that allow users to explore the financial markets including BUX Zero, the flagship platform that is making commission-free investing possible, allowing users to invest in the brands and companies they care about. BUX Zero is currently available in the Netherlands, Germany, Austria, France and Belgium. BUX X, launched in 2014, oﬀers short-term, leveraged trading, all powered by a vibrant in-app community. BUX Crypto, launched in 2020, now offers an easy and affordable way to invest in Bitcoin and other digital currencies. Headquartered in Amsterdam, the Netherlands, the company is backed by Holtzbrinck Ventures, Velocity Capital, Orange Growth Capital and Initial Capital.
At BUX we prefer to make our decisions based on insights derived from our data, and as such we have invested a lot of time and effort in our data platform. Currently we are in a phase of rebuilding our data platform in Google Cloud (GCP), and we want to make it better than it already is.
We are looking for a senior data engineer who is looking for a challenge. You will be the lead in this project, and guide us during the migration. You will have the opportunity to lay the foundation for our new data platform, and grow a team of data engineers over time, who will continuously work on extending and improving this platform. As a senior data engineer you will be part of the BI team, and report to the Head of BI.
WHAT IS THE CHERRY ON TOP?
- Conference & training budget to boost your professional development;
- The possibility to work from abroad for 1 month every 12 months;
- Cool office in the heart of Amsterdam (10 minutes from Central Station);
- Bi-weekly Hackathons, weekly drinks, fun company events;
- Margy’s famous lunches;
- Competitive salary and Employee Option Plan.
- Proactive, self-starting, can-do mentality, team-player;
- Experience with SQL, relational databases, and one or more programming languages (Python);
- Good understanding of data modeling techniques (e.g. data vault and Kimball), and best practises of data warehousing;
- Experience in delivering big data and machine learning solutions in a cloud environment (preferably GCP);
- Experience with building data pipelines in distributed environments with technologies such as Kafka and Spark;
- Experience with containerization and automation solutions in cloud environments (e.g. Docker/Kubernetes, Airflow, etc.).
NICE TO HAVE
- Working experience with dbt and Snowflake/BigQuery