WOO operates a centralised exchange WOO X and a decentralised platform WOOFi, democratising access to top-tier liquidity and exceptional trading execution while keeping costs competitive. Our native token, $WOO offers token holders a unique position to participate, engage, and maximise the benefits of both the DeFi and CeFi ecosystem. Our team of highly-selected 170 employees is located in 12 cities worldwide.
Our vision is to inspire confidence, higher performance, and joy in every user. We have a mission to provide the best liquidity on the best terms. We compete not just on price execution but also on integrity, user experience, innovative tools, and global opportunities.
About the opportunity:
We are looking for a Data Warehouse Engineer who can join us along this mission and vision. You’ll become an integral part of the Data team, which is creating the best-in-class analytical capabilities (insights and data products) to empower WOO with the correct information at the right time to make data-driven decisions. Interested? Keep on reading!
What you’ll be working on:
- Collaborating with the analysis and development teams to engage in data modelling activities, driving the data-driven operations of business departments.
- Working closely with cross-functional teams to contribute to the development and construction of data application projects, ensuring alignment with organizational goals.
Why work with us:
Join us in realising our vision in advancing decentralisation, and leading innovation in CeFi and DeFi. Enjoy work flexibility, a supportive team, and an environment that nurtures your ideas. Plus, expect a performance-based annual bonus for all contributors at WOO.
About you:
- Possess a minimum of 4 years of experience in the field of data warehousing, including expertise in dimension modelling design methods and hands-on experience in processing massive datasets (ETL).
- In-depth understanding of the knowledge base and skills in the data warehouse domain.
- Have a solid foundation in data modelling and data architecture, along with familiarity with big data technologies such as Hadoop, Hive, HBase, Spark, Kafka, Flink, Presto, ClickHouse, among others.
- Expertise in SQL with experience in SQL performance tuning, and proficiency in Hive SQL development.
- Proficiency in at least one of the following languages: Java, Python, Shell, or Scala.
- Exceptional communication skills.
- Fluent English.
Getting the job
We're actively seeking talented individuals to join our team outside of our typical hiring schedule. This proactive approach allows us to connect with exceptional candidates like you even before specific positions become available.
On average, successful candidates go through five rounds of interviews and tests. Our hiring process begins by meeting with our People Team, who help facilitate the process of placing you in your new role. You can expect to share your experience and ideas in online video interviews with our hiring team, made up of management and potential new colleagues.
Submitting your resume now ensures that you're first in line when new opportunities arise. By doing so, you'll have a head start in the selection process and get a chance to showcase your skills and experience.
Get started on your application here!