Our partner CKW is a leading service provider for the energy, data, and infrastructure in Switzerland with the experience and the know-how to tackle the challenges of a new energy world. CKW focuses on creating simple and inspiring solutions that give customers access to modern uses of energy and digital infrastructure. By reducing the variety and complexity of the changing energy world, they give customers orientation and answers - helping them make the right decisions.
Together with our partner, we are expanding the team and looking for a:
Senior Data Engineer
to join the cross-functional team, and work on the existing Data platform (CKW Insights) and the insider Data Products, a Swiss-based energy utility company.
Your responsibilities:
- Develop and integrate data products within CKW Insights, including event-driven, batch, and micro-batch data sources.
- Establish a Datamesh and manage cloud resources for data orchestration, transformation, and storage using Infrastructure as Code (IaC).
- Implement and maintain storage solutions for both structured and unstructured data, including Data Warehouse and Lakehouse systems.
- Design and develop ETL and ELT pipelines using SQL-based and OOP languages.
- Ensure data processing resilience by producing and consuming event queues.
- Conduct data warehousing, create optimized data models, and manage large datasets (big data).
- Generate reports and maintain operational and data quality through rigorous testing, validation, and verification.
- Apply machine learning techniques, IoT integration, and analyze end-to-end solutions.
- Work within Agile frameworks, emphasizing process-oriented management.
Your profile:
- Over 8 years of experience in data analysis and solution implementation.
- Expertise in Python, particularly with pySpark/pandas, and experience with Azure Eventhubs (Kafka).
- Proficient in SQL, T-SQL, Databricks SQL, and Azure-based coding and resource management via Portal/CLI and Terraform.
- Familiar with Azure Functions, APIs, and scheduling functions.
- Understanding of traditional and modern data warehousing methods, including Kimball, Inmon, and Data Vault 2.0.
- Experience with ELT & ETL techniques, Databricks notebooks, and reporting tools such as Power BI.
- Knowledge of data quality testing, functional testing, and Azure DevOps would be considered a plus.
- Background in Hadoop, the broader Big Data ecosystem, Azure DevOps pipelines (yaml), and artifacts would be considered a plus.
- Data solutions, machine learning, IoT, and Datamesh experience would be considered a plus.
- Bachelor’s degree in Engineering, specializing in Business Intelligence or related fields.
Our offer:
- We value work-life balance and have flexible working hours
- We offer you a chance to visit conferences and training, as we are dedicated to allowing all our team members to further enhance their knowledge
- Lots of team activities and perks: yearly retreats, workshops, hackathons
- Work with an international team of world-class engineers
- A modern, sunny, and open working place with a positive and fun atmosphere.
- A stimulating work experience that will allow you to grow both professionally and personally
- Know someone who would be a perfect fit? Let them know and after the successful probation period come and collect a well-deserved referral fee!