MTX Metisox Cell Networks is a leading provider of specialized biomedical databases, analytical software for life science research companies. With teams in Cambridge, Heidelberg, Novi Sad and Rijeka we provide scientific information around molecular and biological processes used in safety assessment, predictive toxicology and scientific services. As we are a rapidly growing high tech company, we are currently looking to recruit skilled enthusiastic people for its international highly interdisciplinary team. MTX is a place where teams of pharmacists, chemists, molecular biologists and technologists use innovative intelligent technologies every day to tackle cutting edge scientific problems in pharmaceutical drug discovery and in the chemical industry.
Metisox is sister company but candidates will be employed by a domestic company CCNet Scientific d.o.o. Novi Sad.
We are currently hiring a Data Engineer to join our engineering team to design and engineer solutions including data pipelines, data integration, and data mining applications integrating various software components.
Responsibilities:
- Writing effective and scalable Java code
- Designing, implementing and maintaining robust data pipelines
- Orchestration and automatization of data flows/pipelines
- Data Modelling and relational database design (RDBMS)
Key Requirements:
- Degree in Computer Science, Engineering, or a related field
- Very good knowledge of Java programming language
- Good knowledge of Python programming language
- Experience as a data engineer for the last 2 years
- Extensive experience using Java, Python, and SQL
- Strong experience with data processing frameworks such as Apache Beam and/or Apache Spark
- Strong experience with Linux
- Full product development life-cycle from requirements to testing
- Experience with GIT code version control system
- Familiarity with various testing frameworks and tools
Nice-to-haves:
- Working experience with Elasticsearch
- Design and implementation of REST API
- Familiarity with Agile development methodologies
- Familiarity with OLAP/Data Warehousing
- Experience with building CI/CD pipelines
- Familiarity with data pipeline orchestration tools like e.g. Apache Airflow
- Experience with Big Data/Cloud-based platforms (GCP, Hadoop)