EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We believe that the team you build is the company you build. Our offices are digital laboratories. Our clients are major global brands. We’re always looking for talented teammates. Think you’ve got what it takes?
DESCRIPTION
Currently we are looking for a self-motivated, enthusiastic Senior Data Software Engineer to join our team in Serbia.
RESPONSIBILITIES
- Create and maintain optimal on-prem and cloud-based data pipeline architectures capable of ingesting structured and unstructured data using both batch & streaming methods
- Assemble large, complex data sets that meet functional and technical requirements
- Identify, design, and implement internal process improvements by automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability
- Build Infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Develop data flows that can leverage both on premise and cloud architectures
- Work with stakeholders including the business analytics teams and IS architecture teams to assist with data-related technical issues and support their data infrastructure needs
REQUIREMENTS
- Extensive experience working within Core Data Software Engineering
- Experience building Data Management solutions on Azure or AWS
- Experience working with Databricks
- Advanced working knowledge of cloud-based data pipeline development
- Extensive experience with an object-oriented/object function scripting language: Python, Java
- Experience building and optimizing data pipelines for acquisition and management of data including data security, metadata capture, data cataloging & classification
- Working knowledge of message queuing, stream processing and Change Data Capture mechanisms
- Any object-oriented/object function scripting language: Python, Java, Node.js
NICE TO HAVE
- Solid Cloud experience with Azure: Storage, Compute, Networking, Identity and Security
- Queues and Stream Processing: Serverless, Data Analysis and Visualization
- ML as a Service
- Experience with Spark Streaming
WE OFFER
- Dynamic, entrepreneurial, high speed, high growth corporate environment
- Diverse multicultural, multi-functional, and multilingual work environment
- Opportunities for personal and career growth in a progressive industry
- Global scope, international projects
- Widespread training and development opportunities
- Unlimited access to LinkedIn learning solutions
- Competitive salary and various benefits
- Sport and social teams support, recreation area, advanced CSR programs.