Facebook Pixel

Ulubione oferty

Aplikuj
Oferta archiwalna - wróć do listy ogłoszeń

Scala Software Engineer

nr ref: 237/5/2025/AD/91151
Konsultant prowadzący: Aleksandra Durej
Kraków (małopolskie)
26 maja 2025

W Antal zajmujemy się rekrutacją od ponad 20 lat. Dzięki działaniu w 10 wyspecjalizowanych dywizjach, świetnie orientujemy się w aktualnych trendach branżowych. Precyzyjnie określamy specyfikę stanowiska, klasyfikując kluczowe umiejętności i niezbędne kwalifikacje. Naszą misją jest nie tylko znalezienie kandydata, którego kompetencje wpisują się w wymagania danego ogłoszenia, ale przede wszystkim stanowiska, spełniającego oczekiwania kandydata. Numer rejestru agencji zatrudnienia: 496.

Scala Software Engineer – Scala + Java + Spark
📍 Location:  Poland (2 days in office in Kraków per month)

Are you passionate about big data, building core components, and developing complex business logic with modern technologies like Spark and cloud platforms?

We’re looking for a Scala Software Engineer  to join a global team that’s shaping a next-generation analytics and surveillance platform within a highly regulated environment. If you enjoy hands-on development, end-to-end ownership, and working with both Scala and Java, this is the role for you.


🔍 Key Responsibilities:

  • Design and implement core components for large-scale data processing using Scala and Java

  • Build and optimize data pipelines to consolidate and transform data from multiple sources into meaningful business datasets

  • Leverage Apache Spark (in both Scala and Java) to develop scalable processing solutions

  • Onboard new data sources and develop ingestion and transformation logic

  • Collaborate with Data Scientists to productionize analytics models

  • Ensure development aligns with CI/CD and TDD best practices

  • Cooperate with architects and business stakeholders to ensure scalable, high-quality solutions

  • Participate in end-to-end delivery including deployment and monitoring in production


Requirements:

  • Hands-on experience with Apache Spark using both Scala and Java (not PySpark)

  • Strong background in building core applications and implementing complex data processing logic

  • Solid understanding of HDFS and YARN – including their purpose and real-world usage

  • Broad experience with ETL processes and Big Data technologies

  • Knowledge of version control and CI/CD tools (Git, Bitbucket, Jenkins, Maven)

  • Comfortable working in distributed, international teams; strong communication in English (min. B2)


🌟 Nice to Have:

  • Experience with Google Cloud Platform (GCP) for deploying and managing data services

  • Hands-on familiarity with Kubernetes and Airflow

  • Exposure to the ELK stack

  • Working knowledge of Linux, Bash scripting, basic Python

  • Familiarity with the Spring framework


💼 What We Offer:

  • A key role in a globally important initiative, focused on compliance and surveillance

  • Modern tech stack with real ownership of your code in production

  • A collaborative, DevOps-oriented culture

  • The opportunity to shape technical decisions early in a long-term project

  • Work with highly skilled professionals in Poland and globally