Facebook Pixel

Ulubione oferty

Aplikuj

Big Data Engineer

nr ref: 185/6/2025/DT/91539
Konsultant prowadzący: Diana Tkachova
Kraków (małopolskie)
17 czerwca 2025

W Antal zajmujemy się rekrutacją od ponad 20 lat. Dzięki działaniu w 10 wyspecjalizowanych dywizjach, świetnie orientujemy się w aktualnych trendach branżowych. Precyzyjnie określamy specyfikę stanowiska, klasyfikując kluczowe umiejętności i niezbędne kwalifikacje. Naszą misją jest nie tylko znalezienie kandydata, którego kompetencje wpisują się w wymagania danego ogłoszenia, ale przede wszystkim stanowiska, spełniającego oczekiwania kandydata. Numer rejestru agencji zatrudnienia: 496.

Job Title: Full Stack Engineer (Big Data)
Location: Kraków

About the Role

You will work as part of a newly established engineering team in Kraków, responsible for the development, enhancement, and support of high-volume data processing systems and OLAP solutions used in global traded risk management.

Key Responsibilities

  • Design, develop, test, and deploy scalable IT systems to meet business objectives

  • Build data processing and calculation services integrated with risk analytics components

  • Collaborate with BAs, business users, vendors, and IT teams across regions

  • Integrate with analytical libraries and contribute to overall architecture decisions

  • Apply DevOps and Agile methodologies, focusing on test-driven development

  • Provide production support, manage incidents, and ensure platform stability

  • Contribute to both functional and non-functional aspects of delivery

Required Qualifications & Skills

  • Degree in Computer Science, IT, or a related field

  • Fluent in English, with strong communication and problem-solving skills

  • Hands-on experience with big data solutions and distributed systems (e.g., Apache Spark)

  • Strong backend development using Java 11+, Python, and Groovy

  • Experience in building REST APIs, microservices, and integrating with API gateways

  • Exposure to public cloud platforms, especially GCP or AWS

  • Familiarity with Spring (Boot, Batch, Cloud), Git, Maven, Unix/Linux

  • Experience with RDBMS (e.g., PostgreSQL) and data orchestration tools (e.g., Apache Airflow)

  • Solid understanding of test automation tools like JUnit, Cucumber, Karate, Rest Assured

Desirable Skills

  • Knowledge of financial or traded risk systems

  • Experience with UI/BI tools and streaming solutions

  • OLAP and distributed computation platforms such as ClickHouse, Druid, or Pinot

  • Familiarity with data lakehouse technologies (e.g., Dremio, Trino, Delta Lake, Iceberg)

  • Exposure to technologies like Apache Flink, Beam, Samza, Redis, Hazelcast

  • Containerization and orchestration tools: Docker, Kubernetes

  • Certifications: Scrum Master, PMP, FRM, or CFA

  • Knowledge of RPC frameworks (e.g., gRPC)