Opis oferty:
Our team is based in Warsaw. As part of the Data & AI area, we implement projects based on the practical "data science" and "artificial intelligence" applications of an unprecedented scale in Poland. We are looking for Big Data engineers who want to build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers.
Wymagania:
- Programming in languages such as Scala or Java, Python
- Strong understanding of distributed systems, data storage, and processing framework like dbt, Spark or Apache Beam
- Knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS
- Use of good practices (clean code, code review, TDD, CI/CD)
- Efficient navigation within Unix/Linux systems
- Positive attitude and team-working skills
- Desire for personal development and keeping knowledge up to date
- English at B2 level
Dodatki i korzyści:
- Possibility to learn and work with backend (Spring, Kotlin) and AI technologies within the team
- Well-located offices with fully equipped kitchens and bicycle parking facilities
- Excellent working tools such as height-adjustable desks and interactive conference rooms
- Wide selection of varied benefits in a cafeteria plan
- English classes covered by the company
- Provided work equipment such as Macbook Pro / Air or Dell with Windows
- Working in a team of top-class specialists
- High degree of autonomy in organizing work and continuous development encouraged
- Hackathons, team tourism, training budget and internal educational platform