Opis oferty:
GCP Data Platform Engineer position in the Automation & Innovation Department in Warsaw. The role involves developing reusable frameworks for data processing, building data ingestion pipelines, implementing automated tests, collaborating with analysts and data scientists, monitoring and optimizing data pipelines in GCP.
Wymagania:
- 3+ years of experience as a Data Engineer
- Experience in large-scale data migration or cloud transformation projects
- Experience with GCP data services (BigQuery, Cloud Storage, Pub/Sub, Dataflow/Dataproc, Composer, Looker, Vertex AI)
- Hands-on experience with Infrastructure-as-Code (IaC) tools like Terraform
- Strong SQL skills and experience with large-scale data processing
- Proficiency in Python and/or Scala or Java
- Experience with Linux, Docker/Kubernetes and CI/CD pipelines
- Good command of English
- Strong communication skills
Dodatki i korzyści:
- Sports packages, medical care, life insurance on preferential conditions
- Possibility to participate in development webinars, Festiwal Rozwoju, access to learning platforms
- Free parking spot, gym at the office, work equipment, phone with unlimited offer
- Special offers and discounts
- Company events, internal contests, well-being and volunteering programs