Opis oferty:
GCP Data Platform Engineer
Location: Warszawa, ul. Marynarska 12
Contract: B2B contract
Working mode: Hybrid (2-3 days a week in the office)
Develop reusable frameworks for data processing and testing on GCP (e.g. BigQuery, Dataflow/Dataproc, Composer). Support building and maintaining batch and streaming data ingestion pipelines from various sources (databases, Kafka/MQ, APIs, files) into GCP. Implement automated tests and data quality checks for pipelines. Collaborate with analysts and data scientists to deliver reliable, well-documented datasets. Monitor, optimize and secure data pipelines in line with data governance and compliance standards.
What skills will be appreciated?
- 3+ years of experience as a Data Engineer in a data-driven environment
- Experience in large-scale data migration or cloud transformation projects
- Experience with modern data platform patterns, including data lakehouse architectures on GCP (Cloud Storage + BigQuery)
- Hands-on experience with GCP data services (BigQuery, Cloud Storage, Pub/Sub, Dataflow/Dataproc, Composer, Looker, Vertex AI)
- Hands-on experience with infrastructure as code IaC(Terraform+)
- Strong SQL skills and experience with large-scale data processing (Spark as must, batch and streaming)
- Proficiency in Python and/or Scala, Java
- Experience with Linux, Docker/Kubernetes, and CI/CD pipelines
- Very good command of English (spoken and written)
- Strong communication skills with the ability to explain complex technical concepts to business
Nice to have
- Degree in Computer Science, Data Science or related field
- Experience with data governance, metadata and data quality tools
- Experience collaborating with business stakeholders
Wymagania:
- 3+ years of experience as a Data Engineer
- Experience in large-scale data migration or cloud transformation projects
- Experience with GCP data services (BigQuery, Cloud Storage, Pub/Sub, Dataflow/Dataproc, Composer, Looker, Vertex AI)
- Hands-on experience with infrastructure as code (Terraform+)
- Strong SQL skills and experience with large-scale data processing using Spark (batch and streaming)
- Proficiency in Python and/or Scala, Java
- Experience with Linux, Docker/Kubernetes, and CI/CD pipelines
- Very good command of English (spoken and written)
- Strong communication skills with the ability to explain technical concepts to business stakeholders
Dodatki i korzyści:
- Sports packages, medical care, life insurance on preferential conditions
- Possibility to participate in development webinars, Festiwal Rozwoju, access to learning platforms
- Free parking spot and gym at the office (Warsaw, Marynarska), work equipment and phone with an unlimited offer, special offer for T-Mobile products, as well as discounts from partners
- Company events, internal contests, well-being and volunteering programs, DE&I employee resource groups