As a GCP Data Engineer at Caserta, you will work in teams to deliver innovative solutions on Google Cloud using core cloud data warehouse tools like Spark, Event Stream platforms, and other Big Data related technologies. In addition to building the next generation of data platforms, you will be working with some of the most forward-thinking organizations in data and analytics.
- Build and Deploy Data Pipelines on Google Cloud to enable AI & ML capabilities.
- Drive the development of cloud-based and hybrid data warehouses & business intelligence platforms.
- Build Data Pipelines to ingest structured and Unstructured Data.
- Gain hands-on experience with new data platforms and programming languages.
- 5+ years of experience consulting in Data Engineering or Data Warehousing.
- Hands-on experience with Google Cloud Platform.
- Experience leading data warehousing, data ingestion, and data profiling activities.
- Advanced SQL & Python skills.
- Hands-on experience with Google cloud platform technologies: Google Cloud Platform Pub/Sub, Cloud Functions, DataFlow, DataProc (Hadoop, Spark, Hive), Cloud Machine Learning, Cloud Data Store and BigTable, BigQuery, DataLab, and DataStudio.
- Migrating Data Pipelines to Google Cloud Platform (GCP).
- Strong aptitude for learning new technologies and analytics techniques.
- Highly self-motivated and able to work independently as well as in a team environment.