GCP Data Engineer will create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics. Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform cloud.
Responsibilities:
- Build data systems and pipelines on cloud providers (GCP preferable);
- Build algorithms and prototypes (geospatial models are a plus);
- Implement tasks for Apache Airflow;
- Support and organize data in a data warehouse (Snowflake/BiqQuery);
- Develop efficient ETL/ELT pipelines.
Experience required:
- 5+ years of application development experience required;
- 1-2 years of GCP experience. Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow;
- 2 + years coding skills in Java/Python;
- Work with data team to analyze data, build models and integrate massive datasets from multiple data sources for data modeling;
- Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing;
- Architecting and implementing next generation data and analytics platforms on GCP cloud;
- Experience in working with Agile and Lean methodologies;
- Experience working with either a Map Reduce or an MPP system on any size/scale;
- Geographic Information Systems (GIS), Geoanalytics, Geospatial Analysis, ArcGIS, Carto, Unfolded, H3 is a plus.
Work Conditions:
- Remote work;
- Official employment;
- Official salary paid on time twice a month;
- Career growth;
- Nice bonus - Thirteenth Salary;
- Paid vacation (27 calendar days), sick leave;
- A compliment from the company for a birthday;
- Corporate events and teambuildings at the expense of the company;
- Work in a friendly, young team with interesting and ambitious tasks.