Fusion HCR is Hiring! We are looking for Data Engineers or Data Scientists who have hands on experience with BigQuery and/or Google Cloud Platform (GCP). Full-time role within a Fortune 500 business with leading 401k match, profit sharing retirement program, PTO, medical/dental/vision.
Key Responsibilities
Design, develop, and optimize robust data pipelines in BigQuery and Google Cloud Platform (GCP)
Extract and transform data from ERP systems (SAP)
Implement ELT/ETL workflows using Dataflow, dbt, Airflow, or similar tools
Perform data modeling and schema design optimized for reporting and self-service analytics
Ensure quality, consistency, and security of SAP and non-SAP data across the pipeline
Partner with analysts and business users to make SAP data more accessible and actionable
Monitor and optimize BigQuery performance (storage, compute, cost)
Create and maintain technical documentation and support data governance initiatives
Required Qualifications
4–7 years of professional experience in Data Engineering, with recent work in BigQuery and GCP
Hands-on experience integrating data from ERP environments
Advanced SQL skills and experience writing complex queries in BigQuery
Proficiency in Python or Java for data processing and scripting
Experience with ETL/ELT tools such as Apache Beam, dbt, Airflow, or Dataflow
Strong understanding of data warehousing, data structures, and dimensional modeling
Ability to collaborate effectively with both technical and non-technical teams
Preferred Qualifications
Knowledge of SAP (ECC, S/4 HANA, BW, SAP Data Services)