V2Soft Inc logo

GCP Data Engineer

V2Soft Inc

Dearborn, MI
Full Time
Senior
28 days ago

Job Description

About the Role

V2Soft is a global leader in IT services and business solutions, delivering innovative and cost-effective technology solutions worldwide since 1998. We have headquartered in Bloomfield Hills, MI and have 16 offices spread across six countries. We partner with Fortune 500 companies to address complex business challenges. Our services span AI, IT staffing, cloud computing, engineering, mobility, testing, and more. Certified with CMMI Level 3 and ISO standards, V2Soft is committed to quality and security. Beyond our work, we actively support local communities and non-profits, reflecting our core values. Join us to be part of a dynamic and impactful global company!

Key Responsibilities

  • Work in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
  • Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
  • Analyze complex data, organize raw data, and integrate massive datasets from multiple data sources to build subject areas and reusable data products.
  • Collaborate with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting.
  • Work with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions, and ensure key business drivers are captured.
  • Design and deploy pipelines with automated data lineage, and develop, evaluate, and summarize Proof of Concepts to validate solutions.
  • Test and compare competing solutions and report on the best options.
  • Design and build production data engineering solutions to deliver pipeline patterns using GCP services such as BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.

Requirements

  • Experience working in an implementation team from concept to operations, providing deep technical expertise.
  • Experience in analyzing complex data, organizing raw data, and integrating datasets from multiple sources.
  • Experience working with architects to evaluate and productionalize GCP tools for data ingestion, integration, and reporting.
  • Experience in formulating business problems as technical data requirements and implementing solutions.
  • Designing and deploying data pipelines with automated data lineage.
  • Proficiency with GCP services such as BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.
  • In-depth understanding of Google's product technology or other cloud platforms.
  • At least 5+ years of analytics application development experience.
  • At least 5+ years of SQL development experience.
  • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale.
  • 2+ years of professional development experience in Java or Python, and Apache Beam.
  • Experience with extracting, loading, transforming, cleaning, and validating data.
  • Designing data processing pipelines and architectures.
  • 1+ year of designing and building CI/CD pipelines.

Nice to Have

  • Experience building Machine Learning solutions using TensorFlow, BigQueryML, AutoML, Vertex AI.
  • Experience in building solution architecture, provisioning infrastructure, and securing data-centric services in GCP.
  • Experience with DataPlex or Informatica EDC.
  • Experience with development ecosystems such as Git, Jenkins, and CI/CD.
  • Experience working with DBT/Dataform.
  • Experience working with Agile and Lean methodologies.
  • Performance tuning experience.
  • GCP Professional Data Engineer Certification.
  • Master's degree in computer science or related field.
  • Mentoring experience with engineers (2+ years).

Qualifications

  • Bachelor's degree in computer science or related scientific field.
  • IT or related certifications and topics such as data architect, data center, data integrity, data manager, data management, data scientist, data warehousing, SQL, Sybase, Teradata.

Working at V2Soft Inc

V2Soft actively supports local communities and non-profits, reflecting its core values. The company emphasizes quality, security, innovation, and a collaborative work environment, encouraging continuous learning and professional growth.

Apply Now

Job Details

Posted AtJun 29, 2025
Job CategoryData Engineering
SalaryCompetitive salary
Job TypeFull Time
ExperienceSenior

Job Skills

AI Insights

Key skills identified from this job posting

Sign upto access all insights for this job

About V2Soft Inc

Website

v2soft.com

Company Size

1001-5000 employees

Location

Dearborn, MI

Industry

Computer Systems Design and Related Services

Get job alerts

Set up personalized alerts for your job search and get tailored job digests for close matches