Tritech Enterprise Systems logo

Databricks Developer

Tritech Enterprise Systems

Lanham, MD
Full Time
Senior
8 days ago

Job Description

About the Role

TriTech Enterprise Systems is seeking an experienced Databricks Developer to support a long-term remote position. The role involves designing, developing, and optimizing big data pipelines and analytics solutions on the Databricks platform, with a focus on IRS data systems such as IRMF, BMF, or IMF. The candidate must have deep expertise in Java and Apache Spark, along with experience working with IRS data systems, and must hold an active IRS MBI Clearance.

Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using Apache Spark on Databricks
  • Implement data processing logic in Java 8+, leveraging functional programming and OOP best practices
  • Integrate with IRS data systems including IRMF, BMF, or IMF
  • Optimize Spark jobs for performance, reliability, and cost-efficiency
  • Collaborate with cross-functional teams to gather requirements and deliver data solutions
  • Ensure compliance with data security, privacy, and governance standards
  • Troubleshoot and debug production issues in distributed data environments

Requirements

  • Active IRS MBI Clearance Required; IRS issued laptop strongly preferred
  • Bachelor's degree in Computer Science, Information Systems, or a related field
  • 8+ years of professional experience demonstrating expertise with IRS Data Systems
  • Hands-on experience working with IRS IRMF, BMF, or IMF datasets
  • Strong expertise in Java 8 or higher
  • Experience with functional programming (Streams API, Lambdas)
  • Familiarity with object-oriented design patterns and best practices
  • Proficient in Spark Core, Spark SQL, and DataFrame/Dataset APIs
  • Understanding of RDDs and when to use them
  • Experience with Spark Streaming or Structured Streaming
  • Familiarity with HDFS, Hive, or HBase
  • Experience integrating with Kafka, S3, or Azure Data Lake
  • Knowledge of Parquet, Avro, or ORC file formats
  • Experience building ETL pipelines with Spark
  • Experience with YARN, Kubernetes, or EMR for Spark deployment
  • Familiarity with CI/CD tools like Jenkins or GitHub Actions
  • Proficiency in Git, Maven, or Gradle
  • Experience with unit testing frameworks like JUnit or TestNG
  • Strong documentation skills and ability to troubleshoot production issues

Nice to Have

  • Experience with Scala or Python in Spark environments
  • Familiarity with Databricks or Google DataProc
  • Knowledge of Delta Lake or Apache Iceberg
  • Data modeling and performance design for big data systems
Apply Now

Job Details

Posted AtJul 16, 2025
Job CategoryData Engineering
SalaryCompetitive salary
Job TypeFull Time
Work ModeRemote
ExperienceSenior

Job Skills

AI Insights

Key skills identified from this job posting

Sign upto access all insights for this job

About Tritech Enterprise Systems

Website

tritechenterprise.com

Location

Lanham, MD

Industry

Computer Systems Design Services

Get job alerts

Set up personalized alerts for your job search and get tailored job digests for close matches