Data Engineer - Databricks in Heredia at TradeStation

Date Posted: 5/7/2022

Job Snapshot

Job Description

Data Engineer - Databricks
Heredia, Costa Rica
TradeStation is an online brokerage firm seeking to level the playing field for self-directed investors and traders, empowering them to claim their individual financial edge.  At TradeStation, we're continuously pushing the boundaries of what's possible, encouraging out-of-the-box thinking and relentless search for innovation.  We offer a collaborative work environment, competitive salaries, comprehensive benefits, and a generous PTO policy.
The Enterprise Analytics team is building a unique AI & ML powered Solutions to augment our Equity & Crypto Trading platform that will empower institutional & retail investors to gain real time insights that multiply trading gains while managing risk and improving compliance.  We are looking for a Databricks Engineer with a passion for working with big and fast data. You will work in a dynamic highly challenging and always evolving environment. You will help evolve our data lakehouse as part of our Enterprise Analytics team and act as subject matter expert for workspace administration and development on pyspark.
If you are an enthusiastic learner who can think creatively, raise questions, and research items, you can bring your passion and skills to solve interesting problems and have an immediate impact. Join us!

***This position can be fully remote - work from home!
ESSENTIAL JOB FUNCTIONS:
  • Write Spark code to process, transform and ensure quality on many datasets
  • Build & maintain Databricks workspace infrastructure on AWS
  • Apply test-driven development to big data processing
  • Build CI-CD pipelines with GitLab and Databricks CLI
  • Consume data from streaming as well as batch sources
  • Build necessary guardrails to keep services operational and secure
  • Build templates and tools to accelerate the development
  • Work in a DevOps environment, where development teams own both the development and operational responsibilities
KNOWLEDGE, SKILLS & CORE TECHNOLOGIES
Required:
  • Code development for data workflows on Spark (Python, Scala, and/or R)
  • Big Data infrastructure on Apache Spark (e.g. Delta, Databricks, Data Lakes, Data Warehouses, Data Lakehouse)
  • Bash scripts and command-line interfaces (CLI) for Databricks and AWS.  
  • Agile environment with DevOps utilizing CI-CD tools (e.g. GitLab CI, Azure DevOps, Jenkins)
  • Understanding of Agile SDLC, Change Management
  • Good oral and written communication skills in English 
Preferred:
  • Brokerage/trading domain knowledge and experience
  • AWS Certified Solutions Architect – Associate
  • Knowledge and experience of current trends for DevOps applied to “data as a product” (e.g. Continuous Learning, Medallion architecture)
  • Configuration management and deployment automation tools (e.g. CI/CD Pipelines, Octopus, Ansible, Puppet, Chef)
EDUCATION & EXPERIENCE:
  • Bachelor’s Degree in Computer Science/Engineering or equivalent work experience
  • 2+ years of experience as Data Engineer