Search Available Jobs

Sr. Data Engineer in Heredia at TradeStation

Date Posted: 6/25/2024

Job Snapshot

Job Description

Who We Are:
TradeStation is an online brokerage firm seeking to level the playing field for self-directed investors and traders, empowering them to claim their individual financial edge.  At TradeStation, we're continuously pushing the boundaries of what's possible, encouraging out-of-the-box thinking and a relentless search for innovation.

What We Are Looking For:
We are seeking a Senior Data Engineer to join the Enterprise Analytics team. The team helps TradeStation extract valuable business insights from raw data scattered across dozens of silos and teams.  Our Data Engineers ensure that our data is fresh, accurate, and meaningful to our business stakeholders.

The ideal candidate is a lifelong learner and self-starter who is constantly looking for new ways to solve old problems. Working with big data, this role will seek to find creative solutions to improve our efficiency, speed, and cost, and help us develop them one Agile sprint at a time.  This position will be instrumental in migrating from legacy data pipelines to our new lake-house architecture.

What You'll Be Doing:
  • Help us build a modern lake data warehouse
  • Create data pipelines using DevOps practices to ensure code is constantly tested, performant, and delivered
  • Migrate old SSIS/SSRS-based ETL jobs to our new Databricks/ADF-based ELT architecture
  • Work independently and with the business stakeholders to understand their requirements and drive business outcomes
  • Write Python and SQL solutions for data transformation, applying test-driven development principles and automated data quality controls
  • Process unstructured data from SQL DBs or JSON, Parquet, CSV files, or Rest APIs
  • Provide engineering solutions, design, and build data pipelines while considering scalability and efficiency
The Skills You Bring:
  • Excellent communication skills with English at B2+ level
  • Experience with Python, Pandas, Py-Spark, ML Flow, Delta Lake
  • Experience with Amazon Web Services and/or Azure clouds
  • Experience with SSIS, SSRS, Data-Factory
  • Experience Databricks and Synapse
  • Experience with Git, Azure DevOps/Jenkins/Gitlab, Kubernetes
  • Ability to design efficient and scalable data processing systems and pipelines on Databricks, APIs, and AWS Services
  • Ability to build the infrastructure required for optimal extraction, transformation, and data loading from various sources
Minimum Qualifications:
  • Bachelor’s or master’s degree in computer science, Engineering, or a related field
  • 2+ years of experience in data engineering, big data, and data warehousing
  • Experience coordinating between multiple teams, such as Architects, Business Analysts, Scrum Masters, and Developers, to get technical clarity leading to the design, development, and implementation of business solutions
Desired Qualifications:
  • Proficient in Python and SQL
  • Strong experience with cloud computing platforms such AWS and AZURE
  • Knowledge of data storage and data processing
  • Experience with data pipeline and workflow management tools
  • AWS Certified Cloud Practitioner
  • Databricks Data Engineering Professional
What We Offer:
  • Collaborative work environment
  • Competitive Salaries
  • Yearly bonus
  • Comprehensive benefits for you and your family starting Day 1
  • Unlimited Paid Time Off
  • Flexible working environment
  • TradeStation Account employee benefits, as well as full access to trading education materials


  1. Software Engineer Jobs
  2. Project Engineer Jobs