PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Sr. Data Engineer:
-
Employment Type:
Full-Time
-
Location:
Work From Home, Heredia Province (Remote)
Do you meet the requirements for this job?

Sr. Data Engineer
Who We Are:
TradeStation is the home of those born to trade. As an online brokerage firm and trading ecosystem, we are focused on delivering the ultimate trading experience for active traders and institutions. We continuously push the boundaries of what's possible, encourage out-of-the-box thinking, and relentlessly search for like-minded innovators.
At TradeStation, we are building an AI-First culture. We expect team members to embrace AI as a core part of their daily workflow, whether that’s using AI to accelerate development, enhance decision-making, improve client outcomes, or streamline internal processes. We hire, grow, and promote people who can harness AI responsibly and creatively. We treat AI as a partner in problem-solving, not just a tool; following our governance standards to ensure AI is used ethically, securely, and transparently. If you join us, you’re joining a culture where AI is how we work.
Are you ready to make yourself at home?
What We Are Looking For:
We are seeking a Senior Data Engineer to join the Enterprise Analytics team. The team helps TradeStation extract valuable business insights from raw data scattered across dozens of silos and teams. Our Data Engineers ensure that our data is fresh, accurate, and meaningful to our business stakeholders.
The ideal candidate is a lifelong learner and self-starter who is constantly looking for new ways to solve old problems. Working with big data, this role will seek to find creative solutions to improve our efficiency, speed, and cost, and help us develop them one Agile sprint at a time. This position will be instrumental in migrating from legacy data pipelines to our new lake-house architecture.
What You’ll Be Doing:
- Help build and maintain a modern Data Lake house
- Design and manage the build of scalable data pipelines using Databricks (Delta Live Tables, PySpark, SQL, Spark Streaming)
- Manage the build of robust ETL/ELT workflows to ingest data from diverse sources (APIs, event streams, databases, SaaS systems)
- Create data pipelines using DevOps practices to ensure code is constantly tested, performant, and delivered
- Write Python, Spark and SQL solutions for data transformation, applying test-driven development principles and automated data quality controls
- Process unstructured data from SQL DBs or JSON, Parquet, CSV files, or Rest APIs
- Provide engineering solutions, design, and build data pipelines while considering scalability and efficiency
- Collaborate on data-driven initiatives, ensuring strong data pipelines and integration for future AI opportunities
The Skills You Bring:
- Excellent communication skills with English at B2+ level
- Experience with Python, Pandas, Py-Spark, ML Flow, Delta Lake, and SQL
- Experience with Amazon Web Services and/or Azure clouds
- Experience with Databricks
- Experience with CI/CD, Git, Azure DevOps/Jenkins/Gitlab, Kubernetes
- Ability to design efficient and scalable data processing systems and pipelines on Databricks, APIs, and AWS Services
- Ability to work with Co-polit and data bricks assistance
- Ability to build the infrastructure required for optimal extraction, transformation, and data loading from various sources
- Experience coordinating between multiple teams, such as Architects, Business Analysts, Scrum Masters, and Developers, to get technical clarity leading to the design, development, and implementation of business solutions
- Exposure to agentic workflow tools for orchestrating AI-driven processes preferred
- Strong experience with cloud computing platforms such as AWS, AZURE, or GCP preferred
- Experience with Unity Catalog, cluster policies, and advanced security configurations preferred
- Familiarity with AI concepts and their application in data engineering, such as enabling intelligent data processing and preparing data for future AI-driven analytics preferred
- Experience with data pipeline and workflow management tools preferred
Minimum Qualifications:
- Bachelor’s or master’s degree in computer science, Engineering, or a related field
- 5+ years of experience in data engineering, big data, and data warehousing
Desired Qualifications:
- AWS Certified
What We Offer:
- Collaborative work environment
- Competitive Salaries
- Yearly bonus
- Comprehensive benefits for you and your family starting Day 1
- Unlimited Paid Time Off
- Flexible working environment
- TradeStation Account employee benefits, as well as full access to trading education materials
#LI-Remote