At Voloridge Investment Management our quantitative systems are deeply dependent on vast quantities of data. The Analytic Engineer must understand the many different and evolving use cases for data at Voloridge and design systems that supply high quality data sets for advanced analytics.
Summary of Job Functions
- Collaborate effectively with stakeholders, which include Traders, Project Managers, Software Engineers, Data Architects, Data Analysts, QA Analysts, Data Base Administrators and Data Engineers.
- Build and maintain data pipelines based on functional and architectural specifications.
- Ensure that data pipelines incorporate best practices related to high performance, fault tolerance, instrumentation, logging, and data driven functionality.
- Ensure that data pipelines are scalable, maintainable, and not overly engineered.
- Produce and maintain engineering and operational documentation.
- Analyze complex data problems and engineer elegant solutions.
- Manage data sets in Data Marts for Business Unit analysts.
- Work in an Agile environment.
- Contribute to the evolution of Voloridge’s engineering standards.
- Participate in an on-call rotation with other Engineers.
- Lead investigations to troubleshoot data issues that arise along the data pipelines.
- 0-5 years experience building ETL/ELT pipelines using transactional databases, external data sources, and data warehouses with large data volumes.
- Experience building ETL/ELT pipelines against data warehouse entities such as SCD’s and Facts.
- Experience with SQL Server 2016+ and SSMS to create and maintain SQL Server tables, views, functions, stored procedures.
- Experience in performance tuning TSQL, execution plan analysis, blocking / deadlock analysis, and index optimization.
- Strong initiative, collaboration, accountability, impartiality, and communication.
- Strong analytical skills, real passion for working with data and strong interest in solving problems.
- Experience building automated processes.
- Experience with Tableau and/or other data visualization tools (Looker, Periscope, Power BI)
- Bachelor’s degree in Computer Science, Information Systems, Mathematics, Statistics or related disciplines, or equivalent experience.
- Ability to work daily onsite in our Jupiter, FL office
Preferred Skills and Previous Experience
- Python programming using standard library and libraries such as Pandas/Numpy, SQLAlchemy, Sklearn.
- Experience with AWS.
- Experience working with REST and RPC APIs, including helping to design and/or build APIs, or the desire to learn and use these technologies.
- Skilled in developing automated testing, code quality, and engineering best practices for data services.
- Experience with exploratory data analysis.
- Experience working within segregated DEV, QA, UAT and Production SDLC stages.
- Understanding of traditional Agile SDLC best practices and experience working on an Agile team.
- Experience working with trading / financial / investment / accounting data; CFA, FRM, CAIA a major plus.
- Experience with tools such as RedGate, Grafana, Prometheus, Loki, and OpsGenie.
- MS in Computer Science, Information Systems, Mathematics, Statistics or related disciplines.
Compensation and Benefits
- Relocation assistance available for the right candidate
- Highly competitive base salary
- Profit sharing bonus
- Health, dental, vision, life, and disability insurance