Numeus is a diversified digital asset investment firm built to the highest institutional standards, combining synergistic businesses across Alpha Strategies, Trading, and Asset Management.
Numeus was founded by successful executives with decades of experience across the finance, blockchain and technology industries, with a shared passion for digital assets. Our values are grounded in an open approach based on connectivity, collaboration, and partnerships across the digital asset ecosystem. People and technology are at the core of everything we do.
We are seeking a skilled and experienced Data Engineer. This is a high impact role with the ability to contribute meaningfully to the development and management of our data platform. This position requires strong technical capabilities and a proven ability to collaborate effectively with quantitative researchers, to process, analyze, and manage a large and diverse corpus of datasets that enable our quantitative investment process.
Key Responsibilities:
- Work closely with quantitative researchers to understand their data requirements.
- Develop high-throughput, near real-time event driven datasets using live trade and historical market data.
- Integrate with external APIs and alternative data vendors to ingest structured and unstructured data.
- Develop data solutions that enable the efficient and reliable extraction of insights from captured data.
- Design, develop, and optimize robust data pipelines, enabling ETL processes that support quantitative research, analysis, alpha forecasting, and execution.
- Implement rigorous data quality assurance measures to ensure the accuracy, reliability, and consistency of data inputs and outputs.
Skillset and Qualifications:
- 3+ years experience in a related field or role. Previous experience at a quantitative hedge fund is highly desirable.
- Bachelor’s degree in Computer Science or related field
- Proficiency in programming languages such as Python and Rust, along with experience in big data technologies.
- Demonstrated expertise in designing and optimizing data pipelines, data modeling, ETL processes, data warehousing, and data governance
- Strong experience in SQL and modeling relational data. Proficiency using relational and non-relational databases as well tick data stores.
- Working experience with REST/WS backends and APIs
- Experience building event driven applications using technologies like Protobuf, Kafka and Schema Registry.
- Experience with AWS and their data offerings
- Experience with Linux and Docker
- Proven ability to collaborate effectively with quantitative researchers and other stakeholders, translating their requirements into technical solutions
- Strong analytical and problem-solving skills, capable of analyzing complex datasets, identifying patterns, and extracting meaningful insights
- Highly detail oriented and meticulous
- Thrive in a fast-paced and dynamic environment, adaptable to changing priorities and emerging technologies
- Strong verbal and written communication skills
- Ability to travel periodically between our other offices in NYC and Zug, Switzerland
Are you keen to work in a well-resourced startup environment, where your ideas, experience, and drive to find creative solutions makes a difference? We’d like to hear from you.