Data Tooling Developer

Chicago

About Akuna: 

Akuna Capital is an innovative trading firm with a strong focus on collaboration, cutting-edge technology, data driven solutions, and automation. We specialize in providing liquidity as an options market-maker – meaning we are committed to providing competitive quotes that we are willing to both buy and sell. To do this successfully, we design and implement our own low latency technologies, trading strategies, and mathematical models. 

Our Founding Partners first conceptualized Akuna in their hometown of Sydney. They opened the firm’s first office in 2011 in the heart of the derivatives industry and the options capital of the world – Chicago. Today, Akuna is proud to operate from additional offices in Sydney, Shanghai, London, Boston, and Austin. 

What you’ll do as a Data Tooling Developer on the Data Engineering team at Akuna:

We are a data-driven organization and are seeking Developers to take our data platform to the next level. At Akuna, we believe that our data provides a key competitive advantage and is a critical part to the success of our business. The goal of the Data Team is to “provide simple access to clean data,” and the Data Tooling team is primarily focused on the “simple access” portion. Our platform starts with gathering data from disparate sources across the globe and ends with our users in trading, research and development accessing complex datasets for a wide range of streaming and batch use cases. In this role, you will:

  • Work with developers, researchers, and traders to design and develop tooling that enables users to answer key research questions and data solve problems
  • Collaborate with many teams on data intensive projects—be it gathering requirements from stakeholders for a new tool or platform, advising and assisting users in integrating existing tooling and patterns into their applications or research, or consulting with a trader or researcher and helping them build a new dataset or pipeline
  • Work with the Data Infrastructure team to provide users simple access to the data platform

Qualities that make great candidates:

  • Strong knowledge of computer science fundamentals.
  • Strong understanding of both object oriented and functional programming paradigms
  • Familiarity with Python
  • Expertise with common data tools and platforms such as Kafka, Spark, Flink, and a variety of databases
  • Strong system design skills
  • Experience building ETL pipelines and familiarity with both streaming and batch approaches
  • Experience doing user research, understanding stakeholder needs, and building products
  • Capable of doing thorough research and analysis of a problem space to help the team make informed design decisions
  • Comfortable mentoring more junior team members
Apply Now