Software Engineer Intern - Data Engineering, Summer 2024
Akuna Capital is an innovative trading firm with a strong focus on collaboration, cutting-edge technology, data driven solutions and automation. We specialize in providing liquidity as an options market-maker – meaning we are committed to providing competitive quotes that we are willing to both buy and sell. To do this successfully we design and implement our own low latency technologies, trading strategies and mathematical models.
Our Founding Partners first conceptualized Akuna in their hometown of Sydney. They opened the firm’s first office in 2011 in the heart of the derivatives industry and the options capital of the world – Chicago. Today, Akuna is proud to operate from additional offices in Sydney, Shanghai and London.
What you’ll do as a Software Engineer Intern on the Data Platform Team at Akuna:
We are seeking Software Engineer Interns to join our innovative and growing technology team for our 10-week internship, Akunacademy, that will take place in our Chicago headquarters from June - August 2024. In this role you will work alongside our trading and software team to take our data platform to the next level. At Akuna, we believe that our data provides a key competitive advantage and is a critical part of the success of our business.
Our Data team is composed of world class talent and has been entrusted with the responsibility of building and maintaining our data pipelines as well as “providing simple access to clean data”. Our platform starts with gathering data from disparate sources across the globe and ends with our users across Trading, Research and Tech staff accessing complex datasets for a wide range of streaming and batch use cases. Along the way, we build tools to access, validate and monitor the data in efficient and intuitive ways. At each step of our pipeline, we choose the best tools and technologies available. In this role you will:
- Work closely with Engineers, Quants, Traders and Support personnel to create and maintain datasets that play a key role in our business
- Design and build Scala/Python applications that provide performant access to our data
- Monitor and validate our data pipelines by building monitoring and analysis tools
- Gain exposure to the financial markets and trading through the development and use of our datasets
- Evaluate and select open source or proprietary tools required to meet our data requirements
- Design and develop tooling that enables users to answer key research questions and solve data problems
- Collaborate with many teams on data intensive projects such as gathering requirements from stakeholders for a new tool or platform, advising and assisting users in integrating existing tooling and patterns into their applications or research, or consulting with a trader or researcher to help them build a new dataset or pipeline
Qualities that make great candidates:
- Pursuing a Bachelors, Masters or PhD in technical field – Computer Science/Engineering, Math, Physics or equivalent
- Graduating by August 2025
- Strong knowledge of computer science fundamentals
- Clear understanding of both object oriented and functional programming paradigms
- Ability to learn about technical systems and concepts quickly
- Asks good questions, identifies black boxes, and is eager to learn about new topics
- Desire to take ownership of small-scale projects
- Familiarity with Docker, Kubernetes, Kafka, Flink/Spark/Kafka-Streams, and/or AWS Cloud technologies is a plus
- Ability to communicate complex technical topics in a clear and concise ways
- Passionate, pragmatic problem solver able to independently pursue solutions to complex problems
- Understanding of the core concepts of distributing computing, database design, and data storage
- Legal authorization to work in the U.S. is required on the first day of employment including F-1 students using CPT, OPT or STEM
*Please note: If you have applied for multiple roles, you will be asked to complete multiple coding challenges and interviews.
**Resumes must be submitted in PDF format.