About Pathway
Deeptech start-up, founded in March 2020.
- Our primary developer offering is an ultra-performant Data Processing Framework (unified streaming + batch) with a Python API, distributed Rust engine, and capabilities for data source integration & transformation at scale (Kafka, S3, databases/CDC,…).
- The single-machine version is provided on a free-to-use license (`pip install pathway`).
- Major data use cases are around event-stream data (including real-world data such as IoT), and graph data that changes over time.
- Our enterprise offering is currently used by leaders of the logistics industry, such as DB Schenker or La Poste, and tested across multiple industries. Pathway has been featured in Gartner’s market guide for Event Stream Processing.
- Learn more at http://pathway.com/ and https://github.com/pathwaycom/.
Pathway is VC-funded, with amazing BAs from the AI space and industry. We have operations across Europe and in the US. We are headquartered in Paris, with significant support from the French ecosystem (BPI, Agoranov, WILCO,…).
The Team
Pathway is built by and for overachievers. Its co-founders and employees have worked in the best AI labs in the world (Microsoft Research, Google Brain, ETH Zurich), worked at Google, and graduated from top universities (Polytechnique, ENSAE, Sciences Po, HEC Paris, PhD obtained at the age of 20, etc…). Pathway’s CTO is a co-author with Goeff Hinton and Yoshua Bengio. The management team also includes the co-founder of Spoj.com (1M+ developer users) and NK.pl (13.5M+ users) and experienced growth leader who has scaled companies with multiple exits.
The Opportunity
We are currently searching for 1 or 2 R&D Engineers with a strong track record in designing efficient data storage systems and/or algorithms, to help develop and improve the core data processing components of our solution.
TL;DR: If you like Rust and hearing “distributed Van Emde Boas tree” would put a smile on your face, this is the job for you.
You Will
- create code to optimize index-like data structures used in our core data processing and data storage components.
- help to design algorithms and data structures which work in a dynamic distributed manner.
- contribute to other aspects of the system, for example related to persistency.
The results of your work will play a crucial role in building and optimizing both the foundations of our data processing product, and its core algorithms library.
Requirements
Cover letter
It’s always a pleasure to say hi! If you could leave us 2-3 lines, we’d really appreciate that.
In order to ensure coherence of our team, you are expected to meet at least 2 of the 6 criteria below
- You got mostly A-grades in a university Computer Science Bachelor/Master program which included: Algorithms, Formal Methods, Graph Theory, Calculus, Probability, and an introduction to Distributed Systems and Parallel Programming.
- You spent at least 6 months working for a FAANG company (Facebook, Amazon, Apple, Netflix, Google).
- You previously exited from a start-up very, very successfully.
- You were an ICPC World Finalist or in a top-10 team in your Region.
- You were an IOI, IMO, or IPhO medalist in High School.
- You were a top-10 finisher in a major Kaggle contest.
You Are
- Ready for hands-on contribution to the product.
- Curious at heart and thrilled to work on data processing challenges encountered by developers in different organizations.
- Have a good working knowledge of Rust.
- Familiar with the intricacies of multi-threaded and distributed systems.
- Have a profound understanding of graph algorithms.
- Have a good working knowledge of Python.
- Have some familiarity with SQL.
- Have at least 2 years of experience in software development (either in the industry, or as a contributor to major open source or research projects).
- Understand basic statistical concepts.
- Have at least some basic familiarity with git, build systems, and CI/CD.
- Respectful of others
- Fluent in English
Bonus Points
- Industry experience in data store / DBMS optimization or designing distributed algorithms.
- Successful track-record in algorithms contests.
- Showing a portfolio: code on github, a research paper in Algorithms or foundations of Machine Learning,…
- Familiarity with fundamentals of stream processing concepts (as in: knowing what Hyperloglog is)
- Some knowledge of Machine Learning approaches on graphs.
- Some knowledge of French, Polish, or German.
Why You Should Apply
- Join an intellectually stimulating work environment.
- Be a pioneer: you get to work with a new type of data processing.
- Be part of one of the hottest early stage data/AI startups on the European scene.
- Uncover exciting career prospects.
- Make significant contribution to our success.
- Join & co-create an inclusive workplace culture.c
Benefits
- Type of contract: Permanent employment contract
- Preferable joining date: early 2023. The positions are open until filled – please apply early.
- Compensation: annual salary of €70K/100K (mid – senior – lead), exceptionally €100K/€140K+ (unique track record or “L7+” experience in FAANG) + Employee Stock Option Plan.
- Location: Remote work from home.
Possibility to work or meet with other team members in one of our offices: Paris (Agoranov near Saint-Placide), Palaiseau (Ecole Polytechnique area), Wroclaw (University area).
As a general rule, permanent residence will be required in the EU, UK, Canada, US, or Brazil.
If you meet our broad requirements but are missing some experience, don’t hesitate to reach out to us.