
I'm looking for a challenging problem to solve.
My goal is to be a unicorn, a "Jack of all trades, master of all".
I prioritize the breadth of knowledge and I'm ready to go deep if necessary.
I like to explore uncharted water and push myself beyond the limit.
I love solving new problems and automating old solutions.
Specialties: Quantitative Finance, Statistics, Machine Learning, Deep Learning
DATA SCIENCE MANAGER Descartes Underwriting (France) 01/2023 – PRESENT
✓ Design a cloud-first pricing framework that permits easy ...
DATA SCIENCE MANAGER Descartes Underwriting (France) 01/2023 – PRESENT
✓ Design a cloud-first pricing framework that permits easy reproducibility, review, and upgrade for hundreds of deals.
✓ Design a risk framework that can calculate portfolio risk across different models and perils within minutes rather than days.
DATA SCIENTIST 07/2021 – 12/2022
✓ Automate the whole pricing pipeline, shorten quoting time from weeks to hours, which leads to ...
DATA SCIENTIST 07/2021 – 12/2022
✓ Automate the whole pricing pipeline, shorten quoting time from weeks to hours, which leads to an annual increase of premium of ~ 50%.
✓ Design and implement a fire monitoring tool with satellite imagery that can detect wildfires in near real-time.
✓ Develop and maintain the 10+ autonomous data pipelines with Airflow, saving hundreds of hours compared to the old process.
MACHINE LEARNING ENGINEER Saegus (France) 01/2020 – 06/2021
Project: Near Duplicate Video Detection (L'Oréal) (Detect mostly ...
MACHINE LEARNING ENGINEER Saegus (France) 01/2020 – 06/2021
Project: Near Duplicate Video Detection (L'Oréal) (Detect mostly similar marketing videos that come from the same shot)
✓ Find twice more relevant results by improving the main clustering algorithm.
✓ Reduce processing time 5 times using cache and parallel computing.
✓ Scale the system to be able to search from tens of thousands of videos.
Project: One Shot Learning with Siamese Networks (BIC) (Detect a lighter model among hundreds of choices with only a few images)
✓ Reduce model training time around 10 - 15 times using various optimization and parallel processing techniques.
✓ Increase model accuracy from 50% to 90%.
✓ Handle high loads with asynchronous API.