parth sareen

working on ollama, the easiest way to run LLMs with your favorite tools

my work at ollama bounces around the entire stack from inference to API. most of my time these days is spent around ai agents through ollama launch. i have also worked on model capabilities, function/tool calling, structured outputs, and general systems engineering.

if you find issues with anything ollama related, please reach out to me on x or linkedin

previously ran a startup called extensible ai where i worked on ai agent reliability, extensitrace (a thread-safe tracing library for agents), DAGent (agents as directed acyclic graphs), and also online tool use for agents

used to work on distributed systems at tesla and autodesk with scala, go, and python. built on-device ml pipelines in c++ at apple. did some pm too at some point.

regularly make latte art, often do muay thai, and like to get good at new things

recently also been making a ton of random but useful personal tools using ollama like watchy (background task manager with TUI and a mini agent loop), ducky (natural language to bash), and zuko (prevent destructive actions by asking for Touch ID)

writings

my work at ollama

hobbies