ow do you make AI inference affordable enough to deliver real ROI in the enterprise?
In Season 2, Episode 1 of The Deep View: Conversations, we talk with Rob May, founder and CEO of Neurometric AI, to break down one of the most urgent challenges in AI today: the soaring cost of inference, and how to bring it down without sacrificing performance.
Today's AI is increasingly powerful, but it’s also expensive. For enterprises to see real returns, inference costs have to drop dramatically. Neurometric believes the answer lies in "thinking algorithms" paired with small, specialized models and workload-specific optimizations. This approach can significantly reduce costs while often improving accuracy and efficiency. Rob walks through how this works in practice and why it matters as AI moves from experimentation to scaled deployment.
We also talk about:
- Why the current AI boom pulled Rob back into operating a startup after multiple exits and a move into investing
- How founders should think about AI infrastructure, efficiency, and long-term economics
- What startup leaders can do to get journalists to pay attention — and a pivotal early-career conversation that led to coverage which changed the trajectory of one of Rob’s companies
If you’re building, deploying, or investing in AI and wrestling with the economics of inference, this conversation offers a clear, practical perspective on what comes next.




