
Technology
|
Conversational AI
|
YC W26
|
Valuation:
Undisclosed

Last Updated:
March 24, 2026

Builds socially aware, real-time AI meeting assistants (Fern) that leverage multimodal LLMs and proprietary social intelligence to join professional conversations as invisible copilots or active participants, providing expert opinions, coaching, and task delegation.
Ishiki Labs has publicly announced Fern, their real-time AI meeting assistant with dual-mode operation (Full Presence and Shadow Mode), real-time expert opinions, and task delegation. Speeds comparable to ChatGPT Voice and Gemini Live. Expansion into sales coaching, deeper meeting platform integrations, and socially aware AI that knows when to speak and when to stay silent.
Founders' deep Meta Reality Labs and Meta AI Research backgrounds in multimodal LLMs, smart glasses, and AR/VR signal a likely push toward embodied AI and wearable integrations. IshikiLabs Capture app for structured audio/visual data collection suggests proprietary dataset building. GitHub and research publications point to variational autoencoders and disentangled representation learning.
Fern acts as an invisible, socially aware AI copilot in live meetings, understanding conversational dynamics to deliver contextually appropriate interventions without interrupting natural flow.
It's like having a brilliant colleague who sits in your meeting, reads the room perfectly, and whispers exactly the right insight at exactly the right moment.
It's like having a world-class executive assistant who can read the room better than most humans but never steals the spotlight.
Fern provides live, context-aware sales coaching during calls by analyzing prospect sentiment, objection patterns, and conversational momentum to deliver real-time tactical recommendations to sales reps.
It's like having your company's top sales closer whispering winning tactics in your ear during every deal call.
It's like having a GPS for sales calls—instead of telling you where you went wrong after you're lost, it reroutes you in real time before you miss the turn.
The IshikiLabs Capture app collects structured, high-quality audio and visual data from real-world interactions with real-time quality checks, building proprietary training datasets for socially aware AI models.
It's like building a massive library of "how humans actually talk to each other" so the AI can learn social skills the way people do—by watching and listening.
It's like building a flight simulator for social skills—you need thousands of hours of real flight data before the simulator can teach pilots anything useful.
Ishiki Labs combines rare expertise from Meta's most advanced AI and AR/VR programs with real-time systems engineering from Citadel Securities, enabling them to build socially intelligent AI that understands human conversational dynamics at a level competitors focused purely on transcription or summarization cannot match.