How Is

Jinba

Using AI?

Lets enterprises build secure AI workflows through chat interfaces in regulated industries.

Using natural language workflow generation, compliance-aware RAG validation for regulated outputs, and multi-agent orchestration across 100+ connectors.

Company Overview

Builds a secure, AI-powered enterprise workflow automation platform that lets users create, deploy, and manage complex workflows through natural language chat interfaces, visual editors, and API/MCP deployment , targeting Fortune 500 clients in regulated industries like banking, insurance, manufacturing, and healthcare.

Product Roadmap & Public Announcements

Jinba has publicly launched Jinba Flow (chat-to-flow workflow builder) and Jinba App (execution environment for business users), with 100+ pre-built connectors, SOC II compliance, on-prem/private cloud hosting, SSO, RBAC, audit logging, and MCP server deployment. Their pricing tiers (Free through Enterprise) and dedicated engineer support model are live. Public job postings confirm expansion into Tokyo for APAC sales and engineering.

Signals & Private Analysis

Job postings for "Forward Deployed Engineers" in both SF and Tokyo signal active enterprise onboarding with hands-on implementation , a classic Palantir-style land-and-expand motion. Hiring for a "Founding Design Engineer" with canvas editor and drag-and-drop expertise suggests a major visual workflow builder overhaul. Growth Ops and Chief of Staff roles indicate preparation for rapid scaling. GitHub and product signals point toward automated remediation workflows, marketplace/template libraries for prebuilt flows, and deeper compliance certifications (HIPAA, GDPR). The provider-agnostic LLM architecture (OpenAI, Anthropic, Azure AI, private models) positions them to upsell model hosting to security-conscious enterprises. Tokyo expansion hints at early traction with Japanese financial institutions and manufacturers.

Jinba

Machine Learning Use Cases

Natural Language Workflow Generation
For
Cost Reduction
Product

<p>Natural language chat interface that converts plain English descriptions into production-ready, enterprise-grade automation workflows with integrated connectors and compliance controls.</p>

Layman's Explanation

You describe what you want automated in plain English, and the AI builds the entire workflow for you — like telling a really smart intern exactly what to do and having it done instantly.

Use Case Details

Jinba's chat-to-flow system uses large language models (OpenAI, Anthropic, or customer-hosted models) combined with retrieval-augmented generation (RAG) to interpret natural language workflow descriptions and generate structured, executable workflow definitions. The system maps user intent to available connectors (100+ integrations including Slack, Salesforce, Gmail, HubSpot, and internal databases), resolves dependencies, and produces a visual workflow graph that users can refine in a drag-and-drop editor or export as YAML. The RAG pipeline grounds the LLM's output in the customer's specific connector catalog, data schemas, and compliance policies, ensuring generated workflows are not only functional but also adhere to enterprise governance requirements. This dramatically lowers the barrier to automation for operations, HR, finance, and sales teams who previously relied on engineering backlogs to build integrations.

Analogy

It's like having a universal TV remote that you program by just saying "record all my shows on Tuesday" instead of reading a 50-page manual.

Compliance-Aware RAG Validation
For
Risk Reduction
IT-Security

<p>RAG-powered compliance layer that automatically validates AI-generated workflows against enterprise security policies, regulatory requirements, and data governance rules before deployment.</p>

Layman's Explanation

Before any AI-built workflow goes live, a compliance AI double-checks it against your company's security rules — like a spell-checker, but for regulatory violations.

Use Case Details

Jinba's architecture separates workflow creation (Jinba Flow) from workflow execution (Jinba App), creating a natural governance checkpoint. At the core of this checkpoint is a retrieval-augmented generation pipeline that indexes enterprise compliance policies, RBAC configurations, data classification rules, and regulatory frameworks (SOC II, with likely expansion to HIPAA and GDPR). When a workflow is generated or modified, the system retrieves relevant policy documents and uses an LLM to evaluate whether the workflow's data flows, connector permissions, and action sequences comply with organizational and regulatory requirements. Non-compliant steps are flagged with specific policy citations and remediation suggestions. This approach is particularly valuable for Fortune 500 clients in banking, insurance, and healthcare, where a single non-compliant automation could trigger regulatory penalties. The provider-agnostic model hosting ensures that sensitive compliance data never leaves the customer's infrastructure.

Analogy

It's like having a building inspector review your renovation plans before you knock down any walls — except this inspector reads every regulation ever written in milliseconds.

Multi-Agent Workflow Orchestration
For
Product Differentiation
Engineering

<p>Publishes approved workflows as Model Context Protocol (MCP) servers, enabling AI agents and LLMs across the enterprise to discover, invoke, and chain workflows as contextual tools — creating a self-service automation mesh.</p>

Layman's Explanation

Your approved workflows become tools that any AI assistant in the company can automatically find and use — like giving every department access to the same well-organized toolbox.

Use Case Details

Jinba's MCP (Model Context Protocol) deployment capability represents a forward-looking architectural decision that positions workflows as first-class tools in the emerging agentic AI ecosystem. When a workflow is approved in Jinba Flow, it can be published as an MCP server with a structured schema describing its inputs, outputs, capabilities, and permissions. Any MCP-compatible AI agent or LLM — whether an internal copilot, a customer-facing chatbot, or a third-party agentic framework — can discover and invoke these workflows as contextual tools. This creates a composable automation mesh where workflows built by one team become reusable building blocks for AI agents across the organization. The system handles authentication, rate limiting, and audit logging at the MCP layer, ensuring enterprise governance is maintained even as workflows are consumed by autonomous agents. This is a significant differentiator over competitors like Zapier or Make, which treat workflows as isolated automations rather than discoverable, agent-invocable services.

Analogy

It's like turning every recipe your company has ever perfected into a menu item that any chef in any kitchen can order and serve — without having to learn how to cook it themselves.

Key Technical Team Members

  • Shoya Matsumori, Founder & CEO
  • Patricia Sugiarto, Early Team Member

Jinba's separation of "build" (Jinba Flow) and "run" (Jinba App) environments creates an enterprise governance layer that competitors like Zapier and Make lack , letting compliance teams approve workflows before business users execute them via chat. Combined with provider-agnostic LLM hosting, on-prem deployment, and MCP server publishing, they offer a uniquely secure AI automation stack for regulated industries that can't send data to third-party clouds.

Jinba

Funding History

  • 2025 | Shoya Matsumori founds Carnot Inc (Jinba). 2026 | Accepted into Y Combinator Winter 2026 batch. 2026 | $500K Seed round from Y Combinator. 2026 | Launches Jinba Flow and Jinba App with 100+ connectors. 2026 | Expands hiring to Tokyo for APAC go-to-market. 2026 | ~$500K raised to date

Jinba

Competitors

  • No-Code/Low-Code Automation: Zapier, Make (Integromat), n8n, Tray.io. Enterprise Workflow: ServiceNow, Microsoft Power Automate, UiPath. AI-Native Workflow: Relevance AI, Respell, Lindy.ai, Cassidy. Chat-Based Automation: Moveworks, Aisera (IT-focused conversational AI).
More

Companies
Get Every New ML Use Cases Directly to Your Inbox
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.