← Back to research
·5 min read·opensource

Mastra

Mastra is a TypeScript AI framework from the Gatsby team that includes agents, workflows, RAG, and a new SOTA memory system called Observational Memory.

Key takeaways

  • Observational Memory achieves 94.9% on LongMemEval with gpt-5-mini — highest score ever recorded
  • Built by the Gatsby team (Y Combinator W25), designed for TypeScript-first AI development
  • All-in-one framework: agents, workflows, RAG, memory, MCP, evals in one package

FAQ

What is Mastra?

Mastra is a TypeScript framework for building AI agents and applications, featuring agents, workflows, RAG, and a novel memory system.

Who created Mastra?

The team behind Gatsby, including Tyler Barnes (ex-Netlify, ex-Gatsby staff engineer), backed by Y Combinator W25.

What is Observational Memory?

A memory system that uses background agents to compress conversations into dense observations, achieving SOTA benchmark scores without RAG or graphs.

How does Mastra compare to LangChain?

Mastra is TypeScript-native and batteries-included; LangChain is Python-first with more modularity but higher integration complexity.

Is Mastra open source?

Yes, Mastra is fully open source under Apache 2.0 license.

Executive Summary

Mastra is an open-source TypeScript framework for building AI-powered applications and agents. Created by the team behind Gatsby and backed by Y Combinator (W25), it provides a batteries-included approach: agents, workflows, RAG, memory, MCP servers, and evals in one cohesive package. The standout feature is Observational Memory, which achieves state-of-the-art benchmark scores without RAG or knowledge graphs.

AttributeValue
CompanyMastra (Gatsby team)
Founded2024
FundingY Combinator W25
Employees~10
HeadquartersSan Francisco, CA

Product Overview

Mastra is an open-source TypeScript framework for building AI-powered applications and agents.[1] Created by the team behind Gatsby, Mastra is designed around the principle that "Python trains, TypeScript ships."[2]

The framework supports 40+ model providers through a unified interface — OpenAI, Anthropic, Google, DeepSeek, and more.

Key Capabilities

CapabilityDescription
AgentsAutonomous agents that use LLMs and tools for open-ended tasks
WorkflowsGraph-based orchestration with .then(), .branch(), .parallel()
Observational MemorySOTA memory system (94.9% on LongMemEval)
RAGBuilt-in retrieval from APIs, databases, and files
MCPModel Context Protocol server authoring

Product Surfaces / Editions

SurfaceDescriptionAvailability
FrameworkTypeScript npm packageGA
CLInpm create mastra@latestGA
EnterpriseHosted services and supportContact sales

Technical Architecture

Language: TypeScript (Node.js)

Storage adapters: PostgreSQL, LibSQL, MongoDB

Key Technical Details

AspectDetail
DeploymentSelf-hosted (Node.js)
Model(s)40+ providers (OpenAI, Anthropic, Google, etc.)
IntegrationsVercel AI SDK, CopilotKit, React, Next.js
Open SourceYes (Apache 2.0)

Installation:

npm create mastra@latest

Observational Memory (SOTA)

Mastra's standout feature is Observational Memory (OM), achieving state-of-the-art on LongMemEval:[3][4][5]

SystemModelLongMemEval
Mastra OMgpt-5-mini94.87%
Mastra OMgemini-3-pro-preview93.27%
Supermemorygpt-4o81.60%
Full contextgpt-4o60.20%

OM uses background agents to compress conversations into dense observations, achieving 5-40× compression without RAG or graphs.[6]


Strengths

  • SOTA memory — Observational Memory beats all published benchmarks
  • TypeScript-native — First-class TS experience, not a Python port
  • Batteries included — One framework for agents, workflows, RAG, memory, evals
  • Prompt cache friendly — Stable context windows enable cost savings
  • Gatsby pedigree — Team has shipped widely-adopted open source before
  • YC backing — Credibility and resources for long-term development
  • Human-in-the-loop — Built-in suspend/resume for approval workflows

Cautions

  • TypeScript only — Python developers have to switch stacks
  • Young project — Limited production track record vs. established frameworks
  • OM sync limitation — Observation currently blocks conversation (async mode shipping soon)
  • Gatsby ghost — Gatsby's decline after Next.js dominance raises questions about staying power
  • Memory vendor lock-in — OM requires Mastra's storage adapters; not portable

Pricing & Licensing

TierPriceIncludes
Open SourceFreeFull framework (Apache 2.0)
EnterpriseCustomHosted services, support

Licensing model: Open source (Apache 2.0) + enterprise services

Hidden costs: Infrastructure costs for self-hosting; enterprise support requires contact[2]


Competitive Positioning

Direct Competitors

CompetitorDifferentiation
LangChainLangChain is Python-first; Mastra is TypeScript-native
CrewAICrewAI focuses on agent teams; Mastra is full framework
Vercel AI SDKVercel is lower-level; Mastra uses it internally
SupermemoryMastra OM outperforms Supermemory on benchmarks

When to Choose Mastra Over Alternatives

  • Choose Mastra when: You want TypeScript-native, batteries-included framework with SOTA memory
  • Choose LangChain when: You're Python-first or need maximum modularity
  • Choose CrewAI when: You specifically need multi-agent team orchestration
  • Choose Vercel AI SDK when: You want lower-level primitives, not full framework

Ideal Customer Profile

Best fit:

  • TypeScript developers building AI applications
  • Teams wanting all-in-one framework without stitching libraries
  • Projects needing production-ready memory that scales with conversation
  • React/Next.js developers wanting familiar patterns
  • Open source advocates wanting commercial backing

Poor fit:

  • Python-first teams
  • Organizations needing proven production track record
  • Teams wanting to avoid any vendor lock-in
  • Projects requiring immediate async memory operations

Viability Assessment

FactorAssessment
Financial HealthModerate — YC backing, early stage
Market PositionChallenger — new but differentiated
Innovation PaceRapid — SOTA memory, active development
Community/EcosystemGrowing — Gatsby team reputation
Long-term OutlookPositive — strong technical differentiation

Mastra has YC backing and technical differentiation with Observational Memory. The Gatsby team has proven open-source execution. Risk is whether they can maintain momentum and avoid Gatsby's fate of being eclipsed by competitors.


Bottom Line

Mastra is the most complete TypeScript AI framework available. The Observational Memory system is genuinely novel — achieving SOTA without RAG or graphs is impressive, and the prompt-caching benefits have real cost implications at scale.

The Gatsby team knows how to build developer tools that get adopted. Whether Mastra becomes the "Next.js of AI frameworks" depends on TypeScript developer adoption.

Recommended for: TypeScript developers wanting a batteries-included AI framework with SOTA memory and React/Next.js integration.

Not recommended for: Python-first teams, organizations needing proven production track record, or projects avoiding any framework lock-in.

Outlook: If Observational Memory proves out in production, Mastra could become the default TypeScript AI framework. Watch for async OM mode and enterprise adoption signals.


Research by Ry Walker Research • methodology