← Back to research
·7 min read·opensource

PicoClaw

PicoClaw is an ultra-lightweight AI assistant from Sipeed that runs on $10 hardware with under 10MB RAM. Written in Go, it brings OpenClaw-like capabilities to embedded devices and low-end hardware.

Key takeaways

  • Ultra-lightweight: <10MB RAM, 1-second boot, single binary — 99% smaller than OpenClaw
  • Runs on $10 hardware including RISC-V boards, old Android phones, and Raspberry Pi
  • AI-bootstrapped: 95% of the codebase was generated by AI agents during development
  • Vertical integration: Built by Sipeed, who also manufactures the hardware it runs on
  • Viral growth: 17K GitHub stars in 11 days, fastest-growing OpenClaw alternative

FAQ

What is PicoClaw?

An ultra-lightweight personal AI assistant that runs on extremely resource-constrained hardware. It's like OpenClaw but optimized for embedded devices.

How much RAM does PicoClaw need?

Under 10MB in core configuration. This is 99% less than OpenClaw (>1GB) and 90% less than NanoBot (>100MB).

What hardware can run PicoClaw?

Any Linux device: $10 RISC-V boards (LicheeRV-Nano), Raspberry Pi, old Android phones via Termux, OpenWrt routers, and more.

Who makes PicoClaw?

Sipeed (Shenzhen Sipeed Tech Ltd), a Chinese RISC-V hardware company known for affordable development boards and AI cameras.

Is PicoClaw production-ready?

No. The project explicitly warns it's in early development with unresolved network security issues. Not recommended for production before v1.0.

Project Overview

PicoClaw is an ultra-lightweight personal AI assistant designed to run on hardware costing as little as $10.[1] Created by Sipeed, a Shenzhen-based RISC-V hardware company, PicoClaw brings OpenClaw-like capabilities to embedded devices, old phones, and resource-constrained systems that would struggle to run traditional AI assistant frameworks.

The project launched on February 9, 2026 and immediately went viral, accumulating over 17,000 GitHub stars in just 11 days — making it one of the fastest-growing projects in the AI agent space.[1]

PicoClaw was inspired by NanoBot, a Python-based lightweight assistant, but was "refactored from the ground up in Go through a self-bootstrapping process, where the AI agent itself drove the entire architectural migration and code optimization."[2] This AI-bootstrapped development approach resulted in a codebase that is 95% agent-generated.

What It Does

PicoClaw acts as a lightweight agent client that delegates reasoning to cloud LLM APIs (Claude, GPT, Gemini, DeepSeek, Zhipu) while keeping orchestration local.[3]

Core capabilities:

  • Shell execution — Run commands on the host device
  • File operations — Read, write, and manage files
  • Web search — Integrated Brave Search and DuckDuckGo
  • Speech-to-text — Voice input support
  • Multi-channel chatTelegram, Discord, QQ, DingTalk, LINE, WeCom
  • Memory and planning — Persistent memory and task logging
  • Person detection — AI camera integration for smart monitoring

Resource Comparison

The efficiency gains over existing solutions are dramatic:[2]

MetricOpenClawNanoBotPicoClaw
LanguageTypeScriptPythonGo
RAM1GB+100MB+Under 10MB
Startup (0.8GHz)500s+30s+Under 1s
Min Hardware Cost~$599 (Mac mini)~$50 (Linux SBC)~$10 (RISC-V board)

The 400x faster startup and 99% memory reduction come from Go's native compilation and zero runtime dependencies — no Node.js, no Python interpreter, just a single self-contained binary.[3]

Target Hardware

PicoClaw supports RISC-V, ARM, and x86 architectures. Recommended deployment targets include:[1]

  • $9.90 LicheeRV-Nano — Sipeed's own RISC-V board with Ethernet or WiFi
  • $30-100 NanoKVM — For automated server maintenance
  • $50-100 MaixCAM — AI camera for smart monitoring
  • Old Android phones — Via Termux (give decade-old phones a second life)
  • Raspberry Pi 3B+ — Classic hobbyist board
  • OpenWrt routers — Edge deployment on network hardware

Vertical Integration Story

PicoClaw represents a notable pattern emerging from Shenzhen: the company that writes the software also manufactures the hardware it runs on.

Sipeed (Shenzhen Sipeed Tech Ltd) is one of China's most active RISC-V hardware companies, with a partnership with Canonical to bring Ubuntu to their boards.[4] CEO Ji Yaping has positioned the company as a provider of "powerful and affordable RISC-V SBC, and even pad, phone, and laptop."

By building PicoClaw to run optimally on their own $9.90 LicheeRV-Nano boards, Sipeed creates a complete hardware-software stack that Western incumbents struggle to match on price. The same vertical integration pattern that made Chinese hardware dominant in other categories is now appearing in the AI agent space.

Quick Start

Installation is straightforward:[1]

# Download binary for your platform
wget https://github.com/sipeed/picoclaw/releases/download/v0.1.1/picoclaw-linux-arm64
chmod +x picoclaw-linux-arm64

# Or build from source
git clone https://github.com/sipeed/picoclaw.git
cd picoclaw
make deps
make build

Configuration requires an API key for your LLM provider:

{
  "model_list": [
    {
      "model_name": "gpt4",
      "model": "openai/gpt-5.2",
      "api_key": "your-api-key"
    }
  ]
}

Then start chatting:

picoclaw agent -m "What is 2+2?"

Chat Integrations

PicoClaw supports multiple messaging platforms:[1]

ChannelSetup Difficulty
TelegramEasy (just a token)
DiscordEasy (bot token + intents)
QQEasy (AppID + AppSecret)
DingTalkMedium (app credentials)
LINEMedium (credentials + webhook)
WeComMedium (CorpID + webhook)

Notably, the platform integrations skew toward apps popular in Asia (QQ, DingTalk, WeCom, LINE), reflecting Sipeed's home market. Western users will likely use Telegram or Discord.

Strengths

  • Extreme efficiency — Opens AI agents to hardware that was previously impractical
  • True portability — Single binary, no dependencies, cross-architecture
  • Hardware synergy — Optimized for affordable Sipeed boards
  • Rapid community growth — 17K stars signals strong product-market fit
  • Open source — Full transparency into the codebase
  • Novel development approach — AI-bootstrapped architecture is interesting precedent

Weaknesses / Risks

  • Early development — Project explicitly warns of "unresolved network security issues"[1]
  • Not production-ready — Recommended to wait for v1.0 before serious deployment
  • Limited testing — Rapid growth may outpace stability
  • Feature gaps — Some users report issues with multi-step tasks and skill execution
  • Security surface — Same prompt injection concerns as other AI assistants
  • China-origin concerns — Some users express hesitancy about software from Chinese companies
  • Scam risk — Project warns of fake crypto tokens claiming PicoClaw affiliation

Critical Perspectives

The project's own documentation is refreshingly honest about limitations:

"Warning: picoclaw is in early development now and may have unresolved network security issues. Do not deploy to production environments before the v1.0 release."[1]

Reddit users report mixed results:

"Managed to get picoclaw to work on my raspberry pi 3a+. But its kinda shit, cant even register to moltbook since it doesn't execute in steps."

The rapid pace of PRs (the team notes high volume during Chinese New Year holidays) suggests active development but also potential instability as features are merged faster than they can be tested.

Competitive Landscape

PicoClaw occupies a unique niche — the "microkernel" approach to AI agents:

vs. OpenClaw OpenClaw is feature-rich but heavy (~1GB RAM, Node.js runtime). PicoClaw sacrifices some capabilities for extreme efficiency. Different tools for different hardware budgets.

vs. NanoBot NanoBot (Python) was PicoClaw's inspiration. PicoClaw's Go rewrite delivers 10x better memory efficiency and 30x faster startup. NanoBot remains more accessible for Python developers.

vs. Local LLM solutions Projects like Ollama run models locally. PicoClaw delegates reasoning to cloud APIs, keeping the local footprint tiny. Better for internet-connected devices with limited compute.

vs. Edge AI deployments Traditional edge AI requires substantial local hardware. PicoClaw shows you can get useful AI assistance by combining a tiny orchestration layer with cloud inference.

Ideal User

  • Embedded enthusiasts running AI on RISC-V or low-power ARM
  • Home lab tinkerers wanting AI agents on Raspberry Pi or old hardware
  • Cost-conscious experimenters who can't justify a Mac mini for AI
  • Asian market users who want native QQ/DingTalk/LINE integration
  • Hardware hobbyists looking for interesting projects for cheap dev boards

Bottom Line

PicoClaw is the "OpenClaw for the rest of us" — a proof of concept that useful AI assistance doesn't require expensive hardware. The 99% memory reduction compared to OpenClaw opens deployment scenarios that were previously impractical.

The project's viral growth (17K stars in 11 days) validates demand for lightweight AI agents. However, the early development status and explicit security warnings mean this is currently a project for experimenters, not production deployments.

The vertical integration angle is the sleeper story here. Sipeed building software optimized for their own $10 hardware represents a pattern that could reshape the AI agent market — especially if Western incumbents continue optimizing for expensive Mac minis and cloud servers while Shenzhen optimizes for global affordability.

Watch this space. PicoClaw may be rough around the edges today, but the fundamental insight — that AI orchestration can be incredibly lightweight — has legs.