US deep-tech AI startup Tiiny AI has just unveiled the world's smallest personal AI supercomputer, with the new Tiiny AI Pocket Lab, which has been officially verified by the Guinness World Record under "The Smallest MiniPC (100B LLM Locally)".

This is the first global unveiling of the new Tiiny AI Pocket Lab, which will fit in your hands -- or your pocket, duh -- and is capable of running up to a full 120-billion-parameter LLM (Large Language Model) entirely on-device, without the need of cloud connectivity, servers, or high-end GPUs.
Tiiny has developed its super-small AI supercomputer for energy-efficient personal intelligence, and the Tiiny AI Pocket Lab runs within a 65W power envelope. The new Tiiny AI Pocket Lab enables massive AI model performance at a fraction of the energy and carbon footprint of traditional GPU-based systems.
- Read more: NVIDIA's new B30 AI GPU for China expected to have significant demand, 75% as fast as the H20
Tiiny AI Pocket Lab is designed to support nearly all major personal AI use cases, serving developers, researchers, creators, professionals, and students. It enables multi-step reasoning, deep context understanding, agent workflows, content generation, and secure processing of sensitive information - even without internet access.
The tiny AI supercomputer also provides true long-term personal memory by storing user data, preferences, and documents locally with bank-level encryption, offering a level of privacy and persistence that cloud-based AI systems cannot provide.
Samar Bhoj, GTM Director of Tiiny AI, said: "Cloud AI has brought remarkable progress, but it also created dependency, vulnerability, and sustainability challenges. With Tiiny AI Pocket Lab, we believe intelligence shouldn't belong to data centers, but to people. This is the first step toward making advanced AI truly accessible, private, and personal, by bringing the power of large models from the cloud to every individual device".

Tiiny AI supercomputer key specifications:
- Processor: ARMv9.2 12-core CPU
- AI compute power: Custom heterogeneous module (SoC + dNPU), delivering ~190 TOPS
- RAM + SSD: 80GB LPDDR5X + 1TB SSD
- Model capacity: Runs up to 120B-parameter LLMs fully on-device
- Power efficiency: 30W TDP + 65W typical system power
- Dimensions + weight: 14.2 x 8 x 2.53cm + 300g + pocket-sized
- Ecosystem: One-click deployment of dozens of open-source LLMs + agent frameworks
- Connectivity: Works fully offline, no internet or cloud required




