https://huggingface.co/litert-community
AI for Mobile, Web, and Embedded
litertcommunity
https://developers.googleblog.com/unlocking-peak-performance-on-qualcomm-npu-with-litert/
Unlock NPU power for on-device GenAI with the new LiteRT Qualcomm AI Engine Direct Accelerator. Get a unified workflow and up to 100x CPU speedup.
peak performancegoogle developersunlockingqualcommnpu
https://developers.googleblog.com/mediatek-npu-and-litert-powering-the-next-generation-of-on-device-ai/
Use LiteRT for streamlined NPU acceleration on-device
next generationmediateknpulitert
https://developers.googleblog.com/on-device-genai-in-chrome-chromebook-plus-and-pixel-watch-with-litert-lm/
Explore LiteRT-LM, an open-source C++ framework for on-device language models, to efficiently run large language models like Gemma and Gemini Nano across a...
chromebook pluspixel watchdevicegenai
https://developers.googleblog.com/litert-the-universal-framework-for-on-device-ai/
LiteRT is the universal framework for on-device AI. The production stack delivers 1.4x faster cross-platform GPU performance, streamlined NPU acceleration, and...
google developerslitertuniversalframeworkdevice