Large language models locally in phone
MLC Chat lets users chat with open language models locally on ipads and iphones. After a model is downloaded to the app, everything runs locally without server support, and it works without internet connections do not record any information.
Because the models run locally, it only works for the devices with sufficient VRAM depending on the models being used.
MLC Chat is part of open source project MLC LLM, with allows any language model to be deployed natively on a diverse set of hardware backends and native applications. MLC Chat is a runtime that runs different open model architectures on your phone. The app is intended for non-commercial purposes. It allows you to run open-language models downloaded from the internet. Each model can be subject to its respective licenses.
Comments will not be approved to be posted if they are SPAM, abusive, off-topic, use profanity, contain a personal attack, or promote hate of any kind.