by Artem Savkin
LLMFarm is a mobile app that lets you load and experiment with various large language models (LLMs) directly on your iOS and MacOS devices. It offers diverse inference and sampling methods, plus Metal support for optimized performance.
Great app for experimenting with different LLMs on the go!
Perfect for: Perfect for developers and enthusiasts eager to explore LLMs on their mobile devices.
Category #25
7/19/2025
Overall an amazing app. I really support allowing for anybody to grab models off huggingface and use them locally, but there are still a few issues: 1. Low RAM devices. I suspect certain models are being masked based on memory, but I dont see any newer models. Ive made and run some mamba/deepseek/qwen 3B param models locally with 4GB, so I know its possible. 2. audio models (speech to text, text to speech, and audio to audio) models are possible to run at low sample rates. The only other app I use for local language models is specifically because of this. 3. This is more long term, but I would like to see support for loading regular pytorch models (yaml/json for architecture, .pt/.pth for weights). All in all, this is a great app, even without any changes. Thank you!! Great, but… Overall an amazing app. I really support allowing for anybody to grab models off huggingface and use them locally, but there are still a few issues:1. Low RAM devices. I suspect certain models are being masked based on memory, but I dont see any newer models. Ive made and run some mamba/deepseek/qwen 3B param models locally with 4GB, so I know its possible.2. audio models (speech to text, text to speech, and audio to audio) models are possible to run at low sample rates. The only other app I use for local language models is specifically because of this.3. This is more long term, but I would like to see support for loading regular pytorch models (yaml/json for architecture, .pt/.pth for weights). All in all, this is a great app, even without any changes. Thank you!!
iOS equivalent of draw things for LLM and not stable diffusion. Would love more settings tho and integration to the neural engine ect. And get the actual training to work So far best app for LLM iOS equivalent of draw things for LLM and not stable diffusion. Would love more settings tho and integration to the neural engine ect. And get the actual training to work
It's an excellent app. I hope it can add support for the Qwen large model. The best local LLM tool. It's an excellent app. I hope it can add support for the Qwen large model.
This is great! Lots of options for running custom models. Would love to have the ability to upload images for vision models. Allowing models to perform web searches would be handy too! Phenomenal This is great! Lots of options for running custom models. Would love to have the ability to upload images for vision models. Allowing models to perform web searches would be handy too!
Finally a locally run LLM that lets you import different models. The interface is nice and simple. The app runs well, 3b models are quick on the iPad mini 7, and it will even run 7b models, albeit slowly. Kudos to the developer. YES! YES! Finally a locally run LLM that lets you import different models. The interface is nice and simple. The app runs well, 3b models are quick on the iPad mini 7, and it will even run 7b models, albeit slowly. Kudos to the developer.
Artem Savkin
7/19/2025