Article Preview
Buy Now
FEATURE
AI on Pi
Running your own AI on a Raspberry Pi
Issue: 23.1 (January/February 2025)
Author: Eugene Dakin
Author Bio: Eugene works as a Senior Oilfield Technical Specialist. He holds university degrees in Engineering, Chemistry, Biology, Business, and a Ph.D. in Chemical Engineering. He is the author of dozens of books on Xojo available on the xdevlibrary.com website.
Article Description: No description available.
Article Length (in bytes): 13,191
Starting Page Number: 18
Article Number: 23103
Related Web Link(s):
http://127.0.0.1
http://192.168.86.39
http://192.168.86.39
http://192.168.86.39
Excerpt of article text...
Raspberry Pi Free AI
It's common to see and hear many people discussing artificial intelligence (AI) and large language models (LLMs). Some AI and LLM products can be expensive, especially when first learning how to work with them. This article provides a solution for installing the free Large Language Model Meta AI (LLaMA), developed by Facebook, that you can run on a budget-friendly Raspberry Pi 5.
There are no monthly fees or sharing of information with Big Brother, and all the information you add is kept only on your computer. This AI can run in remote locations where the internet is not available or where it is not desired. There is a long list of helpful libraries that can be installed to assist you with your questions, such as those related to computer programming. I personally use this as a glorified search engine with my Xojo books to help me retrieve code and provide day-to-day assistance for my clients. Before we get to the installation section, here is some information to help you understand the types of models available and why we should use them.
Installation Specifics
The latest model of LLaMA is 3.2 as of November 9, 2024, and this is the one we will be installing in this article. There are many different versions of the Raspberry Pi, and I will be using the Raspberry Pi 5 model, with 8 GB of RAM and 128 GB of hard drive space, running Raspberry Pi OS. Installing the model locally is convenient, and I will also show you how to install Open WebUI, which allows anyone in my family connected to my local intranet to access LLaMA to run their queries without interference from internet bandwidth and without the costly monthly subscription. I recommend installing a cooling fan on the Raspberry Pi, as it is economical, and running the AI model will increase the CPU temperature.
...End of Excerpt. Please purchase the magazine to read the full article.