Prompts and Modelfiles
By Anatoly Mironov
I read about Gemma this morning. It is the open source version of Google Gemini. So I started trying it out locally on my raspberry pi where I already have ollama and ollama-webui.
Then I discovered the prompts and modelfiles which I thought were worth a blog post.
Modelfile
Let’s start with Modelfiles. You can create your own or browse on OllamaHub.
It’s genious. It reminds Dockerfile with the initial FROM
statement. It contains the system prompt and parameters.
In the ollama and webui they listed as own models.
Prompt
Prompts are system prompts that you can download from the OllamaHub (community gallery) or create your own prompts. The prompts you add are available as commands (type /
and the command).
Prompt and Modelfile gallery
Copilots, GPTs are ways of customizing AI applications. I find Prompts and Modelfiles appealing, and especially how they are made available on OllamaHub, which is a kind of gallery or app store. This might be a good way to democratize AI even for Enterprise AI applications.
Local LLMs
Off topic, but one thought that I have had: Aren’t local Large Language Models (LLMs) the future? I mean for day-to-day AI solutions, generating e-mails, proof-reading messages, generating text. Many tasks can be executed locally on a laptop or a phone. Many have underused computers and phones, why not use them better? There always will be complex scenarios that require more computational power carried out on servers, but I am sure, many use cases can be easily solved using local LLMs.