Don't Show Again Yes, I would!

How to use Ollama to run large language models locally

Table of contents: [Hide] [Show]

Ollama, an open-source language model platform, has introduced several new features and updates since its initial introduction in October of 2023. Including the addition of Python and JavaScript libraries for Ollama, simplifying the creation of scripts for various tasks without relying on external tools like LangChain or LlamaIndex, Integration of vision models into Ollama, with support for command-line and API usage, enabling tasks such as image description automation and text recognition within images.

As well as OpenAI compatibility, allowing users to access Ollama models using the OpenAI library format, facilitating the transition from OpenAI models to Ollama for local execution, The ability to save and load sessions with models, enhancing the workflow for users who want to preserve their work and experiment with different prompts and Improvements to CPU support and user interface commands for better accessibility and control over model parameters and system prompts.

One of the most notable updates is the introduction of specialized libraries for Python and JavaScript. This is a major step forward for developers, as it allows for the direct creation of scripts for Ollama without the need for intermediary tools. These libraries provide a straightforward link to Ollama’s features, making it easier to automate tasks or integrate language models into your applications.

Another exciting development is the incorporation of vision models into Ollama’s capabilities. This addition enables developers to automate the description of images and recognize text within images, which can be done through both command-line and API interfaces. By combining visual and linguistic processing, Ollama opens up new possibilities for creating innovative applications.

See also  Four Google Pixel 8a models pop up on Bluetooth SIG

Using Ollama to run AI models locally

Sam Witteveen has created a great overview video explaining more about the new features, libraries, Vision and updates made to Ollama making it a fantastic choice if you would like to run artificial intelligence on your local network or PC.

Here are some other articles you may find of interest on the subject of Ollama :

Ollama Vision

The LLaVA (Large Language-and-Vision Assistant) model collection has been updated to version 1.6 supporting:

  • Higher image resolution: support for up to 4x more pixels, allowing the model to grasp more details.
  • Improved text recognition and reasoning capabilities: trained on additional document, chart and diagram data sets.
  • More permissive licenses: distributed via the Apache 2.0 license or the LLaMA 2 Community License.

These models are available in three parameter sizes. 7B, 13B and a new 34B model:

  • ollama run llava:7b
  • ollama run llava:13b
  • ollama run llava:34b

For those who have been working with OpenAI models, Ollama now offers compatibility with the OpenAI library format. This ensures a seamless transition for developers looking to switch to Ollama, allowing the use of familiar methods and reducing the time it takes to adapt to a new platform.

The platform has also improved session management, which is a significant benefit for developers involved in multiple projects or long-term work. The ability to save and load sessions with models means that you can pick up right where you left off, without losing progress. This feature saves valuable time and effort.

In addition to these updates, Ollama has made strides in enhancing CPU support and refining user interface commands. These improvements provide developers with more control over model parameters and system prompts, accommodating a broader range of hardware capabilities and user preferences.

See also  Samsung and Netflix team up on Squid Game

Ollama’s recent updates are focused on refining the development process and expanding the platform’s functionality. With the new Python and JavaScript libraries, the integration of vision models, OpenAI compatibility, and improved session management and CPU support, Ollama is enhancing its position as a user-friendly and versatile tool for developers. These improvements are set to enrich the experience of using one of the leading open-source language model platforms in the industry.

Filed Under: Guides, Top News





Latest togetherbe Deals

Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, togetherbe may earn an affiliate commission. Learn about our Disclosure Policy.

Share:

lisa nichols

My lisa Nichols is an accomplished article writer with a flair for crafting engaging and informative content. With a deep curiosity for various subjects and a dedication to thorough research, lisa Nichols brings a unique blend of creativity and accuracy to every piece

Leave a Reply

Your email address will not be published. Required fields are marked *