this post was submitted on 30 Mar 2025
33 points (90.2% liked)

Open Source

35359 readers
202 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Camel Chat

Camel Chat is a feature-rich Flutter application designed to provide a seamless interface for communicating with large language models (LLMs) served via an Ollama server. It offers a user-friendly way to interact with open-source AI models on your own hardware.

Features

  • Connect to Ollama Servers: Easily connect to any Ollama server with optional basic HTTP authentication.
  • Multiple Model Support: Chat with any model available on your Ollama server.
  • Complete Chat History: View and manage your conversation history.
  • Dark Mode Support: Switch between light and dark themes for comfortable viewing.
  • Custom System Prompts: Define system prompts to set the AI's behaviour and context.
  • Export Conversations: Export your chats as markdown files for sharing or archiving.
  • Chat Organisation: Auto-generated meaningful titles for your conversations.
  • Responsive UI: Works seamlessly on both mobile and desktop devices.
  • Code Formatting: Proper rendering and formatting of code blocks in responses.
  • Local Storage: All your conversations are stored locally for privacy.

Getting Started

Prerequisites

  • A running Ollama server (local or remote).

Installation

Android

Download and install the APK from the releases page.

Linux

Choose one of the following packages from the releases page:

  • Debian/Ubuntu: Download and install the .deb package.
  • Fedora/RHEL: Download and install the .rpm package.
  • Arch: Download and install .zst package.
  • Other distributions: Download the AppImage, make it executable and run it.

Setting Up Your Ollama Server

  1. Install Ollama from https://ollama.com/.
  2. Pull the models you want to use (e.g., ollama pull gemma3).
  3. Run the Ollama server.
  4. Connect Camel Chat to your server by entering the URL (e.g., http://localhost:11434/).

Roadmap

Here are some features and improvements planned for future releases:

  • Stream Responses: Implement streaming responses for more interactive conversations.
  • File Attachments: Upload and process files during conversations.
  • Chat Statistics: View usage statistics and performance metrics.
  • Release on Flathub
  • Windows & macOS Support
top 7 comments
sorted by: hot top controversial new old
[–] piefood@piefed.social 8 points 2 days ago (1 children)

How does this compare to something like openwebui https://docs.openwebui.com/ ?

[–] nutbutter@discuss.tchncs.de 7 points 2 days ago

My app is nothing compared to the features Open WebUI. I just wanted to make a simple native app. Honestly, I made this just because I wanted to see if I can make something like that.

Also, Open WebUI is slightly complex for someone who is not into self-hosting. My app is for someone who just installs Ollama on their laptop or any computer and has exposed it to on the local network.

[–] smee@poeng.link 1 points 2 days ago (1 children)

Looks interesting but can't get it to run on Debian 12, neither the .deb nor the appimage.

[–] nutbutter@discuss.tchncs.de 2 points 2 days ago (2 children)

That's odd. I did test it on Mint.

Can you run it via a terminal and tell me if it shows any errors?

[–] smee@poeng.link 2 points 11 hours ago (1 children)

.deb

$ /opt/camelchat/camelchat

opt/camelchat/camelchat: symbol lookup error: /opt/camelchat/camelchat: undefined symbol: g_once_init_enter_pointer

Appimage

Set as executable.

$ ./Camel-Chat-0.2.0-x86_64.AppImage

tmp/.mount_Camel-7OCAAq/camelchat: symbol lookup error: /tmp/.mount_Camel-7OCAAq/camelchat: undefined symbol: g_once_init_enter_pointer


From a similar issue for a different app it seems to be a glib issue, requiring glib 2.8+ when Debian12 is shipped with 2.74.6-2+deb12u5.


Android

Works perfectly!

[–] nutbutter@discuss.tchncs.de 1 points 6 hours ago

Thanks! I will definitely look into it.

[–] smee@poeng.link 2 points 2 days ago

I did run the app image in terminal and got an error about missing something, and the app image comes with everything bundled right? Might be an issue on my end I suppose.

I'll redo it and paste the error message tomorrow.