Rawer

Under Construction

fixing entry permalinks is top of the to-dos

LibreChat Notes 2024-05-30

LibreChat Notes 2024-05-30

LibreChat is a front end for conversational AI services. It's design is not dissimilar to OpenAI's ChatGPT Web interface, but supports a wide range of services. It's open source, and can be run locally or wherever. They recommend using Docker, which is what I did. I'm on Ubuntu here, they support all the usual desktop platforms.

LibreChat is in active development, flux likely, so note the date above.

For starters I wanted to use models from OpenAI and Groq. API keys are needed - OpenAI's is pay-as-you-go, Groq is free but capped usage (and very fast).

Following the Local Installation of LibreChat with Docker will get the thing up and basically running with no fuss. But out of the box there are a few things that might not be immediately obvious.

For the OpenAI API key, first copy yours to clipboard then click the dropdown labeled OpenAI top-left of screen and click the gear icon to the right of OpenAI on the list. That's it done.

For Groq it's a little more involved.

I was going around in circles for quite a while until I saw that the instructions for YAML Setup appear after those for Custom AI Endpoints. The former tells you a docker-compose.override.yml is needed (a new one from the code on that page worked for me). That points Docker to a librechat.yaml file. Conveniently (for me) their librechat.example.yaml file has an entry for Groq, so I just copied & renamed the whole file to librechat.yaml. It contains:

  custom:
    # Groq Example
    - name: 'groq'
      apiKey: '${GROQ_API_KEY}'
      baseURL: 'https://api.groq.com/openai/v1/'
      ...

However, I'm no doubt missing something here, but it didn't pick up my environment variable set with:

export GROQ_API_KEY=...

Probably not best practices, but this is on a local install, so I simply pasted my key instead of the ${GROQ_API_KEY} bit above. It works!!!

I intend using this quite a lot, so I created a service file for systemd,

in /etc/systemd/system/librechat.service :

[Unit]
Description=LibreChat
Requires=docker.service
After=docker.service

[Service]
WorkingDirectory=/home/danny/AI/LibreChat
ExecStart=/usr/bin/docker-compose up
ExecStop=/usr/bin/docker-compose down
Restart=on-failure

[Install]
WantedBy=multi-user.target

Adjust paths to taste

I always get confused about the order of these, it's something like:

sudo systemctl daemon-reload
sudo systemctl enable librechat
sudo systemctl start librechat

If you run into probs, try all the usual update procedures for Docker, npm, etc. before tearing your hair out.

I'll be trying some other services soon (I see they have support for Ollama, so that's definitely in my queue). If there's anything useful to share I'll add it here.

2024-05-30

Good Things Come in Threes

Good Things Come in Threes

I had to be up unnaturally early (for me) this morning, Mari giving me a lift for a blood test. So I didn't have a coffee until after. Wouldn't usually have been a good start to the day, but the clean-enough jeans I found to wear had €30 in the pocket. No idea when from. Then when she dropped me back I noticed some new flowers on the balcony (photo here if I get around to it). Two good things. I totally reject the stupid supernatural nonsense about bad things coming in threes, but today I'm open to the notion of a third good thing. Even a 4th or 5th. Bring it on.

So many distractions, my head is a bit scrambled. But fortunately it's pretty clear what the next few things I need to do are, and all should be straightforward, procedural.

Plan

  1. Server rebuild bits

  2. #FOAFRetrospective stuff

  3. Bookmark store

  4. Write about aligning projects

  5. No.7 stuff

  6. #Groq on #LibreChat

  7. Family email

I've been procrastinating about server admin because I had the tedious job of decommissioning the old server to do before anything else. But that's now done I can look forward. 1-3 above are very intertwingled, and 4. is about jotting down thoughts on how best to manage tasks when there are common demands across projects. 5. is actually highest priority, but I'll be better able to deal with those things after a bit of dopamine from 1-4.

A meta-project in all this is facilitating the capture of any useful information. This blog setup was must-have numero uno, a consistent place to capture notes, under my control, with maximal potential for reuse. The next thing I need is a place to store links. I've been here before, several times. It seems a no-brainer to use a SPARQL store.

Oops, I forgot 6. That shouldn't take long, worth getting out of the way, it will hopefully help with the other things.

I've got LibreChat running locally with Docker Compose, it's connecting to OpenAI's GPT4o API fine. But that was ready to go out of the box (just add an API key). Groq appears in LibreChat's list of potential services, but isn't yet showing up in the UI.

This is exactly what I don't like about Docker!

Ok, I can't contradict any of the very strong arguments in favour of using containers and suchlike. But there's no denying it's adding an additional component to the stack/system/process. I'm inexperienced with it - for example, right now I'm not clear about where and when LibreChat's config & build files are consumed. There's a .env and a

2024-05-30

Calm

Calm

I woke up calm and rested after a good night's sleep. Had about an hour doing odds & end. All good. Then a video call from my mother, generally all ok but she promptly stressed me out. Headache, and the No.7 insurance thing has changed in my head from being a straightforward process to a Big Difficult Thing (nothing has changed in reality).

I needed to get data from the old server so I could kill it off before the end of the month (so as not to pay for it again). Done.

I also wanted to have something in place as a default AI assistant for all the things. I was using ChatGPT4, but hadn't subscribed for a couple of months ($ flow). Groq was a very good stand-in, but their free offerings didn't seem any use for managing per-task/project conversations. I started setting up LibreChat the other day, the plan being to to use it as a front end to whatever.

So many distractions, and I wanted to try out ChatGPT4o anyway, so I signed up there again.

Just now I had another quick look at #LibreChat - the Docker-based local install had worked nicely, I just had to add my API key for OpenAI to get that working.

Might as well have that starting on boot on this machine. With a little help from GitHub #Copilot (I have a sub there) I got a systemd service file set up.

[Unit]
Description=LibreChat
Requires=docker.service
After=docker.service

[Service]
WorkingDirectory=/home/danny/AI/LibreChat
ExecStart=/usr/bin/docker-compose up
ExecStop=/usr/bin/docker-compose down
Restart=on-failure

[Install]
WantedBy=multi-user.target

Hmm. It didn't work. Copilot to the rescue :

sudo systemctl enable librechat

Further probs. I did a load of updating things, still no joy. Then I wondered about a flag Copilot had given me : ExecStart=/usr/bin/docker-compose up -d. That appears to have worked! Educated guesswork that should have started sooner with distrusting AI.

Plan

carried over:

  1. Server decommissioning
  2. #FOAFRetrospective stuff
  3. No.7 garden
  4. AI assistant - good enough for now
  • No.7 ballcock

  • Nigel pressie - little placeholder ordered

  • draft contract for No.7

  • check insurance for No.7

  • Groq on LibreChat

  • remove _.md

  • articles for #Transmissions

  • Service.execute(message)

  • project alignment Article


FOAF Retrospective

danbri's last message about it:

I think for foaf collecting raw materials is key Like the most interesting or impactful things that ever used it Or ongojng use in linkeddata datasets Skos is also kind of needing the same but is probably another story

2024-05-30

Cough to Sneeze

Cough to Sneeze

I seem to have got over the cough/cold I had. Head still a bit fuzzy, but after one good cough when I woke up, all seems fine. Now snuffling a bit with hay fever...

Plan

  • Decommission server

The Cranberries' Zombie

Ok, if I tweak find-big-files a little, I should be able to save some time:

find ./ -type f -printf '%s %p\n' | sort -nr | head -30 | awk '{print "rm " $2}' > bigfiles.sh

I'll manually check the script before running it in case it's picking up anything I want to keep.

Ok, sweet.

Now :

tar -czvf server-decommission-2024-05_root.tar.gz /root
chmod 777

and locally:

time scp danny@178.79.136.35:/zips/server-decommission-2024-05_root.tar.gz ./

oops wrong dir - move to /zips and try again

hah, that was tiny.

Now, /home - doesn't have many big files, and most of those are in .git dirs. I'll just delete those.

find ./ -type d -name '.git' > gits.sh
...
tar -czvf server-decommission-2024-05_home.tar.gz /home
time scp danny@178.79.136.35:/zips/server-decommission-2024-05_home.tar.gz ./

var - virtually all big files were under log, lib or cache. I reckon I'm safe in just deleting those.

tar -czvf server-decommission-2024-05_var.tar.gz /var

Hmm. There was a lot under /var/www/html as expected. What I'd forgotten about is all the webdav stuff for Joplin.

That's a pain, I want to reuse that, but I seem to remember Joplin having a weird format in its .md files.

time scp danny@178.79.136.35:/zips/server-decommission-2024-05_var.tar.gz ./
2024-05-30

Slow Sunday

Slow Sunday

This post started as Tired Saturday :

Very late night last night (trying to fix a tv), very late start today.

I've developed a cough. Sean, Mari's Irish friend brought it, Mari got it in the week, now me. Annoying.

Sunday, so I won't feel bad if I don't get much done today.

Plan

  1. Server decommissioning
  2. #FOAFRetrospective stuff
  3. No.7 garden
  4. AI assistant

FOAF Retrospective

danbri's last message about it:

I think for foaf collecting raw materials is key Like the most interesting or impactful things that ever used it Or ongojng use in linkeddata datasets Skos is also kind of needing the same but is probably another story


Server Decommissioning

Previous virtual server on Linode that go compromised. I need to download copies of anything that might potentially be useful.

But the server is in a state of disarray, I can't remember what's where. I'll start by looking at the root directory:

danny@localhost:~$ ls /
bin                boot  etc   lib    lib64              libx32      media  opt   root  sbin                snap  sys  usr
bin.usr-is-merged  dev   home  lib32  lib.usr-is-merged  lost+found  mnt    proc  run   sbin.usr-is-merged  srv   tmp  var

Ok, I guess I should get (most of) :

$du -sh /root
  • /etc - 2.4G
  • /root - 1.9G
  • /home - 3.5G
  • /var - 12GB

Not as big as I expected, but still a pain on this slow connection.

Find my find big files script...

#!/bin/bash
find ./ -type f -printf '%s %p\n' | sort -nr | head -30

Ah. Now I see why /etc is that size. Lots of big files for #Fuseki DBs. I'd better get all that dir.

/root has a lot of big junk. I'll get rid of some of that before getting the dir.

/home - similar. A lot under .git subdirs that should be safe to ditch - similar stuff will already be local and on GitHub

/var - hah, of course - lots of big log files

Now what's the best tar/gzip command...ask ChatGPT...

tar -czvf root_backup.tar.gz /root

(It also had suggestions including scp, but today I'll do it step-by-step)

Ok, that actually looks familiar. So right away I'll do :

mkdir /zips
cd /zips
time tar -czvf server-decommission-2024-05_etc.tar.gz /etc
...
real	1m54.447s
user	1m36.847s
sys	0m12.939s

File size is 364M, that's grand.

Now to copy it down. scp, but I think it's have to be as non-root user. Ok,

chmod -R 777 ./

Overkill, who cares. Locally,

mkdir server-decommission_2024-05
cd server-decommission_2024-05

time scp danny@178.79.136.35:/zips/server-decommission-2024-05_etc.tar.gz ./

Looks to be working. Dogwalk time.

server-decommission-2024-05_etc.tar.gz        100%  363MB 841.7KB/s   07:22

real	7m41.482s
user	0m4.618s
sys	0m7.892s

Good-good. Next, check that's got reasonable contents. Next, clean those other dirs a bit and repeat for them.

2024-05-30