Cannoli allows you to build and run no-code LLM scripts in Obsidian

by JieJieon 8/22/2024, 7:22 AMwith 1 comments

by JieJieon 8/22/2024, 7:26 AM

I'm having fun with this visual editor for LLM scripts. It's almost like Hypercard for LLMs.

On my 16GB MacBook Air, I did not have to set the OLLAMA_ORIGINS env variable. Maybe I did that a long time ago, as I have a previous Ollama install. This is the first really fun toy/tool that I've found that uses local (also supports foundation model APIs) LLMs to do something interesting.

I'm having a ball!