LLM Chat via SSH

by wey-guon 6/14/2025, 4:43 AMwith 24 comments

by demosthanoson 6/16/2025, 2:08 PM

Skimming the source code I got really confused to see TSX files. I'd never seen Ink (React for CLIs) before, and I like it!

Previously discussions of Ink:

July 2017 (129 points, 42 comments): https://news.ycombinator.com/item?id=14831961

May 2023 (588 points, 178 comments): https://news.ycombinator.com/item?id=35863837

Nov 2024 (164 points, 106 comments): https://news.ycombinator.com/item?id=42016639

by ameliuson 6/16/2025, 12:50 PM

I'd rather apt-get install something.

But that seems not a possibility in the modern days of software distribution, especially with GPU-dependent stuff like LLMs.

So yeah, I get why this exists.

by gsibbleon 6/16/2025, 1:23 PM

We made this a while ago on the web:

https://terminal.odai.chat

by gbaconon 6/16/2025, 4:27 PM

Wow, that produced a flashback to using TinyFugue in the 90s.

https://tinyfugue.sourceforge.net/

https://en.wikipedia.org/wiki/List_of_MUD_clients

by dncornholioon 6/16/2025, 1:10 PM

Using React to render a CLI tool is something. I'm not sure how I feel about that. It feels like like 90% of the code is handling issues with rendering.

by xigoion 6/16/2025, 9:31 PM

It’s not clear from the README what providers it uses and why it needs your GitHub username.

by gclaweson 6/16/2025, 1:55 PM

Is this doing local inference? If so, what inference engine is it using?

by ryancnelsonon 6/16/2025, 4:19 PM

this is neat.... whose anthropic credits am i using, though? sonnet-4 isn't cheap! would i hit a rate-limit if i used this for daily work?

by ccbikaion 6/14/2025, 10:08 AM

I am the author, thank you for your support.

Welcome to help maintain it with me

by kimjune01on 6/14/2025, 5:43 AM

hey i just tried it. it's cool! i wish it was more self aware

by t0ny1on 6/16/2025, 1:38 PM

does this project request to llm providers?

by eisbawon 6/16/2025, 1:20 PM

Why not telnet?