Found this extension a while back [0] after someone suggested it to me when I was complaining about the bait/slop on my feed. I added Firefox support and I've really enjoyed using it, makes the feed a lot higher signal/noise! Great job by the initial author.
It needs an API key from Groq (for low latency LLM inference). Eventually I imagine we'll use local models for this.
I'd love to hear what people think and get the project more users (and thus contributors).
Found this extension a while back [0] after someone suggested it to me when I was complaining about the bait/slop on my feed. I added Firefox support and I've really enjoyed using it, makes the feed a lot higher signal/noise! Great job by the initial author.
It needs an API key from Groq (for low latency LLM inference). Eventually I imagine we'll use local models for this.
I'd love to hear what people think and get the project more users (and thus contributors).
[0]: https://x.com/ErikBjare/status/1882833358847356962)