OFFLINE MODE for simplified ai

With a 7B model being only 5gb and requiring little RAM, there should be a more simplified model to run the rabbit on when not around a network.

1 Like

Local inference might be a big challenge for the rabbit hardware but any form of this would be interesting for simple system commands and results that don’t need to be available instantly.

2 Likes

It would be great to have whisper or an other stt model running on the R1.

1 Like