π Run a voice agent on your computer without Internet
StreamingConversation
quickstart.
This example uses Deepgram for transcription, ChatGPT for LLM, and Azure for synthesis - weβll be replacing each piece with
a corresponding open-source model.
/whisper.cpp
, the paths from the previous example would be:
/whisper.cpp/libwhisper.so
/whisper.cpp/models/ggml-tiny.bin
WhisperCPPTranscriber
in StreamingConversation
as follows:
pygpt4all
package by running:
StreamingConversation
as follows:
llama-cpp-python
by running the following:
StreamingConversation
as follows:
n_gpu_layers
to the llamacpp_kwargs
to offload some of the modelβs layers to a GPU.
StreamingConversation
as follows:
StreamingConversation
instance looks like: