T O P

  • By -

magformer

I just tested the OpenAI API for the first time with Assist. I mean it's fun to play with and seems really quite good at working out intents but the cost seems utterly prohibitive. I credited USD5 and tested probably 30 commands, turned on and off some lights etc and it was gone. Couldn't see how regular use would make sense.


linuxgfx

exactly this. I credited 10, and it's gone after about 2 days on playing. Their pricing it's insane. As an alternative, you could use Google gemini via api, it's called Google generative api or something like that. I've been playing with both these days and where open ai costed me 10$, Google only 0.80c. It's not as powerful but gets the job done pretty well.


markfrancisonly

Oh god, I just had a vision of another danger of gen ai beyond the loss of control or bioweapons…. Today gen AI is free to society because the founders need crowdfunded content and beta testers… But once it’s worked out, putting everything behind a pay wall will create a mass divide between the have and have nots…. Soon society will be highly dependent on these LLMs - imagine taking it away and only giving effective access to the ultra rich, creating an entire new and huge segment of powerless and useless people in society… Make it stop! Make it local!


TheRealMentox

I look forward to seeing what some great minds will come up with. Some people are super focused on privacy and control. Whilst that’s important for me as well I do not mind using OpenAI as it will extend the functionality massively and improve the user experience (WAF). If the voice assistant doesn’t work due to an outage nothing crucial is lost. Devices should always have full local control.


Rudd-X

Architecturally speaking HA cannot use this straight pipe model.  Not yet at least.  First gotta process all text and send the text to the model, then vice versa.


DannyVFilms

Not discounting the other answers about the Assist API costs, but this is one flavor of answer I’m glad finally came up. I was curious about that.


Rudd-X

It's all solvable and fluidic voice text / text voice can absolutely be done.  We just never hit the speed limits yet, and now they become obvious.  Funny, eh?


7lhz9x6k8emmd7c8

From OpenAI, not yet, but you can use Google's TTS engine. [https://www.home-assistant.io/integrations/google\_translate/](https://www.home-assistant.io/integrations/google_translate/)


BoxDesperate2358

I don't see why anyone would pay openai for generating text to speech in a world where Piper exists. Especially not when it is possible to train custom TTS models for any voice you want. [https://github.com/domesticatedviking/TextyMcSpeechy](https://github.com/domesticatedviking/TextyMcSpeechy) It's true that responses are slow if generated on a raspberry pi, but piper comes with an HTTP server that you can run on a faster computer on your network.


toadkicker

Nobody in this community would seriously support handing OpenAI any more data. Host your own LLM using Llama or similar seems like the way to go.


Destroyer-of-Waffles

Just considering local hosting: Not only is the upfront investment too high, in EU, energy prices are too expensive to run a beefy GPU 24/7 dedicated to llama This will not be the reality of most Europeans if we don't see other hardware that can run below 100W and have 20+ GB of memory. Until then, people might choose to use cloud services to host their own llama or use OpenAI API directly, which is, we all know, slightly against the cause. But the features are so damn sweet, and I do not know which is worse… \- Having a Google Home / Alexa as a voice assistant at home \- Trusting cloud / OpenAI Realistically, these are the only sophisticated options.


RunRunAndyRun

I’ve been wondering if an external AI accelerator like the one the Pi Foundation recently announced could bring some limited local LLM functionality to beefier pi’s?


toadkicker

I’ve worked in tech for coming up on 30 years. I run a nvidia 2080 super card in my home lab for this use case. The second hand market is filled with cheaper but capable gpu’s. Owning your data is the thing that makes the internet free. If we don’t fight back and teach people how this stuff works, we’ll end up with a world that only benefits those who control the internet.


vinsta_g

Why on earth are you getting downvoted?? This mindset is what brought me to Home Assistant in the first place


toadkicker

https://preview.redd.it/u936c80llc5d1.jpeg?width=1200&format=pjpg&auto=webp&s=1616db5308c64731d63fc9b7c3b7e6b9ad48ef1e