Having the scenes kick off while setting it up was the reason I never used scenes. Spousal approval was 0/5 so I just did automations to change things. This is a big improvement.
Same here, I use scripts instead of scenes and then run the appropriate scripts using automations. Honestly it will be too much of an hassle to convert each and every script to a scene now.
The scene improvement is nice but not what I was hoping for. I'd like to be able to edit everything about the scene without triggering any devices. And have a preview mode to be separate from editing alltogether.
@@EverythingSmartHomewhat reason is that? Seems pretty easy, make sure the entities exist and then don't verify anything after that until we go into preview mode.
I have a ''Good night" scene in HomeKit that activates my home alarm (among setting lights and thermostats), and I explicitly use HomeKit scenes because I can edit them without triggering them. I definitely understand if the technical implementation makes that not nearly as simple, but this 'update' is completely useless to me until that time. I couldn't edit that scene during the day if using HA as my wife and kids open our our exterior doors quite often and I don't want to cause the blaring alarm to randomly go off.
Make 2025 year of the NOOB. Auto-populated dashboard. Easier dashboards. Auto add devices etc... basically all the stuff that Google/Alexa have natively
@@EverythingSmartHome sure, but it's been a known issue for a week or more (beta testers spotted it) Not sure why the issue wasn't relayed to HA - it's no big deal, just my doorbell isn't announced with Alexa saying "ding dong" :D
Ugh, I wish I read more comments before upgrading. This was known during Beta but went all the way through to Release, so definitely strange that it wasn't flagged more or even delayed.
I have scene issues, and this update solved them. I don't use many scenes, but they're easier to use when you've got scripts for lighting and nerd custom "scenes" for certain light modes.
For me, the Nano Leaf integration does not work well to change the lighting. It does work as part of a scene though, so I have a handful of scenes. I'd prefer they were all automations, but for now they have to be scenes. This is a good improvement.
0:54 - turning lights on and off is a better case, but I tried setting scenes to outdoor blinds and they went up and down quite a few times before I gave up
Is there any way to query the state of a number sensor using voice? I coulndt phrase a request so that homeassistant tells me the home battery state or something similar
Never really got the points of scene myself, except to save a bunch of states on the fly through automation, and that is sadly still not working the way I wish it did. But I did open a WTH for it (WTH can’t scene restore the state of off entities), who knows, maybe they'll fix that too! The combined voice features seem pretty neat, I might try that out actually, this may be compelling enough at this point to give voice an actual try
Just voted for this. I would like to use scenes created in homeassistant ui for alarmo, instead of having list of entites and creating scene from nodered using this list
The order should be reversed, first to the LLM which will clean up your command (when it concludes that it must be a HA command), eventually translate it to English if you choose to use any other language, and then makes sure that what is sent back to the HA assistant is 100% in line with what HA will understand. Every so often, spoken words are incorrectly interpreted (STT), i.e. "turn on kitchen" becomes "turn on kitten". The Home Assistant assist has no mechanism to correct your command based on AI, as do most of the available LLM's - and quite well too. They will understand that you probably said, or indented to say "kitchen" and then in turn formulate a correctly formatted HA command and send it back to HA assist; "Turn on kitchen light" even adding the missing, but assumed entity or device name.
@EverythingSmartHome Thanks for helping, Lewis. I only have "Home Assistant" as a single option in the Conversation Agent drop down. Same when I try to Add a new one.
PP&L Electric has a Zigbee smart meter installed, but you have to give them information and they'll add your Zigbee device and they are the HOST. Is there anyway to get that meter in Home Assistant? Web access? Home about Veolia- USA (they have ~15 minute usage info on their site. I'm not sure where my UGI gas meter is, the read point is a little black wireless thing by the power meter. It must be in the basement ceiling. The more stuff in Home Assistant, the better.
im learning more about the voice features. would gemini , meta ai, etc... be considered a voice agent ? As far as i know they arent supported by home assistant , and they are cloud based to my understanding..but I would be intrigued if someone attempted to get that to work.
That's a thing already! I've pulled Gemini in (using the free Flash 1.5 model), it's a big improvement over nothing. Not to mention the ability to use the local voice agent, and kick over to the LLM if needed, is huge. It means that custom sentence triggers now work! To set it up, just look for the "Google Generative AI" integration. It's free, but I believe they use your data for training purposes
Watch out for breaking change on spotify, which wasn't shown in the video but is in the changes now. Spotify made some API changes which lead to breaking the integration, should be fixed with 2024.12 now :)
Running an LLM on a potato? Ha, Luxury! We had to run ours on a raspberry! And get up to go to school 3 hours before we went to bed! Then our dad beat us with a stick until we were dead! You tell that to kids these days and they don't believe you! :D
What ppppp me off is i am unable to install or update Addons as The Devs decided i must update first. What a way for ""local control"" i am still on 2024.6 and will not update as the call service option disappears. I am slowly moving everything to containers as the Devs do what they want as they think they are among the big boys now. I miss home assistant in the 2021 era
Scenes is a very powerful tool that doesn't get lots of love. The scene was a pain in the ass. I was setting up scenes the other day that included my blinds and light and midnight. The blinds and light kept going up and down constantly. Killed the batteries in my blinds.
Thanks for another excellent video - I found a great "hack" I place the Presence Lite behind the curtains and because it can see through the curtain - my partner does not see it, complain or remark on it....and it works perfectly. I also set the luminance so that when it is daytime, the lights will not come on and you have to open the curtains. Great product, great videos. Many thanks and have an excellent Christmas!
Woow very good for voice commands!! Loved it! The next big addition will be word streming with ollama! When you use TTS you need the whole sentence and it can take long in low power servers for local llm, but with streaming a LLM reponse of 3 or 4 tokens/s can make the assistant as fluid as Alexa, for example.
The fingers in the thumbnail lol 😂
i dont even notice
@veldik I guess AI images will be your downfall then. Lol.
Fits the new AI voice translation 😅
love it, nice intentional touch with the AI VT
B
I realy hope next year is "year of the user" where they improve user permissions, custom dashboards, groups, ect...
Having the scenes kick off while setting it up was the reason I never used scenes. Spousal approval was 0/5 so I just did automations to change things. This is a big improvement.
Same here, I use scripts instead of scenes and then run the appropriate scripts using automations. Honestly it will be too much of an hassle to convert each and every script to a scene now.
The scene improvement is nice but not what I was hoping for. I'd like to be able to edit everything about the scene without triggering any devices. And have a preview mode to be separate from editing alltogether.
I would agree, however having discussed it with the some of the HA team, I can see why it's not as straight forward as doing that
@@EverythingSmartHomewhat reason is that?
Seems pretty easy, make sure the entities exist and then don't verify anything after that until we go into preview mode.
I have a ''Good night" scene in HomeKit that activates my home alarm (among setting lights and thermostats), and I explicitly use HomeKit scenes because I can edit them without triggering them. I definitely understand if the technical implementation makes that not nearly as simple, but this 'update' is completely useless to me until that time. I couldn't edit that scene during the day if using HA as my wife and kids open our our exterior doors quite often and I don't want to cause the blaring alarm to randomly go off.
Precisely. This doesn't really solve anything, you're still triggering the scene if you're tinkering with what the scene actually does.
I don't understand why this is a problem in scenes but for automations and scripts you can set them up without activating.
Make 2025 year of the NOOB. Auto-populated dashboard. Easier dashboards. Auto add devices etc... basically all the stuff that Google/Alexa have natively
Alexa Media Player seems to be nuked when updating to core 2024.12. it is advised to stay on 2024.11 if you rely/use AMP until it can be fixed.
It would've been nice for the boffins at Home Assistant to mention this breakage in the release notes! Unfortunately, I updated. Now what!
It's a custom integration, they can't be expected to warn about custom integrations breaking as there is simply too many
@@EverythingSmartHome sure, but it's been a known issue for a week or more (beta testers spotted it) Not sure why the issue wasn't relayed to HA - it's no big deal, just my doorbell isn't announced with Alexa saying "ding dong" :D
Ugh, I wish I read more comments before upgrading. This was known during Beta but went all the way through to Release, so definitely strange that it wasn't flagged more or even delayed.
@@jaromandait’s fixed now. 5.0.1
Thank you for being with us all year long
I have scene issues, and this update solved them. I don't use many scenes, but they're easier to use when you've got scripts for lighting and nerd custom "scenes" for certain light modes.
For me, the Nano Leaf integration does not work well to change the lighting. It does work as part of a scene though, so I have a handful of scenes. I'd prefer they were all automations, but for now they have to be scenes. This is a good improvement.
0:54 - turning lights on and off is a better case, but I tried setting scenes to outdoor blinds and they went up and down quite a few times before I gave up
I appreciate a good comedy Vacuum robot name. I named mine Busta Grimes.
Oh that's amazing 😂
My Dreame D10 is named DreameBot Annie
you know you're british when you say "blackpool illuminations" actually laughed lmfaooo
or when you say out loud quickly: "purple burglar alarm" 😂
Absolutely essential summary, thank you. Does your house always look that tidy?
Definitely not with a toddler!
Where did you get that hoodie?
Thanks Lewis for another great video! I SURE do appreciate it. 😬
Thank goodness you still have your beard!
My first thought when seeing that sweater was "wanker"
I read that too.
LOL 😂
My concern here is that the breaking changes are the most important since they can create a lot of headache for some of us that's using this platform.
Is there any way to query the state of a number sensor using voice? I coulndt phrase a request so that homeassistant tells me the home battery state or something similar
Does the translation of measurement units mean I can flip the sensor config back to inches from mm for the EPL?
Uh no this is for local language translation if you don't use English
Is there a problem with duckdns? I have lost access to my homeassistant😒😒
Never really got the points of scene myself, except to save a bunch of states on the fly through automation, and that is sadly still not working the way I wish it did. But I did open a WTH for it (WTH can’t scene restore the state of off entities), who knows, maybe they'll fix that too! The combined voice features seem pretty neat, I might try that out actually, this may be compelling enough at this point to give voice an actual try
Just voted for this. I would like to use scenes created in homeassistant ui for alarmo, instead of having list of entites and creating scene from nodered using this list
can you help me unifi controller hostname ip setup
where do i set it up
I don`t have chatgpt , and I did the update yesterday..
Following
The order should be reversed, first to the LLM which will clean up your command (when it concludes that it must be a HA command), eventually translate it to English if you choose to use any other language, and then makes sure that what is sent back to the HA assistant is 100% in line with what HA will understand.
Every so often, spoken words are incorrectly interpreted (STT), i.e. "turn on kitchen" becomes "turn on kitten". The Home Assistant assist has no mechanism to correct your command based on AI, as do most of the available LLM's - and quite well too. They will understand that you probably said, or indented to say "kitchen" and then in turn formulate a correctly formatted HA command and send it back to HA assist; "Turn on kitchen light" even adding the missing, but assumed entity or device name.
Really strange but I've updated to the latest version but I don't see the fallback option in the voice commands section. Any ideas please?
Did you make sure to change to an LLM I'm the conversation agent?
@EverythingSmartHome Thanks for helping, Lewis. I only have "Home Assistant" as a single option in the Conversation Agent drop down. Same when I try to Add a new one.
PP&L Electric has a Zigbee smart meter installed, but you have to give them information and they'll add your Zigbee device and they are the HOST. Is there anyway to get that meter in Home Assistant? Web access?
Home about Veolia- USA (they have ~15 minute usage info on their site.
I'm not sure where my UGI gas meter is, the read point is a little black wireless thing by the power meter. It must be in the basement ceiling.
The more stuff in Home Assistant, the better.
I mean i was just looking at how to do scenes in HA perfect timing. I've been using hue scenes until now
im learning more about the voice features. would gemini , meta ai, etc... be considered a voice agent ? As far as i know they arent supported by home assistant , and they are cloud based to my understanding..but I would be intrigued if someone attempted to get that to work.
That's a thing already! I've pulled Gemini in (using the free Flash 1.5 model), it's a big improvement over nothing. Not to mention the ability to use the local voice agent, and kick over to the LLM if needed, is huge. It means that custom sentence triggers now work!
To set it up, just look for the "Google Generative AI" integration. It's free, but I believe they use your data for training purposes
The scene thing is genuinly a life saver
The feature I want is a good UI for voice commands and response. Specially for non English user it is very useful.
Could you do a video for setting up usb dongles for home assistant on Synology NAS? I’m new to HA and have no clue what to do.
Watch out for breaking change on spotify, which wasn't shown in the video but is in the changes now. Spotify made some API changes which lead to breaking the integration, should be fixed with 2024.12 now :)
Where is Voice Assistant satellite hardware?
👀
I never saw the need for scenes (personally) as most of my lights etc are controlled using automations
Running an LLM on a potato? Ha, Luxury!
We had to run ours on a raspberry!
And get up to go to school 3 hours before we went to bed!
Then our dad beat us with a stick until we were dead!
You tell that to kids these days and they don't believe you!
:D
Oh cool, Home Assistant is essentially implementing my Fallback Conversation Agent... Neat. :| Well, guess I don't need to maintain it anymore.
I had a friend who had that many fingers too 😅. All village women loved him
😂
I have a sleep mode for my office scene where I turn off my monitors via a smart plug. And yeah when editing that scene my monitors turned off :D
Never understood why scenes operated that way. Was incredibly annoying.
EP2 with cloaking device release date?
What ppppp me off is i am unable to install or update Addons as The Devs decided i must update first. What a way for ""local control"" i am still on 2024.6 and will not update as the call service option disappears. I am slowly moving everything to containers as the Devs do what they want as they think they are among the big boys now. I miss home assistant in the 2021 era
I'm really confused as to what you mean? Call service is still there, it's just renamed to "actions" but the functionality is unchanged
The Seven Fingered Man is freaking me out!
WTH is also the month of posting topics of things that is already implemented but you just didn't read the documentations or Googled on it.
Already planning on some Ollama setup at home and fall back to a faster and less intensive option seems like a good thing there.
Scenes is a very powerful tool that doesn't get lots of love. The scene was a pain in the ass. I was setting up scenes the other day that included my blinds and light and midnight. The blinds and light kept going up and down constantly. Killed the batteries in my blinds.
Scenes Improvement is huge
I think network Chuck set up a private LLM, locally.
Oh you can do it for a while now, it just requires a good amount of compute to be useful. My comment was more when it gets a lot easier to run 😅
Thanks for another excellent video - I found a great "hack" I place the Presence Lite behind the curtains and because it can see through the curtain - my partner does not see it, complain or remark on it....and it works perfectly. I also set the luminance so that when it is daytime, the lights will not come on and you have to open the curtains. Great product, great videos. Many thanks and have an excellent Christmas!
Glad to hear you are enjoying it 🙏🏻
Don’t upgrade if you are using Alexa Media Player. It will break it !!!!!
AMP is now fixed with version 5.0.1 My door switch automation is now reporting the door is open on my Alexa show and echo dot.
Woow very good for voice commands!! Loved it!
The next big addition will be word streming with ollama!
When you use TTS you need the whole sentence and it can take long in low power servers for local llm, but with streaming a LLM reponse of 3 or 4 tokens/s can make the assistant as fluid as Alexa, for example.
2024.12 - FIXED BMW integration | BROKEN Alexa Integration. 👿
AMP is now fixed with version 5.0.1. My door switch automation is now reporting the door is open on my Alexa show and echo dot.
There was Fallback Conversation Agent before in HACS. It's useless now i guess
First
Voice Fallback is fantastic. Now the price on tokens just needs to come down; I can't trust my kids to be responsible w/ free access to a metered LLM!
Was really expecting more about voice and devices to challenge google home and Alexa. 😢