• 4am@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    5 months ago

    I believe there is a ChatGPT integration in the works (optional, of course)

    • Serinus@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      If it runs locally, that’ll be awesome. I just hope it never decides to turn the heat up to 90F.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Ideally IMO you’d want a system with safeties in place. Like acceptable temperature ranges or durations for the oven to be on to avoid situations where the software misinterprets a command in a dangerous way.

        Something like this:

        User: Set temperature to 19 degrees. (Yeah it’s on the cold side even for Celsius, but not a crazy amount as room temperature is around 22 degrees)

        Assistant: Setting temperature to 90 degrees. (Deadly in Celsius… Water boils at around 100 degrees, depending on pressure)

        Assistant: 90 degrees is outside of the safe range defined by your configuration. Intrusion suspected. Deploying sentry guns.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          Good question - I have an allowed range configured on my thermostat but I don’t know if it applies to API calls or is just for the UI

      • Saik0@lemmy.saik0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        There’s plenty of local LLM options these days. It’s entirely feasible to run it in house.

        And if someone can do it… I would suspect that there’ll be a HACS module up about 2 weeks ago…