As the title says. I go for a 20 minute walk and when I stop moving, I’m not feeling tired or even agitated at all, yet my legs feel like they’re pulsating in different areas, always near the skin. It’s not synchronised with my heartbeat. It stops after a few minutes.

Chat GPT says these are just muscle twitches caused by dehydration or lack of electrolytes. I’m not convinced. Why does it feel almost on the skin and not deeper in the muscles? Why do I feel it after a 20 minute walk that doesn’t make me sweat but I don’t feel it after a 40 minute leg focused workout??? Wouldn’t that be more strenuous on the legs?? Does this thing even have a name?

Thanks

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Meh, ask it whatever, require it cite it’s source, then go follow up and verify for yourself

    • Mothra@mander.xyzOP
      link
      fedilink
      arrow-up
      5
      arrow-down
      4
      ·
      3 days ago

      I know my legs are fine. All I want to know is a name for this sensation and what causes it. Yes, I want to know about the weird stuff the body does, why is it wrong to ask chatGPT or google?

      • robotElder2 [he/him, it/its]@hexbear.net
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        4
        ·
        3 days ago

        LLMs are stochastic parrots. They just repeat the phrases most often used together in their training data in association with the words on your prompt. It’s like seeking medical advice from the predictive text on your phone keyboard.

        • Mothra@mander.xyzOP
          link
          fedilink
          arrow-up
          8
          arrow-down
          3
          ·
          3 days ago

          Why is this question considered medical advice? Also, considering most common facts are parroted correctly out of LLMs, why is it wrong to search for answers there first?

          • VirtualOdour@sh.itjust.works
            link
            fedilink
            arrow-up
            4
            ·
            2 days ago

            Yeah it’s a very reasonable question for an llm, especially if you Google the name for it and read a reputable article after

          • robotElder2 [he/him, it/its]@hexbear.net
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            3
            ·
            3 days ago

            OK fair, I guess if your not planning to act on it anyway then the stakes are pretty low. I don’t agree that llms reliably get basic information correct. “Glue is not pizza sauce” seems like a common fact to me but Googles llm disagrees for example.

            • hedgehog@ttrpg.network
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              “Glue is not pizza sauce” seems like a common fact to me but Googles llm disagrees for example.

              That wasn’t something an LLM came up with, though. That was done by a system that uses an LLM. My guess is the system retrieves a small set of results and then just uses the LLM to phrase a response to the user’s query by referencing the links in question.

              It’d be like saying to someone “rephrase the relevant parts of this document to answer the user’s question” but the only relevant part is a joke. There’s not much else you can do there.