• 0 Posts
  • 1 Comment
Joined 10 months ago
cake
Cake day: February 4th, 2024

help-circle
  • We use words to describe our thoughts and understanding. LLMs order words by following algorithms that predict what the user wants to hear. It doesn’t understand the meaning or implications of the words it’s returning.

    It can tell you the definition of an apple, or how many people eat apples, or whatever apple data it was trained on, but it has no thoughts of it’s own about apples.

    That’s the point that OOP was making. People confuse ordering words with understanding. It has no understanding about anything. It’s a large language model - it’s not capable of independent thought.