- cross-posted to:
- techtakes@awful.systems
- cross-posted to:
- techtakes@awful.systems
cross-posted from: https://lemmy.world/post/23009603
This is horrifying. But, also sort of expected it. Link to the full research paper:
cross-posted from: https://lemmy.world/post/23009603
This is horrifying. But, also sort of expected it. Link to the full research paper:
Not really caught. The devs intentionally connected it to specific systems (like other servers), gave it vague instructions that amounted to “ensure you achieve your goal in the long term at all costs,” and then let it do its thing.
It’s not like it did something it wasn’t instructed to do; it didn’t perform some menial task and then also invent its own secret agenda on the side when nobody was looking.
It says the frontier models weren’t changed though… Do you think this introduction ending is incorrect?
I never said anything of the kind. I just pointed out that it didn’t do anything it wasn’t instructed to do. They gave it intentionally vague instructions, and it did as it was told. That it did so in a novel way is interesting, but hardly paradigm shattering.
However, the idea that it “schemed” is anthropomorphization, and I think that their use of the term is intentional to get rubes to think more highly of it (as near to AGI) than they should.