• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 27th, 2023

help-circle



  • DrQuint@lemmy.worldtolinuxmemes@lemmy.worldditch discord!
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    9 months ago

    That is absolutely not true unless if you have exact word matches, and anyone with half a brain knows it’s not about searching within discord, but about searching outside of it.

    Discord is a black hole of information. What happens inside is unknown from the outside. This is why every single FOSS project using discord loses the right to call themselves FOSS - an issues page is equally free, has way, way better features to relate an issue to patches and releases, and is actually indexable.




  • DrQuint@lemmy.worldtoPrivacy@lemmy.mlGoodbye Skiff
    link
    fedilink
    English
    arrow-up
    12
    ·
    9 months ago

    MAKE PRODUCT AT LOSS

    GET CAPITOL TO MAKE PRODUCT LOOK BIGGER

    SELL PRODUCT FOR MORE THAN SPENT

    FIRE EVERYONE TO “MAKE SUSTAINABLE” (LIE)

    LEAVE WITH GOLDEN PARACHUTE

    REPEAT TILL YOU CAN BUY ENOUGH PROPERTIES TO RAISE SHITHEAD KID WHO WILL RUIN YOUR FORTUNE


  • Don’t even need to make it about code. I once asked what a term meant in a page full of a certain well known FOSS application’s benchmarks page. It gave me a lot of garbage that was unrelated because it made an assumption about the term, exactly the assumption I was trying to avoid. I try to deviate it away from that, and it fails to say anything coherent and then loops back and gives that initial attempt as the answer again. I was stuck unable from stopping it from hallucinating.

    How? Why?

    Basically, it was information you could only find by looking at the github code, and it was pretty straightforward - but the LLM sees “benchmark” and it must therefore make a bajillion assumptions.

    Even if asked not to.

    I have a conclusion to make. It does do the code thing too, and it is directly related. Once asked about a library, and it found a post where someone was ASKING if XYZ was what a piece of code was for - and it gave it out as if it was the answer. It wasn’t. And this is the root of the problem:

    AI’s never say “I don’t know”.

    It must ALWAYS know. It must ALWAYS assume something, anything, because not knowing is a crime and it won’t commit it.

    And that makes them shit.