Vechev and his team found that the large language models that power advanced chatbots can accurately infer an alarming amount of personal information about users—including their race, location, occupation, and more—from conversations that appear innocuous.

  • abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Unfortunately, the larger models are way too big to run client-side.

    Memory isn’t that expensive… NVIDIA generally only gives you a lot of it if you also buy a huge amount of compute (which is expensive), but there are other hardware manufacturers (e.g. Apple) that offer lots of memory with a modest amount of computer power and they run these models with great performance on hardware that doesn’t break the bank.

    Now that there’s a mass market use case for a lot of memory with a modest amount of compute power, I expect other hardware manufacturers will catch up to Apple and ship offerings of their own.

    You’d have to be crazy to let Google store all your personal emails for all eternity! And yet everybody does it

    There are other email providers…