- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
BBC Panorama discovered dozens of deepfakes portraying black people as supporting the former president.
Mr Trump has openly courted black voters, who were key to Joe Biden’s election win in 2020.
But there’s no evidence directly linking these images to Mr Trump’s campaign.
The co-founder of Black Voters Matter, a group which encourages black people to vote, said the manipulated images were pushing a “strategic narrative” designed to show Mr Trump as popular in the black community.
A creator of one of the images told the BBC: “I’m not claiming it’s accurate.”
What gets me about these AI bros is that they could use photoshop to fix the minor flaws like Trumps hand having the wrong color under it (apparently white hand on black skin fucks up the algorithm) but they never do.
They are such talentless hacks that even the most trivial work a real photographer do is insurmountable to them.
They could, but that would require them to understand how to use Photoshop rather than just typing in “Trump Black Campaign Popular” into a LLM and weeding out a few images that don’t look entirely surreal.
The scary shit about AI imagining is that, eventually, folks are going to get wise and start smoothing these out (by applying actual labor to the images rather than just letting the computer do all the work).
And then you really will have folks posting “Politician In Front Of A Large Crowd of Unlikely Supporters” images that aren’t easily debunked or dismissed.
You’ll also have a ton of FUD, such that real images that have been touched up by Photoshop are going to routinely be dismissed as AI generated. So someone’s inevitably going to come through with the “Joe Biden wasn’t really at Event X” leading into “Joe Biden has been dead for 10 days and the White House won’t admit it” conspiracies. And that’s going to get very ugly relatively quickly.