berlin.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Alles rund um, über, aus & für Berlin

Administered by:

Server stats:

676
active users

OK.

This is kind of scary.

I uploaded my profile pic to

theyseeyourphotos.com/

This is a site, setup by a former Google employee to show you, how Google would interpret a picture of you.

The result is not 100 % accurate (e.g. that I could be targeted with alcohol ads, but I hardly ever drink), but found the exact location and enough to be worried if they analysed a set of, say, a dozen photos.

They See Your PhotosThey See Your PhotosUpload a photo to find out how much an AI sees.

@mina

I fed it a larger version of my profile pic.

It's an acoustic (not electric) guitar, it's a brightly lit (not dim) room, I don't "neglect personal hygene", I don't drink, I don't hoard, and it guessed my income to be over 50 times what it's ever been. And none of the things it suggested selling me are things I'd ever buy.

It did a bit better with my 1976 MIT grad student ID by figuring out the MIT grad student bit from the text. ROFL. But ditto on the sell suggerstions.

Mina

@djl

The thing is:

Even if your results are wildly inaccurate on a single image, it doesn't make you any safer, because data scrapers may have 100s of photos and also lots of additional data from other sources.

In addition: If an "ai" sees an, say, (inexistent) tendency towards alcoholism, it might still decline you from getting a job or an insurance contract.

Fucking corporations believe everything, "the computer" tells them.

@mina

Agreed. Completely.

"AI safety" isn't really different from computer problems that appeared in the 1950s: "The computer said it and I can't do anything about it." People who use a tool (gun, computer, AI, AI drone), must take complete responsibility for the results of using that tool. Period. These things aren't "magic", they are tools, and the person or company who uses it must be seen as 100% responsible for the results.