Meet my AI doppelganger

When I saw the photos, I blanched.

What happened to my neck? Whose boobs are those? The face looks…kind of like me, but that is so not me.

And then a whole flurry of other things popped into my brain.

  • Is this what I would look like pretty and thin? Maybe I should try harder to get on Semaglutide.

  • Is this what I would look like more Asian?

  • Would I be more successful if this was my profile picture?

Hang on. Let’s back it up.

I spent the better part of the week researching AI-based HR tools, specifically for recruiting and job searches. I wanted to know who’s doing what, how, and whether or not it works. I’ve read the same articles as much of the general population, but I’m not completely ignorant to how these tools are trained. I’d call myself somewhat technical and of the belief that technology is a force for good, mostly.

I’d also played with a glut of generative AI tools, mostly language-based, but some image-based too. I turned DALL-E on myself only once when it first came out - and was so taken aback by what it generated that I put it away and didn’t look at it again.

Me, and DALL-E Me

Until today.

Eight months had passed since my DALL-E experiment, and that’s a long time for the tools to improve. So I gave it another go. This time, I went right for a headshot generator. Paid for it and everything. Guess I was feeling … I dunno, hopeful?

I uploaded 20 photos for it to use as a reference. Technically I offered it 60 different shots of me taken over the last 4 years. It rejected 12 because they were too similar to others or there was something else the algorithm didn’t like. I had to go through the rest and pick the 20 best. I did - I was happy with what I chose, and paused only for a moment to wonder how they would match my clothing choices (an option in this tool) to my body and what they would do to my ever-changing hair length. I clicked the button and off it went. A few hours later, I got the results.

Here are some things I knew intellectually but hadn’t really internalized:

  • The generator was trained using images probably sourced in bulk or from other customers. Where the company was founded (or the tool built) probably influenced the images used to train the AI.

  • Hilke Schellmann and Gianluca Mauro wrote an excellent piece in the Guardian in February about how AI algorithms objectify women’s bodies. My use case was different - I wasn’t putting a social media filter on - but I did wonder if my image would be sexualized in some way.

  • Much has been written about how AI struggles with people with dark skin. It was with immense privilege that I assumed that bias wouldn’t come into play for me, and it didn’t. A different bias did.

Yeah, all those things - and also:

  • I’m a pretty confident woman but I still struggle with body image issues. My first reaction was not that the photos were wrong, but that somehow I was wrong and not skinny enough. Still working on that.

  • I thought it might take my face, smooth it out a little bit, and put it on a proportional body. Not that it would do that and thin my face down and drop my head on a body that looked like me at 12.

  • With the immense confidence that comes after reaching a certain age, the fact that I’m statistically less hirable just stinks. I worked hard for every line, every freckle, every damn silver hair. The person in these photos is de-aged considerably. Would I believe she’s done all the things I’ve done, but be more hirable because she looks younger?

Here are my photos - the originals untouched and a selection of the generated headshots. I’m sharing these because I want folks to see for themselves the before and after. And because I think we need more realistic, more culturally representative, more variation in our AI training sets.

Oh yeah, one more thing. If you’re going to comment, don’t be a troll. Be kind.




Next
Next

My Girl Scout Campfire Story