AI can’t write my books yet, but it still took my author photos


When the Screen Actors Guild first announced it would be striking, in July, I found myself scanning the list of grievances. Despite being a card-carrying, sidewalk-pacing member of its sister group, the Writers Guild of America, I briefly indulged some knee-jerk thoughts. Writers, after all, are not showered with free products they can easily afford. Writers do not employ personal publicists or stylists. But financial solvency does not come with a SAG card either, and actors bear the consequences for this type of presumptive thinking. (Four lines on a teen drama? Guess who’s paying for dinner!) Even so, part of me wondered how bad things really were.

And then I focused on a problem familiar to every writer: the lack of contractual safeguarding when it comes to artificial intelligence. AI can use an actor’s likeness like a marionette. It can create “digital doubles” and replicate background actors, reducing whole careers to room tone. No one knows exactly how AI will impact those in creative industries, but our fear is not just about the unknown; it’s about being whisked into an agency-free abyss. Authors have already been living in an existential horror movie for some time. It takes a lifetime to develop a style. You do it alone. Then, once every solar eclipse, one of us gets to buy an apartment.

My own essay collections were included in Books3, a pirated data set of thousands of books used to train language models like ChatGPT, detailed earlier this year in a series by The Atlantic’s Alex Reisner. Even before Books3 came to light (my novels did not make the cut — oh, to be passed over by thieves!), I’d read enough about AI to give my nightmares all the texture they needed.

Every writer with whom I’ve spoken about AI shares a nagging feeling that whatever we have created is, in some ways, already gone. We are steadfast in our belief in books as sacred objects, in ourselves as artists, in our conviction that AI is no match for the true geniuses among us. In these moments, if we are resentful of anything, it’s that we’ve been forced to so earnestly call ourselves Artists with an A. But other times, we feel preyed upon and full of rage, peppered with quiet insecurities about our own originality. We are being batted around by a tiger cub that will, in all likelihood, eat us. On the bright side? Unlike actors, we will never have to contend with anyone doing weird stuff to our faces. At least, that’s what I thought up until a few weeks ago. When someone started doing weird stuff to my face.

While searching for the name of a photographer who’d once taken my picture, I stumbled across 14 artificially manipulated images of myself. A user on a website called Civitai, “the home of open-source generative AI,” used a “stable diffusion code” to Frankenstein dozens of people, mostly women. (This seems par for the course, for the medium and for life in general — that AI would hit women first and worst.)

Users of Civitai seem to exist somewhere on the spectrum of fans and poster designers. Many have a penchant for Tomb Raider. This one user had compiled several of my author photos (which were inexplicably credited to Herb Ritts and Diane Arbus), making my face thinner or wider, younger or more exhausted, manipulating the angles and grafting the results onto a catalogue of bodies. It should be said: They did a fantastic job with the hair. It should also be said: These images were in no way pornographic. In a way, the mundanity was part of the strangeness, an exercise in coding futility. Why not make me fly?

Generative AI is not quite the same thing as editing tools that allow you to move objects around. According to the metadata, there were 30 different images, presumably including different backdrops and outfits, used to fabricate my photos, to create an uncanny valley of stock photography. There “I” was, wearing a trench coat on a windy city street. Or squatting, as one does, in the middle of a marsh. Or with my arm around a duplicate of myself (is it navel-gazing if there are two of you?). Or clutching a portrait featuring yet another version of myself, a visual that would stump Nietzsche and M.C. Escher alike.

Civitai removed the images from the site within hours of my contacting them (among the criteria for removal is “false impersonation,” the cleanest definition of AI I’ve seen yet). Before I did this, however, I downloaded them and posted a few to Instagram. Creepiness notwithstanding, I felt flattered to be on the radar of anyone who also saw fit to mess with Kate Beckinsale’s face. Still, it had not occurred to me that my face could do what my words could: teach a robot to ape a human being. That part, the two-pronged nature of it, made me feel like “The Giving Tree” (not in the Books3 database), which is full of horrible lessons for children about toxic self-sacrifice. When outraged authors posted about their presence in Books3 on social media, some were accused of humblebragging. With nearly 200,000 titles involved, that’s not humblebragging, that’s bearing witness. This, what I’m doing right now, this is humblebragging.

In the end, this incident is a mere keyhole into the ways things are going to get freakier, if not devastating, for all Artists, for anyone who makes the world from scratch. But I confess that the longer the doctored pictures stay up on my grid, the more they start to look … well, not like me, but the gap between what is real and what is wrong has narrowed. I’ve become uncomfortably accustomed to seeing manipulated versions of myself. Which has given me a new AI-based anxiety for all writers. If and when the technology gets good enough to impersonate authors, no matter how distinctive we fancy ourselves, will we be able to separate the sentences we wrote from those we didn’t? Will we be able to identify the body? Will I recognize these very words as mine?

Sloane Crosley is the author of three essay collections and two novels. Her memoir, “Grief Is for People,” will be published in February.


Leave a Reply

Your email address will not be published. Required fields are marked *