Rehtaeh Parsons and the horrors of automation -

Rehtaeh Parsons and the horrors of automation

Jesse Brown on the ubiquity of auto content, and the problem of responsibility


The "accidental" ad, now blocked by Facebook.

Months after her death, the online abuse of Rehtaeh Parsons continues. An admin for, the spammy dating site that ghoulishly illustrated a Facebook ad with Parsons’ face, says that he’s sorry.

“I sincerely apologize,” Anh Dung told CTV news, “I simply used a tool to scrape images randomly on Google Images and inserted it into the ad campaign.”

It might seem simple enough within technology circles, where the use of algorithms to automatically pull and repurpose Internet content is the norm. The code for Ionechat’s ad campaign probably worked something like this: use Facebook’s advertising platform to segment straight male users, determine where they live from their profile data, use that data to generate Google Image search terms, such as “Canadian girl face.” Scrape the images that result and randomly plop them in, and you’ve got a user-targeted ad that maybe one in ten thousand viewers will click on. There are thousands of little businesses that use these methods, or variations of them. Like Dung said, it’s simple.

To the rest of us, it’s sci-fi level horror, a gross future where invisible and unfeeling machines mush together random artifacts from our lives, even after we’re dead, in order to sell cheap goods and services.

Facebook has also apologized, and blacklisted Ionechat from ever advertising with them again. is offline, at least for now.

You can expect the fallout to end there. To dig any further would be to question automated content itself, which social media relies on to function. No matter how many people are hired to monitor sites like Facebook for bullying or abuse, they will only ever be able to scrape the surface of the billions of communications that occur on that platform. Similarly, YouTube employees couldn’t possibly pre-screen the one hundred hours of video that are uploaded to it every minute, but engineers are working hard to write code that analyzes video for a certain percentage of pink, porny pixels or for copyrighted content. The whole point is to replace costly humans with bots that can make decisions almost as well. But there will always be a margin of error. No human would mistake a video of squirming piglets for porn, or block a home video on copyright grounds because a radio is playing a Justin Bieber song in the background.

And no human actively chose to use the image of a tragically dead child as an enticement for online dating. It just sort of happened. Anh Dung calls it an “accident.”  Facebook says its “extremely unfortunate.” Just like the abuse Rehtaeh Parsons suffered when alive, everyone feels awful, and no one feels responsible.

Follow Jesse on Twitter @JesseBrown

Filed under:

Rehtaeh Parsons and the horrors of automation

  1. The problem here is related to internet piracy and privacy generally. In traditional advertising, you needed release forms to us images of individuals. Unless it was clearly outside of copyright, you had to get permission of the creator (and likely pay a fee for it) as well.
    What happened to these laws? How is it now legal for bots tograb images off the net to use as they want?
    I could perhaps see using images of actual subscribers to their service (as long as the conditions of use agreement stated upfront that any posted images may be used in this manner) – but use of random, unlicenced images by a business should (still be) against the law. And there should be a presumed copyright on all personal images, with a requirement to actively seek permission before using commercially.

    • Agreed. Last year a Facebook ad in the side bar told me I was being challenged to an IQ contest. It included a photo of… my girlfriend…yep they scrapped the only photo they could, her profile photo.

      When I heard of this Reteah story I wasn’t surprised in the least… maybe just that it took so long.

    • I’m curious about this tool that was being used. If the tool was specifically created to scrape Google images for the purposes of making ads, then the tool itself would seem to be breaking the law. If the tool was just collecting images from Google and then the USER was using the scarped images to make ads, then the USER was using the tool in an illegal manner. As you say, one can’t just take an image from a Google search and use it without the permission of the image’s owner.

      Of course, with the tools used by Facebook itself, Facebook can do this all legally, so long as they’re only scraping images that have been uploaded to Facebook itself. If you’ve ever uploaded a photo to Facebook then you’ve agreed to their TOS, which allows for them to use any image that you upload for advertising purposes.

  2. No human would mistake a video of squirming piglets for porn, or block a
    home video on copyright grounds because a radio is playing a Justin Bieber song in the background.

    You think too highly of humans. We too use algorithms to recognize what we see and we too have a margin of error.

    But to get to the basic issue, isn’t it false advertising to entice users with an image of a person who does not actually use your service? If that was enforced, automatic image scraping would not have been used. It just has to be fined high enough to make the expected return negative. And that’s probably not even all that high.

  3. Am I the only one who finds it ironic that this very same post is attached to the very same photo of Rehtaeh? Look at the very bottom of the page under “macleans exclusives”. Is there really any difference in someone using her image, not knowing her story, to sell a dating site than it is for a news magazine to use her image to attract clicks?

    The fact of the matter is that it’s actually newspapers that are responsible for her image ranking prominently for certain terms.

    If I were to share this article on Facebook right now, the very same image would appear next to it, essentially as an ad for Macleans magazine.

    So really, isn’t this very much a case of the pot calling the kettle black?

    • Now that is an interesting question.

    • zing!

    • Hi Rick – all fair questions and concerns. We’ve now changed the image that appears alongside this story. Appreciate your feedback.