author

Author: Don Obrien

Time to turn off Facebook’s digital fire hose


I got a call from my newly retired father recently, asking indignantly why ads for funeral homes and will-writing businesses were following him around the internet. He hadn’t been googling anything directly related, but had looked up some medications for my grandmother. I suggested that perhaps this, combined with his age, had caused some algorithm to conclude that he might require these services imminently. We tried to laugh it off, but I could tell the incident had unnerved him.

Many of us have our own algorithm war stories. Sometimes they’re relatively trivial. Sometimes downright cruel. Londoner Melissa Elliott, 38, lost her twins in 2019, four months into her pregnancy, but continued to receive maternity-related advertisements on Facebook for months after. She told the Huffington Post that she was driven to repeatedly google “miscarriage” in the hope it would nudge Facebook’s blinkered algorithm to stop the fire hose of baby ads.

Elliott’s attempt to appeal to the humanity of the algorithms, to connect the dots between miscarriages and mummy yoga, is particularly affecting. Because that simply isn’t how algorithms work.

To target ads on Facebook, the social media giant uses two methods: it allows companies to define what type of viewer they want; and it uses its own algorithms to learn about you and to decide which clients’ ads are best suited to you.

Facebook’s own algorithms target ads based on your browsing of its platforms, including Instagram, as well as your behaviour on the wider internet, which it tracks via “pixels” — tools that stalk you around the web, logging where you click and what you buy. The algorithms use such a plethora of data points to triangulate your desires that their methodology is opaque even to Facebook. There also isn’t any obvious way to turn them off.

So how much control does Facebook give users over ads? The answer, despite its protestations, seems to be hardly any.

A new study conducted by Northeastern University computer scientist Piotr Sapiezynski and Panoptykon, a Polish civil liberties organisation, focused on the ads seen by a single Facebook user, an individual with a history of anxiety around health issues. When this person, let’s call her Anna, became pregnant, she noticed she was receiving dozens of extremely disturbing ads relating to terminally ill children, including those with severe genetic disorders. These inflamed Anna’s existing anxieties.

“I wouldn’t say Facebook caused my health-related anxiety, but I feel it is exploited against me and it just fuels it and makes it worse,” she told me. The issue that Anna is describing is at the heart of how Facebook measures engagement — it assumes anything you click on is something you want to see more of.

Panoptykon and Sapiezynski’s experiment aimed to test Facebook’s ad control tools, to see to what extent turning off specific ad settings could influence the content Anna was shown. They discovered that using the tools barely changed her experience. She deleted 21 health-related interests from a list Facebook compiles, and turned off the “parenting” topic. But neither action impacts what Facebook’s ad-targeting algorithm learns about you from its platform — and continues to feed you. At the end of the month-long experiment, every third ad that Anna saw touched on these topics — nearly the same rate as before she activated any controls.

Facebook knows the impact it’s having on users. Last month, it was revealed that internal researchers had found Instagram contributed to body image issues, anxiety, depression and suicidal thoughts, most notably in teenage girls. These individual harms are propagated by Facebook, rather than advertisers.

Yet it admits there’s no way to turn the whole machine off. “We tell people . . . that removing interests or hiding topics will not stop every related ad, which is why we offer a range of ways to improve the ads experience,” a spokesperson told me. For instance, users can disconnect all their off-Facebook activity, making it harder for its algorithms to profile you. But finding this setting is onerous. After multiple red herrings on the settings page, and some googling, I eventually found that there was a six-step process to turn it off.

“A massive problem” is how Panoptykon policy adviser Karolina Iwańska describes the way the algorithms work. “The user cannot control the process, and the law also neglects that aspect of ad targeting.” She believes the law should compel companies to provide an option to turn off all targeting algorithms. Only then will we truly have control over our individual human experience in Facebook’s world.

Madhumita Murgia is the FT’s European technology correspondent

Follow @FTMag on Twitter to find out about our latest stories first





Source link

Share:

Share on facebook
Facebook
Share on twitter
Twitter
Share on pinterest
Pinterest
Oliver Bolt

Oliver Bolt

On Key

Related Posts