CIO

Algorithms and experiments make strange bedfellows at SXSW

Data has to be a two-way street, these panelists agreed.
  • Lamont Wood (Computerworld (US))
  • 15 March, 2016 05:50

Search and social media algorithms can produce strange results for their users -- but fiddling with them can produce strange results from their users. In both cases the results may be too important to leave purely to machines.

That was the message from two conference sessions this weekend at South by Southwest (SXSW) Interactive in Austin, as the largely youth-oriented technology conference moved into its second day under clearing skies.

"How to raise your IQ by eating gifted children," suggests the auto-complete feature of a search engine. Another tagged a specific ethnic group of humans as gorillas. In yet another case, women were not shown high-end job openings. And, Facebook will say that you are getting a particular advertisement "because you are similar to our customers."

Related: SXSW: Obama touts tech, others examine pitfalls

Christian Sandvig, a professor at the University of Michigan, presented these examples as part of a session titled "Algorithmic Lunacy and What to Do About It."

"We have a crisis of confidence in algorithms," he said, calling for "seamfull" (as opposed to seamless) design that makes the algorithms visible rather than invisible. "But that is premised on having non-evil designers who are trying to help (rather than delude) you," he added.

Fellow panelist and Intel research scientist Dawn Nafus told of the algorithm of her fitness monitor harping at her for being unhealthy for not taking enough steps daily at a time when she was confined to a wheelchair after an accident. As a result, she would not have qualified for a workplace health insurance discount, she complained.

She called on the industry to make data downloads the norm so that users can go "off script" and "domesticate" their data through personal analysis.

She recalled a friend who analyzed the data from his sleep monitor and found he was waking up every morning at exactly 3 am. A computer in the room was backing up at that time and awakening him, he discovered. Another friend saw that her periodic exercise routine was triggering her auto-immune disease and so was actually hurting her health.

"Make data downloading the norm," she pleaded. "It's super-cheap to build in a data download button."

But when Facebook experimented with the algorithm that selected the items for the news feeds of individual users, many users were deeply troubled, as discussed by panelists in a session titled "Massive Online Experiments: Practical Advice." The so-called emotional contagion experiment of 2012 (published in 2014) altered the news feeds of about 700,000 users to see if their emotions changed (as evidenced by the language of their postings) in response to changes in the emotional content of their news feeds.

There could be a number of reasons for the backlash, said Duncan Watts, principal researcher at Microsoft Research. The word "experiment" has negative associations with rats and petri dishes, he said, while the need to resort to an experiment implies disappointing ignorance on the part of those running things, he said. Meanwhile, people are not comfortable with machine-based randomization, although it is integral to many experiments, he added, as they would rather believe that humans are making the choices.

Jeff Hancock, a Stanford University professor who was involved in the study, analyzed the hate mail he got, and found four themes. Some asked, "How dare you manipulate my news feed?" although it was always generated by an algorithm of some sort. Others accused him of attempted mind control. Some complained that the news feed was important to them. And some asked if they were part of the experiment.

But the biggest problem, he added, was probably that Facebook violated their expectations. "They thought of it as a platform, and platforms don't experiment on people," he said.

Consent is not always required for such experiments -- especially when the users are exposed to minimal risks, added Hancock.

Elizabeth Churchill, Google's director of user experience, told the conference that designers should routinely design experiments to support their design decisions as part of the process of creating a user experience, using qualitative and quantitative measures that can work together to improve overall quality.