Human Rights Commission: We're both the beneficiaries of tech and the “ones facing the guillotine”

Human Rights Commission: We're both the beneficiaries of tech and the “ones facing the guillotine”

Commission seeking submissions as it seeks to protect society from harmful tech

Today’s technology can bring us to tears. Speaking yesterday at an Australian Human Rights Commission’s technology conference in Sydney, Emma Bennison of Blind Citizens Australia recounted her experience of using an app called Aira for the first time.

She had come out of an unfamiliar building, and wanted a coffee. So through Aira, she connected to a remote assistant who is able to view a live stream from a camera fitted on her glasses. The assistant found a nearby café, guided her to her seat and took her through the menu.

“It really does spell liberation for me,” Bennison said. “I have found myself reduced to tears by this technology, just because it is such a freeing experience.”

But it's not always tears of joy. On the other side of the coin, technologies like the NSW Police’s data and algorithm-driven ‘crime prediction’ Suspect Target Management Program (STMP), is overwhelmingly targeting Aboriginal youths (some as young as ten).

Analysis has shown that an Aboriginal child under the age of 15 is almost 31 times more likely to be singled out by the STMP than their white counterparts; leading it to be dubbed “racist”.

Whether we like it or not, we are living in “revolutionary times” brought about by various technologies that can both help and harm society and its citizens, said Human Rights Commissioner Edward Santow yesterday at the launch of a major project on rights and technology.

“I began by saying we are living in revolutionary times. I noticed when I said that there was the odd raised eyebrows. And fair enough, where are the angry citizens with pitchforks, the catchy but menacing songs, the old leaders meeting a brutal end?” Santow said.

“Clearly this revolution is different. As we make and consume technology, we are simultaneously the revolution’s beneficiaries and also the ones facing the guillotine. As we surround ourselves with ever increasing numbers of more powerful tech gadgets we risk sleep walking into a world that cannot and does not protect our most basic human rights. But it’s not too late,” he added.

The aim of the two to three year project is to identify the issues, research and consult on how to respond to them and then develop a “practical and innovative roadmap for reform”. The project has kicked off with a new issues paper which is now seeking submissions.

“One response would be to reject technological innovation. But like Canute [the King of trying to stop the tide fame] we would fail. New technology is coming whether we like it or not. And in trying to create a Luddites paradise, we could lose important opportunities to benefit from AI and related technology,” said Santow.

“The smarter alternative is to understand the challenges that new technology poses to our basic human rights, and establish a legal and broader framework that addresses those risks,” he added.

At the event, the audience heard from major technology vendors like Microsoft, Google and Salesforce; academics including Genevieve Bell, Toby Walsh and Mary-Anne Williams; and government agencies. Also appearing was ABC managing director Michelle Guthrie and UN Special Rapporteur on the right to privacy Dr Joseph Cannataci.

The conference’s line-up received criticism from some quarters for its lack of representation from civil liberties and privacy groups. Only one of the speakers could be considered as within that category, executive director of international digital rights group Access Now, Brett Solomon.

A better way

Australia’s Chief Scientist Dr Alan Finkel explained the dystopic potential for AI and emerging technologies with a reminder from history.

“It was data that made a crime on the scale of the Holocaust possible. Every conceivable dataset was turned to the service of the Nazis and their cronies,” he said.

There was now an opportunity, he said, for Australia to determine its own response to the “revolution”.

“It does not mean that we have to accept a future that we are handed by companies from China, or Europe, or the United States. To the contrary, we can define our own futures by being leaders in the fields of ethics and human rights. And that is my aspiration for Australia, to be human custodians,” Finkel said.

“In my mind that means showing the world how an open society, a liberal democracy, and a fair minded people, can build artificial intelligence into a better way of being, a better way of living,” he added.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags MicrosoftGooglesalesforceunited nationsdiscriminationAIHuman Rightsdon't be evilAustralian Human Rights CommissionEVILDr Alan FinkelAccess NowBlind Citizens AustraliaAHCREdward SantowEmma Bennison

More about AustraliaAustralian Human Rights CommissionEmmaGoogleMicrosoftNSW PoliceSolomon

Show Comments