Print

Print


I am also interested in this thread - particularly the bit about preventing
bias and discrimination being coded in to AI (and other digital technology),
and how we can possibly find algorithms for spotting this bias and flagging
it up, which assumes in the first instance that we recognise that it's
there; and also the use of other bias interrupters.

Any other real examples of this would be gratefully received, as I am trying
to put together case studies for undergrads at Aston at the moment. Thanks
for the examples in the message below.

 

Best wishes

Dawn

 

Dawn Bonfield MBE

CEng HonFIStructE FICE FIMMM FWES

Royal Academy of Engineering Visiting Professor of Inclusive Engineering,
Aston University

Director, Towards Vision

 <mailto:[log in to unmask]> [log in to unmask]

Tel. 01438 820850 | M. 07881905520

@dawnbonfield |  <http://www.dawnbonfield.com> www.dawnbonfield.com |
www.towardsvision.org

 

Working towards a vision of diversity and inclusion in engineering.

2018 is the Year of Engineering. Join me on my '
<http://www.towardsvision.org/year-of-engineering-roadshow.html> Meet the
Parents'  roadtrip to promote engineering careers.

 

From: psci-com: on public engagement with science
[mailto:[log in to unmask]] On Behalf Of BARKER Daniel
Sent: 01 November 2017 17:11
To: [log in to unmask]
Subject: Re: [PSCI-COM] Guardian: Public backlash to AI

 

I agree with Thomas, there are things to worry about, concerning machine
learning and AI. Sometimes these are the things reported, sometimes not.

I've recently read a few books on the topic, and highly recommend "Weapons
of Math Destruction" by Cathy O'Neill. One of her case-studies is how
algorithms for policing and sentencing encode and perpetuate prejudices such
as racism.

Also:
"Digital Exhaust" by Dale Neef
"The rise of the Robots" by Martin Ford
"Postcapitalism" by Paul Mason
"Data and Goliath" by Bruce Schneier

And on somewhat related topics:
"The Internet is not the Answer" by Andrew Keen
"The Dark Net" by Jamie Bartlett
"Programmed Inequality: How Britain Discarded Women Technologists and lost
its Edge in Computing" by Marie Hicks

I am intrigued by our use of social media in public engagement. Should we
encourage the public hand privacy over to Twitter, Google and Facebook, so
they can find our thoughts or sign up for our events? This could be argued
either way. I would like to see more debate.

- Daniel

On 01/11/2017 14:34, Thomas Hornigold wrote:

Steve,

I don't want to point fingers, but I feel that there are many aspects to
this problem.

The first is that there are genuine fears of an existential risk that might
arise from superintelligent AI (which we are at the very least decades away
from even if it's possible, by a general consensus of experts.
https://en.wikipedia.org/wiki/Superintelligence:_Paths,_Dangers,_Strategies
is a good source for some of the academic work surrounding this.) These
genuine concerns are then expressed by popular figures like Stephen Hawking
and Elon Musk, and turned into soundbites of a single line that are splashed
around by the media (usually with some ridiculously misleading picture of
the Terminator attached.) To read some accounts, and certainly some
headlines, you'd think that we already had superintelligent AI cooked up in
some Google lab somewhere that's on the verge of destroying the world. This
is being recklessly exaggerated - and it doesn't help that, e.g., the focus
on robotics and artificial intelligence is often to blow developments out of
proportion.

Take two prominent stories of late: Saudi Arabia "giving a robot
citizenship"
http://www.independent.co.uk/life-style/gadgets-and-tech/news/saudi-arabia-r
obot-sophia-citizenship-android-riyadh-citizen-passport-future-a8021601.html

Facebook's AI "invents its own language":
http://www.telegraph.co.uk/technology/2017/08/01/facebook-shuts-robots-inven
t-language/


Both had major flaws in the reporting, especially when you go to less
reputable publications. Nowhere, for example, in the Sophia article do they
mention that all of its responses are pre-scripted; it's hardly as
intelligent as you might believe at first glance.

Facebook's AI story was an interesting scientific paper about how the
particular algorithms they were using deviated from what they were expecting
(and, eventually, English); but "AI experiment shut down for getting too
intelligent and inventing its own language" (as it was widely expressed);
highly misleading.


So much for strong AI. Then we have to get onto the topic of what actually
exists: weak AI (optimization algorithms) and robotics, automation. 

I think in many ways people are right to be skeptical about how well these
fields can benefit them. We're being constantly warned that many jobs are in
danger of becoming automated, and you don't often see many  economic or
social arguments for the benefits to society as a whole. Algorithms are not
always used responsibly; they can take away the human element in
decision-making and hence also the responsibility. The fact that they're
often used to sell us products and direct advertising in a way that some
view as manipulative also gives people cause for concern. Like any new
technology, there is always skepticism about the negative impacts before
there is widespread acceptance; in this case, it is exacerbated by failures
to understand what AI can and can't do, and various looming dystopian
futures that get more coverage for being more lurid.

If anyone is working in robotics or artificial intelligence and would like
to be interviewed for a science communication project that discusses the
benefits and risks of artificial intelligence/robotics, do drop me a line on
this email and we can get in touch.

Rant over...

Thomas


 
<http://www.telegraph.co.uk/technology/2017/08/01/facebook-shuts-robots-inve
nt-language/> 

 
<http://www.telegraph.co.uk/technology/2017/08/01/facebook-shuts-robots-inve
nt-language/> Robots scare Facebook - The Telegraph

www.telegraph.co.uk <http://www.telegraph.co.uk> 

Facebook shut down a pair of its artificial intelligence robots after they
invented their own language.






 
<http://www.independent.co.uk/life-style/gadgets-and-tech/news/saudi-arabia-
robot-sophia-citizenship-android-riyadh-citizen-passport-future-a8021601.htm
l> 

 
<http://www.independent.co.uk/life-style/gadgets-and-tech/news/saudi-arabia-
robot-sophia-citizenship-android-riyadh-citizen-passport-future-a8021601.htm
l> Saudi Arabia becomes first country to make a robot into a citizen

www.independent.co.uk <http://www.independent.co.uk> 

Saudi Arabia has become the first country to give a robot citizenship. The
move is an attempt to promote Saudi Arabia as a place to develop artificial
intelligence - and, presumably, allow it to become a full citizen. But many
pointed out that those same rights aren't afforded to many humans in the
country. The robot, named Sophia, was confirmed as a Saudi citizen during a
business event in Riyadh, according to an official Saudi press release.









  _____  


From: psci-com: on public engagement with science
<mailto:[log in to unmask]> <[log in to unmask]> on behalf of
Steve Pritchard  <mailto:[log in to unmask]>
<[log in to unmask]>
Sent: 01 November 2017 14:02
To: [log in to unmask] <mailto:[log in to unmask]> 
Subject: [PSCI-COM] Guardian: Public backlash to AI 

 

Hi Sci-Com peeps,

Spotted this in today's Guardian -
https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.thegua
rdian.com%2Fscience%2F2017%2Fnov%2F01%2Fartificial-intelligence-risks-gm-sty
le-public-backlash-experts-warn
<https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.thegu
ardian.com%2Fscience%2F2017%2Fnov%2F01%2Fartificial-intelligence-risks-gm-st
yle-public-backlash-experts-warn&data=02%7C01%7Cthomasa2z%40hotmail.com%7Ca4
42a19d91a5472d390a08d5213154ed%7C84df9e7fe9f640afb435aaaaaaaaaaaa%7C1%7C0%7C
636451418106662541&sdata=fTfQtEqfvqVh8pi6trfCH5%2BLuNKQQFNGtEhbOIlkSUk%3D&re
served=0>
&data=02%7C01%7Cthomasa2z%40hotmail.com%7Ca442a19d91a5472d390a08d5213154ed%7
C84df9e7fe9f640afb435aaaaaaaaaaaa%7C1%7C0%7C636451418106662541&sdata=fTfQtEq
fvqVh8pi6trfCH5%2BLuNKQQFNGtEhbOIlkSUk%3D&reserved=0

Now I'd hope that sci-com professionals could help in this case, but it does
seem that we're not learning from previous experience of failure to engage
over new technology.

What's the cause of this? Funding? People not seeing the importance of good
communications? Or is the message getting across - given that this is a
warning in advance of the problem?

Answers on a postcard...

Steve

**********************************************************************

psci-com how-to:
Once subscribed, send emails for the list to [log in to unmask]
<mailto:[log in to unmask]> . If not subscribed, either subscribe here
https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.jiscma
il.ac.uk%2Fcgi-bin%2Fwebadmin%3FA0%3Dpsci-com
<https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.jiscm
ail.ac.uk%2Fcgi-bin%2Fwebadmin%3FA0%3Dpsci-com&data=02%7C01%7Cthomasa2z%40ho
tmail.com%7Ca442a19d91a5472d390a08d5213154ed%7C84df9e7fe9f640afb435aaaaaaaaa
aaa%7C1%7C0%7C636451418106662541&sdata=OO3sQvZX85U5V46cX08i7LJocu5NhuRCN6J1K
vOZ2wo%3D&reserved=0>
&data=02%7C01%7Cthomasa2z%40hotmail.com%7Ca442a19d91a5472d390a08d5213154ed%7
C84df9e7fe9f640afb435aaaaaaaaaaaa%7C1%7C0%7C636451418106662541&sdata=OO3sQvZ
X85U5V46cX08i7LJocu5NhuRCN6J1KvOZ2wo%3D&reserved=0 or send requests for
items to be posted on your behalf to [log in to unmask]
<mailto:[log in to unmask]> 

To unsubscribe (or silence messages while away) send an email (any subject)
to [log in to unmask] <mailto:[log in to unmask]>  with one of the
following messages (ignoring text in brackets)

. signoff psci-com (to leave the list)
. set psci-com nomail (to stop receiving messages while on holiday)
. set psci-com mail (to resume getting messages)

Contact list owner at [log in to unmask]
<mailto:[log in to unmask]> 
Small print and JISCMail acceptable use policy
https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fsites.goog
le.com%2Fsite%2Fpscicomjiscmail%2Fthe-small-print
<https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fsites.goo
gle.com%2Fsite%2Fpscicomjiscmail%2Fthe-small-print&data=02%7C01%7Cthomasa2z%
40hotmail.com%7Ca442a19d91a5472d390a08d5213154ed%7C84df9e7fe9f640afb435aaaaa
aaaaaaa%7C1%7C0%7C636451418106662541&sdata=63ARQBsjdbSovxwY7Sln0q0FLAglMWTHv
5csAlfahkY%3D&reserved=0>
&data=02%7C01%7Cthomasa2z%40hotmail.com%7Ca442a19d91a5472d390a08d5213154ed%7
C84df9e7fe9f640afb435aaaaaaaaaaaa%7C1%7C0%7C636451418106662541&sdata=63ARQBs
jdbSovxwY7Sln0q0FLAglMWTHv5csAlfahkY%3D&reserved=0

**********************************************************************

********************************************************************** 

psci-com how-to: Once subscribed, send emails for the list to
[log in to unmask] <mailto:[log in to unmask]> . If not
subscribed, either subscribe here
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=psci-com or send requests for
items to be posted on your behalf to [log in to unmask]
<mailto:[log in to unmask]>  

To unsubscribe (or silence messages while away) send an email (any subject)
to [log in to unmask] <mailto:[log in to unmask]>  with one of the
following messages (ignoring text in brackets) 

. signoff psci-com (to leave the list) . set psci-com nomail (to stop
receiving messages while on holiday) . set psci-com mail (to resume getting
messages) 

Contact list owner at [log in to unmask]
<mailto:[log in to unmask]>  Small print and JISCMail
acceptable use policy
https://sites.google.com/site/pscicomjiscmail/the-small-print 

********************************************************************** 





-- 
Dr Daniel Barker
Institute of Evolutionary Biology
University of Edinburgh
Charlotte Auerbach Road
The Kings Buildings
Edinburgh
EH9 3FL
United Kingdom
 
The University of Edinburgh is a charitable body, registered in Scotland,
with registration number SC005336. 

********************************************************************** 

psci-com how-to: Once subscribed, send emails for the list to
[log in to unmask] <mailto:[log in to unmask]> . If not
subscribed, either subscribe here
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=psci-com or send requests for
items to be posted on your behalf to [log in to unmask]
<mailto:[log in to unmask]>  

To unsubscribe (or silence messages while away) send an email (any subject)
to [log in to unmask] <mailto:[log in to unmask]>  with one of the
following messages (ignoring text in brackets) 

. signoff psci-com (to leave the list) . set psci-com nomail (to stop
receiving messages while on holiday) . set psci-com mail (to resume getting
messages) 

Contact list owner at [log in to unmask]
<mailto:[log in to unmask]>  Small print and JISCMail
acceptable use policy
https://sites.google.com/site/pscicomjiscmail/the-small-print 

********************************************************************** 



**********************************************************************

psci-com how-to:
Once subscribed, send emails for the list to [log in to unmask] If not subscribed, either subscribe here https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=psci-com or send requests for items to be posted on your behalf to [log in to unmask]

To unsubscribe (or silence messages while away) send an email (any subject) to [log in to unmask] with one of the following messages (ignoring text in brackets)

• signoff psci-com (to leave the list)
• set psci-com nomail (to stop receiving messages while on holiday)
• set psci-com mail (to resume getting messages)

Contact list owner at [log in to unmask]
Small print and JISCMail acceptable use policy https://sites.google.com/site/pscicomjiscmail/the-small-print

**********************************************************************