Is human behav­ior becom­ing more and more pre­dictable through the com­bi­na­tion of intel­li­gent algo­rithms and the dig­i­tal foot­print? What are the con­se­quences of this devel­op­ment? These ques­tions were at the cen­ter of dis­cus­sions at the 9th SURPRISE FACTORS SYMPOSIUM. Three inter­na­tion­al­ly renowned experts were invit­ed to share their insights with ACADEMIA SUPERIOR to devel­op ideas and recommendations.

The dis­cus­sions were mod­er­at­ed by the jour­nal­ist Dr. Melin­da Crane and the sci­en­tif­ic direc­tor of ACADEMIA SUPERIOR Univ.-Prof. Dr. Markus Hengstschläger. Mem­bers of the Sci­en­tif­ic Advi­so­ry Board and the Young Acad­e­mia also engaged in the dis­cus­sions. „The tech­no­log­i­cal devel­op­ments — like the the­sis ‘Pri­va­cy is gone’ — are there and we have to deal with them. Turn­ing back the time is not pos­si­ble. So the big ques­tion is: how can we inte­grate these new tech­nolo­gies into our soci­ety, so that our soci­ety remains appre­cia­tive, benev­o­lent, open, capa­ble of crit­i­cism and respon­si­ble”, said LH-Stv. Mag. Chris­tine Haber­lan­der the aim of the discussions.

Privacy is an illusion

Prof. Michal Kosin­s­ki, PhD from Stan­ford Uni­ver­si­ty showed in an arti­cle titled „I just showed that the bomb is there” in 2016, that with today’s data analy­sis capa­bil­i­ties people’s behav­ior is not just pre­dictable, but also can be influenced.

Kosin­s­ki under­scored his provoca­tive the­sis, that pri­va­cy is an obso­lete mod­el, with clear fig­ures and facts: „We all leave a dig­i­tal foot­print with our activ­i­ties on the Inter­net. Already in 2012, a per­son gen­er­at­ed a data vol­ume of 500 megabytes per day. And accord­ing to fore­casts, this will go up to 62 giga­bytes per day in 2025.” This dig­i­tal foot­print is result­ing from the use of smart­phones, social media, the Inter­net, voice assis­tants, sur­veil­lance cam­eras, sen­sors or cred­it cards, etc., because all ’smart’ devices record data.

Facebook knows us better than the partner

In a study with 60,000 par­tic­i­pants Kosin­skis team could show, that it needs only about 250 Likes on Face­book, that an algo­rithm can assess a per­son in a per­son­al­i­ty test as well as his or her part­ner. Such pro­fil­ings can be used for indi­vid­u­al­ly adapt­ed adver­tis­ing or marketing-messages.

The desire to pro­tect pri­va­cy is more and more reach­ing its lim­its. „Com­pa­nies may still can be forced to respect it, but crim­i­nal orga­ni­za­tions or indi­vid­u­als will always get access to sen­si­tive data, and people’s own com­fort will do the rest”, said the psy­chol­o­gist and data ana­lyst from Stan­ford. Also the vast major­i­ty of peo­ple is „shar­ing” their data vol­un­tar­i­ly to use online ser­vices that make their lives eas­i­er and bet­ter. Such ser­vices would not work with­out this data. The con­clu­sion of the expert is thought-pro­vok­ing because one could almost say: who does not release his data, behaves anti-social and ben­e­fits from the fact that many oth­ers do it.

But the devel­op­ment has dark­er sides: In his lat­est study Kosin­s­ki showed that an arti­fi­cial intel­li­gence requires only five pro­file pic­tures of a per­son to be able to clas­si­fy her or his sex­u­al ori­en­ta­tion with 80–90 per­cent cer­tain­ty. „A cir­cum­stance that is rather sec­ondary in our lib­er­al soci­eties, but in states where homo­sex­u­al­i­ty is pun­ished by death, it is a mat­ter of life and death”, Kosinksi point­ed out.

„Pri­va­cy is an illu­sion. The soon­er you accept this real­i­ty; the soon­er you can rea­son­ably talk about the nec­es­sary pol­i­cy.” – Michal Kosinski

„I am also wor­ried about data mis­use, but I’m con­vinced that 99.9% of the algo­rithms are used pos­i­tive­ly to help peo­ple. There­fore, I am in favor of accept­ing that pri­va­cy is a thing of the past and think that we should focus on min­i­miz­ing the risks and max­i­miz­ing ben­e­fits”, Kosin­s­ki empha­sized and added: „Only if we accept this real­i­ty, can we start to dis­cuss the nec­es­sary pol­i­tics and get the most out of the new technology”.

Ethical rules for robots

Robot researcher and design­er Univ.-Prof. Dr. Nadia Thal­mann from Nanyang Tech­no­log­i­cal Uni­ver­si­ty described her expe­ri­ences with the use of social robots. Tech­no­log­i­cal advances enable machines to rec­og­nize human lan­guage, ges­tures and emo­tions from move­ment, sound and image, to remem­ber peo­ple, to answer ques­tions and to speak in mul­ti­ple languages.

The robot Nadine, cre­at­ed by Prof. Thal­mann, is cur­rent­ly work­ing on a test tri­al in the cus­tomer ser­vice of an insur­ance com­pa­ny. Pre­lim­i­nary results show that the robot can not only respond faster to inquiries than the human col­leagues, but also that the cus­tomers accept the robot well.

The researcher at NTU Sin­ga­pore is cur­rent­ly see­ing the future of her robots above all in the care work with elder­ly peo­ple. In a few years, there will be a huge labor short­age in this area. „Social robots could act as a com­pan­ion and sup­port for peo­ple”, Thal­mann said.

„Pol­i­tics and soci­ety should now decide, what we want to allow robots to do. The time to dis­cuss this is now.” – Nadia Thalmann

To make sure that robots can real­ly be put to good use, suit­able frame­work con­di­tions are need­ed. There must be rules for robots that con­trol their behav­ior: „Even if robots nev­er real­ly feel and only sim­u­late emo­tions, we still have to anchor lim­its to their behav­ior in their soft­ware”, argued Thal­mann, who has pro­grammed her robot Nadine to be honesty.

Robot workforce

Will robots replace human work­force? Robot researcher Thal­mann can’t see evi­dence for this in the near future, even if her tests and projects are extreme­ly suc­cess­ful: „Nadine can han­dle some nar­row­ly defined areas well, but she will not take on a full-fledged human job in the fore­see­able future – only par­tial areas”, described Thal­mann. She point­ed out that – despite cen­turies of research – we only under­stand a frac­tion of human psy­chol­o­gy and phys­i­ol­o­gy. „But we know every­thing about our robots. They are by far not as com­plex as humans”, says Thalmann.

In gen­er­al, the native Swiss, who now lives in Sin­ga­pore, iden­ti­fied sig­nif­i­cant cul­tur­al dif­fer­ences in the use of robots between the US, Europe and East Asia. While Asians tend to be more open to new tech­nolo­gies, Euro­peans are more skep­ti­cal and cau­tious at first. „In Asia for exam­ple, if you find out that a robot can do a job as well or bet­ter than a human, then you will use a robot. Above all, effi­cien­cy is what counts here”, says Thalmann.

More technology assessment needed

The jour­nal­ist Susanne Gaschke warned against „dig­i­tal dumb­ing down” and called for more inten­sive tech­nol­o­gy assess­ment in order to be able to pre­dict and min­i­mize the risks of dig­i­ti­za­tion: „In many cas­es, we also use the dig­i­tal pos­si­bil­i­ties out of sheer con­ve­nience with­out suf­fi­cient­ly con­sid­er­ing their neg­a­tive effects: online trad­ing for exam­ple, leaves the inner cities des­o­late and increas­es the traf­fic prob­lem. The huge amounts of data require high­er and high­er stor­age capac­i­ties with cor­re­spond­ing ener­gy require­ments, this is a high­ly non-eco­log­i­cal sys­tem”, said Gaschke.

Dig­i­ti­za­tion also opens up new edu­ca­tion­al chal­lenges: „Adults and chil­dren need to learn how to get mean­ing­ful and accu­rate infor­ma­tion online.” The news com­mu­ni­ca­tion on the Inter­net also car­ries risks, said the jour­nal­ist: „Extreme opin­ions are high­ly rat­ed at the social plat­forms and, accord­ing­ly, wide­spread. Today, jour­nal­ists delib­er­ate­ly use this log­ic and thus con­tribute to the strength­en­ing of these ideas.”

„You have to know a lot in order to get a lot out of the inter­net.” – Susanne Gaschke

Espe­cial­ly in the edu­ca­tion-area, the use of dig­i­tal tech­nolo­gies should be well con­sid­ered, because new stud­ies show that chil­dren who come ear­ly into con­tact with „dig­i­tal dis­trac­tion machines” could train essen­tial (human-) skills too lit­tle. Nev­er­the­less, dig­i­ti­za­tion also has many pos­i­tive sides. Regard­ing the lack of work­ers in the care sec­tor, Gaschke said that robots as nurs­es are still bet­ter than no care at all.

She pleads for more broad­ly dis­cussing the neg­a­tive and pos­i­tive con­se­quences of tech­no­log­i­cal devel­op­ment: „Tech­no­log­i­cal progress is almost con­sid­ered sacro­sanct today. You can not dis­cuss this pub­licly. But that should be possible.”

Don’t do everything you can do

Markus Hengstschläger argued in the dis­cus­sions, that humans should not imple­ment every­thing that is tech­ni­cal­ly or sci­en­tif­i­cal­ly pos­si­ble. In cer­tain areas – where the risks can not be pre­dict­ed – care must be tak­en. As an exam­ple of what was pos­si­ble, but inter­na­tion­al­ly reject­ed, the geneti­cist referred to the recent­ly pub­lished cas­es of genet­i­cal­ly mod­i­fied embryos in Chi­na. With­in a very short time, pol­i­tics and sci­ence around the world, and even in Chi­na itself, have decid­ed not to allow such experiments.

„It’s also up to pol­i­tics to slow down tech­no­log­i­cal devel­op­ment so that peo­ple still man­age to come along with it”, empha­sized Hengstschläger and under­lined that more efforts are need­ed for glob­al agree­ments on the major chal­lenges human­i­ty is fac­ing. All of these new tech­nolo­gies can and should be used for the ben­e­fit of peo­ple, „but we need a per­ma­nent eth­i­cal sup­port in dis­cussing tech­no­log­i­cal inno­va­tions”, said Hengstschläger. And the mod­er­a­tor of the sym­po­sium, Melin­da Crane sum­ma­rized: „We can not stop these new tech­nolo­gies. But we have to and can shape it. To min­i­mize the risks and max­i­mize the benefits.”

SURPRISE FACTORS PLENUM
First results and insights into the top­ic of the sym­po­sium were pre­sent­ed at the PLENUM with more than 700 vis­i­tors in the con­gress cen­ter Toscana.
All videos and con­tent from the PLENUM >

SURPRISE FACTORS REPORT
The results and ideas of the sym­po­sium will be sum­ma­rized in the SURPRISE FACTORS REPORT.

The reports of the last years >