Arti­fi­cial intel­li­gence and dig­i­tal tech­nolo­gies were the focus of the 9th Sur­prise Fac­tors Sym­po­sium of ACADEMIA SUPERIOR. The invit­ed experts dis­cussed the risks and oppor­tu­ni­ties of tech­no­log­i­cal devel­op­ments and the use of data that are left as a dig­i­tal foot­print on the Inter­net. The results were pre­sent­ed at the PLENUM with more than 700 vis­i­tors in the Toscana Con­gress Cen­ter.

„We are pleased that as keynote speak­er of this year’s sym­po­sium we have been able to win psy­chol­o­gist and data sci­en­tist Michal Kosin­s­ki. He has been fea­tured in a sen­sa­tion­al arti­cle titled ‘I’ve only showed that the bomb exists’ in which he showed that with today’s data analy­sis capa­bil­i­ties the behav­ior of peo­ple is not only pre­dict­ed but also influ­enced. This has actu­al­ly been shown at the last US pres­i­den­tial elec­tion and the Brex­it vote, which both have been tak­en a com­plete­ly dif­fer­ent out­come than expect­ed”, empha­sized chair­man Michael Strugl in his speech. „It is a high­ly dra­mat­ic devel­op­ment that the results of social behav­ior analy­sis are used not only for com­mer­cial pur­pos­es but also to influ­ence polit­i­cal deci­sions”, Strugl said.

Human or machine?

„It’s not about who is stronger or smarter, the machine or man, but how we can use tech­nol­o­gy to make a pos­i­tive future. And this is where Upper Aus­tria is also sup­posed to active­ly par­tic­i­pate, so it is good that a new study pro­gram for Arti­fi­cial Intel­li­gence has been cre­at­ed at the Johannes Kepler Uni­ver­si­ty in Linz, which also deals with research in this area”, empha­sized Gov­er­nor Thomas Stelz­er in his speech. „I am very much in favor of using and fur­ther devel­op­ing these fas­ci­nat­ing tech­nolo­gies in Upper Aus­tria, bring­ing togeth­er arti­fi­cial and human intel­li­gence in order to meet the chal­lenges of the future”, Stelz­er con­tin­ued.

Privacy is an obsolete model

His provoca­tive the­sis, that pri­va­cy is an obso­lete mod­el, Prof. Michal Kosin­s­ki under­pinned with con­crete facts: „We all leave a dig­i­tal foot­print with our activ­i­ties on the Inter­net, already in 2012, an indi­vid­ual had pro­duced a data vol­ume of 500 megabytes per day. And accord­ing to fore­casts, there will already be 62 giga­bytes per day in 2025. And it takes only 250 Likes on Face­book, so that an algo­rithm can assess a per­son as well as his or her life part­ner”, said Kosin­s­ki.

„I too am con­cerned about data mis­use, but I’m con­vinced that 99.9% of the algo­rithms are used pos­i­tive­ly to help peo­ple. That’s why I believe that we have to accept that pri­va­cy is a thing of the past and that we should more focus on min­i­miz­ing risks and max­i­miz­ing ben­e­fits”, empha­sized Kosin­s­ki, adding: „Only if we accept real­i­ty, can we dis­cuss the nec­es­sary pol­i­tics.”

„Pri­va­cy is an illu­sion. The soon­er you accept real­i­ty, the soon­er you can rea­son­ably talk about the nec­es­sary pol­i­cy.” – Michal Kosin­s­ki

In the fol­low­ing pan­el, the jour­nal­ist and dig­i­ti­za­tion crit­ic Susanne Gaschke, the robot researcher and design­er Nadia Thal­mann, who has cre­at­ed a social robot named Nadine who is also capa­ble of emo­tion­al behav­ior, as well as the geneti­cists and Sci­en­tif­ic Direc­tor of ACADEMIA SUPERIOR Markus Hengstschläger dis­cussed under the mod­er­a­tion of Melin­da Crane.

More technology assessment needed

The Ger­man jour­nal­ist Susanne Gaschke warned against „dig­i­tal dumb­ing down” and plead­ed for inten­sive „tech­nol­o­gy impact assess­ment” to reduce the risks of dig­i­ti­za­tion: „We often use the dig­i­tal pos­si­bil­i­ties out of sheer con­ve­nience, with­out suf­fi­cient con­sid­er­a­tion of their neg­a­tive effects: The online com­merce for exam­ple leaves the inner cities desert­ed, increas­es the traf­fic prob­lem and the huge amounts of data require high­er and high­er stor­age capac­i­ty with cor­re­spond­ing pow­er require­ments”, said Gaschke.

Ethical rules for robots

Robot researcher and design­er Nadia Thal­mann empha­sized that robots must also have rules lim­it­ing their behav­ior: „Even if robots will nev­er real­ly feel and can only sim­u­late emo­tions, we still have to anchor their behav­ioral lim­i­ta­tions in their soft­ware”, said Thal­mann.

Slowing down technical progress

Markus Hengstschläger, Sci­en­tif­ic Direc­tor of ACADEMIA SUPERIOR, argued that humans should not imple­ment every­thing that is tech­ni­cal­ly or sci­en­tif­i­cal­ly pos­si­ble. „It’s also up to politi­cians to slow down tech­no­log­i­cal devel­op­ment in such a way that humans still come along with it”, under­lined Hengstschläger.

Christine Haberlander new chair of ACADEMIA SUPERIOR

After switch­ing from pol­i­tics to the busi­ness sec­tor, Michael Strugl now also hand­ed over the pres­i­den­cy of the Think Tank ACADEMIA SUPERIOR, which he found­ed togeth­er with Markus Hengstschläger, to Vice-Gov­er­nor Chris­tine Haber­lan­der. „I am look­ing for­ward to this new task and invite every­one to devel­op the future a lit­tle fur­ther with us”, said Haber­lan­der.