It is obvi­ous that the next gen­er­a­tion of “dig­i­tal natives” – peo­ple who have not known a world with­out the Inter­net – have a com­plete­ly dif­fer­ent approach to algo­rith­mic and AI-dri­ven pre­dic­tions, influ­ences and con­trols. At our stu­dent work­shop, it already became evi­dent that, although young peo­ple hard­ly have any fear of con­tact with new tech­nolo­gies, they are cer­tain­ly crit­i­cal of the pos­si­ble effects on our communities.

Analog and digital are not mutually exclusive

We live in a world where it seems nei­ther pos­si­ble nor rea­son­able to turn away from dig­i­ti­za­tion. Why should we? In most cas­es the ben­e­fits out­weigh the draw­backs; the denial of dig­i­tal tech­nolo­gies would be like a social exit. Any­one who longs for the good old anal­o­gous world may also believe that every­thing was bet­ter in the old days. But that’s not the point. We can­not do with­out “anal­o­gous expe­ri­ences” or dig­i­tal progress because they com­ple­ment each oth­er. For exam­ple, it is impor­tant for chil­dren to be intro­duced to dig­i­tal tech­nolo­gies with rea­son and crit­i­cal reflec­tiv­i­ty while self-aware­ness in a sup­pos­ed­ly anal­o­gous world should not be neglect­ed. If “anal­o­gous” is under­stood in the sense of allow­ing a more active con­trol and deci­sion-mak­ing by users and pol­i­cy-mak­ers in the future, tem­po­rary slow­downs in tech­no­log­i­cal devel­op­ments may be accept­ed. Deny­ing them, how­ev­er, is the wrong approach.

Algorithms have to be able to explain themselves

It was empha­sized on sev­er­al occa­sions in the dis­cus­sions that machines, unlike humans, have no moral­i­ty, no feel­ings, no con­scious­ness and no inten­tions. That is why it is up to us – cit­i­zens, deci­sion-mak­ers and politi­cians – to define the frame­work with­in which intel­li­gent sys­tems are allowed to oper­ate. Cre­at­ing this frame­work is per­haps one of the great­est chal­lenges of our time. This also means that we should not devel­op and use machines or tech­nolo­gies that we no longer under­stand or whose deci­sions and actions are no longer man­age­able or pre­dictable by people.

How do we want to use social robots?

Under the right cir­cum­stances, a social robot in the form of a humanoid com­pan­ion has many ben­e­fits. It could sup­port or relieve stress in var­i­ous sit­u­a­tions, for exam­ple as a com­pan­ion for lone­ly or old peo­ple with demen­tia. It could not only tack­le lone­li­ness, but also at the same time inform emer­gency ser­vices in dan­ger­ous situations.

It would be a human regres­sion, how­ev­er, if such robots were used out of lazi­ness or low esteem for inter­per­son­al rela­tion­ships and if they replaced all inter­ac­tion. This is a ques­tion of dig­ni­ty where there are no tech­no­log­i­cal shortcuts.

If privacy is an illusion

Maybe the end of pri­va­cy has already begun. But even if we leave behind the “illu­sion” of our pri­va­cy, we should think about who con­trols our data and what they are used for. Politi­cians should ensure that data by Aus­tri­an cit­i­zens are stored and archived on servers in Europe or Aus­tria and not in the US or Chi­na. In any case, the dis­cus­sion about the end of pri­va­cy should inspire us all to re-exam­ine our own opin­ion-form­ing process­es, to pro­mote the mean­ing­ful use of new tech­nolo­gies and to active­ly par­tic­i­pate in shap­ing them.


In order to incor­po­rate the views and con­cerns of young peo­ple, in the run-up to the sym­po­sium, ACADEMIA SUPERIOR orga­nizes an inten­sive, inter­dis­ci­pli­nary work­shop for stu­dents from a wide range of dis­ci­plines every year. Four of them were giv­en the oppor­tu­ni­ty to par­tic­i­pate in the SURPRISE FACTORS SYMPOSIUM as mem­bers of the Young Acad­e­mia and to dis­cuss “Mea­sur­ing the Future” with the invit­ed inter­na­tion­al experts.

These four stu­dents par­tic­i­pat­ed in this year’s symposium:

Alexan­der Grentner
Med­ical and Bioin­for­mat­ics, Uni­ver­si­ty of Applied Sci­ences Hagenberg

Bar­bara Ange­li­ka Siedler, BSc.
Indus­tri­al Design, Art Uni­ver­si­ty Linz

Philip Tazl, BSc.
Phi­los­o­phy, Uni­ver­si­ty of Vien­na and Eco­nom­ics, Vien­na Uni­ver­si­ty of Economics

Julia Wiesinger, BA
Sup­ply Chain Man­age­ment, Uni­ver­si­ty of Applied Sci­ences Steyr