December 12th 2023
Programme
10:00 Opening words (Thomas Joseph White OP, Rector of the PUST / Helen Alford OP, Dean of the Faculty of Social Sciences, PUST)
10:15 Introduction (Gábor L. Ambrus, lead researcher of the project ‘Human Freedom in the Age of AI’)
10:45 ‘Why Is It Not AI That Is Dangerous?’ (Richard Benjamins); response (Frantisek Stech); discussion
11:45 ‘A.I. from the Perspective of Christian Metaphysics’ (Gregory Reichberg); response (Matthieu Raffray); discussion
12:45 Lunch
14:15 ‘A.I. and Human Freedom: How New is the Challenge?’ (Anselm Ramelow OP); response (Mariusz Tabaczek OP); discussion
15:15 ‘Noosphere and Biosphere from Teilhard de Chardin to Our Time’ (Riccardo Pozzo); response (Giovanni Cogliandro); discussion
16:15 Break
16:45 Roundtable discussion with all speakers / Conclusions
17:45 End
The Faculty of Social Sciences at the Pontifical University of St. Thomas Aquinas (Rome) and the Centre for Digital Culture of the Dicastery for Culture and Education (Vatican), in cooperation with the Theology and Contemporary Culture Research Group at the Charles University (Prague), cordially invites you to the research seminar entitled
‘Changing the Global Game: Artificial Intelligence, Predictive Analytics and the Human Condition’.
Current trends in the development of artificial intelligence should be a reason for far more concern than the spectre of its becoming a conscious agent with superhuman capabilities. What primarily looms on the horizon is not ‘human obsolescence’, let alone ‘human extinction’, but rather a grave and growing threat to human freedom. Such threat is presented by the spreading employment of a specific kind of ‘weak AI’ called ‘predictive analytics’. Predictive analytics is a method which makes AI capable of extrapolating the developments of the past to future outcomes. When being trained on sufficient amount of data from past events, it can identify patterns to predict what these events most probably point towards in the future. So far, the employment of predictive analytics has been restricted to distinct and separate fields such as social media with its orchestration of third-party ads (Zuboff), or medical science as working towards goals like cancer prevention (Mittelstadt), or the insurance sector with its optimisation of risk assessment (Loi and Christen). These fields belong to a broad spectrum of the use of predictive analytics in society which also includes job interviews, college applications, credit analysis, policing and sentencing – all contributing to the current tendency of artificial intelligence to increase social inequality and injustice (O’Neil). To talk about ‘inequality’ and ‘injustice’ in this context is to orient the issue of artificial intelligence and predictive analytics towards the established discourse of ‘the ethics of AI’; to problematise ‘human freedom’ in the same context, however, is to move the same issue into what may be the foundation of ethics, but certainly lies beneath and beyond it: the ontology of human beings.
Incongruent as the terms ‘freedom’ and ‘ontology’ may sound in relation to predictive analytics, the latter is much more than a purely external factor. While imposing confines on human action, it also involves human beings. The move of predictive analytics from the ‘external’ to the ‘internal’ is well epitomised by social media where the algorithm’s ‘nudging’ of users in a certain direction increasingly influences their ‘inner life’. At the same time, the whole spectrum of the application of predictive analytics is characterised by a major aspect of human unfreedom in the face of artificial intelligence in general: the almost blind faith of human beings in its competence and its benefits. This blind faith as combined with the otherwise powerful workings of predictive analytics lays bare the extreme fragility of human freedom in contemporary technological society. Human unfreedom is not a condition as such; rather, it is something that can be performed.
Although the present state of AI and predictive analytics is unsettling enough with regard to human freedom, accelerating trends in information technology and in the whole of technological society cast their shadow on a not-too-distant future, too. There is an intensifying technological totalitarianism which further accelerates ‘the convergence of the planet’ which is a convergence not only by means of technology, but within technology itself. This convergence was already recognised – and also called ‘confluence’ or ‘coalescence’ – by Pierre Teilhard de Chardin in the mid-20th century, prompting his techno-optimistic theological vision of the evolution of the cosmos from matter to life to consciousness to a final stage of history that he termed the ‘Omega point’. In Teilhard de Chardin, the emergence of consciousness, especially after entering the phase of modernity with its transportation and communication technologies, will bring about a total unification of our planet, making it converge into a total ‘sphere of the mind’ and of ‘the hyper-personal’. Yet he was aware that convergence in technological society may indicate an alternative future. Mass society may form, he wrote, ‘in such a way that, instead of the expected mind, a new wave of determinism surges up – that is to say, of materiality.’ And he added that ‘… it is mechanisation that seems to emerge inevitably from totalisation.’ His mid-20th century vocabulary notwithstanding, Teilhard de Chardin’s insight still applies to the convergence of technologies and of the global social fabric today in the wake of digitisation. For, from the steady convergence of media and entertainment to phenomena like ‘the internet of things’, there are proliferating signs of unification and totalisation everywhere. What if there is a similar convergence awaiting the currently separate applications of predictive analytics? What if Teilhard de Chardin’s ‘Omega point’ is to assume a reversed significance, and instead of becoming the higher consciousness of the ‘hyper-personal’, it turns out to be the nadir of unfreedom within a techno-totalitarian determinism?