Politicians and Artificial Intelligence Refusal: Brief Considerations


  • Marius Vacarelu National School of Political and Administrative Studies




artificial intelligence, politicians' interests, the advent of technologies, legal framework, democratic institutions, citizens' needs


Technological development of the last decades changed not only working procedures but also economic hierarchies and – in many cases – it offered an instrument for ambitious leaders to bring a new perspective to countries, continents, or even the world in different areas: economy, law, politics, etc.

Inside these new dimensions of human life, politicians are asked to create rules for societies, but also their specific tasks. At the same time, politicians are asked to think about the future and to settle the main directions for national development, in such a way to profit for today and the next generations. For such development plans, politicians need to consult many data and create a specific legal framework, able to increase people's skills in every area, in a coherent vision that includes Artificial Intelligence.

Artificial Intelligence might have a special field of action in political competition, which must be regulated by the same actors able to use it. Such a specific legal possibility – to be able to regulate one of your tools – offers to politicians many interrogations about the limits of Artificial Intelligence use. Politics is a matter of power and history shows that many times politicians use many tools and administrative procedures to preserve or achieve power. Artificial Intelligence could be used for the same purposes and both scholars and citizens will be important to study how politicians will regulate it. In this case, an important topic is the Artificial Intelligence acceptance in political completions and if politicians will try to regulate its use with a specific interdiction on the political area. It will be also important to see if such interdictions will consider the danger for democratic institutions or other reasons like psychological dangers to the human mind, costs of implementation, etc.