Towards an International Legal Framework for Lethal Artifcial Intelligence Based on Respect for Human Rights: Mission Impossible
Tis article considers the potential use of autonomous weapons both in and outside armed confict, including in law enforcement. It analyses the phenomenon from the perspective of human rights law, with a particular focus on the right to life. For over a decade, the international community has debate...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
University of Ljubljana, Faculty of Law
2024-12-01
|
| Series: | Zbornik Znanstvenih Razprav |
| Subjects: | |
| Online Access: | https://journals.uni-lj.si/LLR/article/view/22890 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Tis article considers the potential use of autonomous weapons both in and outside armed confict, including in law enforcement. It analyses the phenomenon from the perspective of human rights law, with a particular focus on the right to life. For over a decade, the international community has debated whether technological advances pertaining to the development of autonomous weapons require the establishment of new rules within the framework of international humanitarian law. In contrast, consideration of such technology from a human rights law perspective has been limited, despite its implications for the right to life and other human rights. In parallel, several international initiatives have emerged in recent years aiming to establish non-binding and binding rules for the development and use of artifcial intelligence (AI) based on respect for human rights. Tis article reviews four such initiatives: the OECD Recommendation on AI, the UNESCO Recommendation on the Ethics of AI, the INTERPOL and UNICRI Toolkit for Responsible AI Innovation in Law Enforcement, and the Council of Europe AI Convention. It examines the extent to which these initiatives address the specifc concerns raised by autonomous weapons.
|
|---|---|
| ISSN: | 1854-3839 2464-0077 |