Optimising Peace through a Universal Global Peace Treaty to Constrain Risk of War from a Militarised Artificial Superintelligence
John Draper
No 4268q, SocArXiv from Center for Open Science
Abstract:
An artificial superintelligence (ASI) emerging in a world where war is still normalised may constitute a catastrophic existential risk, either because the ASI might be employed by a single nation-state to wage war for global supremacy or because the ASI goes to war on behalf of itself to establish global supremacy; these risks are not mutually incompatible in that the first can transition to the second. We presently live in a world where few states actually declare war on each other or even war on each other. This is because the 1945 United Nations’ Charter's Article 2 states that UN member states should “refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state”, while allowing for “military measures by UN Security Council resolutions” and “exercise of self-defense”. In this theoretical ideal, wars are not declared; instead, 'international armed conflicts' occur. However, costly interstate conflicts, both ‘hot’ and ‘cold’, still exist, for instance the Kashmir Conflict and the Korean War. Furthermore, a ‘New Cold War’ between AI superpowers (the United States and China) looms. An ASI-directed/enabled future interstate war could trigger ‘total war’, including nuclear war, and is therefore ‘high risk’. One risk reduction strategy would be optimising peace through a Universal Global Peace Treaty (UGPT), which could contribute towards the ending of existing wars and towards the prevention of future wars, through conforming instrumentalism. A critical juncture to optimise peace via the UGPT is emerging, by leveraging the UGPT off the announcement of a ‘burning plasma’ fusion reaction, expected from circa 2025 to 2035, as was attempted, unfortunately unsuccessfully, in 1946 with fission, for atomic war. While this strategy cannot cope with non-state actors, it could influence state actors, including those developing ASIs, or an ASI with agency.
Date: 2020-04-15
References: Add references at CitEc
Citations:
Downloads: (external link)
https://osf.io/download/5e97108243016604b5a0478d/
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:osf:socarx:4268q
DOI: 10.31219/osf.io/4268q
Access Statistics for this paper
More papers in SocArXiv from Center for Open Science
Bibliographic data for series maintained by OSF ().