The Center for Global Agenda thanks the CEO of The Millennium Project for his expert statement
New York, New York City, April 20, 2022 (PRLog) - The Center for Global Agenda (CGA) at Unbuilt Labs is leading the global stakeholder consultation for the Recommended UN Action Plan to Close the Compliance Gap, a publication at the United Nations Institute for Training and Research (UNITAR). Action Plan CCG will recommend directing impact investment funds and efforts towards four areas of diplomacy and five themes within each of the areas.
We would like to thank Jerome Glenn, Co-Founder and CEO of The Millennium Project for his expert statement on Area 4: Disaster Diplomacy, Theme 16: Strategic Foresight. Marvin Cheung, Co-Director of CGA, also expresses his thanks, "I am pleased to have been a signatory to the Open Letter to the UN Secretary-General to conduct a feasibility study for a UN Office on Strategic or Existential Threats, co-authored by The Millennium Project, and to continue supporting research on strategic foresight. Glenn, who has been meeting with Ayaka Suzuki, Director of the Strategic Planning and Monitoring Unit in the Executive Office of the Secretary-General (EOSG) under the recommendation of UN Secretary General Antonio Guterres, presents a valuable perspective to the Recommended UN Action Plan to Close the Compliance Gap. His statement on Artificial General Intelligence (AGI) complements on-going deliberations on strategic threats, including advanced global warming, solar radiation, malicious nanotechnology, nuclear incidents, pandemics, and other high-risk events that pose a severe challenge to shared peace and prosperity for all."
The complete statement is included below:
โAn international assessment of how to govern the potential transition from Artificial Narrow Intelligence (ANI) to potential Artificial General Intelligence (AGI) is needed. If the initial conditions of AGI are not โright,โ it could evolve into the kind of Artificial Super Intelligence (ASI) that Stephen Hawking, Elon Musk, and Bill Gates have warned the public could threaten the future of humanity via the future globally connected Internet of Things (IoT).
โThere are many excellent centers studying values and the ethical issues of ANI, but not potential global governance models for the transition to AGI.
โThe distinctions among ANI, AGI, and ASI are usually missing in these studies.
"Current work on AI governance is designed to catch up with the artificial narrow intelligence proliferating worldwide today. Meanwhile, investment into AGI development is forecast to be $50 billion by 2023. Expert judgments about when AGI will be possible vary. Some working to develop AGI believe it is possible to have AGI in as soon as ten years. It is likely to take ten years to: 1) develop ANI to AGI international or global agreements; 2) design the governance system; and 3) begin implementation. Hence, it would be wise to begin exploring potential governance approaches and their potential effectiveness now. We need to jump ahead to anticipate governance requirements for what AGI could become. Beginning now to explore and assess rules for governance of AGI will not stifle its development, since such rules would not be in place for at least ten years. (Consider how long it is taking to create a global governance system for climate change.)"
We invite everyone to participate in the global stakeholder consultation process. Anyone can now submit a public statement. Organizations that wish to co-host 1 or 2 workshops with us can also apply for consultative status with CGA. For more details, see https://unbuiltlabs.com/cga. Thank you for your time and support.
##
Media Contact
Marvin Cheung
Co-Director, Center for Global Agenda, Unbuilt Labs
marvin@unbuiltlabs.com