{"id":59,"date":"2024-02-28T00:08:17","date_gmt":"2024-02-28T00:08:17","guid":{"rendered":"https:\/\/caltechaia.org\/?page_id=59"},"modified":"2024-02-28T00:35:59","modified_gmt":"2024-02-28T00:35:59","slug":"what-is-ai-alignment","status":"publish","type":"page","link":"https:\/\/caltechaia.org\/index.php\/what-is-ai-alignment\/","title":{"rendered":"What is AI Alignment?"},"content":{"rendered":"\n<p>As an emerging field, there are many definitions of the field AI Alignment. Broadly AI Alignment is a research field aimed at tackling the questions \u201chow do we ensure the development of advanced artificial intelligence benefits humanity? and how do we avoid catastrophic failures while building advanced AI systems?\u201d<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Starter resources<\/strong><\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Short Article: <a href=\"https:\/\/www.vox.com\/future-perfect\/2018\/12\/21\/18126576\/ai-artificial-intelligence-machine-learning-safety-alignment\">The case for taking AI seriously as a threat to humanity<\/a> by Kelsey Piper, Vox&nbsp;<\/li>\n\n\n\n<li>Article: <a href=\"https:\/\/80000hours.org\/problem-profiles\/artificial-intelligence\/\">Preventing an AI-related catastrophe<\/a> by Ben Hilton, 80,0000 Hours<\/li>\n\n\n\n<li>Video: <a href=\"https:\/\/www.youtube.com\/watch?v=pYXy-A4siMw\">Intro to AI Safety<\/a> by Rob Miles<\/li>\n\n\n\n<li>Report: <a href=\"https:\/\/futureoflife.org\/background\/benefits-risks-of-artificial-intelligence\/\">Benefits &amp; Risks of Artificial Intelligence<\/a> by Ariel Conn, Future of Life Institute<\/li>\n\n\n\n<li>Syllabus: <a href=\"https:\/\/www.eacambridge.org\/agi-safety-fundamentals\">AGI Safety Fundamentals Curriculum<\/a> by Richard Ngo, OpenAI<\/li>\n\n\n\n<li>More: <a href=\"https:\/\/www.aisafetysupport.org\/resources\/lots-of-links\">Lots of Links<\/a> from AI Safety Support<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-full\"><img fetchpriority=\"high\" decoding=\"async\" width=\"828\" height=\"466\" src=\"https:\/\/caltechaia.org\/wp-content\/uploads\/2024\/02\/datahazards_industries_2560x1440.png\" alt=\"\" class=\"wp-image-84\" srcset=\"https:\/\/caltechaia.org\/wp-content\/uploads\/2024\/02\/datahazards_industries_2560x1440.png 828w, https:\/\/caltechaia.org\/wp-content\/uploads\/2024\/02\/datahazards_industries_2560x1440-300x169.png 300w, https:\/\/caltechaia.org\/wp-content\/uploads\/2024\/02\/datahazards_industries_2560x1440-768x432.png 768w\" sizes=\"(max-width: 828px) 100vw, 828px\" \/><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>As an emerging field, there are many definitions of the field AI Alignment. Broadly AI Alignment is a research field aimed at tackling the questions \u201chow do we ensure the development of advanced artificial intelligence benefits humanity? and how do we avoid catastrophic failures while building advanced AI systems?\u201d Starter resources<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"om_disable_all_campaigns":false,"footnotes":""},"class_list":["post-59","page","type-page","status-publish","hentry"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/pages\/59","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/comments?post=59"}],"version-history":[{"count":4,"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/pages\/59\/revisions"}],"predecessor-version":[{"id":85,"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/pages\/59\/revisions\/85"}],"wp:attachment":[{"href":"https:\/\/caltechaia.org\/index.php\/wp-json\/wp\/v2\/media?parent=59"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}