{"id":22787,"date":"2024-11-09T09:56:02","date_gmt":"2024-11-09T01:56:02","guid":{"rendered":"https:\/\/www.1ai.net\/?p=22787"},"modified":"2024-11-09T09:56:02","modified_gmt":"2024-11-09T01:56:02","slug":"%e8%8b%b1%e5%9b%bd%e6%98%8e%e5%b9%b4%e5%b0%86%e7%ab%8b%e6%b3%95%e9%98%b2%e8%8c%83-ai-%e9%a3%8e%e9%99%a9%ef%bc%8c%e4%b8%bb%e8%a6%81%e9%9d%a2%e5%90%91-chatgpt-%e7%ad%89%e5%89%8d%e6%b2%bf","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/22787.html","title":{"rendered":"UK to legislate against AI risks next year, mainly for \"cutting-edge models\" like ChatGPT"},"content":{"rendered":"<p>according to<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%8b%b1%e5%9b%bd\" title=\"[Sees articles with [British] labels]\" target=\"_blank\" >U.K.<\/a>The Financial Times reports that the UK plans to pass legislation next year to tighten up the <a href=\"https:\/\/www.1ai.net\/en\/tag\/ai\" title=\"[View articles tagged with [AI]]\" target=\"_blank\" >AI<\/a> potentially<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e9%a3%8e%e9%99%a9\" title=\"_Other Organiser\" target=\"_blank\" >exposures<\/a>The precaution. The country's science and technology minister, Peter Kyle, says the government will also invest in infrastructure to support the growth of the AI sector.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-22788\" title=\"7c5c05f6j00smnuod00p0d000pu00jkp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/11\/7c5c05f6j00smnuod00p0d000pu00jkp.jpg\" alt=\"7c5c05f6j00smnuod00p0d000pu00jkp\" width=\"930\" height=\"704\" \/><\/p>\n<p>Speaking at the Financial Times' Future of Artificial Intelligence Summit on Wednesday local time, Kyle noted that the UK's current AI voluntary testing protocols \"work well and are a good framework,\" but that the upcoming AI Bill will take such agreements with major developers and make them \"a good framework.<strong>Transformation into legal obligations<\/strong>.<\/p>\n<p>The bill, which is due to be introduced in the current parliament, would make the UK's AI Safety Institute an independent body from the government, acting \"solely in the interests of the British public\".<\/p>\n<p>The legislation will reportedly target primarily\u00a0<strong>ChatGPT-style \"frontier\" modeling<\/strong>\u00a0-- These are state-of-the-art systems developed by a handful of companies that are able to<strong>Generate text, images and video<\/strong>content.<\/p>\n<p>Kyle also pledged to invest in advanced computing technology to help the UK develop its own home-grown AI and Large Language Models (LLMs). This follows the UK government's decision to cancel the University of Edinburgh's<strong>\"Billion-dollar supercomputing\" project<\/strong>has been criticized for its financial support, but the project originally received \u00a3800 million (currently about RMB 7,421 million) in promised government support.<\/p>","protected":false},"excerpt":{"rendered":"<p>The U.K. plans to pass legislation next year to strengthen protection against the potential risks of AI, according to the FT. The country's science and technology minister, Peter Kyle, said the government will also invest in infrastructure to support the development of the AI industry. Speaking at the FT's Future of AI Summit on Wednesday local time, Kyle noted that the UK's current AI voluntary testing protocols \"work well and are a good framework,\" but that an upcoming AI bill will turn such agreements with major developers into legal obligations. The bill, which is due to be introduced in the current parliament, would make the UK's AI Safety Institute an independent body from the government, acting \"solely in the interests of the British public\". The legislation would reportedly<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[411,552,4884],"collection":[],"class_list":["post-22787","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-552","tag-4884"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/22787","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=22787"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/22787\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=22787"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=22787"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=22787"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=22787"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}