{"id":5463,"date":"2025-10-20T11:49:17","date_gmt":"2025-10-20T11:49:17","guid":{"rendered":"https:\/\/cybersecurityinfocus.com\/?p=5463"},"modified":"2025-10-20T11:49:17","modified_gmt":"2025-10-20T11:49:17","slug":"openai-tightens-sora-rules-after-fake-martin-luther-king-videos","status":"publish","type":"post","link":"https:\/\/cybersecurityinfocus.com\/?p=5463","title":{"rendered":"OpenAI Tightens Sora Rules After Fake Martin Luther King Videos"},"content":{"rendered":"<p>OpenAI has halted the use of Martin Luther King Jr.\u2019s likeness in Sora after users created fake clips depicting the civil rights leader in \u201cdisrespectful\u201d ways. The company confirmed the move following growing backlash over AI-generated videos that appeared to mimic Dr. King\u2019s voice and image.<\/p>\n<p>In a joint statement posted on X, OpenAI and the King Estate said Sora generations featuring Dr. King are paused while the company strengthens \u201cguardrails for historical figures.\u201d<\/p>\n<h2 class=\"wp-block-heading\">When tribute turns tacky<\/h2>\n<p>The uproar followed the spread of hyperrealistic clips showing a digital Dr. King speaking and behaving in ways far removed from the activist\u2019s real-life image. The <a href=\"https:\/\/www.eweek.com\/artificial-intelligence\/deepfake\/\">deepfakes<\/a>, generated with <a href=\"https:\/\/www.eweek.com\/openai\/news-openai-sora-2-video\/\">OpenAI\u2019s Sora tool<\/a>, blurred the line between homage and exploitation, and quickly ignited public anger over how easily technology can twist historical memory.<\/p>\n<p>The <a href=\"https:\/\/x.com\/OpenAINewsroom\/status\/1979005850166648933\" target=\"_blank\" rel=\"noopener\">joint statement on X <\/a>framed the pause as part of a broader effort to tighten safeguards around public figures. OpenAI said it recognizes the balance between creative freedom and respect for legacy, emphasizing that families and estate holders should have a say in how their likeness is used.\u00a0<\/p>\n<p>The <a href=\"https:\/\/www.eweek.com\/artificial-intelligence\/ai-companies\/\">AI company<\/a> added that the action was guided by dialogue with Dr. Bernice A. King, John Hope Bryant, and OpenAI\u2019s <a href=\"https:\/\/www.eweek.com\/artificial-intelligence\/ai-ethics\/\">AI Ethics<\/a> Council on preserving dignity in digital representations.<\/p>\n<h2 class=\"wp-block-heading\">Sora\u2019s growing list of red lines<\/h2>\n<p>While OpenAI moves to protect real faces, its fictional ones are stirring new fights.<\/p>\n<p>Since the release of Sora 2, users have been churning out videos <a href=\"https:\/\/www.eweek.com\/news\/sora-generating-copyrighted-characters\/\">featuring household names like SpongeBob, Pikachu, and Mario<\/a>, pushing the <a href=\"https:\/\/www.eweek.com\/artificial-intelligence\/ai-software\/\">AI tool<\/a>\u2019s creative boundaries and legal limits.<\/p>\n<p>To stem the tide, OpenAI has started <a href=\"https:\/\/www.eweek.com\/news\/news-sora-update-copyright\/\">letting rightsholders opt out<\/a> of having their characters or works recreated. Disney was among the first to exercise that option, effectively blocking Spider-Man and Darth Vader from appearing in Sora generations.<\/p>\n<p>The new policy mirrors OpenAI\u2019s approach to human likenesses. Just as estates can now shield public figures like Dr. King, studios and artists can draw their own lines, deciding where creativity ends and infringement begins.<\/p>\n<h2 class=\"wp-block-heading\">Digital immortality without dignity<\/h2>\n<p>Dr. King\u2019s likeness is only the latest to be pulled into AI\u2019s growing habit of reanimation. In recent weeks, Robin Williams\u2019 daughter, Zelda Williams, <a href=\"https:\/\/www.eweek.com\/news\/ai-celebrity-deepfakes-backlash\/\">condemned videos that used her father\u2019s digital recreations<\/a>, calling them \u201cgross\u201d and a \u201cwaste of time and energy.\u201d She said the clips reduced a person\u2019s memory to something that only \u201cvaguely looks and sounds like them.\u201d<\/p>\n<p>Another AI-generated video showed rapper Tupac Shakur casually shopping in a Target store, a scene many viewers found unsettling. As realism accelerates, Hollywood and celebrity estates are scrambling to reclaim control.<\/p>\n<p>What began as creative play now raises deeper questions about ownership, memory, and respect. Recreation without consent turns innovation into possession.<\/p>\n<p><strong>Walmart and OpenAI are <\/strong><a href=\"https:\/\/www.eweek.com\/openai\/walmart-openai-partnership-ai-shopping\/\"><strong>turning conversational AI into a shopping tool<\/strong><\/a><strong>, raising the stakes for trust.<\/strong><\/p>\n<p>The post <a href=\"https:\/\/www.eweek.com\/openai\/openai-halts-mlk-sora-videos\/\">OpenAI Tightens Sora Rules After Fake Martin Luther King Videos<\/a> appeared first on <a href=\"https:\/\/www.eweek.com\/\">eWEEK<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>OpenAI has halted the use of Martin Luther King Jr.\u2019s likeness in Sora after users created fake clips depicting the civil rights leader in \u201cdisrespectful\u201d ways. The company confirmed the move following growing backlash over AI-generated videos that appeared to mimic Dr. King\u2019s voice and image. In a joint statement posted on X, OpenAI and [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-5463","post","type-post","status-publish","format-standard","hentry","category-news"],"_links":{"self":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts\/5463"}],"collection":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5463"}],"version-history":[{"count":0,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts\/5463\/revisions"}],"wp:attachment":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5463"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5463"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5463"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}