{"id":5181,"date":"2025-10-02T18:58:52","date_gmt":"2025-10-02T18:58:52","guid":{"rendered":"https:\/\/cybersecurityinfocus.com\/?p=5181"},"modified":"2025-10-02T18:58:52","modified_gmt":"2025-10-02T18:58:52","slug":"sora-2-can-generate-spongebob-pikachu-and-other-copyrighted-characters","status":"publish","type":"post","link":"https:\/\/cybersecurityinfocus.com\/?p=5181","title":{"rendered":"Sora 2 Can Generate SpongeBob, Pikachu, and Other Copyrighted Characters"},"content":{"rendered":"<p>Sora 2 is being used to generate videos featuring SpongeBob SquarePants, Pikachu, and other copyrighted characters. This comes after OpenAI reportedly allowed studios and talent agencies to opt out of having their work recreated by its latest artificial intelligence generator.<\/p>\n<p>Since the new model was <a href=\"https:\/\/www.eweek.com\/openai\/news-openai-sora-2-video\/\">released on Tuesday<\/a>, Sora 2 users have been enjoying experimenting with their favourite animated characters and testing the system\u2019s boundaries. While OpenAI is rolling out access gradually, it is already possible to find videos of <a href=\"https:\/\/x.com\/jake_joseph\/status\/1973383226166026597?s=46&amp;t=6XFwPPAGRifEw1jHjmAfqQ\" target=\"_blank\" rel=\"noopener\">Patrick Star rapping<\/a>, <a href=\"https:\/\/x.com\/skirano\/status\/1973184329619743217?s=46&amp;t=6XFwPPAGRifEw1jHjmAfqQ\" target=\"_blank\" rel=\"noopener\">Nintendo\u2019s Mario escaping from his game<\/a>, and a <a href=\"https:\/\/x.com\/libankano\/status\/1973600978470629778?s=46&amp;t=6XFwPPAGRifEw1jHjmAfqQ\" target=\"_blank\" rel=\"noopener\">boxing match between Pikachu and Blue from Blue\u2019s Clues<\/a> circulating on X.<\/p>\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-1 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"wp-block-embed__wrapper\">\n<p>Blues clues vs Pikachu from Sora 2.<br \/>There is no going back \u2026 <a href=\"https:\/\/t.co\/b90Xa6DVet\">pic.twitter.com\/b90Xa6DVet<\/a><\/p>\n<p>\u2014 Liban (lee~ben) Kano (@libankano) <a href=\"https:\/\/twitter.com\/libankano\/status\/1973600978470629778?ref_src=twsrc%5Etfw\">October 2, 2025<\/a>\n<\/p><\/div>\n<p>Source: X\n<\/p><\/div>\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n<p>Sometimes, copyrighted material appears to inspire clips even if it is not explicitly in the prompt. One user produced a <a href=\"https:\/\/x.com\/happysmash27\/status\/1973280605027836323?s=46&amp;t=6XFwPPAGRifEw1jHjmAfqQ\" target=\"_blank\" rel=\"noopener\">convincing Rocket Raccoon imitation<\/a> when they asked for a \u201cphotorealistic raccoon.\u201d<\/p>\n<p>Another generated an <a href=\"https:\/\/x.com\/javilopen\/status\/1973298791521677622\" target=\"_blank\" rel=\"noopener\">anime version of The NeverEnding Story<\/a> after requesting a \u201ccute young woman riding a dragon in a flower world, Studio Ghibli style.\u201d It\u2019s interesting that Studio Ghibli-fication is still allowed, given the backlash earlier this year when users of the image generator built into GPT-4o <a href=\"https:\/\/www.eweek.com\/news\/openai-studio-ghibli-ai-art-copyright\/\">started creating cartoons in the iconic style<\/a>.\u00a0<\/p>\n<h2 class=\"wp-block-heading\">Opt-out requests and Disney\u2019s block<\/h2>\n<p>Just before the model and its accompanying Sora app were launched, reports indicated that OpenAI was contacting copyright holders to give them the <a href=\"https:\/\/www.eweek.com\/news\/openai-sora-2-copyright-opt-out\/\">option to exclude their IP from Sora 2<\/a>. But OpenAI\u2019s process for honouring any requests more likely involved blocking specific outputs through modifications to the master prompt rather than ensuring the material was excluded from the initial training data.<\/p>\n<p>Why can we assume this? OpenAI only started asking studios if they wanted to opt out in the last week, according to <a href=\"https:\/\/www.wsj.com\/tech\/ai\/openais-new-sora-video-generator-to-require-copyright-holders-to-opt-out-071d8b2a\" target=\"_blank\" rel=\"noopener\">The Wall Street Journal<\/a>, making it unlikely that Sora 2 was retrained in the short period leading up to its release.\u00a0<\/p>\n<p>Furthermore, <a href=\"https:\/\/www.washingtonpost.com\/technology\/interactive\/2025\/openai-training-data-sora\/\" target=\"_blank\" rel=\"noopener\">The Washington Post<\/a> found that the previous iteration of Sora, released long before the opt-out requests, was trained on swathes of copyrighted material. The likes of Netflix and Twitch confirmed that they had not provided their source material for training, and experts said the model was <a href=\"https:\/\/arxiv.org\/pdf\/2412.17847\" target=\"_blank\" rel=\"noopener\">most likely sourced from YouTube<\/a> and other publicly available online sources.<\/p>\n<p>However, OpenAI appears to be honouring the opt-out requests it received. Disney was among them, according to <a href=\"https:\/\/www.reuters.com\/business\/media-telecom\/openai-launches-new-ai-video-app-spun-copyrighted-content-2025-09-30\/\" target=\"_blank\" rel=\"noopener\">Reuters<\/a>, and users have reported being unable to generate Sora 2 videos featuring Spider-Man and Darth Vader.<\/p>\n<p>When asked about how Sora 2 is generating clips containing copyrighted characters, OpenAI told <a href=\"https:\/\/gizmodo.com\/the-first-24-hours-of-sora-2-chaos-copyright-violations-sam-altman-shoplifting-and-more-2000666216\" target=\"_blank\" rel=\"noopener\">Gizmodo<\/a> that it sees them as new opportunities for creators to deepen their connection with the fans. It also claimed it was \u201cworking with rightsholders to understand their preferences for how their content appears across our ecosystem, including Sora.\u201d<\/p>\n<h2 class=\"wp-block-heading\">OpenAI is more strict on human likenesses than copyright, but is it strict enough?<\/h2>\n<p>While many copyrighted characters remain accessible in Sora 2, the company states outright that the model will not generate images of recognizable public figures without their consent.<\/p>\n<p>The challenge lies in enforcing this while also maintaining its \u201ccameo\u201d feature, which allows users to insert a realistic avatar of themselves, or any human, animal, or object, in a generated clip.<\/p>\n<p>Through the new Sora app, which is primarily designed for social networking, friends can remix and edit each other\u2019s cameos. Users will control who has access to their cameo and can revoke permissions at any time, as well as review and remove any videos featuring it.<\/p>\n<p>While OpenAI has emphasized safeguards against <a href=\"https:\/\/www.eweek.com\/news\/scarlett-johansson-ai-deepfake-video\">nonconsensual use of a person\u2019s likeness<\/a>, it is too early to determine whether these safeguards will actually prevent bullying, harassment, disinformation, and other abuses as the model becomes more powerful and avatars increasingly realistic.<\/p>\n<h2 class=\"wp-block-heading\">Altman memes highlight limits safeguards<\/h2>\n<p>Sam Altman, OpenAI\u2019s CEO, is ironically already demonstrating the limits of these protections. He allowed his likeness to be used in Sora 2, and users have created videos of him <a href=\"https:\/\/x.com\/GabrielPeterss4\/status\/1973120058907041902\" target=\"_blank\" rel=\"noopener\">getting caught shoplifting GPUs on CCTV<\/a> and <a href=\"https:\/\/x.com\/PJaccetturo\/status\/1973427223580606793\" target=\"_blank\" rel=\"noopener\">stealing art from the Studio Ghibli HQ<\/a> as a parody.<\/p>\n<p>After all, it is very easy to copy a video, which can persist even if the original is deleted. Even though AI-generated videos carry a Sora 2 watermark and an embedded <a href=\"https:\/\/openai.com\/index\/launching-sora-responsibly\/\" target=\"_blank\" rel=\"noopener\">AI disclaimer in the metadata<\/a>, it remains to be seen whether people will actually check these markers before sharing <a href=\"https:\/\/www.eweek.com\/news\/deepfake-images-female-soldiers-oan-network\/\">politically misleading<\/a> or <a href=\"https:\/\/www.eweek.com\/news\/grok-imagine-ai-deepfake-videos-celebrities\/\">even explicit<\/a> deepfakes.<\/p>\n<p><strong>Read eWeek\u2019s coverage about actress <\/strong><a href=\"https:\/\/www.eweek.com\/news\/scarlett-johansson-ai-deepfake-video\/\"><strong>Scarlett Johansson calling out the use of her likeness in a deepfake video<\/strong><\/a><strong>. And, how the judges of the 2026 Academy Awards will not take into account whether <\/strong><a href=\"https:\/\/www.eweek.com\/news\/oscars-ai-guidelines-films\/\"><strong>generative AI tools were used in the making of a movie<\/strong><\/a><strong>.<\/strong><\/p>\n<p>The post <a href=\"https:\/\/www.eweek.com\/news\/sora-generating-copyrighted-characters\/\">Sora 2 Can Generate SpongeBob, Pikachu, and Other Copyrighted Characters<\/a> appeared first on <a href=\"https:\/\/www.eweek.com\/\">eWEEK<\/a>.<\/p>","protected":false},"excerpt":{"rendered":"<p>Sora 2 is being used to generate videos featuring SpongeBob SquarePants, Pikachu, and other copyrighted characters. This comes after OpenAI reportedly allowed studios and talent agencies to opt out of having their work recreated by its latest artificial intelligence generator. Since the new model was released on Tuesday, Sora 2 users have been enjoying experimenting [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-5181","post","type-post","status-publish","format-standard","hentry","category-news"],"_links":{"self":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts\/5181"}],"collection":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5181"}],"version-history":[{"count":0,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=\/wp\/v2\/posts\/5181\/revisions"}],"wp:attachment":[{"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5181"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5181"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cybersecurityinfocus.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5181"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}