LLMs hallucinating non-existent developer packages could fuel supply chain attacks

Tags:

Large Language Models (LLMs) have a serious “package hallucination” problem that could lead to a wave of maliciously-coded packages in the supply chain, researchers have discovered in one of the largest and most in-depth ever studies to investigate the problem.

It’s so bad, in fact, that across 30 different tests, the researchers found that 440,445 (19.7%) of 2.23 million code samples they generated experimentally in two of the most popular programming languages, Python and JavaScript, using 16 different LLM models for Python and 14 models for JavaScript, contained references to packages that were hallucinated.

Continue reading on InfoWorld.

Categories

No Responses

Leave a Reply

Your email address will not be published. Required fields are marked *