The document outlines seven main clusters that should be present in an SBOM for AI: metadata, models, key performance indicators (KPI), infrastructure, security properties (SP), system level properties (SLP), and dataset properties (DP).The metadata cluster should include elements about the SBOM itself, including its author, version, data format, author signature, tool name and version, generation context, timestamp, and dependency relationship.The SLP cluster should contain information about the AI system, including name, producer, version, components, timestamp, data flow and usage, input/output properties, and intended application area.The guidance recommends creating a models cluster that contains information about the models used by the AI, including name, identifier, version, producer, description, timestamp, hash value and algorithm, properties, license, and external references.The DP cluster should include information about the datasets used by the model. The infrastructure cluster contains information about the software and hardware used to operate and support the AI system.SP should cover security controls, security compliance, cybersecurity policy information, and vulnerability referencing, while the KPI cluster should contain information on security metrics and operational performance.[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

The metadata cluster should include elements about the SBOM itself, including its author, version, data format, author signature, tool name and version, generation context, timestamp, and dependency relationship.The SLP cluster should contain information about the AI system, including name, producer, version, components, timestamp, data flow and usage, input/output properties, and intended application area.The guidance recommends creating a models cluster that contains information about the models used by the AI, including name, identifier, version, producer, description, timestamp, hash value and algorithm, properties, license, and external references.The DP cluster should include information about the datasets used by the model. The infrastructure cluster contains information about the software and hardware used to operate and support the AI system.SP should cover security controls, security compliance, cybersecurity policy information, and vulnerability referencing, while the KPI cluster should contain information on security metrics and operational performance.[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

The SLP cluster should contain information about the AI system, including name, producer, version, components, timestamp, data flow and usage, input/output properties, and intended application area.The guidance recommends creating a models cluster that contains information about the models used by the AI, including name, identifier, version, producer, description, timestamp, hash value and algorithm, properties, license, and external references.The DP cluster should include information about the datasets used by the model. The infrastructure cluster contains information about the software and hardware used to operate and support the AI system.SP should cover security controls, security compliance, cybersecurity policy information, and vulnerability referencing, while the KPI cluster should contain information on security metrics and operational performance.[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

The guidance recommends creating a models cluster that contains information about the models used by the AI, including name, identifier, version, producer, description, timestamp, hash value and algorithm, properties, license, and external references.The DP cluster should include information about the datasets used by the model. The infrastructure cluster contains information about the software and hardware used to operate and support the AI system.SP should cover security controls, security compliance, cybersecurity policy information, and vulnerability referencing, while the KPI cluster should contain information on security metrics and operational performance.[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

The DP cluster should include information about the datasets used by the model. The infrastructure cluster contains information about the software and hardware used to operate and support the AI system.SP should cover security controls, security compliance, cybersecurity policy information, and vulnerability referencing, while the KPI cluster should contain information on security metrics and operational performance.[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

SP should cover security controls, security compliance, cybersecurity policy information, and vulnerability referencing, while the KPI cluster should contain information on security metrics and operational performance.[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

[ Read:Are SBOMs Failing? Security Teams Struggling With SBOM Data]The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

The authoring agencies noted, “These minimum elements are not mandatory; do not create requirements, standards, or legislation; and are open to further refinements to keep pace with technological development and evolution of legal or policy frameworks within G7 members.”Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

Nigel Douglas, head of developer relations at artifact management and software supply chain security firm Cloudsmith, commented on the new AI SBOM guidance, noting that “The G7 framework raises the right requirements, but organizations trying to implement them will find that documentation applied retrospectively doesn’t reconstruct origin.”“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

“Continuous, automated SBOM generation is already a baseline requirement for organizations serious about software supply chain security, and the G7’s new AI SBOM framework extends that logic into territory where the tooling and governance haven’t caught up yet. To its credit, the guidance is being candid about its own limits, acknowledging that most of its seven data clusters are difficult to measure consistently across organizations,” Douglas said.“Where it runs into difficulty is structural, rooted in how systems are being built. GenAI tools have made it routine for developers to create applications or pull in software dependencies outside any formal review pipeline.“Meanwhile, traditional SBOMs were designed for supply chains with relatively traceable edges but AI-assisted development is producing code, workflows, and dependencies that may enter production without ever passing through established inventory or assurance processes – a dynamic that attacks like s1ngularity have already begun to exploit directly,” he added.Related:SBOM Pioneer Allan Friedman Joins NetRise to Advance Supply Chain VisibilityRelated:Global Cyber Agencies Issue AI Security Guidance for Critical Infrastructure OT

Source: SecurityWeek