Sunday, March 1, 2026

IEEE normal provides 6 steps for AI system procurement


For greater than three years, an IEEE Requirements Affiliation working group has been refining a draft normal for procuring synthetic intelligence and automatic determination techniques, IEEE 3119-2025. It’s meant to assist procurement groups determine and handle dangers in high-risk domains. Such techniques are utilized by authorities entities concerned in training, well being, employment, and lots of different public sector areas. Final yr the working group partnered with a European Union company to guage the draft normal’s elements and to collect details about customers’ wants and their views on the usual’s worth.

On the time, the normal included 5processes to assist customers develop their solicitations and to determine, mitigate, and monitor harms generally related to high-risk AI techniques.

These processes had been drawback definition, vendor analysis, resolution analysis, contract negotiation, and contract monitoring.

The EU company’s suggestions led the working group to rethink the processes and the sequence of a number of actions. The ultimate draft now contains a further course of: solicitation preparation, which comes proper after the issue definition course of. The working group believes the added course of addresses the challenges organizations expertise with getting ready AI-specific solicitations, equivalent to the necessity to add clear and strong information necessities and to include questions relating to the maturity of vendor AI governance.

The EU company additionally emphasised that it’s important to incorporate solicitation preparation, which provides procurement groups further alternatives to adapt their solicitations with technical necessities and questions relating to accountable AI system selections. Leaving house for changes is very related when acquisitions of AI are occurring inside rising and altering regulatory environments.

Gisele Waters

IEEE 3119’s place within the requirements ecosystem

At present there are a number of internationally accepted requirements for AI administration, AI ethics, and common software program acquisition. These from the IEEE Requirements Affiliation and the Worldwide Group for Standardization goal AI design, use, and life-cycle administration.

Till now, there was no internationally accepted, consensus-based normal that focuses on the procurement of AI instruments and provides operationalsteerage for responsibly buying high-risk AI techniques that serve the general public curiosity.

The IEEE 3119 normal addresses that hole. Not like the AI administration normal ISO 42001 and different certifications associated to generic AI oversight and danger governance, IEEE’s new normal provides a risk-based, operationalstrategy to assist authorities companies adapt conventional procurement practices.

Governments have an essential position to play within the accountable deployment of AI. Nonetheless, market dynamics and unequal AI experience between business and authorities might be limitations that discourage success.

One of many normal’s core objectives is to higher inform procurement leaders about what they’re shopping for earlier than they make high-risk AI purchases. IEEE 3119 defines high-risk AI techniques as those who make or are a considerable think about making consequential selections that might have vital impactson folks, teams, or society. The definition is much like the one utilized in Colorado’s 2034 AI Act, the primary U.S. state-level regulation comprehensively addressing high-risk techniques.

The usual’s processes, nevertheless, do complement ISO 42001 in some ways. The connection between each is illustrated beneath.

Worldwide requirements, typically characterised as delicate regulation, are used to form AI improvement and encourage worldwide cooperation relating to its governance.

Laborious legal guidelines for AI, or legally binding guidelines and obligations, are a piece in progress all over the world. Within the United States, a patchwork of state laws governs completely different points of AI, and the strategy to nationwide AI regulation is fragmented, with completely different federal companies implementing their very own tips.

Europe has led by passing the European Union’s AI Act, which started governing AI techniques based mostly on their danger ranges when it went into impact final yr.

However the world lacks regulatory onerous legal guidelines with a world scope.

The IEEE 3119-2025 normal does align with current onerous legal guidelines. Resulting from its deal with procurement, the usual helps the high-risk provisions outlined within the EU AI Act’s Chapter III and Colorado’s AI Act. The usual additionally conforms to the proposed Texas HB 1709 laws, which might mandate reporting on the usage of AI techniques by sure enterprise entities and state companies.

As a result of most AI techniques used within the public sector are procured moderately than constructed in-house, IEEE 3119 applies to industrial AI services that don’t require substantial modifications or customizations.

The usual’s target market

The usual is meant for:

  • Mid-level procurement professionals and interdisciplinary crew members with a average stage of AI governance and AI system data.
  • Public- and private-sector procurement professionals who function coordinators or patrons, or have equal roles, inside their entities.
  • Non-procurement managers and supervisors who’re both liable for procurement or oversee workers who present procurement features.
  • Professionals who’re employed by governing entities concerned with public training, utilities, transportation, and different publicly funded providers that both work on or handle procurement and are taken with adapting buying processes for AI instruments.
  • AI distributors looking for to grasp new transparency and disclosure necessities for his or her high-risk industrial merchandise and options.

Coaching program within the works

The IEEE Requirements Affiliation has partnered with the AI Procurement Lab to supply the IEEE Accountable AI Procurement Coaching program. The course covers learn how to apply the usual’s core processes and adapt present practices for the procurement of high-risk AI.

The usual contains over 26 instruments and rubrics throughout the six processes, and the coaching program explains learn how to use many of those instruments. For instance, the coaching contains directions on learn how to conduct a risk-appetite evaluation, apply the seller analysis scoring information to investigate AI vendor claims, and create an AI procurement “danger register” tied to recognized use-case dangers and their potential mitigations. The coaching session is now obtainable for buy.

It’s nonetheless early days for AI integration. Determination makers don’t but have a lot expertise in buying AI for high-risk domains and in mitigating these dangers. The IEEE 3119-2025 normal goals to assist companies construct and strengthen their AI danger mitigation muscle tissues.

From Your Website Articles

Associated Articles Across the Net

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles