Since there are currently no federal regulations for AI, companies are often looking for guidelines that are determined by the National Institute for Standards and Technology to find common language or standards. (Photo by R. Wilson/Nists)
In the middle of the discussions of the congress, what the federal regulations for artificial intelligence could look like, experts and legislators have pointed out existing guidelines in the past two years as a model that were created by the National Institute of Standards and Technology (nist).
“I think [NIST] Do you have Ai Ai, and you are actually pretty good, ”said Bhavin Shah, founder and CEO of the AI Company MoveWorks, at a hearing from the home supervisory committee on June 5.
“They provide many recommendations that we actually follow,” he added.
Shah referred to Nists AI -risk management frameworkA number of voluntary guidelines published in 2023 and developed through a collaboration between experts from the private and public sector.
The framework aims to provide systems that developers and deprivation of AI can pursue in order to boost the safety of their AI models and trustworthiness towards their users. In its outline, the frame was flexibly and generally enough that companies or organizations of different sizes can implement their suggestions.
In 2024 Nist released a similar framework that is specific Generative AI – Newer AI models that are able to generate modern text, images and other content.
Why does Nists Ki -Framework have influential?
Since there are currently no federal provisions for AI, companies are often looking for nist to find a common language or standards, said Patrice Williams-Lindo in Atlanta, a business consultant and CEO of Career Consulting Company Career Nomad.
“Trust meets the technical rigor here,” said Williams-Lindo. “It is not regulatory, but it creates this common language.
The Nist Ai Framework evaluates the full life cycle of a AI product from the conception to surveillance after using the strategy “Map, Administration and Measurement”, said Anthony Habayeb, co-founder and CEO of the AI Governance platform Monitaur. Users who follow the framework, made and identify risks in the AI model, measure the AI performance and the risk levels, then manage the identified risks and react to this.
“What Nist then does is to give you tactical instructions on how you can build in the direction of transparency, how you can build up for better fairness without defining the details of the way in which it works,” said Habayeb.
The research and results of nist often lay the basis for national laws and guidelines. After the attacks on September 11th, for example, Nist was one of the agencies Examine the technical reasons for the building,And subsequent reports changed aspects of the communication standards, building regulations of the White House of the White House.
In the early 2010s, Nist served as a technical consultant in the development of the Fedramp programIn this way, government agencies today get software. In 2014 Nist released his Cybersecurity frameworkA collection of information from hundreds of workshops and participants that were widespread by private companies said Shah in his testimony of June 5.
“Nist is often the soft law before the hard law occurs,” said Williams-Lindo.
In early June, nist became part of the US trade ministry. It is a step that Ylli Bajraktari, President and CEO of the technically organized special thought factory project for competitive studies, in the hearing on June 5 for the development of their best AI guidelines could be an advantage. Regardless of whether the congress occupies a federal politics or not, said Bajraktari, the Nist’s AI framework has a good impact on the private and public sector.
“I think Nist is well positioned for this,” he said.
Could nest standards be right?
Legislators from both sides of the gait and experts of different background praised the clear, impartial AI frames from Nist during a handful of hearing of the congress, which have proposed an event since the Republicans. 10-year moratorium on AI laws at the state level In the “big beautiful calculation”, which deals with the congress.
“The congress likes nist because Nist does not provide any regulations,” said Boise, in Idaho-based Thomas Leithauser, a legal analyst at the software and information services, Wolters Kluwer. “Nist offers recommendations and instructions.”
Although many technology companies follow the guidelines described in the AI framework, this is completely voluntary. They are standards, no regulations, said Habayeb.
However, some legislators argue at the national level for AI regulations and legends in their states that a framework like Nists is not sufficient. Many AI laws are shaped as consumer protection laws, including recently passed state law Facial recognitionPresent Bank business, attitude and healthcare.
True regulations are necessary to protect people from potential damage caused by AI algorithms, said Rep. Kathy Castor, a democrat from Florida, about trade, production and trade in a hearing of the house on May 21.
“What the hell does the congress do?” Castor said about Republican efforts to block regulation at the state level. “What do you do to take the bulls out of time while countries acted to protect us?”
Nist’s AI framework receives cross -party support, but could it be transformed into a federal law? Yes and no, said Williams-Lindo.
It is a robust starting point and a travel post for many technology companies, she said. However, it does not serve as a regulatory authority that may be necessary to keep the rapidly growing AI against the damage it could cause.
“Nist is operational, so we will still need a real plan that measures … Algorithmic damage or ensures that historically excluded communities are not collateral damage,” she said. “There is usually a gaps in this federal leadership here.”
However, some industry players say that companies do not need the strenuous threats from regulation to follow nest standards. Apart from security concerns, the KI builders should see the competitive advantage when following security guidelines, said Habayeb.
“If you skip a step of certain test or data validation, you have a less than ideal system,” said Habayeb. “And you should take care of that for business purposes and not even regulation.”

