The US House of Representatives administrative chief Catherine Szpindor has reportedly set several restrictions for using ChatGPT in congressional offices this week, probably due to data security concerns.

This was based on an internal memo, where Szpindor limited the use of any AI LLMs with weak protections. As for ChatGPT, the staff can only use its Plus service with manually turned privacy settings to protect sensitive information from getting fed into tr leaking.

New Rules For ChatGPT Usage at Congress

OpenAI’s ChatGPT is a resounding success that triggered the big techs create their models or partner with OpenAI to leverage its GPT large language model. While the AI chatbot is easing our regular work, it’s also drawing heavy criticism for being limitless in some cases – especially in offering life advices and racial remarks on some.

This led companies and governments to put strict rules for their staff for using the chatbot while also trying to regulate the new AI tools for society’s good. In this pursuit, the US House of Representatives administrative chief, Catherine Szpindor, has reportedly set new rules for using ChatGPT by their staff.

This comes from Axios, which claims a memo that notes several restrictions to the Congress staff on using ChatGPT and other similar language AI models in the offices. The team can only use the paid version of ChatGPT (Plus service), citing its tighter privacy controls. Even then, the use should be limited to “research and evaluation” and can’t be used for everyday work.

Further, the House offices can only use the chatbot with publicly accessible data while manually enabling the privacy settings to prevent interactions from feeding data into the AI model, adds Szpindor. Well, this isn’t surprising, though, as many experts and researchers have warned against using AI chatbots for personal work, as they sometimes spit misleading and hateful responses.

Thus, the new rules may not draw much opposition from anyone, as the parliament has been trying to regulate AI for good lately.