> Strict limits on governmental regulation wherein any restrictions must be demonstrably necessary and narrowly tailored to a compelling public safety or health interest.
> Mandatory safety protocols for AI-controlled critical infrastructure, including a shutdown mechanism and compulsory annual risk management reviews.
Read: industry can do whatever we want, but the government also has to put up barriers to entry that favor large incumbents.
This has nothing to do with rights or even computing, it's just regulatory capture.
ToucanLoucan 1 hours ago [-]
You know if we're gonna pass laws to make it illegal for the government to interfere with the Torment Nexus, the least they could do is not gaslight us with the fucking name of the law. Just tell us the billionaires get to fuck the planet in the eye and the rest of us have to deal with it, at least it's honest that way.
bigfishrunning 35 minutes ago [-]
They can't be that blatant, that's how you lose your next term
Mistletoe 48 minutes ago [-]
So it should be renamed Right to Datacenter Act. And here I thought they were giving people power over their private computers and being surveilled on them…
hermannj314 1 hours ago [-]
When a "right to..." law is passed, there is usually an accompanying narrative that explains a past injustice that will be corrected. Matthew Shepard hate crime, Civil Rights Voting act, etc.
The absence of such a story makes me think this law doesn't protect shit. What exactly did a Montanian get killed or arrested trying to do with a computer that is now protected? Can I use AI during a traffic stop or use AI to surveil and doxx governemnt employees? What exactly is the government giving up by granting me this right?
Or is this just about supressing opposition to data centers?
culi 59 minutes ago [-]
Yeah I think it's pretty obviously the AI industry trying to ban its own regulation
> Nationally, the Right to Compute movement is gaining traction. Spearheaded by the grassroots group RightToCompute.ai, the campaign argues that computation — like speech and property — is a fundamental human right. “A computer is an extension of the human capacity to think,” the organization states.
staplers 29 minutes ago [-]
computation — like speech and property — is a fundamental human right
Computation however requires a vast supply chain where certain middlemen have a near monopoly on distribution of said "fundamental right". The incentives for lobbyists seems clear.
I don't necessarily disagree with the idea, but until profit is shared with taxpayers, this is a one-way transaction of taxpayers bankrolling AI companies.
dismalaf 47 minutes ago [-]
Regulation is just regulatory capture by incumbents and also a national security risk.
hdgvhicv 29 minutes ago [-]
You argue that food safety tellregialtikns are just regularity capture?
akersten 58 minutes ago [-]
Eh, if states can pass restrictive laws on AI in absence of a correspondingly negative motivating event, I don't see any contradiction in doing the opposite.
gosub100 48 minutes ago [-]
so the jobs have to be lost _first_ , then we can ban it?
moate 51 minutes ago [-]
>>absence of a correspondingly negative motivating event.
You don't think there's reasons pass laws banning AI...datacenters?
Because what state is banning the concept of AI? They're banning/restricting the creation of a type of infrastructure within their borders because they feel that is detrimental to their citizens. Maybe it's NIMBY/Luditte BS to you, but people not wanting their resources to go help ensure some dork can have a chat-bot girlfriend seems normal to me.
hparadiz 43 minutes ago [-]
I'm already running an LLM locally. This is just me renting space in a data center. Since when did we restrict people's ability to do things? For the record my local models run off the solar bolted to my roof. Even including the data center I'm using 1/10th of the energy we were using on tube monitors back in the 90s. This is exhausting. My GPU would be demonstrably using more power by playing a videogame right now than when I run a local LLM.
jrmg 38 minutes ago [-]
Since when did we restrict people's ability to do things?
This question is not the obvious winner you think it is. To me, and I am sure many, it sort of undermines your argument.
Even in the most ‘free' cultures, society has _always_ restricted people’s individual ability to do things that it collectively deems harmful to the whole society.
hparadiz 29 minutes ago [-]
This is literally why America was founded. Too many people stifle innovation. Move to Europe if you want to be stuck in the 20th century frankly. That doesn't mean we can't take care of folks. But the ludites need to get the fuck out of the way. You're all exhausting.
cheeeeeeeese 10 minutes ago [-]
[dead]
Arainach 37 minutes ago [-]
>Since when did we restrict people's ability to do things?
When those things impact other people - such as by skyrocketing utility prices, overloading the electrical grid, and more.
hparadiz 27 minutes ago [-]
I thought this was a free market? Or is that not how things work anymore?
moate 27 minutes ago [-]
>>when did we restrict people's abilities to do things? That's literally what most laws are, saying what you can and can't do. This is like, a foundational understanding of what government/regulation is.
>>this is just me renting space...
Okay, so a "network effect" is when things have greater impact due to larger usage. So the data center usage that you're talking about does not represent the overall impact of the data center. Saying "I only pour ONE cup of bleach into the ocean, so I don't see why it's so bad to have the bleach factory pump all its waste in as well" is a WILD take.
akersten 46 minutes ago [-]
I didn't say any of that in my comment nor express an opinion about this whole thing writ large. I'm only pointing out that it's not weird for legislature to preempt a real world use case by way of pointing out similar laws.
moate 38 minutes ago [-]
I'm going to do this again:
>>>>absence of a correspondingly negative motivating event.
What did you mean? Why do you believe there has not been a motivating event to ban data centers when those bans have happened, which is literally what you said?
akersten 26 minutes ago [-]
In the context of the discussion a correspondingly negative event would have been along the lines of "we built a data center and then it exploded, we need to make sure that doesn't happen." Not "we're worried about the effects the data center might have," which is vis a vis to "we're worried about the effects banning ai might have." All I'm saying is neither of those last two are weird reasons to enact a law.
GP was insisting that "rights" named laws always come after some negative event and it is weird that we have this "rights" named law without someone being deprived of their computation or whatever. I'm disagreeing with the premise that that's weird by pointing out laws preempt real world events all the time, in either direction (restrictive or permissive).
baggy_trough 38 minutes ago [-]
> Maybe it's NIMBY/Luditte BS to you, but people not wanting their resources to go help ensure some dork can have a chat-bot girlfriend seems normal to me.
Why would it be your business, or anyone else's, to stop someone from doing this?
Arainach 25 minutes ago [-]
Because these data centers are at best overstressing utility grids and elevating prices for everyone and at worse running dirty generators and poisoning entire communities, for a start.
9 minutes ago [-]
lukeschlather 1 hours ago [-]
I was really hoping this gave people the right to use their computers, but it really looks like it simply prevents "the government" from regulating the right to "make use of computational resources." So Google or Apple can still prevent me from using my phone for lawful purposes, the government just can't regulate it (and the government might not be able to write restrictions that prevent manufacturers from violating my right to compute.)
einpoklum 10 minutes ago [-]
Imagine if Montata required that all compute platforms sold in the state to be free of user restriction: That they be amenable to modification, that all source code, firmware and hardware specs be open, and when that is not the case - the company would be compelled to release the relevant information on pain of having assets seized, required to refund payments etc. That would have been a hoot :-)
dynm 1 hours ago [-]
I think the main content of this law (https://legiscan.com/MT/text/SB212/id/3212152) is just two paragraphs. I'd suggest reading them yourself rather than relying on secondary description:
"Government actions that restrict the ability to privately own or make use of computational resources for lawful purposes, which infringes on citizens' fundamental rights to property and free expression, must be limited to those demonstrably necessary and narrowly tailored to fulfill a compelling government interest."
"When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
In particular, I think the reporting is straight wrong that there's a shutdown requirement. That was in an earlier version (https://legiscan.com/MT/text/SB212/id/3078731) and remains in the title of this version, but seems to have been removed from the actual text.
scuff3d 1 minutes ago [-]
When you contextualize the law with comments like this
"The initiative... contrasts with recent restrictive legislation efforts in states like California and Virginia. Zolnikov, a noted advocate for privacy, has been instrumental in pushing for tech-friendly policies that ensure individual liberties in an evolving digital landscape.
"'As governments around the world and in our own country try to crack down on individual freedom and gain state control over modern technologies,' Zolnikov said. 'Montana is doing the opposite by protecting freedom and restraining the government.'"
And it's the normal framing we always see with this crap. This is more an attempt to protect corporations from regulation then it is to protect individuals.
RobRivera 60 minutes ago [-]
So the government is afforded the opportunity to constrict compute if for a government interest.
This bill seems to expand powers, not restrict
dynm 26 minutes ago [-]
Before the law, I think the state government or local governments could (by passing a law) restrict computing for any reason, even without a government interest. Now, they'd have to repeal this first.
RobRivera 20 minutes ago [-]
How?
I know the whole 90s meme of 'I am a controlled munition' went around because cryptography was labeled an ordnance subject to export control laws, and therefore code that performed those kind of computations were forbidden to be sold abroad, liable to a felony.
What happens today? Government gets rights to source code, logs, and rubber stamps/rejects your code from executing in the cloud?
Government limits your access to commodity infrastructure?
toomanystraws 12 minutes ago [-]
"... the deployer shall develop a risk management policy after deploying the system...."
This is a complete sham. Anything really geared towards protecting people would have protections in place before deployment.
hnsdev 1 hours ago [-]
With laws such as the Brazilian one or the one proposed in New York, I am curious to know what will be the future for computing.
On one hand, forbidding and limiting people from using computers as they wish is somewhat impossible, as too many computers that don't have restrictions have already been produced. You can always use old hardware and, with open source projects, fork an old version that will respect your right to compute. At some point though it will be a problem as hardware no longer works and software becomes incompatible with everything. The thing is that those who will probably be doing it mostly are people that already grew accustomed to not live in an Orwellian state, while, on the other hand, newer generations will all be using new systems with these restrictions, as if they were normal. The smart ones will find ways of circumventing it (as if it wouldn't be hard to get your parents CC and verify it as if you were over 18).
Given that, they will be computing in a restrictive and controlled environment. I feel sorry for them.
I am going to college (Computer Science) as an older student with previous experience in programming, and it never ceases to amaze me that the current generation of students doesn't think out of the box and is completely dependent on ChatGPT. We all suffered from conditioning from governments and corporations throughout the years, but it is accelerating at an alarming rate.
Acts like this (the one from Montana) are positive, but unfortunate that they simply have to exist and somewhat irrelevant when the big dogs (California, New York and whole countries such as Australia) approve legislation that will promptly be followed by most companies/projects, which will in turn force this way of things happening everywhere else.
heavyset_go 51 minutes ago [-]
This won't touch age verification and surveillance laws, it's not meant to protect people, it's meant to protect the interests of capital
preinheimer 46 minutes ago [-]
What about a “right to create act” giving people the right to create things and not have their creation be ingested to train ai for billion dollar companies?
ProllyInfamous 28 minutes ago [-]
Some sort of pre-emptive auto-opt-AI't.
It's ridiculous that AIco's arguments are dwindling down to "it's not copyright infringement to ingest others' work and make 'derivatives' [which often are identical to original authors' works]."
----
We desperately need younger politicians, who can not only keep up with information more sharply (i.e. aren't legally decades-retireable), but also are of the age where their own children are being affected by government re-funding flows away from youth/education/future.
At this point I'm willing to concede that our future probably has companies' individual LLM/genAI products competing against one-another, as digital politicians ["the digital pimp, hard at work... we have needs"--Matrix' Mouse]. Nobody knows how either flesh nor silicon congressmen work, inside; but I think the latter could act more human[e]ly...
24 minutes ago [-]
s_dev 2 hours ago [-]
I really dislike how 'compute' as a noun took over 'computational' as an adjective. I just find the sentence 'I need more computational resources' flows so much nicer than ''I need more compute'.
DennisP 1 hours ago [-]
"Right to compute" sounds to me more like they're using "compute" as a verb, which predates "computational" by a couple centuries.
moate 48 minutes ago [-]
Someone said "right to computers' and someone else said "that sounds dumb...make it compute!"
hackyhacky 1 hours ago [-]
Interpret the word "compute" in the title as a verb, not a noun. "I have the right to compute" is analogous grammatically to "I have the right to vote" or "I have the right to assemble"
moate 47 minutes ago [-]
Glad Montana is securing the right to do math.
codethief 1 hours ago [-]
The "compute" in "right to compute" could also be a verb, though. :-)
jasonlotito 9 minutes ago [-]
Compute is the...
FTA: right to own, access, and use computational resources
It's a verb.
hackyhacky 1 hours ago [-]
How about "we've got the best nuclear"
soulofmischief 1 hours ago [-]
Well, language evolves, and I personally prefer compute as a noun when talking about resources. It's great though because we can each say it in our preferred way without judging one another.
sockaddr 52 minutes ago [-]
I agree. This is language evolving. If someone from the 16th century could hear a modern well-educated person speak English today they would likely be horrified at how degenerate it would sound to them.
So I don't think current English is in some perfect state that should not change.
TL;DR: Basically the AI industry trying to ban governments from regulating it
152334H 38 minutes ago [-]
> Apr 21, 2025
why is this posted now?
kmeisthax 2 hours ago [-]
This is extremely light on details, but I'm pretty sure "Right to Compute" has absolutely nothing to do with software freedom and everything to do with making it harder to oppose giant datacenter buildouts for AI companies, so they can blast you with infrasound, spike the price of electricity and RAM, and build surveillance systems to take away your rights.
hrimfaxi 1 hours ago [-]
Well they do define compelling government interest to include
> "Compelling government interest " means a government interest of the highest order in
protecting the public that cannot be achieved through less restrictive means. This includes but is not limited to:
(a) ensuring that a critical infrastructure facility controlled by an artificial intelligence system
develops a risk management policy;
(b) addressing conduct that deceives or defrauds the public;
(c) protecting individuals, especially minors, from harm by a person who distributes deepfakes and
other harmful synthetic content with actual knowledge of the nature of that material; and
(d) taking actions that prevent or abate common law nuisances created by physical datacenter
infrastructure.
D seems to address that potentially.
perfect-blue 1 hours ago [-]
My thoughts exactly. I reads a lot like they are trying to minimize the state's power to regulate AI. I'm not sure that's such a good thing. Regulation is one of the only ways that we can manage the ``bads'' that come with any new technology. In the US, we've never been very good at regulating new technologies before industry stakeholders entrench themselves in the lobbying circuit.
glaslong 38 minutes ago [-]
Proactively shielding themselves from the eventual, justified, realization that spiking a population's price of water and electricity such that they cannot use them IS an externality just as bad as polluting the water supply.
jeffbee 2 hours ago [-]
It amuses me how contradictory the two bullet points from the article are.
- Strict limits on governmental regulation, wherein any restrictions must be demonstrably necessary and narrowly tailored to a compelling public safety or health interest.
- Mandatory safety protocols for AI-controlled critical infrastructure, including a shutdown mechanism and compulsory annual risk management reviews.
How were the necessity and scope of the second rule shown to satisfy the first rule?
In essence, it doesn't really mandate anything; it says you should have a plan, and only for "critical infrastructure facilities":
"Section 4. Infrastructure controlled by critical artificial intelligence system. (1) When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
So it's essentially lip service to AI safety, probably to quell some objections to a bill that otherwise limits regulation of tech platforms.
jeffbee 1 hours ago [-]
I did read it. The point is there are no findings that justify the regulation in light of the grant of rights in the same bill. The only WHEREAS that approaches the level of a finding amounts to "many are saying..."
janice1999 1 hours ago [-]
The 2nd rule is clearly intended to be a shield and distraction. It's there to pretend the law serves the public, when in reality it's designed to defend datacenter builders from the public interest. Politicians can talk about meaningless sci-fi concepts like SkyNet and how it can defeat it with off switches, instead of real issues like noise pollution, tax giveaways, electricity prices and mass surveillance.
hnsdev 1 hours ago [-]
Probably one applies for individuals while the other, as described, applies for infrastructure.
selectively 44 minutes ago [-]
The tragedy is that 'right to compute' is such a great name for something actually useful. Requiring OEMs to allow users to load any OS they want, requiring OEMs to allow full control over a device/OS ('root access') etc.
Instead, it's wasted on AI slop.
Nevermark 5 minutes ago [-]
"Write to Computer 2.0" sounds good to me. Might as well slipstream.
Sell it as the "individual freedom of compute" complement to "data center freedom of compute".
Put that way, it practically sells itself. What politician wants to vote against individual freedom, after they voted in the same for data centers?
amelius 36 minutes ago [-]
Yeah, "you can own compute hardware" doesn't really help if nobody makes hardware that can be owned.
dboreham 1 hours ago [-]
(2025)
righthand 44 minutes ago [-]
This is a law designed to force data centers to be built. This is nothing but a bipartisan corporate handout. Nothing to celebrate.
35 minutes ago [-]
Rendered at 16:49:07 GMT+0000 (Coordinated Universal Time) with Vercel.
https://frontierinstitute.org/frontier-institute-statement-i...
Ah.
Read: industry can do whatever we want, but the government also has to put up barriers to entry that favor large incumbents.
This has nothing to do with rights or even computing, it's just regulatory capture.
The absence of such a story makes me think this law doesn't protect shit. What exactly did a Montanian get killed or arrested trying to do with a computer that is now protected? Can I use AI during a traffic stop or use AI to surveil and doxx governemnt employees? What exactly is the government giving up by granting me this right?
Or is this just about supressing opposition to data centers?
> Nationally, the Right to Compute movement is gaining traction. Spearheaded by the grassroots group RightToCompute.ai, the campaign argues that computation — like speech and property — is a fundamental human right. “A computer is an extension of the human capacity to think,” the organization states.
I don't necessarily disagree with the idea, but until profit is shared with taxpayers, this is a one-way transaction of taxpayers bankrolling AI companies.
You don't think there's reasons pass laws banning AI...datacenters?
Because what state is banning the concept of AI? They're banning/restricting the creation of a type of infrastructure within their borders because they feel that is detrimental to their citizens. Maybe it's NIMBY/Luditte BS to you, but people not wanting their resources to go help ensure some dork can have a chat-bot girlfriend seems normal to me.
This question is not the obvious winner you think it is. To me, and I am sure many, it sort of undermines your argument.
Even in the most ‘free' cultures, society has _always_ restricted people’s individual ability to do things that it collectively deems harmful to the whole society.
When those things impact other people - such as by skyrocketing utility prices, overloading the electrical grid, and more.
>>this is just me renting space... Okay, so a "network effect" is when things have greater impact due to larger usage. So the data center usage that you're talking about does not represent the overall impact of the data center. Saying "I only pour ONE cup of bleach into the ocean, so I don't see why it's so bad to have the bleach factory pump all its waste in as well" is a WILD take.
>>>>absence of a correspondingly negative motivating event.
What did you mean? Why do you believe there has not been a motivating event to ban data centers when those bans have happened, which is literally what you said?
GP was insisting that "rights" named laws always come after some negative event and it is weird that we have this "rights" named law without someone being deprived of their computation or whatever. I'm disagreeing with the premise that that's weird by pointing out laws preempt real world events all the time, in either direction (restrictive or permissive).
Why would it be your business, or anyone else's, to stop someone from doing this?
"Government actions that restrict the ability to privately own or make use of computational resources for lawful purposes, which infringes on citizens' fundamental rights to property and free expression, must be limited to those demonstrably necessary and narrowly tailored to fulfill a compelling government interest."
"When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
In particular, I think the reporting is straight wrong that there's a shutdown requirement. That was in an earlier version (https://legiscan.com/MT/text/SB212/id/3078731) and remains in the title of this version, but seems to have been removed from the actual text.
"The initiative... contrasts with recent restrictive legislation efforts in states like California and Virginia. Zolnikov, a noted advocate for privacy, has been instrumental in pushing for tech-friendly policies that ensure individual liberties in an evolving digital landscape.
"'As governments around the world and in our own country try to crack down on individual freedom and gain state control over modern technologies,' Zolnikov said. 'Montana is doing the opposite by protecting freedom and restraining the government.'"
And it's the normal framing we always see with this crap. This is more an attempt to protect corporations from regulation then it is to protect individuals.
This bill seems to expand powers, not restrict
I know the whole 90s meme of 'I am a controlled munition' went around because cryptography was labeled an ordnance subject to export control laws, and therefore code that performed those kind of computations were forbidden to be sold abroad, liable to a felony.
What happens today? Government gets rights to source code, logs, and rubber stamps/rejects your code from executing in the cloud?
Government limits your access to commodity infrastructure?
This is a complete sham. Anything really geared towards protecting people would have protections in place before deployment.
Given that, they will be computing in a restrictive and controlled environment. I feel sorry for them.
I am going to college (Computer Science) as an older student with previous experience in programming, and it never ceases to amaze me that the current generation of students doesn't think out of the box and is completely dependent on ChatGPT. We all suffered from conditioning from governments and corporations throughout the years, but it is accelerating at an alarming rate.
Acts like this (the one from Montana) are positive, but unfortunate that they simply have to exist and somewhat irrelevant when the big dogs (California, New York and whole countries such as Australia) approve legislation that will promptly be followed by most companies/projects, which will in turn force this way of things happening everywhere else.
It's ridiculous that AIco's arguments are dwindling down to "it's not copyright infringement to ingest others' work and make 'derivatives' [which often are identical to original authors' works]."
----
We desperately need younger politicians, who can not only keep up with information more sharply (i.e. aren't legally decades-retireable), but also are of the age where their own children are being affected by government re-funding flows away from youth/education/future.
At this point I'm willing to concede that our future probably has companies' individual LLM/genAI products competing against one-another, as digital politicians ["the digital pimp, hard at work... we have needs"--Matrix' Mouse]. Nobody knows how either flesh nor silicon congressmen work, inside; but I think the latter could act more human[e]ly...
FTA: right to own, access, and use computational resources
It's a verb.
So I don't think current English is in some perfect state that should not change.
On god.
TL;DR: Basically the AI industry trying to ban governments from regulating it
why is this posted now?
> "Compelling government interest " means a government interest of the highest order in protecting the public that cannot be achieved through less restrictive means. This includes but is not limited to: (a) ensuring that a critical infrastructure facility controlled by an artificial intelligence system develops a risk management policy; (b) addressing conduct that deceives or defrauds the public; (c) protecting individuals, especially minors, from harm by a person who distributes deepfakes and other harmful synthetic content with actual knowledge of the nature of that material; and (d) taking actions that prevent or abate common law nuisances created by physical datacenter infrastructure.
D seems to address that potentially.
- Strict limits on governmental regulation, wherein any restrictions must be demonstrably necessary and narrowly tailored to a compelling public safety or health interest.
- Mandatory safety protocols for AI-controlled critical infrastructure, including a shutdown mechanism and compulsory annual risk management reviews.
How were the necessity and scope of the second rule shown to satisfy the first rule?
In essence, it doesn't really mandate anything; it says you should have a plan, and only for "critical infrastructure facilities":
"Section 4. Infrastructure controlled by critical artificial intelligence system. (1) When critical infrastructure facilities are controlled in whole or in part by a critical artificial intelligence system, the deployer shall develop a risk management policy after deploying the system that is reasonable and considers guidance and standards in the latest version of the artificial intelligence risk management framework from the national institute of standards and technology, the ISO/IEC 4200 artificial intelligence standard from the international organization for standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems. A plan prepared under federal requirements constitutes compliance with this section."
So it's essentially lip service to AI safety, probably to quell some objections to a bill that otherwise limits regulation of tech platforms.
Instead, it's wasted on AI slop.
Sell it as the "individual freedom of compute" complement to "data center freedom of compute".
Put that way, it practically sells itself. What politician wants to vote against individual freedom, after they voted in the same for data centers?