Perhaps you're thinking of the EULA kind of situation. EULAs are probably not enforceable because they require the user to agree to additional terms after the main sale of contract is complete. It's considered a one-sided agreement because the user doesn't get anything in return (the contract of sale of the software already grants the right to use the software).
For LLMs it really depends on the situation. If it's presented in a EULA scenario, where you already bought the rights to use the LLM and ClosedAI gave you the EULA with additional terms afterwards, then the logic above applies. But then everyone knows EULAs aren't very enforceable these days, and nobody buys packaged software any more, so this scenario is quite unlikely these days.
So, if the clause is just one of the many conditions in their main contract of service, of which you had ample opportunity to review before purchasing/agreeing to use their service, then as long as the terms are legal (eg. don't contradict some law), parties are generally free to agree to whatever they want in a contract, and courts will generally uphold those terms.
"can't use them to feed employees of competing milkshakes companies" is probably enforceable. Sounds silly, but I can't think of any reason why it wouldn't be upheld. Unless there's antitrust factors involved.
"Can't use output of their API to train competitive models" is most likely enforceable. Unless there's antitrust factors involved. These kinds of terms are pretty common too. Nobody seriously thinks they're unenforceable per se.
Of course there are practical barriers to enforce a contract -- the aggrieved party has to discover the breach, gather sufficient evidence, and file a lawsuit. As an average Joe individual, you're probably not worrying about getting sued by a company for trivial breaches of service agreements. Most likely the service provider will just cut the service instead of spending thousands of dollars tracking you down (and risk taking a PR hit for going after the little guy). But between businesses, the risks of getting sued by a competitor is real, and no sane lawyer would advise the business to ignore such contract terms.
(Btw, I am not a lawyer. I've studied these things a bit though.)
For LLMs it really depends on the situation. If it's presented in a EULA scenario, where you already bought the rights to use the LLM and ClosedAI gave you the EULA with additional terms afterwards, then the logic above applies. But then everyone knows EULAs aren't very enforceable these days, and nobody buys packaged software any more, so this scenario is quite unlikely these days.
So, if the clause is just one of the many conditions in their main contract of service, of which you had ample opportunity to review before purchasing/agreeing to use their service, then as long as the terms are legal (eg. don't contradict some law), parties are generally free to agree to whatever they want in a contract, and courts will generally uphold those terms.
"can't use them to feed employees of competing milkshakes companies" is probably enforceable. Sounds silly, but I can't think of any reason why it wouldn't be upheld. Unless there's antitrust factors involved.
"Can't use output of their API to train competitive models" is most likely enforceable. Unless there's antitrust factors involved. These kinds of terms are pretty common too. Nobody seriously thinks they're unenforceable per se.
Of course there are practical barriers to enforce a contract -- the aggrieved party has to discover the breach, gather sufficient evidence, and file a lawsuit. As an average Joe individual, you're probably not worrying about getting sued by a company for trivial breaches of service agreements. Most likely the service provider will just cut the service instead of spending thousands of dollars tracking you down (and risk taking a PR hit for going after the little guy). But between businesses, the risks of getting sued by a competitor is real, and no sane lawyer would advise the business to ignore such contract terms.
(Btw, I am not a lawyer. I've studied these things a bit though.)