

Part of the reason that this jailbreak worked is that the Windows keys, a mix of Home, Pro, and Enterprise keys, had been trained into the model, Figueroa told The Register.
Isn’t that the whole point? They’re using prompting tricks to tease out the training data. This has been done several times with copyrighted written works. That’s the only reasonable way ChatGPT could produce valid Windows keys. What would be the alternative? ChatGPT somehow reverse engineered the algorithm for generating valid Windows product keys?
I don’t follow Mexican politics closely, but this could be part of an effort to curb obesity. I’ve heard they introduced taxes on sugary drinks for this, so this might be another avenue.
If people are wanting cheap snacks, and private companies are only making unhealthy ones, you can introduce regulations to micromanage what they can produce, or you can introduce a complex taxation process to disincentivize sugar snacks. Or you can introduce your own product that meets a perceived unmet demand in an underserved market.