There’s a preprint paper out that claims to prove that the technology used in LLMs will never be able to be extended to AGI, due to the exponentially increasing demand for resources they’d require. I don’t know enough formal CS to evaluate their methods, but to the extent I understand their argument, it is compelling.
Cluster munitions are bad when you’re an invading army bc some of the explosives fail to fire, endangering civilians who come across them later. Ukraine, however, is using them on its own territory to combat russia who 1) is already using cluster munitions with a greater fail rate than the ones the US is providing Ukraine, and 2) deliberately mines the areas they invade in a way to kill civilians (e.g. setting up a mine to explode if you try to move the corpse of a beloved family dog). So in this case, using the US’s cluster munitions to get russia out is a net positive.