- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
I’ve actually heard from pretty respectable folks, that one, an possibly quite an important one as it is, of the goals of NATO in Ukraine is to gather training data for war/propaganda oriented AI, via the palantir company. And apparently same goes for Israel, so for this purpose who wins is not relevant just that the fight rages on.
And lastly and quite concerning palantir apparently has expanded towards south Korea, which is quite alarming if the trend continues.
There is no doubt in my mind that AIs are being trained on the data. We already know this thing exists for example https://www.theverge.com/2019/7/31/20746926/sentient-national-reconnaissance-office-spy-satellites-artificial-intelligence-ai
That said though, the tech is not unique to the US, and that means Russia and China would be training predictive systems on the data as well. Russia might be somewhat behind in that regard, but China most certainly would have military AIs that can rival the US.
War, war has changed.
Wouldn’t happen to have a source on it somewhere? I mean, with all the shit the west gets up to, I could def believe it. But also, I wouldn’t be too quick to be scared by such a thing. Just on a fundamental principle level of things, one thing to remember is that the best human generals can make tactical blunders and AI is nowhere near being on the level of human intelligence in the first place. I could see attempts to use it for statistical judgments, but such judgments are likely going to be locked into a specific scenario and it’ll be hard to generalize. At the end of the day, we’re still dealing with material conditions and AI is too.
And though it might not be exactly the same tech, based on what I’ve seen so far with generative AI, it’s a lot less effective at generalizing than people tend to think it is. One of the problems in generative AI, to put it in specifics, is that if the dataset has no experience with X subject, then it’s likely going to struggle to do anything in that subject, ex: if a text model was not trained on any data about Legos, it won’t somehow extrapolate that Legos exist in the world (which makes intuitive sense when you think about it for an example like that). Same thing with humans, but worse with AI. Even we can only generalize so much beyond what we know for sure and we overcome this by learning new things as we encounter them. But a lot of what’s getting called AI doesn’t learn a damn thing unless you make it do that, explicitly, and training gets expensive fast. And if you try to make it some kind of self-learning, it could easily run itself in a direction you don’t want, like the Microsoft Tay incident.
So I mean, they can try, but colonialism has gone on as long as it has without AI even in existence for the majority of that time. AI might impact how brutality is carried out, but the brutality has been going on for hundreds of years. And in spite of that, China is doing well, as are some other AES states. BRICS is making progress. The empire can be resisted and will be until its war criminals are brought to justice.
Edit: wording
I don’t fear war ai right now, the point that scares me is US to stir up war in Korea (again) just to gather more data to train their models. And here are some sources, nothing conclusive but it is quite suggestive https://www.palantir.com/platforms/aip/defense/
https://time.com/6691662/ai-ukraine-war-palantir/
https://www.theregister.com/2024/04/15/hyundai_palantir_autonomous_ships/
My reading of this(now we’re into full speculation mode) is that all this ai training madness on Ukraine is the US both realizing that the eastern block might be better armed them them, as well as the US trying to use the headstart they have in advanced machine learning hardware, via nvidia and stuff to try to get the military upper hand via ai before it’s too late. I think this is more out despair than sound strategy, but this whole analysis might be wishful thinking leading me to think the US is in worse shape than it really is.