this post was submitted on 09 Aug 2024
46 points (100.0% liked)
TechTakes
1435 readers
117 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
let's delve for five minutes in that fantasyland where military considers that current genai has some military application. ai-powered murderbots are still not gonna happen, and something closer to whatever surveilling hellscape peter thiel is cooking seems more likely (which goes with the same set of problems as current israeli ai-"aided" targeting, and is imo fig leaf for diffusing responsibility. but anyway)
DoD is no stranger to overpriced useless boondoggles, so they should have some sense to throw it out, but if, and that's massive, load-bearing if, military adopts some variant of commandergpt, then it's gonna happen iff it provides some new capability that wasn't there before, or improvement is so vast that it's worth adopting over whatever side effects, costs of tech, training etc. if it provides some new critical capability, then it will be tightly classified, because under no circumstances it can find its way to the most probable adversary. bunch of software sitting in some server farm in Utah is mostly safe there, unless mormon separatism becomes a thing overnight. putting it in a drone or missile, well it's not impossible but it's much harder. but it has been done:
one way it was solved can be seen in this FGM-148 Javelin ATGM guidance module teardown. don't ask me why it's on youtube or on how many watchlists this will get you. you'll notice that there's no permanent storage there, and lots of actual processing happens in bunch of general-purpose FPGAs. the way it maybe perhaps works is that during cooldown of IR sensor actual software and configuration of these FPGAs is uploaded from CLU (command launch unit) (which is classified item) to missile (which is not), and even if it's a dud, enemy can't find out how missile works because power is lost in seconds, RAM is wiped and missile reverts to bricked state. this is to avoid what happened to AIM-9B that got cloned as K-13/AA-2 Atoll.
power, space, and weight on missile are limited. in order to make it work, that software has to be small, elegant, power-efficient, fast, reliable, redundant, reliable, hardened to conditions possible and impossible, sanely-behaving and comprehensible for maintainer. whatever openai has is anything but
What really struck me was how Microsoft's big pitch for defense applications of LLMs was ... corporate slop. Just the same generic shit.
The US military has many corporate characteristics, and I'm quite sure the military has even more use cases for text that nobody wanted to write and nobody wanted to read than your average corporation. But I'd also have thought that a lying bullshit machine was an obvious bad fit for when the details matter because the enemy is trying to fucking kill you. Evidently I'm not quite up on modern military thought.
they have to, otherwise they risk interfering with something real that has real-life consequences, starting with things like not being in specification and getting reports that this shit doesn't work, breaks something mission-critical or worse yet contributed to fatal incident