sparcboxbuck
What happened to my ¤cash?
Soylent Green
It's coming
The generative AI stuff, and what’s available in foundational models alone, is mind blowing. I’ve been doing what amounts to deep learning since the mid ‘90s where my first job was DoD related… so I’ve been at this for a long time. Reason I share, as a cognitive scientist (by education, anyhow), I’ve been floored by what we’re seeing.
You can stop there if you want. If you want more explanation, keep reading…
Not to put anyone in a but I’m deeply involved in the AI work being done in our company. Given my background, as a practitioner and by education, I’m a bit beside myself in how good this stuff really is. It used to be that if you wanted to train a BIG deep learning model or something like one of these generative language models, there wasn’t enough computational horse power to pull it off. That’s no longer the case. But what’s really crazy is now the computational resources exist and with it, a LOT of the heavy lift that you’d have to do for a bespoke solution is already being done and being put in the public domain. That is to say, I can train a LLM (large language model) from scratch at the cost of $30-$50K, minimum, in computational resources alone. But there exists a number of general LLM models that have been “pre-trained” (that’s what the “PT” stands for in the G”PT” acronym). The pre-trained models can be fine tuned with significantly less resources cost wise, and a very limited amount of data to be adapted for custom use cases.
So, great… I can make a sex-bot from a foundation model on the cheep? What’s the big deal?
The number of fucks I give about your sex-bot? Zero. It’s where this shit is going… and how quickly we’re going to get there. I’m sure that some of you have seen the AI-based Morgan Freeman?
Or the Joe Rogan deep fake? Guess what, that shit is funny. Maybe a bit spooky, but it’s fucking funny…
That is, until you realize that this technology is literally in it’s infancy relative to the computational resources we have today. NGL, compared to the DoD work I was doing, this is light years in front of what we were doing in the ‘90s… but we had nowhere near the kind of resources we have now. I won’t even go into the shit we had access to in DoD work back then… even by today’s standards, it was impressive… but it sure as fuck wasn’t available to my 18-year old son and a credit card.
But back to the point of foundational models. When the requirements to adapt someone else’s heavy lift is 1% of the initial lift, it democratizes (not necessarily in a good way) the technology. This makes it available to any hack with decent skills and a credit card… and combine that with bad intentions, this shit can go sideways in a hurry.
And we’ve only discussed generative AI for language, video and voice.
This is where you need to, how did the sorority girl from UMD (recall the “cunt punch” email) say it? Ahh, that’s right… “tie yourself down to whatever chair you’re sitting in, this is going to be a rough fucking ride.”
Imagine a parallel technology with equal or greater computational power being applied to the encryption algorithms that do things like protect networks and… awww… fuck… our national security… and those of our friends and our foes.
I’ll stop there and let that shit sink in for a bit. The world is being sidetracked by the shiny object called GPT while spoofing someone’s likeness and having an AI model writing a term paper is the LEAST OF YOUR FUCKING CONCERN, or sure as fuck should be.
Pray that the encryption technology stays well in front of whateverthehell is being used to break the encryption. I’m not convinced it’s a fair race.
Last thing I’ll say is that we discuss ethical uses of this technology all the time. That’s all well and fucking good when everyone buys into the idea of ethical use. No need in saying any more.
Upvote
0