- cross-posted to:
- technology@hexbear.net
- cross-posted to:
- technology@hexbear.net
None of what I write in this newsletter is about sowing doubt or “hating,” but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom — is (as I’ve said before) unsustainable, and will ultimately collapse. I also fear that said collapse could be ruinous to big tech, deeply damaging to the startup ecosystem, and will further sour public support for the tech industry.
Can’t blame Zitron for being pretty downbeat in this - given the AI bubble’s size and side-effects, its easy to see how its bursting can have some cataclysmic effects.
(Shameless self-promo: I ended up writing a bit about the potential aftermath as well)
deleted by creator
a phrase so loadbearing you could build skyscrapers out of it
“custom-built AI based on sound data science”?
Summizing Emails is a valid purpose. If you want to be pedantic about what AI means, go gatekeeper somewhere else.
Me, showing up to a chemistry discussion group I wasn’t invited to:
Or it would have been if LLMs were sufficiently dependable anyway.
JFC how many novel-length emails do you get in a week?
I think a more constructive way to handle this problem is to train people to write better emails.
a sort of problem that only lw forums users have
“AI, please summarize this LW”
“Certainly. Here is the summary: give all your money to Yud or burn in virtual hell forevermore”
who the fuck are you again? go post somewhere else
ah yes, thanks for that extremely valuable driveby input that you gave after someone most definitely was speaking directly to you
oh, wait. hang on. sorry, I just checked my notes here. it’s the other thing, the opposite one.