On LinkedIn I see a lot of posts aimed at software developers along the lines of “If you’re not using these AI tools (X,Y,Z) you’re going to be left behind.”

Two things about that:

  1. No you’re not. If you have good soft skills (good communication, show up on time, general time management) then you’re already in excellent shape. No AI can do that stuff, and for that alone no AI can replace people
  2. This rhetoric is coming directly from the billionaires who are laying off tech people by the 100s of thousands as part of the class war they’ve been conducting against all working people since the 1940s. They want you to believe that you have to scramble and claw over one another to learn the “AI” that they’re forcing onto the world, so that you stop honing the skills that matter (see #1) and are easier to obsolete later. Don’t fall for it. It’s far from clear how this will shake out once governments get off their asses and start regulating this stuff, by the way–most of these “AI” tools are blatantly breaking copyright and other IP laws, and some day that’ll catch up with them.

That said, it is helpful to know thy enemy.

⤋ Read More

I generally don’t buy into hype myself anyway, didn’t buy into the “Cloud” hype, nor the “Cryptocurrency” hype, and I’m sure as hell not buying into the so-called “AI” hype. Wake me up when I can run this shit™ on my own GPU-powered machines 🤣 – As a vision impaired person, the only use-case I’ve found that’s remotely useful for me is to summarize text. Problem though? I’m not going to use OpenAI’s service to do this. Why? Privacy! Fuck me, do you really think I’m just going to dump shit™ into your API endpoints?! (or Web App).

One of these days I’d like to build a small GPU cluster, but I haven’t decided how yet. Cluster or RPI(s) or a more expensive cluster of 1RU pizza boxes with NVIDIA Tesla cards, or a 1RU 4-node NVIDIA AGX Jetson cluster?

⤋ Read More

@prologic@twtxt.net yeah. I’d add “Big Data” to that hype list, and I’m sure there are a bunch more that I’m forgetting.

On the topic of a GPU cluster, the optimal design is going to depend a lot on what workloads you intend to run on it. The weakest link in these things is the data transfer rate, but that won’t matter too much for compute-heavy workloads. If your workloads are going to involve a lot of data, though, you’d be better off with a smaller number of high-VRAM cards than with a larger number of interconnected cards. I guess that’s hardware engineering 101 stuff, but still…

⤋ Read More

Participate

Login to join in on this yarn.