godforsaken.website is one of the many independent Mastodon servers you can use to participate in the fediverse.
godforsaken.website is a uk-based mastodon instance boasting literally thousands of posts about bumholes and UNESCO world heritage sites

Server stats:

31
active users

I'm really really really not interested in computers getting more powerful.
I am super interested in them being more repairable and modifiable, drawing less power, lasting and being supported for way longer etc. That stuff still gets me excited

Computers are as fast as they need to be. I know that's an old thing we've been joking about for decades, but these days it feels software is just getting more and more bloated for the sake of selling faster computers and features for the sake of more=good capitalism

Shrig 🐌

I've built a few really high-end PCs for people recently and nobody needs a computer that fast. It's like things load before you've even realised you've clicked on them. We don't need more than that! It's really silly

@Shrigglepuss hah I was telling someone today "computers peaked around Nehalem and SSDs"

@arrjay I've used a few PCIe 5 SSDs now, it's ridiculous

@Shrigglepuss @arrjay

The two biggest computer upgrades I've seen are multicore and SSDs.

@jookia @Shrigglepuss @arrjay IMO the M4/NVMe SSDs feel to (PCI) SSDs like SSDs felt to HDDs

@arrjay @Shrigglepuss It really was around that exact time, isn't it?

4-8GB of RAM was good enough for almost everything, we had multi-core processors for parallel tasks, we had GHz clock speeds. Our screens were all at least "high definition" LCDs. Nehalem was kind of a game changer, especially for laptops (lower TDP, good iGPU, some good modern features), and SSDs getting cheaper were the last bottleneck surpassed.

Everything since then was either minor, a huge cost/size/power increase, or actually inconvenient as fuck and less durable.

@eldaking @Shrigglepuss I'm still using a Xeon W3565 for "generic desktop stuff"

SSDs and a newer Radeon and it's...fine, y'know? totally competent day-to-day.

@Shrigglepuss I think most of us are really quite relieved that ten-year-old laptops are...fine, really. 4G of RAM is...fine, I guess. SSDs made laptops feel faster than any CPU improvements ever did.

Years ago I was excited to get another used laptop as a backup, or re-purpose a chromebook. these days I look around and think "I have too many laptops: I don't use all of them" and that's kind of amazing.

@spacehobo It is! I get people's old laptops given to me sometimes so I refurbish them and donate them onwards to a few local charities I'm in contact with.
I have 0 need for them, but a person being supported through homelessness can do a course and their admin on it if it can browse the web and do word processing still. There's simply too much tech out there now, and distributed all wrong

@Shrigglepuss Well, _some_ people need machines that fast. People who play high-end computer games, people who use video editing software, people who do professional graphics editing or 3D modelling, people working with AI. The kind of stuff for which you would use an incredibly expensive workstation 20-40 years ago, but which nowadays can be done with a machine for €800-3000.
However, we really need to make our high-end electronics, whether computer hardware or anything else, last a lot longer

@LordCaramac @Shrigglepuss That is like saying "some people need an oscilloscope" or "some people need a lathe". It is technically true but it should not change anything for anyone else.

@eldaking @Shrigglepuss I own two oscilloscopes, and I've been thinking about building a small lathe for myself using the electric motor from an old drill.

@eldaking @Shrigglepuss Most of the time I don't even start my big PC though because my RaspberryPi 400 is more than enough.

@LordCaramac @Shrigglepuss The need for high end computer for games is a marketing fabrication. Read the reviews of games from 10years ago and tell me how often you see "This is the best of its time and it still sucks but fortunately 10years from now we'll reach a decent level". Or tell me how bad it is today and how much you're looking forward for the graphics 10years from now.
We used to have fun with old consoles. We were only convinced they're old and their games ugly because we're showed nicer graphics now.
If were to freeze graphics quality, 3D modelling would also be suddenly less demanding.

I would add that games are made more and more complex for no reason. Nowadays you need to follow series of tutorials to play. Is there any fun in a tutorial?? Where is the good old "launch and you'll have it figured out in 5mins"? Because we had "shooting game", and then "multiple choices of weapons" and soon "full training on the weapons available to all armies in the world". Why not add an exam at the end of the damned game course because that's what I would expect from a game: feeling I'm back at schools with all kinds of evaluations and tons of information to know by heart.

@matlag @Shrigglepuss I've never been into shooter games, my favourite games have always been adventure games (the kind of game where you play the lead in a story but don't know the story, and you need to solve riddles and find out what to say to people, how to find items, and what to do with them, in order for the story to progress), round-based strategy games, simulations, and riddle games. For many of them, you need to read the manual first in order to know what to do.

@matlag @Shrigglepuss And we're not quite at the point where more powerful GPUs don't make sense anymore. Graphics resolution and details and also physics simulations are not quite at the point where you can't tell computer graphics rendered in realtime apart from a video of the real world, but it's getting very close, and that's in part due to the fact that they're using more and more machine learning in games nowadays, which makes details and movements look more natural.

@LordCaramac @Shrigglepuss And frankly, do you enjoy the games more with a more realistic real world like rendering? Because that's precisely my point: the best games were never the best because they had the best rendering. The gameplay doesn't need that. Gamers don't spend more time on games now than they were 10 years ago because of more realistic renderings.

@matlag @Shrigglepuss Well, for the kind of game I like to play, graphics aren't even that important, but I'm not your typical gamer.

@matlag @LordCaramac @Shrigglepuss Monster Hunter and Final Fantasy have too many text tutorials for sure, but you can make learning fun. That’s what a video game is, isn’t it? An extended experience where you learn how to complete, or even, le gasp, enjoy completing as much of it as you want.

@MxVerda @matlag @Shrigglepuss My favourite kind of game is the one where you stumble through a story and try to find out what you need to do for things to happen. I still think the best games ever are The Secret of Monkey Island VGA version and Space Quest IV, both from the early 1990s, adventure games peaked back then and never got any better. However, I'd love to play a game like that with the looks of a Pixar feature film, which would be feasible, but adventure games aren't popular.

@MxVerda @matlag @Shrigglepuss They used to be very popular in the 1980s and 1990s, but the first few attempts at making 3D adventure games had horrible gameplay, so people mostly stuck to making 2D ones, and younger players who had grown up on 3D graphics never got into the genre.
I think it would be absolutely possible to make them to work in 3D, it's just that nobody is interested in that anymore. There isn't enough money in this genre.

@matlag @LordCaramac @Shrigglepuss Because depth is what makes the game interesting? As someone who actively engages in a MMO looter shooter (which much more complex than your average shooter), part of the fun is making a build and designing your own experience or engaging in the story, lore, quests, etc
Granted there is an onboarding problem for new players but that's a speed hump.
Improved performance allow artists improve the atmosphere, density of stuff, systems and tone if used correctly

@Shrigglepuss
I think I'm in the one field where it's still viable to build a 128 core behemoth with 4 GPUs as a workstation, and still want it under a desk for a single user.
Even for us it's getting rare and impractical.
I like to freak those folks out by A: being more senior and better at everything while also B: using a rockpro64.

@Shrigglepuss
This but also it's like some apps keep requiring more and more specs the more you have to give them and in the end you don't really do "more". Just the same things but less efficiently

@Shrigglepuss We do need loading that fast, but we had it 25 years ago, and lost it because the software keeps getting worse thanks to new ideologies and frameworks aimed at making the people who make software fungible and disempowered rather than a skilled craft we only need a small number of people who actually care to be doing.

@dalias @Shrigglepuss I would like to request rephrasing in shorter sentences to aid comprehension please

@MxVerda @Shrigglepuss 25 years ago we had apps that loaded instantly when you pressed enter/clicked. This is important to usability. The reason we don't now is bloated frameworks and development ideologies around using them. The driving force behind these ideologies is enabling corporations to put tens or hundreds of low paid replaceable devs on a project, which is very inefficient and hard to make work due to principles described in Mythical Man Month.

@Shrigglepuss

Well, DevOps, Serverless and Kubernetes on the server side will ensure that computers are never fast enough. ;-)

If you have been around long enough; GEM ran desktop publishing (Ventura?) on like a 2MB 386 with disk space less than the size of a regular web page today.

It is mind-blowing when thinking about it, and lived it.

@Shrigglepuss the last few notebooks I've bought all come from e-recyclers. great thinkpads, slapped linux on them, bam total daily drivers

1 out of the last 12-15 machines, in as many years, ive bought or built came from erecyclers. the only new one i got was for my kid's gaming rig, and even that runs linux.

i have two machines from 07, 08 which still run just fine with linux.

it can be done. just not by apple or microsoft.

used machines last decades. i have proof

@Shrigglepuss hmmm, ok I gotta be contrary but also I am curious.

I want to stream, run a vtuber with hand and face tracking, play a video game like subnautica (yes, it’s still buggy and not at all optimised years later, but I still love the concept), aaaand… run OBS for recording the whole thing too.
Also run a second screen for twitch chat (or peertube? Idk. I’ll figure something out).

What do.

/ would even that be covered by some eco-friendly non top-spec setup?

I don’t do anything else but video editing and maybe animation eventually.. ooh! Live 2D Cubism! That eats up a lot, right?

I am not particularly techie tbqh.

@Shrigglepuss I want a computer that's at least an order of magnitude faster so I can run a simulated processor using CXXRTL at 1 MHz instead of 100 kHz

this still isn't enough to run real-time tasks in, well, real time, but it's better

also high end FPGA compilation tasks regularly take 4-8 hours, up to 24 in some cases; I would definitely want that to be faster

@Shrigglepuss My frustration with video games is that they keep requiring more powerful GPU. I don’t want to have better graphics than Portal 2, I want to have fun.

@Shrigglepuss imo the last upgrade I got was necessary. Getting Linux Kernel compile times down from almost an hour on my old system to "barely two digits minutes" is a *huge* time saver if you regularly do kernel dev.
I do agree though that for normal usage (mostly web browsing) an x230 with 3rd gen Intel Core is still useable.

@jakob @Shrigglepuss
1/2Reposting this:
I started electromusic creation in 1992 on a Mac with a 25 mHz processor. The Hyperprism app let me move the mouse to modulate pitch & time of a 16-bit file as it was playing, clean audio ready to record.
Nothing remotely like this appears available to-day for machines with 2.6 GIGAhertz processors.

@jakob @Shrigglepuss
2/2
Could someone more learned in personal computers assure me this is NOT because most of the machines' processing is dedicated to net surveillance, creativity a hood ornament to attract PC & Apple shoppers?
Corollary: If I made my aging brain struggle to learn Unix, would that system offer more creative opportunity & less spying?