

I call bullshit. They might say they’re opt-in, I bet they have some way to use the personal data that technically doesn’t violate very specific wording of the rule.
I call bullshit. They might say they’re opt-in, I bet they have some way to use the personal data that technically doesn’t violate very specific wording of the rule.
And a bunch of people didn’t but we don’t talk about them, it was the norm back then.
That you know of
Or at least willing to push whatever barely working bullshit they forgot to test.
on things that actually happend.
The thing you’re saying didn’t happen literally happen, what are you talking about?
OK, OK. I will ask our IT to buy some for the office, since it’s such a general purpose computer, it’s cheaper than laptops we’re using. I would probably buy one for my mom, tax season is coming up, she will like the upgrade, her computer is quite old, and this general purpose computer will help her a lot. Then I will get one for my wife, this general purpose computer will help her with her scientific research. I would probably not get one for my uncle, he’s not great with computers, and the only thing he does is playing games, some dedicated gaming console will better suit him, he’s always a bit confused with the general purpose laptops, he’ll need something that’s dedicated for gaming.
Shit like that needs to be stopped while it’s just someone’s wild wet dreams. When it will become solid plans with budget and deadlines, it will already be too late
Framework’s whole model is to accommodate for exceptions.
Some people have dead laptops with fried motherboards or dead batteries, or cracked screens, that are absolutely unrepairable but have memory and alive ssds. I know I do.
More powerful i5 with 32GM RAM and a bunch of expansion cards amounted to about 1200€. A bit more than similarly specked 15-17 laptops, but preliminary I couldn’t find anything with this speks in this formfactor, so I couldn’t compare properly.
It’s a portable gaming console with in-built gaming controls. What do you think it is?
Most people didn’t grew up with Windows or Mac, that was a blip in time, most people grew up with a phone. When it comes to PC they’re a blank slate, they will have as much familiarity with the idea of a Windows start menus as they are with Linux console. That is to say, they saw it in a movie.
Most people do know how to use a computer though.
That was kind of true for a brief period of time. And even then it wasn’t true entirely. Now most people encounter a computer when they enter the workforce. They know shit about shit, they never had to tinker with computers, most of them never had one outside of some chromebook that allowed them to render two web pages. In most cases they start from basically blank slate.
Most people do in fact associate a cog with settings.
Most people don’t know that it’s cog. Most people don’t know it’s a button. Most people don’t have concept of a button in mind. Most people entering workforce right this moment never used a mouse to press a cog button in their life. Unless they’re in IT or engineering.
Also, I’m not talking about fixing problems
This is usually when you kind of required to use console on Linux, that’s why I was talking about it.
But my broader point was against so called intuitive self-explanatory nature of the menu you have to click with your mouse.
In my latest experience, pointing someone coming from Windows or Mac towards Ubuntu or whatever Ubuntu-derrivative is the most popular now, brings worse experience than giving them Arch. All the problems with Ubuntu that will eventually arise will not be fun and basic. Have you ever tried to teach a noobie to install the correct NVidia drivers? Or teach someone how to upgrade their system-wide python version so their third-party ppa repo will correctly update (one that you just taught them how to add)?
I don’t know, maybe like 15 years ago that was different, but right now I can tell you from numerous experiences, Arch with KDE goes down way more smoothly than any Ubuntu or even Debian.
Unless there are zero problems and everything just works, and the user just uses the computer as an interface for Chrome, in which case it kind of doesn’t matter.
but not really a UX
What else could it be if not UX. Not being able to setup a shortcut for the keyboard layout change without a bunch of bullshit hoops is an eXperience I have as a User.
Any UI could be faulted for that then
Yes, it’s a metric by which we measure the experience. Sometimes things should and could be easily customisable, and if they aren’t, it’s a fault of the UI.
As for apps not written for it again, not something they have control over
If they’re making a window manager, they need to consider apps that user might run with this window manager. If for example a browser doesn’t render half of the internet correctly because they added an unexpected rendering conventions, it’s a shit browser. Same could be said about desktop environment.
Other DE expected to run apps, Gnome expects that you write your app with Gnome in mind, that’s a big difference.
If you don’t see the difference between a gaming console and a general purpose personal computer, I really don’t know what to tell you.
I agree on consistency. It does have vision and it is consistently implemented.
It has different problems. It doesn’t play well with apps written not for it, it doesn’t allow for a good deal of customisation, and full of bugs and questionable decisions. All the UI stuff is subjective, but bugs and unresponsiveness isn’t.
We should’ve stop doing it before he got into the office. Now his ramblings are interpreted by his followers as laws, so you have to pay the most attention possible to it. For your own safety, and for the safety of others.
Hey, I do you one better, I worked professionally as a person teaching people to use various softwares, but it only taught me that we know even less about what people think and how people operate, than we thought we know.
That’s my point, I don’t believe blogposts paint the whole picture.