Yes…? All are except Microsoft, which is why most companies I work with aren’t looking that way.
Yes…? All are except Microsoft, which is why most companies I work with aren’t looking that way.
I know several large companies looking to Microsoft, Xen, and Proxmox. Though the smart ones are more interested in the open source solutions to avoid future rug-pulls.
2009 era was also when Intel leveraged their position in the compiler market to cripple all non-Intel processors. Nearly every benchmarking tool used that complier and put an enormous handicap on AMD processors by locking them to either no SSE or, later, back to SSE2.
My friends all thought I was crazy for buying AMD, but accusations had started circulating about the complier heavily favoring Intel at least as early as 2005, and they were finally ordered to stop in 2010 by the FTC… Though of course they have been caught cheating in several other ways since.
Everyone has this picture in their heads of AMD being the scrappy underdog and Intel being the professional choice, but Intel hasn’t really worn the crown since the release of Athlon. Except during Bulldozer/Piledriver, but who can blame AMD for trying something crazy after 10 years of frustration?
Dishonor on you! Dishonor on your cow!
I host my own to avoid running into timeouts, fairly easy
MRSA infection following hospital admittance for Pneumonia. That shit is serious and way more prevalent than people think, it’s just that it usually kills people who are already terminally ill.
Unlikely to be an assassination. But not impossible. Either way, looks very bad.
The recommendation to shareholders from the independent advisor who proxies Boeing is to vote out several board members who are responsible for safety and QA. Crazy to see at a Fortune 100.
You found one video supporting your viewpoint. Kaspersky’s role in Russian intelligence has been an open secret since the mid 2010s. This is Facebook Anti-Vaxxer “research” methodology.
No, I’m not conflating “a” with “b”. I’m using stability exactly as it’s used in physics.
https://phys.libretexts.org/Bookshelves/College_Physics/College_Physics_1e_(OpenStax)/09%3A_Statics_and_Torque/9.03%3A_Stability
My point is, it’s a completely valid use of the word. And yes, so is reliable, though I think “reliable” fails to capture the essence of the system changing but maintaining it’s state, hence why we don’t study “reliable systems” in physics.
I recommend picking something else to be pedantic about.
Amazingly, for someone so eager to give a lesson in linguistics, you managed to ignore literal definitions of the words in question and entirely skip relevant information in my (quite short) reply.
Both are widely used in that context. Language is like that.
Further, the textbook definition of Stability-
the quality, state, or degree of being stable: such as
a: the strength to stand or endure : firmness
b: the property of a body that causes it when disturbed from a condition of equilibrium or steady motion to develop forces or moments that restore the original condition
c: resistance to chemical change or to physical disintegration
Pay particular attention to “b”.
The state of my system is “running”. Something changes. If the system doesn’t continue to be state “running”, the system is unstable BY TEXTBOOK DEFINITION.
Both are widely used in that context. Language is like that.
I think the confusion comes from the meaning of stable. In software there are two relevant meanings:
Unchanging, or changing the least possible amount.
Not crashing / requiring intervention to keep running.
Debian, for example, focuses on #1, with the assumption that #2 will follow. And it generally does, until you have to update and the changes are truly massive and the upgrade is brittle, or you have to run software with newer requirements and your hacks to get it working are brittle.
Arch, for example, instead focuses on the second definition, by attempting to ensure that every change, while frequent, is small, with a handful of notable exceptions.
Honestly, both strategies work well. I’ve had debian systems running for 15 years and Arch systems running for 12+ years (and that limitation is really only due to the system I run Arch on, rather than their update strategy.
It really depends on the user’s needs and maintenance frequency.
Author doesn’t seem to understand that executives everywhere are full of bullshit and marketing and journalism everywhere is perversely incentivized to inflate claims.
But that doesn’t mean the technology behind that executive, marketing, and journalism isn’t game changing.
Full disclosure, I’m both well informed and undoubtedly biased as someone in the industry, but I’ll share my perspective. Also, I’ll use AI here the way the author does, to represent the cutting edge of Machine Learning, Generative Self-Reenforcement Learning Algorithms, and Large Language Models. Yes, AI is a marketing catch-all. But most people better understand what “AI” means, so I’ll use it.
AI is capable of revolutionizing important niches in nearly every industry. This isn’t really in question. There have been dozens of scientific papers and case studies proving this in healthcare, fraud prevention, physics, mathematics, and many many more.
The problem right now is one of transparency, maturity, and economics.
The biggest companies are either notoriously tight-lipped about anything they think might give them a market advantage, or notoriously slow to adopt new technologies. We know AI has been deeply integrated in the Google Search stack and in other core lines of business, for example. But with pressure to resell this AI investment to their customers via the Gemini offering, we’re very unlikely to see them publicly examine ROI anytime soon. The same story is playing out at nearly every company with the technical chops and cash to invest.
As far as maturity, AI is growing by astronomical leaps each year, as mathematicians and computer scientists discover better ways to do even the simplest steps in an AI. Hell, the groundbreaking papers that are literally the cornerstone of every single commercial AI right now are “Attention is All You Need” (2017) and
“Retrieval-Augmented Generation for Knowledge -Intensive NLP Tasks” (2020). Moving from a scientific paper to production generally takes more than a decade in most industries. The fact that we’re publishing new techniques today and pushing to prod a scant few months later should give you an idea of the breakneck speed the industry is going at right now.
And finally, economically, building, training, and running a new AI oriented towards either specific or general tasks is horrendously expensive. One of the biggest breakthroughs we’ve had with AI is realizing the accuracy plateau we hit in the early 2000s was largely limited by data scale and quality. Fixing these issues at a scale large enough to make a useful model uses insane amounts of hardware and energy, and if you find a better way to do things next week, you have to start all over. Further, you need specialized programmers, mathematicians, and operations folks to build and run the code.
Long story short, start-ups are struggling to come to market with AI outside of basic applications, and of course cut-throat silicon valley does it’s thing and most of these companies are either priced out, acquired, or otherwise forced out of business before bringing something to the general market.
Call the tech industry out for the slime is generally is, but the AI technology itself is extremely promising.
Xorg needs several of it’s extensions to function at the same level as Weston+Wayland. At minimum you’d need xorg server, proto, lib, and driver… Maybe a few other things I’m forgetting.
Weston is by file size, about equal to xserver. But really there is more utility in Weston than xserver.
They put ads in.
But what if they don’t need that many people working on Firefox? What if AI, VR, and Network programmers are fundamentally different in skills from a web browser programmer, and don’t want to change their career trajectory?
What if, by not firing these people, Mozilla folds in 3 years and everyone ends up without a job?
Not every project makes 2x the money with 2x the people. It’s the “Why can’t 9 Mom’s give birth in 1 month” problem. Hell most projects will slow down significantly with an influx like that.
Look, layoffs suck, but it’s quid-pro-quo. Employees can leave at any time too. If a company isn’t abusive or arbitrary with their layoff decisions, has decent layoff benefits, and doesn’t refuse to give job recommendations, it’s hard for me to hold it against the employer.
No argument here. I’m a PF2e player since beta and won’t touch HasWizards products with a 10 foot disintegrate.
The base ruleset (SRD) only. Everything else is OGL, which has proven to be as open as Wizards Hasbro wants to make it.
Over the years of using Vim both professionally and for my own uses, I’ve learned to just install LunarVim and only add a handful of packages/overrides. Otherwise I just waste too much time tinkering and not doing the things I need to.