• 0 Posts
  • 31 Comments
Joined 1 year ago
cake
Cake day: June 1st, 2023

help-circle



  • Kata1yst@kbin.socialtoTechnology@lemmy.worldThe decline of Intel..
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    edit-2
    2 months ago

    2009 era was also when Intel leveraged their position in the compiler market to cripple all non-Intel processors. Nearly every benchmarking tool used that complier and put an enormous handicap on AMD processors by locking them to either no SSE or, later, back to SSE2.

    My friends all thought I was crazy for buying AMD, but accusations had started circulating about the complier heavily favoring Intel at least as early as 2005, and they were finally ordered to stop in 2010 by the FTC… Though of course they have been caught cheating in several other ways since.

    Everyone has this picture in their heads of AMD being the scrappy underdog and Intel being the professional choice, but Intel hasn’t really worn the crown since the release of Athlon. Except during Bulldozer/Piledriver, but who can blame AMD for trying something crazy after 10 years of frustration?








  • Kata1yst@kbin.socialtolinuxmemes@lemmy.worldArch with XZ
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    3 months ago

    Amazingly, for someone so eager to give a lesson in linguistics, you managed to ignore literal definitions of the words in question and entirely skip relevant information in my (quite short) reply.

    Both are widely used in that context. Language is like that.

    Further, the textbook definition of Stability-

    the quality, state, or degree of being stable: such as

    a: the strength to stand or endure : firmness

    b: the property of a body that causes it when disturbed from a condition of equilibrium or steady motion to develop forces or moments that restore the original condition

    c: resistance to chemical change or to physical disintegration

    Pay particular attention to “b”.

    The state of my system is “running”. Something changes. If the system doesn’t continue to be state “running”, the system is unstable BY TEXTBOOK DEFINITION.



  • I think the confusion comes from the meaning of stable. In software there are two relevant meanings:

    1. Unchanging, or changing the least possible amount.

    2. Not crashing / requiring intervention to keep running.

    Debian, for example, focuses on #1, with the assumption that #2 will follow. And it generally does, until you have to update and the changes are truly massive and the upgrade is brittle, or you have to run software with newer requirements and your hacks to get it working are brittle.

    Arch, for example, instead focuses on the second definition, by attempting to ensure that every change, while frequent, is small, with a handful of notable exceptions.

    Honestly, both strategies work well. I’ve had debian systems running for 15 years and Arch systems running for 12+ years (and that limitation is really only due to the system I run Arch on, rather than their update strategy.

    It really depends on the user’s needs and maintenance frequency.


  • Author doesn’t seem to understand that executives everywhere are full of bullshit and marketing and journalism everywhere is perversely incentivized to inflate claims.

    But that doesn’t mean the technology behind that executive, marketing, and journalism isn’t game changing.

    Full disclosure, I’m both well informed and undoubtedly biased as someone in the industry, but I’ll share my perspective. Also, I’ll use AI here the way the author does, to represent the cutting edge of Machine Learning, Generative Self-Reenforcement Learning Algorithms, and Large Language Models. Yes, AI is a marketing catch-all. But most people better understand what “AI” means, so I’ll use it.

    AI is capable of revolutionizing important niches in nearly every industry. This isn’t really in question. There have been dozens of scientific papers and case studies proving this in healthcare, fraud prevention, physics, mathematics, and many many more.

    The problem right now is one of transparency, maturity, and economics.

    The biggest companies are either notoriously tight-lipped about anything they think might give them a market advantage, or notoriously slow to adopt new technologies. We know AI has been deeply integrated in the Google Search stack and in other core lines of business, for example. But with pressure to resell this AI investment to their customers via the Gemini offering, we’re very unlikely to see them publicly examine ROI anytime soon. The same story is playing out at nearly every company with the technical chops and cash to invest.

    As far as maturity, AI is growing by astronomical leaps each year, as mathematicians and computer scientists discover better ways to do even the simplest steps in an AI. Hell, the groundbreaking papers that are literally the cornerstone of every single commercial AI right now are “Attention is All You Need” (2017) and
    “Retrieval-Augmented Generation for Knowledge -Intensive NLP Tasks” (2020). Moving from a scientific paper to production generally takes more than a decade in most industries. The fact that we’re publishing new techniques today and pushing to prod a scant few months later should give you an idea of the breakneck speed the industry is going at right now.

    And finally, economically, building, training, and running a new AI oriented towards either specific or general tasks is horrendously expensive. One of the biggest breakthroughs we’ve had with AI is realizing the accuracy plateau we hit in the early 2000s was largely limited by data scale and quality. Fixing these issues at a scale large enough to make a useful model uses insane amounts of hardware and energy, and if you find a better way to do things next week, you have to start all over. Further, you need specialized programmers, mathematicians, and operations folks to build and run the code.
    Long story short, start-ups are struggling to come to market with AI outside of basic applications, and of course cut-throat silicon valley does it’s thing and most of these companies are either priced out, acquired, or otherwise forced out of business before bringing something to the general market.

    Call the tech industry out for the slime is generally is, but the AI technology itself is extremely promising.





  • But what if they don’t need that many people working on Firefox? What if AI, VR, and Network programmers are fundamentally different in skills from a web browser programmer, and don’t want to change their career trajectory?

    What if, by not firing these people, Mozilla folds in 3 years and everyone ends up without a job?

    Not every project makes 2x the money with 2x the people. It’s the “Why can’t 9 Mom’s give birth in 1 month” problem. Hell most projects will slow down significantly with an influx like that.

    Look, layoffs suck, but it’s quid-pro-quo. Employees can leave at any time too. If a company isn’t abusive or arbitrary with their layoff decisions, has decent layoff benefits, and doesn’t refuse to give job recommendations, it’s hard for me to hold it against the employer.