This article outlines an opinion that organizations either tried skills based hiring and reverted to degree required hiring because it was warranted, or they didn’t adapt their process in spite of executive vision.

Since this article is non industry specific, what are your observations or opinions of the technology sector? What about the general business sector?

Should first world employees of businesses be required to obtain degrees if they reasonably expect a business related job?

Do college experiences and academic rigor reveal higher achieving employees?

Is undergraduate education a minimum standard for a more enlightened society? Or a way to hold separation between classes of people and status?

Is a masters degree the new way to differentiate yourself where the undergrad degree was before?

Edit: multiple typos, I guess that’s proof that I should have done more college 😄

  • SpaceNoodle@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    10 months ago

    I’m not going to dig up decades-old code for you to pick over - but I do recall that the labyrinthian and ever-increasingly complex and buggy behavior of the multitudinous builtins was an undending pain in the ass.

    • tsonfeir@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      I was just wondering if you had anything off the top of your head. Any language can be spaghetti if you make it spaghetti. 🤷‍♂️

            • tsonfeir@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              10 months ago

              Using them incorrectly, would be incorrect. Without an example, it’s hard to tell.

              But, pretty much everyone was doing the web “wrong” back in the day. Server-side html generation? Gag me. Or worse, inserting PHP into html?! Shudder. But that’s how it was for many backed languages.

              IMO, nowadays, if it’s not a reactive js front end using the backend as an API, it’s doing it wrong. But I’m sure in 10 years we will all be laughing at how seriously we were taking JavaScript.

              • SpaceNoodle@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                It makes me shudder to think how the modern web is just treating browsers as JavaScript application environments. Converting a little backend load into a massive frontend headache is the exact opposite of where we thought we were headed twenty years ago.

                • tsonfeir@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  10 months ago

                  Well, it’s not a massive front end headache if you do it right. And, by passing off a lot of the easy stuff to the browser, your server can handle more load. As a bonus, it’s easier to decouple your architecture. Not only is this more efficient, but it’s easier to maintain, test, and deploy.

                  • SpaceNoodle@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    10 months ago

                    It’s sacrificing efficiency on the frontend for the backend. It makes the backend easier to test, while making the frontend more complex. It significantly jacks up requirements for the clients while reducing them for the host.

                    You backend people are forgetting that there are devices on the other end that need to process and render this bullshit. It sells more new iPhones, though, so who the fuck cares?