• 0 Posts
  • 5 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle



  • myslsl@lemmy.worldtolinuxmemes@lemmy.worldHtop too
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    5 months ago

    If you have a fixed collection of processes to run on a single processor and unlimited time to schedule them in, you can always brute force all permutations of the processes and then pick whichever permutation maximizes and/or minimizes whatever property you like. The problem with this approach is that it has awful time complexity.

    Edit: There’s probably other subtle issues that can arise, like I/O interrupts and other weird events fwiw.


  • Machine learning techniques are often thought of as fancy function approximation tools (i.e. for regression and classification problems). They are tools that receive a set of values and spit out some discrete or possibly continuous prediction value.

    One use case is that there are a lot of really hard+important problems within CS that we can’t solve efficiently exactly (lookup TSP, SOP, SAT and so on) but that we can solve using heuristics or approximations in reasonable time. Often the accuracy of the heuristic even determines the efficiency of our solution.

    Additionally, sometimes we want predictions for other reasons. For example, software that relies on user preference, that predicts home values, that predicts the safety of an engineering plan, that predicts the likelihood that a person has cancer, that predicts the likelihood that an object in a video frame is a human etc.

    These tools have legitamite and important use cases it’s just that a lot of the hype now is centered around the dumbest possible uses and a bunch of idiots trying to make money regardless of any associated ethical concerns or consequences.