• simonced@lemmy.one
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 days ago

    Letting your text editor write your code, not using version control… I don’t feel sad at all. Hope lesson was learned.

  • gamer@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    7 days ago

    I made a game engine and a game back in highschool, but all that code is lost because I didn’t know how to use git. I knew git existed (and even knew enough to know it was better than mercurial or svn), but I was too lazy to learn.

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      8 days ago

      Tbf you have to do that for the first push, if a Readme file was autogenerated

        • Lucy :3@feddit.org
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          8 days ago

          Huh? I’m talking about existing code being in a dir, then initting a git repo there, creating a pendant on your hoster of choice and then pushing it there. Wouldn’t cloning the repo from step 3 to the code from step 1 overwrite the contents there?

          • stembolts@programming.dev
            link
            fedilink
            arrow-up
            8
            ·
            edit-2
            8 days ago

            There are multiple solutions to this without using --force.

            Move the files, clone, unmove the files, commit, push being the most straightforward that I can summon at this time… but I’ve solved this dozens of times and have never use --force.

            • Hoimo@ani.social
              link
              fedilink
              arrow-up
              3
              ·
              8 days ago

              If your remote is completely empty and has no commits, you can just push normally. If it has an auto-generated “initial commit” (pretty sure Github does something like that), you could force push, or merge your local branch into the remote branch and push normally. I think cloning the repo and copying the contents of your local repo into it is the worst option: you’ll lose all local commits.

              • Jayjader@jlai.lu
                link
                fedilink
                arrow-up
                2
                ·
                6 days ago

                If it’s a single, generated, “initial” commit that I actually want to keep (say, for ex I used the forge to generate a license file) then I would often rebase on top of it. Quick and doesn’t get rid of anything.

              • stembolts@programming.dev
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                8 days ago

                True, in the situation with a local history maybe it’s worthwhile to --force to nuke an empty remote. In that case it is practical to do so. I just typically like to find non-force options.

          • dev_null@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            8 days ago

            Yeah, I was thinking of a new repo with no existing code.

            In your case you’d want to uncheck the creation of a readme so the hosted repo is empty and can be pushed to without having to overwrite (force) anything.

      • computergeek125@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 days ago

        Does that still happen if you use the merge unrelated histories option? (Been a minute since I last had to use that option in git)

        • Lucy :3@feddit.org
          link
          fedilink
          arrow-up
          3
          ·
          8 days ago

          Never have heard of that, but in the case of you also having a Readme that will be even more complicated, I imagine. So just adding -f is the easier option.

  • Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    141
    arrow-down
    1
    ·
    edit-2
    8 days ago

    “Developer”
    “my” 4 months of “work”

    Those are the ones easily replaced by AI. 99% of stuff “they” did was done by AI anyway!

  • dan@upvote.au
    link
    fedilink
    arrow-up
    130
    ·
    9 days ago

    Before Git, we used SVN (Subversion), and CVS before that. Microsoft shops used TFS or whatever it’s called now (or was called in the past)

    • i_stole_ur_taco@lemmy.ca
      link
      fedilink
      arrow-up
      56
      ·
      9 days ago

      Wasn’t it Visual SourceSafe or something like that?

      God, what a revolution it was when subversion came along and we didn’t have to take turns checking out a file to have exclusive write access.

      • dan@upvote.au
        link
        fedilink
        arrow-up
        23
        ·
        edit-2
        8 days ago

        Visual SourceSafe

        Yes! That’s the one I was struggling to remember the name of. My previous employer started on Visual SourceSafe in the 90s and migrated to Team Foundation Server (TFS) in the 2000s. There were still remnants of SourceSafe when I worked there (2010 to 2013).

        I remember TFS had locks for binary files. There was one time we had to figure out how to remove locks held by an ex-employee - they were doing a big branch merge when they left the company, and left all the files locked. It didn’t automatically drop the locks when their account was deleted.

        They had a bunch of VB6 COM components last modified in 1999 that I’m 80% sure are still in prod today. It was still working and Microsoft were still supporting VB6 and Classic ASP, so there wasn’t a big rush to rewrite it.

        • HarkMahlberg@kbin.earth
          link
          fedilink
          arrow-up
          7
          ·
          8 days ago

          Welcome to my world… our new lead architect has mandated that we move everything from TFS to GitLab before the end of the year. I hope it comes true.

          • Flames5123@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 days ago

            At the start of COVID, I migrated our three projects to git from VSS. I also wrote a doc for our other teams to do the same. It was amazing once we got it working. Small team of 3, but we started using feature branches which enabled us to easily merge everything into a testing branch and release only certain features at a time. So much cleaner.

            Before I left, I almost got semi automatic CI/CD working with Jenkins!

          • nogooduser@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 days ago

            I remember when our company split up and we had to give them the source code of some older versions that they still used. We couldn’t do that because the repo was corrupt meaning that we couldn’t access some older revisions. We had no problems using it day to day so nobody noticed which meant that all backups were also corrupted.

      • HarkMahlberg@kbin.earth
        link
        fedilink
        arrow-up
        8
        ·
        8 days ago

        Yeah VSS was the predecessor to TFS, and now TFS is called Azure DevOps… whatever the fuck that means, Microsoft needs to get it together with product naming. Anyway TFS sucks major rotten ass. I have my problems with git - namely user friendliness - but TortoiseGit has put all those troubles to rest.

        Nothing like that can fix TFS.

        • Pieisawesome@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 days ago

          I started at a company that uses ADO (migrating to GitHub this year) and it took me like 20 minutes to figure out how to change repositories in the UI… idk how they built something that unuser friendly

          • HarkMahlberg@kbin.earth
            link
            fedilink
            arrow-up
            5
            ·
            8 days ago

            I could go all day with my grievances… For some fucking reason, Team Foundation Server thought it would be a good idea to model their source control on folders and files rather than atomic nodes of changes like git.

            I’m sure someone thought this was intuitive, but it falls apart once you realize you can check in cross-branch or even cross-project files into a single changeset. This allows you to easily pollute projects you’re working on but didn’t intend to modify yet, if you forgot to exclude their files. And then, when your code reviewer checks the history of the project folder you modified, they don’t even notice all the files you changed that WEREN’T in that folder but were part of the same changeset. So you pass your review, and all the sudden there’s unwanted, unnoticed, and untested changes in some other project, with a nice code review stamp on them!

            And the entire checkout/checkin system is just flipping the read-only flag on the files in file explorer. It’s the most amateurish shit. If you edit a file in an open, active project, the file gets checked out automatically. But if you’re editing loose scripts that aren’t part of a bespoke SLN or CSPROJ, you have to check those out manually… which it will only tell you to do once you try to save the file.

            And then Visual Studio cannot understand that I might need to switch regularly between 2 types of version control systems. If you’re not on the same VCS plugin when you want to open a recent project on it, it doesn’t automatically switch it for you, it just refuses to load the project. The only way to reliably to switch is by going into the options menu, changing it there, THEN loading the project.

            git is practically made of grease compared to how stuttery and clunky TFS is. I’ll shed no tears for the fossils who are having a hard time learning git, they will be better off whether they realize it or not.

    • The_Decryptor@aussie.zone
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 days ago

      A place I worked at did it by duplicating and modifying a function, then commenting out the existing one. The dev would leave their name and date each time, because they never deleted the old commented out functions of course, history is important.

      They’d also copy the source tree around on burnt CDs, so good luck finding out who had the latest copy at any one point (Hint: It was always the lead dev, because they wouldn’t share their code, so “merging to main” involved giving them a copy of your source tree on a burnt disk)

    • GhostlyPixel@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 days ago

      My first SWE job out of college in 2019 they were still using SVN because none of the seniors could be bothered to learn how to use git.

      The “well this is how we’ve always done it” attitude had a death grip on that place

      • dan@upvote.au
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        8 days ago

        For what it’s worth, SVN is a much simpler object model compared to Git, which makes it easier to understand.

        It’s centralized rather than distributed like Git is, which has some disadvantages. Most operations require access to the server, as opposed to Git where you usually have a copy of the entire repo and can work offline. Git users can clone the repo from other users rather than relying on a centralized server.

        On the other hand, a centralized server also simplifies some things. For example, instead of commit hashes, SVN has revision numbers, which are natural numbers that start at 1 and are incremented for every commit. A lot of software that used SVN used to use the revision number as part of the version or build number.

        Git is definitely the source control system to choose today, but SVN can still have its place.

      • JakenVeina@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        Thank god, we STILL use TFS at work, and its core version control model is reeeeeally fucking awful.

    • ByteJunk@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      8 days ago

      Oh yeah, I remember using tortoiseCVS briefly.

      Mercurial and Bazaar also showed up at around the same time as git, I think all spurred by BitKeeper ending their free licenses for Linux kernel devs.

      An interesting shot to the foot, that one.

      BitKeeper was a proprietary version control system that somehow (and with a lot of controversy) ended up being adopted by a big chunk of the Linux kernel developers, while others were adamant against it.

      In any case, they provided free licenses to Linux devs, with some feature restrictions (including not being able to see full version history) only available for premium clients, while Devs who worked on open source competing systems were even barred from buying a licence.

      When someone started to work on a client that allowed access to these locked away features, they revoked the free licenses, and a host of solutions started being developed immediately. Linus Thorvalds himself started work on git, and that eventually got adopted by the whole Linux ecosystem and, nowadays, the world.

      As for BitKeeper, it’s been dead for years now.

  • darklamer@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    104
    ·
    8 days ago

    The first version control system I ever used was CVS and it was first released in 1986 so it was already old and well established when I first came to use it.

    Anyone in these past forty years not using a version control system to keep track of their source code have only themselves to blame.

    • barsoap@lemm.ee
      link
      fedilink
      arrow-up
      32
      ·
      8 days ago

      CVS was, for the longest time, the only player in the FLOSS world. It was bad, but so were commercial offerings, and it was better than RCS.

      It’s been completely supplanted by SVN, specifically written to be CVS but not broken, which is about exactly as old as git. If you find yourself using git lfs, you might want to have a look at SVN.

      Somewhat ironically RCS is still maintained, last patch a mere 19 months ago to this… CVS repo. Dammit I did say “completely supplanted” already didn’t I. Didn’t consider the sheer pig-headedness of the openbsd devs.

      • lud@lemm.ee
        link
        fedilink
        arrow-up
        5
        ·
        8 days ago

        Pretty sure GTA V use(d) SVN or something like that. I remember reading the source code and being surprised that they didn’t use GIT.

          • barsoap@lemm.ee
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            8 days ago

            You definitely need something else than git for large assets, yes, its storage layer is just not built for that and they way art pipelines generally work you don’t get merge conflicts anyway because there’s no sane way to merge things so artists take care to not have multiple people work on the same thing at the same time, so a lock+server model is natural. Also, a way to nuke old revisions to keep the size of everything under control.

      • I Cast Fist@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        8 days ago

        “We’ve always done things this way, we ain’t changing!” - some folks in the Foss community, like those RCS maintainers

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    87
    arrow-down
    1
    ·
    8 days ago

    I just want to pause a moment to wish a “fuck you” to the guy who named an AI model “Cursor” as if that’s a useful name. It’s like they’re expecting accidental google searches to be a major source of recruitment.

  • yarr@feddit.nl
    link
    fedilink
    English
    arrow-up
    73
    ·
    8 days ago

    It’s a scary amount of projects these days managed by a bunch of ZIP files:

    • Program-2.4.zip
    • Program-2.4-FIXED.zip
    • Program-2.4-FIXED2.zip
    • Program-2.4-FIXED-final.zip
    • Program-2.4-FIXED-final-REAL.zip
    • Program-2.4-FIXED-FINAL-no-seriously.zip
    • Program-2.4-FINAL-use-this.zip
    • Program-2.4-FINAL-use-this-2.zip
    • Program-2.4-working-maybe.zip
    • Program-2.4-FINAL-BUGFIX-LAST-ONE.zip
    • Program-2.4-FINAL-BUGFIX-LAST-ONE-v2.zip
    • Boakes@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      8 days ago
      • Program-1.5-DeleteThis.zip
      • Program-1.6-ScuffedDontUse.zip
      • CanWeDeleteThesePlease.txt (last edit 8 months ago)

      Inspired by a small collaboration project from a few years ago.

    • iegod@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      8 days ago

      If we’re talking actual builds then zip files are perfectly fine as long as the revs make chronological sense.

      • yarr@feddit.nl
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 days ago

        I’m not. I’m talking about in companies where dev A wants dev B to do some work, but they don’t use git or any kind of source control, so you email over a cursed ZIP file, then dev B does the work and sends it back with a different name. It’s a highly cursed situation.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    68
    ·
    edit-2
    9 days ago

    if this is real, that’s the kind of people who should be worried about being replaced by an ai

    it’s also Claude

    lmao

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      23
      ·
      8 days ago

      Was playing around with it. It’s neat tech. It’s interesting all the side projects I can spin up now. It absolutely cannot replace an engineer with a brain.

      I’ve caught so many little things I’ve had to fix, change. It’s an amazing way to kick off a project, but I can’t ever trust blindly what it’s doing. It can get the first 80% of a small project off the ground, and then you’re going to spend 7x as long on that last 20% prompt engineering it to get it right. At which point I’m usually like “I could have just done it by now”.

      I see kids now blindly trusting what it’s doing, and man are they going to fall face first in the corporate world. I honestly see a place for vibe coding in the corporate world. However I also see you still needing a brain to stitch it all together too.

      • Lucy :3@feddit.org
        link
        fedilink
        arrow-up
        10
        ·
        8 days ago

        Yeah, a coworker (also a trainee) spent 2 days trying to debug some C# MVC thing. It took me around 5 mins, from having last seen C# code 7 years ago, to realizing that the quotes were part of the literal string and needed to be checked too.

        Well he did literally everything with the internal ChatGPT instance (or so a coworker said, I don’t know which model actually runs there). I asked if he wrote JS code, he said no. Well even though there was JS in the cshtml file, he technically didn’t lie, as he didn’t write it.

  • zovits@lemmy.world
    link
    fedilink
    arrow-up
    53
    ·
    8 days ago

    It’s actually reassuring to see that despite all warnings and doomsayers there will still be opportunities for programmers capable of solving problems using natural intelligence.

    • finitebanjo@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      8 days ago

      If anything it feels like we’re the doomsayers trying to warn people that their AI bullshit won’t ever work and they’re just not listening as they lay off the masses and push insecure and faulty code.