[-] FizzyOrange@programming.dev 11 points 1 week ago

I used to use Git stash but I found in the end I found just making "real" commits was better.

[-] FizzyOrange@programming.dev 12 points 2 months ago

If you are expecting stuff to never go wrong and software to never be updated in a way you disagree with on Linux then you're in for disappointment.

Remember the KDE kidney bean? What about Gnome's idiotic hamburger menus?

[-] FizzyOrange@programming.dev 12 points 3 months ago* (last edited 3 months ago)

Yeah I think that's mostly a myth. When I looked up salaries they were definitely good (for programming; amazing for the average person), but not "I would write COBOL for that" good.

There aren't really that many old COBOL systems around. I think it's mostly just over-reported because you can write an article about how some government department still uses COBOL but you can't write about one that switched to Java.

[-] FizzyOrange@programming.dev 13 points 3 months ago

Maybe not dumb but I've definitely been forced to at least partly learn a few terrible languages so I could use some system:

  • PHP so I could write custom linters for Phabricator. Pretty successful. PHP is a bad language but it's fairly easy to read and write.
  • Ruby so I could understand what the hell Gitlab is doing. Total failure here, Ruby is completely incomprehensible especially in a large codebase.
  • OCaml so I can work on a super niche compiler written in OCaml. It's a decent language except the syntax is pretty terrible, OPAM is super buggy, and I dunno if it's this codebase or just OCaml people in general but there are approximately zero comments and identifiers are like ityp, nsec, ef_bin... The sort of names where you already need to know what they are.
[-] FizzyOrange@programming.dev 13 points 3 months ago

That brings more problems. Despite the scaling challenges monorepos are clearly the way to go for company code in most cases.

Unfortunately my company heavily uses submodules and it is a complete mess. People duplicating work all over the place, updates in submodules breaking their super-modules because testing becomes intractable. Tons of duplicate submodules because of transitive dependencies. Making cross-repo changes becomes extremely difficult.

[-] FizzyOrange@programming.dev 11 points 3 months ago

IMO Julia just had way too many big issues to gain critical mass:

  1. Copied 1-based indexing from MATLAB. Why? We've known that's the worse option for decades.

  2. For ages it had extremely slow startup times. I think because it compiles everything from C, but even cached it would take like 20s to load the plotting library. You can start MATLAB several times in that time. I believe they improved this fairly recently but they clearly got the runtime/compile time balance completely wrong for a research language.

  3. There's an article somewhere from someone who was really on board with Julia about all the issues that made them leave.

I still feel like there's space for a MATLAB replacement... Hopefully someone will give it a better attempt at some point.

[-] FizzyOrange@programming.dev 11 points 3 months ago* (last edited 3 months ago)
  • Pijul: patch-based like Darcs but apparently solves its performance issues. In theory this improves conflict resolution.
  • Jujutsu: kind of an alternative front-end to a git repo (but not a front-end to git). Has some different ideas, like no staging area (draft commit), and some other stuff I can't remember.
  • Sapling: from Facebook. Unfortunately only part of it is available. The server is not public yet (I guess it's tired up in Facebook infrastructure too much).

And it's definitely not a solved problem. Aside from the obvious UX disaster, Git has some big issues:

  • Monorepo support is relatively poor, especially on Mac and Linux.
  • Submodule support is extremely buggy and has particularly bad UX even for Git.
  • Support for large files via LFS is tacked on and half-arsed.
  • Conflict resolution is very very dumb. I think there are third party efforts to improve this.

I think the biggest issue is dealing with very large code bases, like the code for a mid-large size company. You either go with a monorepo and deal with slowness, Windows-only optimisations and bare minimum partial checkout support.

Or you go with submodules and then you have even bigger problems. Honestly I'm not sure there's really an answer for this with Git currently.

It's not hard to imagine how this might work better. For instance if Git repos were relocatable, so trees were relative to some directory, then submodules could be added to a repo natively just by adding the commits and specifying the relative location. (Git subtree almost does this but again it's a tacked on third party solution which doesn't integrate well, like LFS.)

[-] FizzyOrange@programming.dev 12 points 3 months ago

Someone find the commit where they accidentally removed this critical component 😄

[-] FizzyOrange@programming.dev 12 points 4 months ago

I mean... let's just hope he isn't doing this professionally.

[-] FizzyOrange@programming.dev 13 points 5 months ago

A recent notable example is xz, but there’s also event-stream npm package a few years ago that got infected with Bitcoin stealing code.

They're asking if the entire project is somehow fake, not if it's a real project that got backdoored. That's obviously impossible to tell just based on stars, language quality, and similar heuristic signals.

[-] FizzyOrange@programming.dev 11 points 5 months ago

You were so preoccupied wondering what asinine comment you could make that you never bothered to read the article and learn the reasons that they should.

[-] FizzyOrange@programming.dev 12 points 6 months ago

Your manager is an idiot.

view more: ‹ prev next ›

FizzyOrange

joined 1 year ago