[-] EmbeddedEntropy@lemmy.ml 6 points 10 months ago* (last edited 10 months ago)

I must be ancient then. I recognized, and I think used, all of those cards/chips.

Some personally. Some at work. At work I used to maintain and MS-DOS / early Windows graphics program. I had to test the program’s compatibility with a stack of graphics cards.

[-] EmbeddedEntropy@lemmy.ml 11 points 11 months ago

I’d rather have an M.2 connector without requiring a HAT.

I’ll stick with my Orange Pi 5 for now which comes with one, tyvm.

[-] EmbeddedEntropy@lemmy.ml 4 points 11 months ago

I've written hundreds (thousands?) of GNU Makefiles over the past 30 years and never had a need to unconditionally run particular targets before all others. GNU Make utility is a rule-based language. I'd suggest what you're attempting to do is enforce an imperative programming language model onto a rule-based programming language model, which you're going to run into trouble trying to code in a different language model than the tool's native model.

Can you provide what you mean by check the environment, and why you'd need to do that before anything else?

For example, in the past I've want to determine if and which particular command was installed, so I have near the top of my Makefile:

container_command_defaults = podman docker
container_command_default_paths := $(shell command -v $(container_command_defaults))

ifndef container_command
  container_command = $(firstword $(container_command_default_paths))
  ifeq ($(container_command),)
    $(error You must have docker or podman installed)
  endif
endif

Using the := operator with $(shell ...) is a way to run a command while GNU Make is initially parsing your Makefile. Normally, using := assignment operator is antithetical to a rule-based language, so you want to limit its use as much as possible, but unusual exceptions can exist.

I'm also unclear what you mean by "ensure variables are set". What kind of variables?

The above snippet shows how you can check if a makefile variable is set when the Makefile is first parsed, if not, declare an error and exit. (The same approach works for environment variables too.)

Preparing a particular layout ahead of time is not the best approach. I'd suggest a given layout is nothing more than dependencies that should be declared as such.

Also, running specific targets or rules unconditionally can lead to trouble later as your Makefile grows up. You may eventually have additional targets that say provide information about the build's state or run checks or tests. You wouldn't want those targets necessarily to go off and build an entire tree of directories for you or take other unnecessary actions.

If you want to ensure certain directories are present, add those as dependencies for those targets with the | character. For example:

build_directory ?= build
build_make = $(MAKE) ...
targets = ...

all: FORCE | $(build_directory)
	$(build_make) $(targets)

$(build_directory):
	mkdir -p -- '$@'

Even though I've been writing GNU Makefiles for decades, I still am learning new stuff constantly, so if someone has better, different ways, I'm certainly up for studying them.

[-] EmbeddedEntropy@lemmy.ml 12 points 1 year ago

I think the joke would have been better and more understandable if it had used different corporate names rather than states. But, of course, that might have been legally problematic.

[-] EmbeddedEntropy@lemmy.ml 14 points 1 year ago* (last edited 1 year ago)

Unless Gitlab changed things very recently, you only needed to provide a CC/DC if you wanted the free CI/CD pipeline enabled for your projects. Decline, and everything except the free pipeline works just fine.

[-] EmbeddedEntropy@lemmy.ml 8 points 1 year ago

A 1979 TV show about a guy who put together a junk spaceship to salvage junk from the moon: Salvage 1.

My teenage self found it entertaining at the time. Hmmm, now where did I leave my parrot? I wonder if he could help me find a copy…

[-] EmbeddedEntropy@lemmy.ml 13 points 1 year ago

If they can have someone program a fee in their accounting systems, that means they know exactly what that the fee is and under what conditions it’s applicable. It’s trivial from there to sort, filter, and list them.

[-] EmbeddedEntropy@lemmy.ml 9 points 1 year ago

I think they’re trying to simplify the exposed interfaces simplifying everyone else’s job at the expense of making a more complex implementation.

[-] EmbeddedEntropy@lemmy.ml 11 points 1 year ago

At my company, we have around 400,000 servers in production. When we last surveyed them, we found several thousand over 12 years old, with the oldest at 17 years. And that wasn’t counting our lab and admin servers which could run even older because they’re often repurposed from prod decomms.

We had a huge internal effort to virtualize their loads, but in the end, only about 15% were transferred just due to the sheer number of hidden edge cases that kept turning up.

[-] EmbeddedEntropy@lemmy.ml 7 points 1 year ago

No, I am no “free” to do whatever I want. I want to distribute that source is strictly allowed under the GPL, but then RH penalizes me for exercising that right by terminating that account. That’s a restriction. How is being penalized for doing what I’m allowed to do not a restriction?

How about yet another angle for you. For example, I download and distribute the source RPM for gcc for the version running on my box. RH terminates my account. Now I want to download and distribute the source RPM for the kernel running on my box. How do I do that with a terminated account?

[-] EmbeddedEntropy@lemmy.ml 12 points 1 year ago* (last edited 1 year ago)

They are violating section 6 of the GPLv2 by adding further restrictions.

But ignoring that, I haven’t heard anyone discussing what could happen once this anti-GPL angle goes beyond Red Hat. What’s going to happen once internet connected IoT devices or say a company like Tesla pull the same technique. If Tesla kills your account for any reason, you lose all your paid for services and your car becomes a brick.

[-] EmbeddedEntropy@lemmy.ml 30 points 1 year ago

Since being forced to use this terrible communication method in my teams and groups, I’ve been copy-and-pasting good Q&A threads into text files that I push to an enterprise GitHub repo for perma-store. At least that way other engineers and myself can either use GitHub’s search or clone the repo locally, grep it, and even contribute back with PRs. Sometimes from there, turn into a wiki, but that’s pretty rare. My approach is horribly inefficient and so much stuff is still lost, but it’s better than Discord’s search or dealing with Confluence.

view more: next ›

EmbeddedEntropy

joined 1 year ago