Over time, do you think this process could lock you into an inflexible state?
I'm reminded of the trade off between automation and manual work. Automation crystalizes process, and thus the system as a whole loses it's ability to adapt in a dynamic environment.
heh, Im using this font in my game. Picking fonts is hard, and I feel like I've just dipped my toe in the water so far. Im not 100% satisfied with the non-monospace font I use (Adobe Source Sans), but I have more important things to focus on right now
So happy I bought this game on GOG before they replaced it with the revamped version (modern looking art, etc).
I played through the orc campaign last year and had fun. It's definitely aged, but it makes me wonder if something like that could exist today. Story games are popular, and I think always will be (people like stories).
Instead of a solo protagonist, can we bring back the hero (a la WarCraft III) and their army? Or even the invisible god like WC2?
Ah wow , rip the awesome fonts used in the original game. The new ones are so "clean" and boring :(
Glad to see the art itself was not too badly modified. It's weird though, like the vegetation in the Farm building looks weird. The original version you can tell it's some kind of yellow fruit or vegetable but in the remaster the yellow dots are unusually small and don't really "feel right". Strikes me as AI upscaling rather than hand-crafted editing.
Thanks for putting this together. Shaders are something I wish I had the time to dive deep into, but since I'm making a game, my time is very limited for the time being. i.e. I only learn what I need to.
Only critique is.. if you're sharing to teach, your compact/one line [460] char GLSL code is a poor delivery mechanism.
> Only critique is.. if you're sharing to teach, your compact/one line [460] char GLSL code is a poor delivery mechanism.
Understandable. Though, the demos are here to illustrate "what you can do with the trick I'm sharing". It's like, I'm teaching you how to do watercolor, and illustrate it with some paintings you won't be able to perform just with that knowledge. They're meant to inspire you to create. You're not looking at a tutorial, you're looking at art.
For once instead of being shoved a ready-made solution there's a short explanation of the core idea with a live example, but instead of a fully documented shadertoy it's like the answer is ROT13'd which makes me itch for implementing a solution myself.
Same. I go one step further and create a macro _STOP which is defined as w/e your language's DebugBreak() is. And if it's really important, _CRASH (this coerces me to fix the issue immediately)
That is not the same thing at all. Unreachable means that entire branch cannot be taken and the compiler is free to inject optimizations assuming that’s the case. It doesn’t need to crash if the violation isn’t met - indeed it probably won’t. It’s the equivalent of having something like
x->foo();
if (x == null) {
Return error…;
}
This literally caused a security vulnerability in the Linux kernel because it’s UB to dereference null (even in the kernel where engineers assumed it had well defined semantics) and it elided the null pointer check which then created a vulnerability.
I would say that using unreachable() in mission critical software is super dangerous, moreso than an allocation failing. You want to remove all potential for UB (ie safe rust with no or minimal unsafe, not sprinkling in UB as a form of documentation).
You're right, the thing I linked to do does exactly that. I should have read it more closely.
The projects that I've worked on, unconditionally define it as a thing that crashes (e.g. `std::abort` with a message). They don't actually use that C/C++ thing (because C23 is too new), and apparently it would be wrong to do so.
For many types of projects and approaches, avoiding UB is necessary but not at all sufficient. It's perfectly possible to have critical bugs that can cause loss of health or life or loss of millions of dollars, without any undefined behavior being involved.
Funnily enough, Rust's pattern matching, an innovation among systems languages without GCs (a small space inhabited by languages like C, C++ and Ada), may matter more regarding correctness and reliability than its famous borrow checker.
Possibly, I am not sure, though Delphi, a successor language, doesn't seem to advertise itself as having pattern matching.
Maybe it is too primitive to be considered proper pattern matching, as pattern matching is known these days. Pattern matching has actually evolved quite a bit over the decades.
Im a encryption noob. Less than a noob. But something I've been wondering about is how can homomorphic computing be opaque/unencryptable?
If you are able to monitor what happens to encrypted data being processed by an LLM, could you not match that with the same patterns created by unencrypted data?
Real simple example, let's say I have a program that sums numbers. One sends the data to an LLM or w/e unencrypted, the other encrypted.
Wouldn't the same part of the LLM/compute machine "light up" so to speak?
I don’t actually think an LLM is a good way to sum numbers, but it is a pretty good example to explain the phenomenon you’re interested in. When you run an LLM, you essentially take your input and matrix multiply it with the weights to get an output. This happens regardless of what the input is or what the model “needs” to do. So, to some extent, the same part of the machine is being used every time even though the results seem very different.
(Of course, the reality is much more complicated than this; you can trace things like power and in theory track individual activations with great difficulty and then do interpretability to see what the model is doing. But hopefully it illustrates that you can usually take some sort of special operation and turn it into a process that does the same operation on different data.)
Haha nice, I saw your game come up a few times when I was googling to remember the name of the one I suggested, I'll be putting yours on my wishlist as well!
Such a decision merely tips the scale into a brittle structure territory. It introduces critical points of failure (funneling responsibility through fewer "nodes", stronger reliance on compute, electricity, internet, and more) and reduces adaptability (e.g. data bias, data cutoff dates, unaware of minute evolving human needs, etc).
I'm reminded of the trade off between automation and manual work. Automation crystalizes process, and thus the system as a whole loses it's ability to adapt in a dynamic environment.
reply