Hacker Newsnew | past | comments | ask | show | jobs | submit | urxvtcd's favoriteslogin

Yep. This was the biggest thing that turned me off Go. I ported the same little program (some text based operational transform code) to a bunch of languages - JS (+ typescript), C, rust, Go, python, etc. Then compared the experience. How were they to use? How long did the programs end up being? How fast did they run?

I did C and typescript first. At the time, my C implementation ran about 20x faster than typescript. But the typescript code was only 2/3rds as many lines and much easier to code up. (JS & TS have gotten much faster since then thanks to improvements in V8).

Rust was the best of all worlds - the code was small, simple and easy to code up like typescript. And it ran just as fast as C. Go was the worst - it was annoying to program (due to a lack of enums). It was horribly verbose. And it still ran slower than rust and C at runtime.

I understand why Go exists. But I can't think of any reason I'd ever use it.


I've heard AI coding described as "It makes the first 80% fast, and the last 20% impossible."

...which makes it a great fit for executives that live by the 80/20 rule and just choose not to do the last 20%.


> While hardware folks study and learn from the successes and failures of past hardware, software folks do not

I've been managing, designing, building and implementing ERP type software for a long time and in my opinion the issue is typically not the software or tools.

The primary issue I see is lack of qualified people managing large/complex projects because it's a rare skill. To be successful requires lots of experience and the right personality (i.e. low ego, not a person that just enjoys being in charge but rather a problem solver that is constantly seeking a better understanding).

People without the proper experience won't see the landscape in front of them. They will see a nice little walking trail over some hilly terrain that extends for about a few miles.

In reality, it's more like the Fellowship of the Rings trying to make it to Mt Doom, but that realization happens slowly.



It should be noted that many species are occasionally hit at altitudes thought to be impossible for them to fly at.

One notable example: https://news.alaskaair.com/alaska-airlines/flying-fish/


I recommend McGill's Back Mechanic book, which is an end-user focused distillation of his academic work.

It suggests simple tests to discover exactly where your pain is coming from and then appropriate exercises to mechanically strengthen the right area and a few workarounds to avoid stressing that area in regular life e.g. alternate ways to pick up light items from the floor.

McGillcs big three are three simple exercises that are generally good for those with no patience for ordering a book and intros to them can be found all over YouTube.


IMO the best argument for time zones is it keeps calendar days roughly[1] aligned with “natural” days, such that day changes happen when it’s dark outside and most people[2] are asleep and most people won’t be entering a new day during a work meeting.

Why? Because it is confusing if most people’s natural days are divided into different calendar days. If I wake up and the calendar says Tuesday, then I can be sure that I don’t need to worry about the dental appointment on Wednesday afternoon. But if Wednesday starts midday, then I have to spend extra brain cycles to work out which parts of my natural day belong to Tuesday and which parts are Wednesday. It’s a lot of hassle avoided by having days change over when I’m asleep.

If you want real life example of such confusions, look up “early morning flight confusion” on your favourite search engine. It turns out a lot of people are confused by flight times like “Wednesday 1:20am”, because if you are catching an 1:20am flight, you are heading to the airport on the previous day before midnight, and mentally the flight feels like a part of the previous “natural” day. What sometimes happens is people will book a flight on Wednesday 1:20am, head to the airport on Wednesday evening because they think “well the flight is on Wednesday right”, and find out that it is actually Thursday after midnight.

By the way, one successful attempt at addressing this problem is Japanese late night anime schedules. If an anime is aired on Wednesday 1:20am, the TV station will instead write “Tuesday 25:20” on the schedule. It makes no sense from a technical point of view, but feels right for the human because what is the early morning of Wednesday if not just Tuesday night prolonged?

[1] Even extreme cases like Spain are only a couple of hours out of sync, it’s not like the sun shines on Madrid during “midnight”.

[2] the article mentions remote workers and frequent travelers, which I am willing to wager are a tiny (but over represented in nerd spaces) fraction of the society.


The article pretty much says this as well, but a concise way I saw Andrew Kelley put it recently in the context of Zig (but it seems to apply well for any language that has errors-as-values + panics) is: "Assertions for programmer mistakes; errors for user mistakes."[0]

One interesting difference between Zig and other languages with similar error stories (Rust, Go) is that panicking can be handled but not recovered. Panicking is always fatal in Zig. The nice thing about this is that you cannot use panics for exception-style control flow, which removes any temptation to use them for user errors.

[0] https://ziggit.dev/t/i-wrote-a-simple-sudoku-solver/9924/12?...


> The laws of thermodynamics pretty much guarantees this anyways does it not?

Yes, but:

http://www.thelastquestion.net/


At one point, you could leave an open <script> tag at the end of the HTML with the language attribute set to "javascript9.9" or something non-existent, and the JavaScript banner ads wouldn't load.

Good times, those were.


I see this problem differently, as mainly exhibiting a problem that cannot be analyzed without self-reference. Self-reference causes problems.

The more obvious solution that the postscript hints at is the question "what would you say if I asked you whether [this door] leads to the castle?". Here, we immediately cancel out the influence of whatever goblin we're speaking to, and we get the right answer, whereas in the movie's solution we immediately incorporate the influence of the lying goblin and get the wrong answer.

But the movie's solution is better for the movie, because "what would you say if I asked you X?" is a normal English way to ask "X", and in this case, where the two questions are different, the audience is guaranteed to be confused.


Astronomy is one of the few sciences that I've studied that has given me existential dread.

In particular, the scale of the universe just hurts my brain: If you were to scale down the Sun to the size of a coarse grain of sand (1 mm), then the orbit of the Earth would be about 20 cm across, with the planet itself being microscopic at this scale (10 micrometers). Jupiter would be a barely visible speck 55cm from the Sun.

At this incredibly tiny scale, the next nearest star in the galaxy is about 30 kilometers (18 miles) away! That's roughly the same as a trip across a typical city, but our Voyager probes at this scale have gone only 15 meters over a period of 45 years! That's comparable to the rate at which hair grows (1 mm/day).

Hence, a good mental model for thinking about the scale of our galaxy is: Stars are grains of sand separated by tens of kilometers on average across a circular space the size of the orbit of the Moon.


Have you ever noticed than even after using screens/computers/phones for 12 hours a day, they almost never appear in your dreams when you sleep?

My opinion is that Go has the same problems as UNIX:

- Reactionary in the face of Multics' (resp. Java/C#) complexity, but overshooting to the point of "worse is better".

- Basically confusing "no complexity" with punting of said complexity to users who will solve it in multiple horrible and/or incompatible ways; like a badly designed R5RS Scheme.

- Some decent ideas but poorly and non-uniformly implemented compared to Plan 9 (whatever will fix Go).

Which is quite logical, knowing who made both of these...

I don't know enough about Java to comment on that switch (as modern Java seems to have gotten a lot of cool stuff like pattern matching and https://openjdk.org/jeps/485) BUT Java's runtime (which also powers Clojure and Scala) is evolving at an impressive rate compared to Go's: especially GraalVM and generational ZGC.


A coworker once implemented a name validation regex that would reject his own name. It still mystifies me how much convincing it took to get him to make it less strict.

The Alt-Mann Amnesia Effect, maybe.

A former colleague of mine had a good way to describe it: technical debt is like financial debt, too much will kill you but if you don’t have any you will be outgunned by those that do.

The trick is how to manage tech debt properly, and the widespread scrum fake-agile in use provides no means for ever tackling tech debt once taken on. This is one reason for the explosion in SRE teams.


Sane defaults are the bomb. Defused bomb.

Very very small tweezers.

I see this so often.

Mocks and stubs get used all over the place because nobody understands what it means to write code that’s easily testable. They’re great when used correctly (e.g., remote services or inherently stateful APIs like time). But they almost never are.

You end up with tests that ensure one and only one thing: the code is written the way it’s currently written.

Tests should do two things: find unexpected out-of-spec behavior, and prevent regressions during the course of editing and refactoring. These overly-mocked tests by definition can’t do the first one and they actively inhibit the second. They have negative value insofar as they constantly trigger failure while making completely benign edits.


Handcrafted artisinal stuff is a luxury because that's the only niche that makes economic sense for it now, given mass production and other recent developments (too lazy to list, sorry).

Consider how you can't really get by in most of America without a car because we designed our cities to require them. It would be a mistake to conclude that, because life is harder in a car-optimized society without a car, society must be better off optimizing for cars.


Btw, it goes without saying that if you wanted to have a general-purpose relational language then you should use Prolog, not Datalog. After all, if you use Prolog there's nothing stopping you from sticking to Datalog when you want, and only using the full power of Turing-completeness when you must.

Seriously. Learn Prolog. It's a powerful language and you'll never worry about the Object-Relational Impedance-Mismatch ever again in your life. The only reason not to learn it is that you will forever be sad that you can't use it in your day job. Or you'll find one where you can, like I did.


The decoupling narrative is oversold for queues.

There's essential decoupling and accidental decoupling; decoupling you want, and decoupling which mostly just obscures your business logic.

Resilience in the face of failure, where multiple systems are communicating, or there's a lot of long-running work which you want to continue as seamlessly as possible, is the biggest essential decoupling. You externalize transitions in the state machine (edges in the state graph) of the logic as serialized messages, so you can blow away services and bring them back and the global state machine can continue.

Scaling from a single consumer to multiple consumers, from multiple CPUs to multiple machines, is mostly essential decoupling. Making decisions about how to parallelize subgraphs of your state machine, removing scaling bottlenecks, is an orthogonal problem to the correctness of the state machine on its own, and representing it as a state machine with messages in queues for edges helps with that orthogonality. You can scale things up without external persistent queues but you'll end up with queues somewhere, even if it's just worker queues for threads.

Accidental decoupling is where you have a complex state machine encapsulating a business procedure with multiple steps, and it's coordinated as messages between and actions in multiple services. The business logic might say something like: take order from user; send email notification; complete billing steps; remove stock from inventory system; schedule delivery; dispatch stock; etc.

All this logic needs to complete, in sequence, but without higher order workflow systems which encode the state machine, a series of messages and producers and consumers is like so much assembly code hiding the logic. It's easy to end up with the equivalent of COMEFROM code in a message system.

https://en.wikipedia.org/wiki/COMEFROM


This seems reminiscent of Session Types to me:

https://en.wikipedia.org/wiki/Session_type

I think one difference is that session types capture which state a process/co-routine is in after receiving a sequence of messages whereas this system does not, it captures how one can respond to each state in isolation (e.g. I don't think you can statically capture that once a door is closed and locked you can only receive the locked state from it and respond by unlocking it).


…and the problem with that is what, exactly?

The only meaningful thing in this discussion is about people who want to make money easy, but can’t, because of the rules they don’t like.

Well, suck it up.

You don’t get to make a cheap shity factory that pours its waste into the local river either.

Rules exist for a reason.

You want the life style but also all the good things and also no rules. You can’t have all the cake and eat it too.

/shrug

If China builds amazing AI tech (and they will) then the rest of the world will just use it. Some of it will be open source. It won’t be a big deal.

This “we must out compete China by being as shit and horrible as they are” meme is stupid.

If you want to live in China, go live in China. I assure you you will not find it to be the law less free hold of “anything goes” that you somehow imagine.


We live in Mordor and have the ring.

Clock people have great words for things. Their word for features is "complications".

In Brazil even beggars accept money through their phones and everywhere there's contactless payment.

Yes, it would seem preferable to reuse the same energy storage over and over again, as opposed to digging it out of the ground at huge expense, shipping it across the world, and then spreading it out into the environment as a cloud of toxic particles after one use.

Behold uber for forums

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: