Hacker Newsnew | past | comments | ask | show | jobs | submit | pak9rabid's commentslogin

Never I hope

I was able to accomplish this by doing each test within its own transaction session that gets rolled-back after each test. This way I'm allowed to modify the database to suit my needs for each test, then it gets magically reset back to its known state for the next test. Transaction rollbacks are very quick.

As a consultant, I saw many teams doing that and it works well.

The only detail is that autoincrements (SEQUENCEs for PotgreSQL folks) gets bumped even if the transaction rollsback.

So tables tend to get large ids quickly. But it's just dev database so no problem.


Unfortunately a lot of our tests use transactions themselves because we lock the user row when we do anything to ensure consistency, and I'm pretty sure nested transactions are still not a thing.

You can emulate nested transactions using save points. A client uses that in production. And others in unit tests.

Bingo...this is how I get around that.

This doesn’t work for testing migrations because MySQL/MariaDB doesn’t support DDL inside transactions, unlike PostgreSQL.

Migrations are kind of a different beast. In that case I just stand up a test environment in Docker that does what it needs, then just trash it once things have been tested/verified.

Range types are a godsend when you need to calculate things like overlapping or intersecting time/date ranges.

Can you give a real world example?

I think the examples here are pretty good: https://boringsql.com/posts/beyond-start-end-columns/

This is kind of a complicated example, but here goes:

Say we want to create a report that determines how long a machine has been down, but we only want to count time during normal operational hours (aka operational downtime).

Normally this would be as simple as counting the time between when the machine was first reported down, to when it was reported to be back up. However, since we're only allowed to count certain time ranges within a day as operational downtime, we need a way to essentially "mask out" the non-operational hours. This can be done efficiently by finding the intersection of various time ranges and summing the duration of each of these intersections.

In the case of PostgreSQL, I would start by creating a tsrange (timestamp range) that encompases the entire time range that the machine was down. I would then create multiple tsranges (one for each day the machine was down), limited to each day's operational hours. For each one of these operational hour ranges I would then take the intersection of it against the entire downtime range, and sum the duration of each of these intersecting time ranges to get the amount of operational downtime for the machine.

PostgreSQL has a number of range functions and operators that can make this very easy and efficient. In this example I would make use of the '*' operator to determine what part of two time ranges intersect, and then subtract the upper-bound (using the upper() range function) of that range intersection with its lower-bound (using the lower() range function) to get the time duration of only the "overlapping" parts of the two time ranges.

Here's a list of functions and operators that can be used on range types:

https://www.postgresql.org/docs/9.3/functions-range.html

Hope this helps.


Back when video games separated the men from the boys.


Damn, that was a walk down memory lane.


Well, look at how Microsoft tried to hijack the JVM back in the 90s. I think the big fear is that somebody creates a "mostly compatible" product, that in fact isn't 100% compatible, and tries to market it as the same thing as the original, which in fact isn't the original.


Yes, a VM with extremely tight integration with the Windows environment to make things that would otherwise require lots of time to setup a breeze. I use it as my daily driver for dev work (at work, since we're required to use Windows :( ) and to be honest it's quite pleasant most of the time.


I usually work in a VM hosted by my company. But the performance is really starting to irritate me. Been considering switching to WSL2, but last time I checked all they supported was Debian based distros, and we do all our work on RHEL8. I don't think it would matter much but it's still annoying working on an entirely different setup from the rest of the team.

How is the graphical app support these days?



They support Oracle Linux. It is almost the same as RHEL.


How is it "tight" when even `ps` or `top` show VM processes instead of OS ones? Could you give an example of functionality that can't be done with `docker run -it ubuntu`?

I used it for a little bit (got Windows laptop, thought maybe I'll switch but no), and just hated that split brain workspace.


That's the difference between an engine and game developer.


It's the difference between someone working in a world of engineering tradeoffs and a fantasist imagining their next übergame with everything awesome in it.

I looked into the development cycle behind Daikatana, partly because it had its 25th anniversary this year and so for some reason, YouTube was recommending me Daikatana content. And... there's a reason why Romero's first dev team all quit. Daikatana started life as a 400-page document full of everything Romero found awesome at the time. There were going to be time travel mechanics and a roleplaying party system like Chrono Trigger. It was going to have like a hundred awesome weapons that totally reinvented how to make things go boom. It was going to have the sweetest graphics imaginable. Etc. It was like something Imari Stevenson would have written as a teenager, which is somewhat surprising since Romero could now call himself a seasoned industry professional.

What's worse, "Design is Law" basically meant "what I say goes". It was his job to have the ideas, and it was his team's job to implement them. Romero wanted to be the "idea guy", and Daikatana was an "idea guy" game. I doubt he had the maturity at the time to understand what design is, in terms of solving a problem with the tools and constraints you have. He wanted Daikatana, and Quake before that, to have everything, and didn't know how to pare it down to the essentials, make compromises, and most importantly, listen to his team. Maybe there's an alternate-universe Quake or Daikatana somewhere that's just a bit more ambitious than the Quake we got, incorporating more roleplaying elements into a focused experience. But in our timeline, Romero didn't want to make that game.

Of course, after taking the L on Daikatana's eventual release, Romero wised up and started delivering much more focused and polished experiences, learning to work within constraints and going a long way toward rehabilitating his reputation. But that's not where he was when he criticized Quake for not pushing the envelope enough.


Yeah, Romero seemed too ambitious. I didn't play the original Daikatana, but the Game Boy Color port was surprisingly good. The basic 2D graphics of the system helped enforce technical simplicity. So an undisciplined game designer could actually go nuts (within the technical limitations of a tile-based 8-bit machine) without much risk of overwhelming programmers or artists.

However, the GBC game wasn't actually developed by Romero's team, but outsourced to the Japanese studio Kemco. Romero was involved, though I'm not sure how much.

Anyway, what made the GBC game unique is that it played like a linear story-driven first-person game, similar to Half-Life, just not in first-person.

The presentation was top-down, like Zelda Link's Awakening, but the world wasn't split into an overworld and dungeons, nor was it split into "levels". Instead you would just walk from location to location, where side paths would be blocked off, similar to Half-Life. On the way you were solving environmental (story related) puzzles, killing enemies, meeting allies, all while advancing the elaborate plot through occasional dialog scenes. It felt pretty modern, like playing an action thriller.

For some reason I never saw another 2D game which played like that. I assume one reason is that this type of story-driven action game was only invented with Half-Life (1998), at which point 2D games were already out of fashion on PC and home consoles. Though this doesn't explain why it didn't catch on for mobile consoles.

So in conclusion, I think Romero (his own studio) might have been better off developing ambitious 2D action adventures for constrained mobile consoles rather than trying the same on more challenging 3D hardware. It would have been a nice niche that no other team occupied at the time, to my knowledge.


Well, that's kinda what he did. He formed a new studio, Monkeystone Games, and released the 2D top-down adventure, Hyperspace Delivery Boy, on Windows CE devices, which was pretty well received at the time. Like I said, he took the L pretty well and learned its lessons.

I've played Daikatana GBC. It's pretty good. Years back, Romero released the ROM on his web site as a free download. I suspect ION was pretty involved, at least from a standpoint of making sure the story unfolded more or less as it did in the main game.


They theoretically had more than enough time for game design in the ~2 year development period, which was long for the time.


I'm guessing none of these diehards were running domain controllers.


Dialing-up a friend to play Quake, there essentially was no lag.

Dialing-up to the Internet to play Quake via TCP/IP...shit tons of lag (150+ ms).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: