Spain is still pushing out great titles for these machines- the scene around the Oric-1/Atmos computers (competitors to the ZX Spectrum) is particularly virulent, among the Spanish retro hacking community. Some truly astonishing things coming from those folks, for these 40+ year old machines ..
I've always had a TV or screen of some sort, devoted to background music or light films, just to fill in the void between lines of code. For some, having such light stuff going on is a productivity booster. I once got a dev team that had been struggling to get things finished, well and truly over the finish line, by putting a fat TV in the room, and giving folks the ability to line up their playlists for the day, as long as it wasn't too violent/inappropriate for the workplace.
We side-watched a ton of stuff together as a team - it was great for morale - and we actually shipped stuff, too. Of course the TV eventually became a console for the build server, but it was always available to anyone to put something on in the background, if they wanted to. Definitely a nice way to get a team to be a bit more coherent - as long as whats being played isn't too crazy.
One of my first audio dev jobs was to add Creative Labs AWE32 support to a game called "Spectre VR", which at the time was one of the most popular multi-player/networked games for PC's.
It was, literally, a blast. The AWE32 was basically a sampler on a card, for PC's, and it was CL/Velocity's intention to bundle SpectreVR with the AWE32 for release. SpectreVR had a PC-Speaker driver, which was .. viable .. if you had the processing power for it. But AWE32 support just kicked serious ass, so it was a major push to get it working.
We got it working, it was hilariously fun, and the upgrade in audio had everyone in the office clamoring for the limited AWE32's we'd been allocated for development.
Then, DOOM hit the 'net, and everyone moved on, really fast, from SpectreVR, and it wasn't long until the bundling deal came undone, and DOOM got the AWE32 treatment instead. That was a frustrating state of affairs, but at least I got a free AWE32 out of it ..
Its still the case today, its just that America has gotten louder about its academic accomplishments being a key factor in economic success.
Your average German/Austrian universities have plenty of ex-pat Americans, there for precisely the fact that the education systems have such variety between the two nations. They are understated and under-represented in mainstream culture about academia, but for sure there are still Americans making the pilgrimage to older universities, for the diversity and strengths they offer.
America buys a lot of people in for its universities today, although of course there are bright Americans. German universities today are stricter and more rigorous in general than American ones.
I think the USA, like the UK, does tend to use name recognition. Oxford and Cambridge use interviews to filter out people, but are disproportionately represented in power structures.
While I agree with you that Germany doesn't have the intellectual prowess it once may have had, I don't think you can consider the Nobel prize a valid metric, personally. The Nobel prize has subverted itself many times over.
While German academia was rebuilding itself, American academia was chasing clout - one side effect being that the Nobel prize is more of a carnival attraction than an academic accomplishment.
Another example of Windows' technical debt being there, low-hanging fruit-wise, to be cashed in by performance-oriented developers. Its interesting that Youtube changing the timer resolution propagates to other threads .. does this harken to darker days in the MSDOS era? Youtube, another Turbo button?
I was told a story by some hackers in the old multi-user mainframe days. They said that a good speed booster was to have the program open a terminal because it made the mainframe OS think it was a real-time user interactive program and give it more resources.
.. because there was, once, a day when the terminal buffer available for this pipe was bigger than available memory offered to a process by default, meaning the thing would unpack faster if I did that versus:
$ tar zxvf somefile.tar.gz
Admittedly, this discrepancy of available memory was usually because the BOFH hadn't realized there was also an allocation for pipes-per-user, so it was a neat trick to get around the hard limits that BOFH had imposed on some of my processes in terms of heap allocation ..
I this related to when you are scrolling and selecting within a document, and you wiggle the mouse, it scrolls faster ? I always thought it was just a nice UI optimisation, but I could believe it's actually some accidental side-effect at play.
(like make a 20 page word doc, and start selecting from the first page and drag through - it wil go faster if you jiggle. same in excel and nearly every windows app, even windows explorer)
No, it has to do with every time you move the mouse over a window, a hover event is sent to the application, which runs its main event loop. Either the installer only updated its progress bar when an event happened (in which case it would only appear to be going faster, because the progress bar would move more smoothly) or there was some really terribly written code that literally only made progress when an (unrelated) event happened. My guess is the former.
There must be so many subtle features like these that people use subconsciously, and when they try to move to another operating system, they try it, nothing happens and they get frustrated.
A performance issue related to this is more likely a shortcoming in the software experiencing this issue.
The setting in question is the minimum timer resolution. Changing this will only have an impact on applications that depend heavily on that resolution, i.e. it's not some sort of turbo button for general execution speed. In fact according to the docs, a higher resolution can "reduce overall system performance, because the thread scheduler switches tasks more often."
An application whose performance depends on the timer resolution should be setting that resolution itself, using the Win32 API function mentioned in the thread, timeBeginPeriod, which includes the following in its documentation:
> For processes which call this function, Windows uses the lowest value (that is, highest resolution) requested by any process. For processes which have not called this function, Windows does not guarantee a higher resolution than the default system resolution.
> Starting with Windows 11, if a window-owning process becomes fully occluded, minimized, or otherwise invisible or inaudible to the end user, Windows does not guarantee a higher resolution than the default system resolution. See SetProcessInformation for more information on this behavior.
> Setting a higher resolution can improve the accuracy of time-out intervals in wait functions. However, it can also reduce overall system performance, because the thread scheduler switches tasks more often. High resolutions can also prevent the CPU power management system from entering power-saving modes.
Think of it this way, the global timer resolution of the system is minOf(allProcessesTimerResolution). If no process needs higher accuracy timing then there is nothing hindering the system from sleeping longer periods to save power and/or have less interrupt overhead (An feature I'd say).
These API's are from the 90s, in the beginning of the 90s where these API's are from having an global system interrupt firing 1000 times per second could very well have taken a percent or two or more from overall CPU performance (people already complained about the "overhead" of having a "real OS").
On the other hand writing audio-players on DOS you had the luxury of receiving your own interrupt within a few samples worth of audio, this meant that you could have very tight audio-buffers with less latency and quicker response to user triggers.
Not having that possibility to get that timing fidelity would have made Windows a no-go platform for audio-software, thus giving developers the freedom to enable it when needed was needed. Removing it in the next 10 years would probably have risked bad regressions.
Like a sibling comment noted, they finally removed it during Windows 10's lifespan and with modern CPU's _AND_ multicore they probably felt safe enough with performance margins to separate high accuracy threads/processes to separate cores and let other cores sleep more and actually win more battery life out of it.
It might not be "perfect engineering", but considering the number of schedulers written for Linux over the years to address desktop(audio) vs server loads it was a fairly practical and usable design.
DOS was basically bare-metal programming with a few hardware and software calls thrown in. With 50 cent ARM processors these days having the power of an 80's mainframe Bare-metal on $5 dev-board is still my preferred way to go for simple projects that boot instantly and never need updates. I'm currently listening to music on a DOS MP3 player on a throwaway industrial x86 motherboard I built into an amplifier case 23 years ago.
> There should at least be mention that changing this resolution can affect other processes.
That sorta is what it’s saying. If you don’t set it yourself, you won’t get any better than the “default system resolution”. But if the default system resolution changes (say, by entering a sort of “performance mode” when playing games or watching videos), then it would imply it will affect all processes that are using the default, right?
Sorta, on Windows < 10. From the same Microsoft page:
“Prior to Windows 10, version 2004, this function affects a global Windows setting. For all processes Windows uses the lowest value (that is, highest resolution) requested by any process. Starting with Windows 10, version 2004, this function no longer affects global timer resolution.”
I mean, sure, it implies things. But we all know that devs have a hard time reading between the lines when the compiler is boiling away.
You get it, I get it, but I guarantee you there are a thousand developers for each one of us who won't get it and wonder why the heck things change now and then, without realizing they also need to test their timer-dependent code under less than hygienic conditions in order to validate the results ..
I think that this technically is a distasteful situation and whoever wrote those technical docs kind of wanted to avoid having to admit the painful truth, and just out and out state that changing the timer resolution will have a system-wide impact, because .. really .. why should it? There is no good reason for it. Only bad reasons, imho. Ancient, technical debt'ish kinda reasons.
Beej has been a stable reference for me for decades, also. By applying the knowledge gained in his networking guide, I've won many contracts for work that others considered too difficult (or "impossible" in the budget constraints) .. but which I managed to deliver, on time and under budget, because Beej had led the way.
Easily one of my top 5 favourite people on the Internet, alongside Linus Torvalds and so on.. I would say Beejs' impact on technology has been understated but definitely an order of magnitude or two greater than most.
Once, back in the woolly days of the Internet and before crypto became a household name, Wired magazine published an article about Phil Zimmermans' work .. and included his phone number at the end of the article, for some strange reason.
I was immediately star-struck, and without even thinking about it, called him up - not really expecting an answer.
He answered, and I was suddenly lost for words. What the heck was I calling him for? I told him, sorry, I just wanted to see if that phone number was real, and whether or not he was really so accessible to the general public to discuss crypto things.
He was very cordial, said yes indeed he enjoyed hearing from people who had constructive ideas about his work, and what was I really interested in. I glibly told him, I was just really testing the phone number - how did he feel about being so contactable - and he replied he had no problem with it whatsoever, and thanks for my call. I stupidly tried to explain to him how important it all was, and bumbled my way out of the call .. and I still, decades later, distinctly remember his chuckle as he put the receiver down on the other end ..
Every time I fire up "cmake" I chant a little spell that protects me from the goblins that live on the other side of FetchContent to promise to the Gods of the Repo that I will, eventually, review everything to make sure I'm not shipping poop nuggets .. just as soon as I get the build done, tested .. and shipped, of course .. but I never, ever do.
Does anyone know if the foveated rendering feature of the Steam Frame depends on eye-tracking? Is it tracking the iris to determine the center of foveation, or is there some other trick to doing this?
reply