I tried WebGL about a year ago, and it wasn't ready yet. It spun up CPU quite a bit, and was a different experience in terms of performance across browsers. Has this changed in the last year?
I think just like all things on the world wide web, a majority of it will be implemented sub-par, perhaps even terribly. It's not so much WebGL that is the bottleneck as all the experimental implementations of it (guilty as charged here).
Those people like Mr Simon here who take the time optimize every last bit can indeed make it purr.
Everything on the web starts as a framework or some proprietary bundle of some kind, and if it proves it's merit, it may just get built into the W3C standards themselves.
We currently have a low level tech (webGL) in W3C, and libraries like three.js are providing the get-you-by to elevate it to average code legibility, but I think we are still a long way off from having 3D be a javascript-native ability for the average developer in a peformant implementation, at which point we could really expect this www-metaverse experience getting hyped every which where to blow the lid on this two dimensional web we're currently stuck in.
What do you mean you 'tried' WebGL? It's just OpenGL in the browser. As mentioned, it just works just like OpenGL/Vulkan/DirectX just works.
Now, various WebGL frameworks (Three.js, Babylon.js, etc...) are more performant or easier to use or whatever. Or you can compile native 3D platforms to wasm. Which did you try?
OpenGL/Vulkan/DirectX just work, WebGL depends on the whims of the browser and how it evaluates the client machine, eventually blacklisting it, or replacing calls with software emulation instead.
OpenGL/Vulkan/DirectX can use 2021 hardware capabilites, WebGL is stuck on a GL ES 3.0 subset, basically 2012 hardware.
Cool if you want to do iPhone 6 style graphics, but that is about it.
OpenGL ES 3.0 is derived from both OpenGL 3.3 and 4.2 while getting rid of some legacy stuff so it's a 'subset', but not really. And considering the range of devices browsers are on, it's not bad. Very few devices other than consoles and gaming rigs actually make use of bleeding edge graphics APIs.