> Despite attempts to make Atom—an Electron application—more responsive, it never reached the performance standards the team yearned for.
This feels like an attempt at deflecting blame. VSCode is another Electron application that ended up having better performance than Atom. There's another Electron adjacent application that has good performance, the one you're probably using right now to read this page.
Reading this static HTML page and executing several megabytes of JavaScript to read and edit short lines of text are fundamentally different tasks, even if the same program can perform both actions.
VSCode can pretend to be fast in my desktop, and I would not care because desktops today are computing monsters that rival supercomputers of the past, but Sublime Text is still much faster at any text editing task.
I’m not saying that VSCode is the snappiest editor out there but for all its flaws it’s still the most responsive editor for my use case on moderately sized CUDA/HIP projects.
What are you saying here? It is true that VS Code is less bad in terms of responsiveness in comparison to Atom. Zed, however, is written in Rust (i.e., not Electron), and I would guess it is at least an order of magnitude more responsive than VS Code across every possible scenario.
Web technologies are an unrivaled technological marvel for what they are, but it is disingenuous to imply they represent anything near the peak of what we are capable of in the context of performance.
It depends. It definitely opens faster and the general UI seems a bit faster, but open a largish file (a few MB) and VSCode will easily out-perform Zed because it doesn't have that fancy CDRT thing.
In my experience VSCode is plenty fast. Use it with no extensions and you will have zero problems with performance (though memory use isn't great). The real problems come when you have extensions, especially because it's often impossible to attribute performance issues to them because they can often do a lot of work all in the same "extension host" process.
> you will have zero problems with performance (though memory use isn't great)
Memory use is a part of performance, though, so I definitely would say VS Code has serious performance issues. It's why I no longer use it, in fact. It's inexcusable for something to eat as much RAM as VS Code does.
If you use vscode on a platform with limited resources you will see that vscode is absolutely NOT fastand zed outperforms vscose long way. Extensions or not. And electron is a pleague that needs to pass from this world
Based on this comment [0], they're building DeltaDB as a version control system which uses a CRDT, so I assume even in single player mode, the file will instantiate its own CRDT for fine grained tracking of changes.
I had a coworker who lost a lot of weight and showed up at work one day wearing new clothes and looking sharp. The pants were from Costco. I have since gone and bought a few pairs of pants from them. They feel fairly high quality, made of sturdy and comfortable materials, and are wife-approved. And of course they are very inexpensive.
I'm sure expensive pants have their benefits but no matter how much money I have, I will always baby expensive things, and it's very inconvenient to baby clothes (e.g. must be dry cleaned, can't use a washer or dryer, can't risk getting stains on it). There are good reasons why dads gets their clothes from Costco.
Yeah, it's a nation. A nation of immigrants. Where did your great-great-great grandfather come from? Did he spontaneously erupt from this common ancestry and heritage that you speak of?
Glad to see that there's support for CPU sample flamegraphs in Perfetto now that's on par with Google's internal pprof visualizer as alluded to in the talk. Using the internal visualizer to share Windows ETW traces with colleagues was the primary motivation for developing [EtwToPprof](https://github.com/google/EtwToPprof). Now that perfetto supports this natively, I might look into developing EtwToPerfetto :-)
Fun fact: Perfetto also gained support for the pprof format within the last month :)
It opens a special "aggregate flame graph" view of the profile since pprof does not preserve time information. But it works! We use it for visualizing aggregates across thousands of profiles in production!
That's just the nature of these tools though. For example, Windows has its own powerful ETW tracing framework, but using it for real profiling and debugging requires learning a lot about the tools: https://randomascii.wordpress.com/2015/09/24/etw-central/
Republicans can end the filibuster in the Senate with a simple majority after which they can pass the funding bill with a simple majority. But of course, they won't do that since it opens up other bills to be passed with a simple majority too in the future. So it's not factual.
It’s weird to me that any Democrat would be rooting for the nuclear option right now. After losing so badly last year, after demographic trends continue to degrade their ability to win majorities in either house or in to win in most swing states, and with Trump improving his numbers even in blue states last year. Democrats are currently daring Republicans to eliminate their last resort option to block anything, even though they should not be confident that the DNC will ever be in a position to benefit from this new power later themselves.
I think shutdowns are usually pretty stupid, but I’m not reading for “nuclear option” to be the way this one gets resolved.
While I'd prefer to get the first legislative crack at it, I am extremely convinced that the filibuster is poison to our country.
Congress cannot pass anything meaningful except one omnibus spending bill each year via budget reconciliation, which has arcane rules around flat budget impact after 10 years. Since congress can't do anything, we naturally move more and more of the details of federal governance to executive agencies and executive orders. While not all of this is bad, it has a few horrible effects.
First, the supreme court is more willing to interfere with executive action, amplifying its power as it finds reason to protect actions by the favored party and cancel actions by the disfavored party. Increasing the power of the supreme court shifts federal power towards an unelected branch that is the slowest to adjust to changing voter preference.
Second, as more power within the executive gets concentrated specifically with the president we enable more and more federal action at the whims of exactly one person. This exposes us to the current situation, where Trump is unfettered in how he wields the executive branch rather than guiding it and having its power distributed across the executive branch.
And we've also got the general popular dissatisfaction with congress and the democratic process because they can't get shit done. A country that has an enormously unfavorable opinion of congress is more primed for collapse into anti-democratic governance.
Yes, if we didn't have the filibuster then the GOP could pass all sorts of nightmare legislation. But I'd prefer legislation enabled through the will of the people to this slow collapse into authoritarianism.
> Since congress can't do anything, we naturally move more and more of the details of federal governance to executive agencies and executive orders.
It's really interesting how wildly different this is than parliamentary systems where Parliament is the ultimate authority and the executive's tenure simply ends if they lose the 'confidence' of a plurality in Parliament.
Instead we have a useless legislature as you described, and an executive whose claim to all this additional power is actually quite dubious, yet that executive controls nearly everything. I don't think a swing to the other party changes this, either (not that the DNC is capable of winning elections enough to ever hold the kind of power the GOP now has). I think from now on, the President will rule by executive action, and use creative avenues like rulings from friendly courts to vaguely legitimize this power.
Would the interface nil example be clearer if checking for `nil` didn't use the `==` operator? For example, with a hypothetical `is` operator:
package main
import "fmt"
type I interface{}
type S struct{}
func main() {
var i I
var s *S
fmt.Println(s, i) // nil nil
fmt.Println(s is nil, i is nil, s == i) // t,t,f: Not confusing anymore?
i = s
fmt.Println(s, i) // nil nil
fmt.Println(s is nil, i is nil, s == i) // t,f,t: Still not confusing?
}
Of course, this means you have to precisely define the semantics of `is` and `==`:
- `is` for interfaces checks both value and interface type.
- `==` for interfaces uses only the value and not the interface type.
- For structs/value, `is` and `==` are obvious since there's only a value to check.
Ever heard of subpixel rendering? You can have pretty sharp text at even 90 ppi if your OS supports it. MacOS doesn't, probably because they didn't want the complexity of supporting it throughout their compositing/graphics stack, and also likely because Apple doesn't sell any low ppi displays.
reply