Thoughts
Also I didn't post about this but there was a comment a couple days ago about how interpreted languages have only linear performance costs.
(Implying that compiled languages aren't worth using because they don't provide faster-than-linear speedups.)
Which is super super interesting because it's so obviously wrong in the sense that the linear
difference between Ruby and Zig is very frequently 1,000x and it turns out that linear differences of even 10x are significant in real world applications.
But there's also an argument that it's insightful because there are applications that don't care about linear performance where Ruby is used.
I used to be in the "linear performance doesn't matter" camp, but I didn't understand just how significant the performance differences can be.