tr:dr; he says “x86 took over the server market” because it was the same architecture developers in companies had on their machines thus it made it very easy to develop applications on their machines to then ship to the servers.

Now this, among others he made, are very good points on how and why it is hard for ARM to get mainstream on the datacenter, however I also feel like he kind lost touch with reality on this one…

He’s comparing two very different situations, more specifically eras. Developers aren’t so tied anymore like they used to be to the underlaying hardware. The software development market evolved from C to very high language languages such as Javascript/Typescript and the majority of stuff developed is done or will be done in those languages thus the CPU architecture becomes irrelevant.

Obviously very big companies such as Google, Microsoft and Amazon are more than happy to pay the little “tax” to ensure Javascript runs fine on ARM than to pay the big bucks they pay for x86…

What are your thoughts?

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    Have you used ARM servers? They’re a massive pain to work with because they just need that one little extra step every time. Oops, this Docker image doesn’t do aarch64, gotta build it yourself. Oops, this package isn’t available, gotta compile it yourself. Oops, this tool doesn’t work, gotta find an alternative or run it through the much slower qemu layer.

    The M1 was the first usable ARM development machine for the mainstream and at launch it was plagued with tons of “how do I develop on this” problems. Apple provided x64 compatibility as a workaround for basically every piece of software you want to run being on another platform. Things are moving forward, but I haven’t heard of any companies announcing how their lives improved by switching to Graviton. Maybe if Apple released a 200 core M2 server it would start to make sense to use ARM, but knowing Apple they’d probably force you to run macOS.

    Linux was released in 1991, not 1960. There were tons of programming languages out there. BASIC ran on basically anything, as did C++. Pascal and Fortran are still used to write high demand applications to this day. Nobody was stuck with C.

    Also, when you actually need performance, Javascript needs to go. Java and dotnet have the same cross platform advantages with much higher speeds. When those become too slow for you (not that hard, they both have huge overhead), you get into the realm of C++ and Rust. After that, you can go one step further, and write your code in C or Fortran (Fortran is especially good at number crunching, beating C at many tasks).

    For a while, developers were stuck with compiling stuff for their servers. Then Java came out. Java did what you say Javascript does: write once, run anywhere. Since the late nineties, server architecture does not strictly matter. You can take most .jar files and serve them from your server, your Power9 box, your Android phone, it’ll all just work after downloading a runtime.

    Nothing changed, really. The minority of developers running on ARM will usually still deploy to amd64. Unlike in the past, ARM cores on desktop are faster than ARM cores on the server. There’s no benefit to running ARM servers. Running slow software like PHP and Javascript becomes especially problematic on slower hardware, so for those cross platform runtimes, you’re still better off running on amd64. That’s part of the reason why companies like Oracle are handing out free ARM VPS products with tons of free RAM, to convince people to try their ARM product for real.

    Maybe Graviton will take off, who knows. People said the same thing about Power9 and they’re saying great stuff about RISC-V too. For now, I don’t see much change.