• ImageKissaki
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    I don’t think they mention maintenance burden specifically. Using a framework with packages means you have to track upgrades, do upgrades, check release notes, breaking changes, support and end of life cycles, license changes, etc. It’s a have maintenance burden if you keep it live, even if you don’t intend to make any changes.

    Vanilla doesn’t have this problem. Server-side has it too, but in a slightly different flavor.

    The heavier and integrated the framework, and the more additional packages you include, the heavier the burden.

  • Imagekalkulat@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    8 days ago

    I can’t speak for the performance of frameworks, but today’s ‘vanilla’ javascript both compiles and executes blisteringly fast. The better-optimized it is, the better. Staying away from modern syntactic sugar may also help.

    I can understand that those who choose to use frameworks may be newcomers, or may have productivity pressures. Neither source of slowdowns can be blamed in JS itself.

    But this headline distorts that reality by leaving out the fact that the article itself is about using JS frameworks. Whatever the ‘long-term performance goals’ may be, writing the fastest code can’t hurt them. Suggesting otherwise is a disservice to JS.

    • Imagedan@upvote.au
      link
      fedilink
      arrow-up
      3
      ·
      7 days ago

      Sites that only use vanilla JS can still be heavy, too. I think the underlying message is to do the heavy processing on the server side, and keep the client side relatively light.

      This is also why frameworks/libraries like Astro and htmx are becoming more popular. Both of them focus on having minimal frontend JS. htmx effectively reverts back to a pattern that was somewhat common 20 years ago - a small amount of reusable JavaScript to handle common use cases, that hits the server and inserts its response HTML somewhere on the page. I was a web developer back then so it’s been interesting to see old patterns come back.

      • Imagekalkulat@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 days ago

        That may not be the wisest choice. For one thing, the client side can be as fast or faster than the ‘server’ side. Sending data to a ‘server’ which is ‘serving’ as a mainframe (reverting to a model from the past) will consume more time and bandwidth. So much for the performance goals.

        That also has the potential to create securiity concerns at both ends (which were not intense in those bye-gone days of yesteryear). Furthermore, wny should the client trust the quality of the code the ‘server’ uses? The popularity of the ‘new’ frameworks aside, the cost of bandwidth and processing by the ‘server’ will be born by the ‘client’. I’m not seeing any pluses except for thin clients, and big potential pluses for the ‘serving’.

        • Imagedan@upvote.au
          link
          fedilink
          arrow-up
          1
          ·
          4 days ago

          the client side can be as fast or faster than the ‘server’ side.

          That’s not the case on a lot of JS-heavy sites, though. A lot of logic runs on the main thread, which slows things down. The only way to run things off the main thread is by using web workers, but that adds extra serialization/deserialization overhead.

          That also has the potential to create securiity concerns at both ends

          Generally, the more logic you have on the client-side, the more likely you are to have security issues relating to untrusted input or behaviour. The client is a completely untrusted environment (since a user can do whatever they want with your JS code), and increasing the amount of logic on the client side increases your attack surface there. Any code on the server-side can be trusted, since you wrote it and users can’t modify its behaviour.