• Geodad@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    We’ve already seen a few crotchety old timers who don’t want to add rust to the Linux kernel.

    At some point, a complete rewrite of the kernel in a memory safe language would be prudent. Unfortunately, the dinosaurs will have to go extinct before that can happen.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    1 day ago

    I do not think C is going to completely go away. If nothing more, it will be used as an ABI, to glue various other languages together.

    On the other hand, C is going to fade out, not just for memory safety issues, but also due to “language jank”. Usually language design choices that made sense on 60’s and 70’s mainframes, but no longer needed, and later languages tried to rectify them in their “C-influenced” syntax, but had the issue of also being much higher level than C.

    Also Rust is just the most hyped replacement for C, and depending on your usecase, other languages might be much better. D has a very close syntax to C without the jank, expecially when used in the betterC mode.

    • FizzyOrange@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 day ago

      Well COBOL hasn’t completely gone away… I don’t think anyone expects C to become completely extinct; just very legacy.

      D missed its chance. Zig is clearly going to be the successor to C, for people who don’t want to use Rust.

  • Troy@lemmy.ca
    link
    fedilink
    arrow-up
    22
    arrow-down
    6
    ·
    2 days ago

    No.

    C is going to be around and useful long after COBOL is collecting dust. Too many core things are built with C. The Linux kernel, the CPython interpreter, etc. Making C go away will require major rewrites of projects that have millions upon millions of hours of development.

    Even Fortran has a huge installed base (compared to COBOL) and is still actively used for development. Sometimes the right tool for a job is an old tool, because it is so well refined for a specific task.

    Forth anyone?

    The rewrite-it-in-rust gang arrives in 3, 2 …

    • deathmetal27@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      3
      ·
      2 days ago

      People tend to be obsessed with bleeding edge technology. But those who truly understand know that “bleeding edge” is an anti-pattern and there’s a reason it’s called that: it can bleed you as well.

      If it ain’t broken, don’t fix it.

      • bamboo@lemm.ee
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        2 days ago

        If it ain’t broken, don’t fix it.

        That’s the thing, it is broken and there is a fix desperately needed. C lacks memory safety, which is responsible for many, many security vulnerabilities. And they’re entirely avoidable.

      • Troy@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        I agree. And those decades of development come with huge advantages. Libraries. Patterns. Textbooks! Billions of lines of code you can cross reference and learn from!

        It’s fun to bleed a little when you are tinkering. It’s not fun to have to reinvent the wheel because you choose a language that doesn’t have an existing ecosystem. That becomes and chicken-and-egg problem. The tinkerers fulfill this role (building out the ecosystem) and also tend to advocate for their tinkering language of choice. But there needs to be a real critical mass.

        It takes decades to shift an entrenched ecosystem. Check in ten years if the following exist in languages other than C/C++: an enterprise grade database, a python(/etc.) interpreter that isn’t marked experimental, an OS kernel that is used somewhere real, an embedded manufacturer that ships the language as its first class citizen, a AAA game using it under the engine…

        Like, in the last 15 years, I’m only aware of a single AAA game that used a memory safe language – Neverwinter Nights 2 used C# for part of the Electron Engine…

        Rust is the most likely candidate here, although you see things like Erlang being used to make some databases (CouchDB). People see Rust being used on some real infrastructure projects that gain actual traction (polars comes to mind). Polars is an interesting use case though – it’s simply better than the other projects in its particular space and so people are switching to it not because it is written in rust at all… And honestly, that’s probably the only way this happens.

    • atzanteol@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      Making C go away will require major rewrites of projects that have millions upon millions of hours of development.

      Yep. And it’ll be done. Yes it’ll take a while, but this is what it means for C to be like COBOL (which also still exists). But the more and more it can be marginalized the better we’ll all be security-wise.

      The rewrite-it-in-rust gang arrives in 3, 2 …

      Cattle not pets. They’re just computer languages.

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      9
      arrow-down
      13
      ·
      2 days ago

      “It’s too much work, so let’s just not do anything and stubbornly stick with a problematic unsafe language that fewer and fewer people are willing to learn”

      • Troy@lemmy.ca
        link
        fedilink
        arrow-up
        9
        ·
        2 days ago

        Certainly, if I had said that.

        It’s like the Brits trying to convince everyone else to switch to their electrical socket. Sure, the design is better for higher voltage and current, has all these extra safety features, etc. But you cannot dramatically shift an entrenched ecosystem for free.

        • cm0002@lemmy.worldOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          2 days ago

          Yea, mb, on reread yea.

          But still, nothing new should be written in it and everything old should be rewritten or deprecated over time. Entrenched and around, yes, useful…no

          There’s very little benefit to starting something new in C and a whole lot of downsides. At least FORTRAN and COBOL have niche use cases. C doesn’t really have a good niche case that something else newer and more secure can’t fill AFAIK

          • Janovich@lemmy.world
            link
            fedilink
            arrow-up
            5
            ·
            2 days ago

            The problem is switching for enterprises because of how much momentum there is. Especially in embedded.

            I worked on a 30 year old C code base that’s still being developed now for future products. Some components are literally 20+ years old mostly untouched. Sure they could switch to Rust or something but they’re fucked since nearly none of the staff have relevant experience in anything but the in house C build system and changing over multiple thousands of C files to another language will literally take years even if you got people trained up.

            Plus, in embedded pretty much no big HW supplier provides BSPs or drivers in anything but C. If NXP etc. aren’t giving you anything but C, management doesn’t want to start combining languages.

            I advocated for Rust when we started a ground-up new project, but got shot down every which way. Only those younger than like 35 were into the idea. Old managers are scared of anything new and their whole life has been C. I don’t know how you convince those kinds of people and maybe we’ll get some movement in another 10 years but enterprises are a slow cautious mess.

  • tunetardis@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    My wife was telling me at her work they’re desperate for cobol programmers, as they’re all retiring boomers leaving behind a giant code base. At my work, it’s legacy fortran that’s all over the place, but we’re a much smaller company.

    • mesamune@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 days ago

      I know COBOL, its pretty easy to pick up honestly. Its just the 40+ years of context and the "WHY"s that they did what they did that is hard.

      With how many people are looking for COBOL I really should just make a resume specific to it, put salary * 1.2 and see if any remote positions pop up. Love my job, but with the latest news…

        • mesamune@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 days ago

          I work in a field that is 1. in the news every day and 2. is seeing HEAVY changes. Im sure you can figure it out ;).

          They cant fire me, but I might be on accident haha.

          • tunetardis@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Ouch. Well if you wanna take your chances in Canada, definitely advertise your senior COBOL dev skills! ;)

    • IllNess@infosec.pub
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      Can I take a a guess? If not, please ignore the next sentence.

      Finance companies with a “database” is a gigantic flat file?

      • tunetardis@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        Lol yeah she’s in insurance! I bet you could probably also infer from the fortran that I work for a science-y outfit.

  • pycorax@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    2 days ago

    That’s a rather misleading headline, it’s completely different from what the article which talks more on adopting good practices.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    2 days ago

    Serious question: How can a programming language be more or less secure than another? I am just a hobbiest, not a professional, so I am genuinely curious.

    My dad who is a software engineer can’t even answer my question. But then he’s old and I’ve only seen this argument coming from the young bloods.

    • onlinepersona@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 hours ago

      It’s about memory management.

      In programming terms: allocated memory has to have the data in it that you expect in order for your program to work. The unsafe languages do it by manually ensuring it’s good and doing so mostly at runtime, or just assume the data is valid and write code that looks valid and have somebody check it before the program runs, or do a mix thereof. In all cases, it require a lot of human intervention and because humans are fallible with different skill levels, this fail quite often.

      Safe languages are either built on top of unsafe languages that are battle tested and do lots of runtime checks behind the scenes (interpreted languages like python, ruby, javascript, etc.). Then there are languages that check actions at compile time like Rust. They tell you that the memory you’re trying to access can be modified by another part of the code, which might make unexpected changes and that in order to access it, certain conditions have to be met.

      In laymans terms: imagine you work at a storage facility (memory) and have to store and retrieve packages. To know where to store and retrieve them, you have a piece of paper with the aisle, shelf, and rack and position on the rack. That’s your pointer. To store something, you have to make space on a rack and put the item there, write down the name of the item (variable) and location on a piece of paper (memory address), and keep it on you.

      Imagine keeping all of that in order. You have to make sure you don’t write down the wrong location (off by one error), remove a piece of paper then it’s not valid anymore (dangling reference), remove a piece of paper without removing the item (memory leak), add a piece of paper pointing to something without actually checking what you expect to be there is there and then retrieve it later, and so many other things.
      Those are the things unsafe languages allow you to do.

      Safe languages either enforce that before doing certain things, you check stuff (runtime checks) or that before you even start doing anything, you plan how you would do, and that plan is checked.

      The crazy storage facilities are what most of our world runs on at the moment and there a whole lot of people who love it because it’s simple and they know it. “Just tell the intern to get that box there, I made sure it’ll be fine. Trust me, I’ve been doing it this way for years.” meanwhile somebody gets the wrong medicine because a piece of paper said another one was supposed to be on the shelf. There are a bunch of people who have thought about ways to improve it, implemented, tested it, and are using it to manage their storage facilities.

      Anti Commercial-AI license

    • solrize@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      2 days ago

      Concrete technical answer (one of many): imagine you have a list (“array”) of 5 numbers, and you try to print the 10th number in the array. A secure language will say “error! it’s a list of 5 numbers, there is no 10th one!!”. C will instead print some random garbage (whatever happens to be in the part of memory following the 5 element list), or maybe do something even crazier (try searching “nasal demon”), without indicating that anything has gone wrong. There are many other issues like this with C. You end up with programs going completely into the weeds, turning control over to attackers, etc.

      Abstract philosophical answer: Secure languages like Ada and (hopefully) Rust are designed to help you ensure the absence of unwanted behaviours, rather than just the presence of wanted ones. If you want behaviour X, the goal of old languages like C was to make sure you could write a program in which X was present. That was a big enough challenge in the old days that language designers stopped once they reached that point. If you don’t want behaviour Y (let’s say Y is a security attack), it’s up to you to just write the program without behaviour Y. 50+ years of experience have shown that to be inhumanly difficult once the program gets complicated, so you really do need help from the language. Accountants invented double-entry bookkeeping 700 years ago for similar sorts of reasons: to keep small errors in complicated systems from sending the system into a nose dive.

      Ensuring the absence of behaviours is the classic problem of proving a negative, so there are limits on how thorough the checking can be, and the technical features (like the notorious Rust borrow checker) can be difficult to use. But if you’re willing to endure a certain amount of pain and runtime inefficiency (requiring the program to do a little extra work at each operation to make sure the result makes sense, like the example of the 10th element of the 5-element list), you can make programs much safer than you can in C.

      Does that help?

      Added: Rust is getting some flak because it is pretty new, is still a work in progress, has various unmet goals, etc. It’s not fully baked yet but it is getting there (I’m studying it right now). Ada is an older language that is way more mature than Rust, but is more of a pain to use in many ways, so Rust is currently getting more attention.

    • atzanteol@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      2 days ago

      It’s mostly about memory access. Modern languages throw errors if, for example, you try to reference an element of an array that is “outside the bounds” of the array. C does not - it gladly returns whatever memory address is past the end of the array. So the programmer has to check that the index is 0 <= x < array_size whenever they access a an array entry. That’s a pain - so they don’t.

    • 3h5Hne7t1K@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      Its about the type of operations the compiler allow you to do, more or less. Like sharing mutable references, that can be independently changed in a ‘hard to keep track of’- manner. Other factors the compiler tries eliminate include buffer overruns and int overflows e.t.c.

      Rust for example sometimes makes trivial things a royal pain, see linked lists for example. It also has a gaping microdependency/supply chain attack prone ecosystem, and the compiler interface is also not stable (afaik, caused some issues in linux). There is also no spec.

      I have experience of both, and i love both, but C is my fav. Its often trivial to imagine the codegen with C, and there are no shortage of quality compilers. The language is also small enough that implementing a compiler is actually feasible.

  • Irdial@lemmy.sdf.org
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    2 days ago

    It’s great that there are new languages coming along that strike a balance between performance and safety. However, there’s always going to be a need for unsafe, low-level code. I work in semiconductors and occasionally have to write firmware and drivers for our devices. There’s no avoiding C in those environments.

    • bamboo@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Unsafe rust has proven that it can be an effective alternative here, ideal especially when the consumers are also rust.