When I first started using Linux 15 years ago (Ubuntu) , if there was some software you wanted that wasn’t in the distro’s repos you can probably bet that there was a PPA you could add to your system in order to get it.

Seems that nowadays this is basically dead. Some people provide appimage, snap or flatpak but these don’t integrate well into the system at all and don’t integrate with the system updater.

I use Spek for audio analysis and yesterday it told me I didn’t have permission to read a file, I a directory that I owned, that I definitely have permission to read. Took me ages to realise it was because Spek was a snap.

I get that these new package formats provide all the dependencies an app needs, but PPAs felt more centralised and integrated in terms of system updates and the system itself. Have they just fallen out of favour?

  • jmcs@discuss.tchncs.de
    link
    fedilink
    arrow-up
    118
    arrow-down
    5
    ·
    5 months ago

    Probably because PPAs only work on Ubuntu and there are more Linux distros and even then it meant having to build and test a package for a couple of different Ubuntu versions.

    • Jesus_666@feddit.de
      link
      fedilink
      arrow-up
      49
      ·
      5 months ago

      Also, Ubuntu is moving towards using snaps for everything so they’re pretty much the successor to PPAs.

      • jmcs@discuss.tchncs.de
        link
        fedilink
        arrow-up
        10
        ·
        5 months ago

        Theoretically they can, in practice it’s less than ideal. And that doesn’t solve all the other distros or the combinatory explosion of supporting several distros and versions.

        Flatpaks on the other hand give you a single runtime of your choice to worry about (though they still have lots of cons too).

        • acockworkorange@mander.xyz
          link
          fedilink
          arrow-up
          5
          ·
          5 months ago

          Oh I’m not defending PPAs at all, I’m glad we’ve moved past them, I just thought it was a Debian tech that got boosted by Ubuntu. I see I was in error. Thanks for clarifying!

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            5 months ago

            Debian focuses on stability. They tell you not to add any extra repos ever as it introduces untested software.

            • acockworkorange@mander.xyz
              link
              fedilink
              arrow-up
              2
              ·
              5 months ago

              Encouraging something and disabling something are two different things. They have Flatpak in stable, which is untested software. That’s not why they didn’t use PPAs.

  • BananaTrifleViolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    72
    arrow-down
    5
    ·
    edit-2
    5 months ago

    PPAs are flawed and limited to the Debian/Ubuntu ecosystem. They’re a security issue as you really need to trust to the person or group who has set up the PPA (yet many people just added PPAs for all sorts of random software based on a Google search). They need to be maintained which is variable depending on the size of the project and for developers they’re only a route to support part of the entire Linux ecosystem. They can also conflict with the main system provided packages and repost which can break entire systems or break upgrades (happened to me on Mint, and I needed to do a complete system reinstall to remove legacy package conflicts).

    They’ve fallen out of fashion and rightly so.

    There are other ways to get software to users. Arch has its AUR which is basically a huge open repo. OpenSuSE has its OBS which is also a huge open repo. These are also not without their risks as it’s hard to curate everything on such an expansive repo. However others can take over packages if the original developer stops updating them, and you can see how the package was built rathe than just download binaries which allays some security concerns. They are also centralised and integrated into the system, while PPAs are a bit of a free for all.

    Flatpaks are a popular alternative now - essentially you download and run software which runs in a sandbox with its own dependencies. Flatpaks share their sandboxed dependencies but it does lead to some bloat as you’ll have system level libraries and separate Flatpak versions of the same libraries both installed and running at the same time. However it does mean software can be run on different systems without breaking the whole system if library dependencies don’t match. There are issues around signing though - flathub allows anyone to maintain software rather than insisting on the original devs doing so. That allows software to be in a Flatpak that might otherwise not happen but adds a potential security risk of bad actors packaging software or not keeping up to date. They do now have a verified tick in Flathub to show if a Flatpak is official.

    Snap is the Canonical alternative to Flatpak - it’s controversial as it’s proprietary and arguably more cumbersome. The backend is closed source and in canonical control. Snaps are also different and for more than just desktop apps and can be used to in servers and other software stacks, while Flatpak is focused only on desktop apps. Canonical arr also forcing Ubuntu users to use it - for example Firefox only comes in a snap on Ubuntu now. It has similar fundamental issues around bloat. It has mostly the same benefits and issues as Flatpak, although Flatpaks are faster to startup.

    Appimage are another alternative way to distribute software - they are basically an all-in-one image. You are essentially “mounting” the image and running the software inside. It includes all the libraries etc within the image and uses those instead of the local libraries. It does and can use local libraries too; the idea is to include specific libraries that are unlikely to be on most target systems. So again it has a bloat associated with it, and also security risks if the Appimage is running insecure older libraries. Appimage can be in a sandbox but doesn’t have to be, unlike Flatpak where sandboxing is mandatory - which is a security concern. Also Appimages are standalone and need to be manually updated individually while Flatpaks and Snaps are usually kept up to date via an update system.

    I used to use PPAs when I was still using Ubuntu and Mint. Now I personally use Flatpak, and rarely Appimages, and occasionally apps from the OBS as I’m on OpenSuSE Tumbleweed. I don’t bother with snaps at all - that’s not to say they don’t have value but it’s not for me.

    Edit: in terms of permissions, with Flatpak you can install Flatseal and manage software’s permissions and access per app. You can give software access to more locations including system level folders should you need to or all devices etc for example. I assume you can do the same with snap but I don’t know how.

    Also you can of course build software form source so it runs natively , if you can’t find it in a repo. I’ve done that a few times - can be fiddly but can also be easy.

    • jayandp@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      5 months ago

      Another issue I’ve had with Snaps is just increased boot times. Something to do with mounting all the virtual images involved or something, makes boot take noticeably longer. I’ve tested having an Ubuntu install with Snaps, and then removed the snaps and snapd while installing the same software via Flatpak, and had a noticeable boot time improvement. Hopefully they’ve been working to improve this, but it just soured me on them even more.

      As for another install method, mostly for CLI tools, but working with a lot of GUI apps too now, there’s Distrobox. It has a bit of a bloat issue, because you’re basically installing an entire extra headless Linux Distro with it, but it for example allows you to run AUR inside an Arch based Box, and then you can integrate the app you installed with AUR into the host OS, running it near seamlessly, while keeping its dependencies contained in the Box which you can easily remove. By default apps in the Box will have access to the host’s filesystem but you can mitigate this if you want. Distrobox is especially great on atomic read-only Distros, where you can’t directly touch system directories, by allowing you to install apps that expect such access from things like AUR.

    • federico3@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      5 months ago

      They do now have a verified tick in Flathub to show if a Flatpak is official

      Jia Tan liked your comment

      Without the traditional distribution workflow what prevents flatpaks to be full of security issues? Unfortunately sandboxing cannot protect the data you put in the application.

      • Gecko@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        5 months ago

        Jia Tan liked your comment

        Without the traditional distribution workflow […]

        You are aware that the xz exploit made it into Debian Testing and Fedora 40 despite the traditional distribution workflows? Distro maintainers are not a silver bullet when it comes to security. They have to watch hundreds to thousands of packages so having them do security checks for each package is simply not feasible.

        • federico3@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          5 months ago

          I am well aware of it. It is an example of the traditional distribution workflow preventing a backdoor from landing into Debian Stable and other security-focused distributions. Of course the backdoor could have been spotted sooner, but also much later, given its sophistication.

          In the specific case of xz, “Jia Tan” had to spend years of efforts in gaining trust and then to very carefully conceal the backdoor (and still failed to reach Debian Stable and other distributions). Why so much effort? Because many simpler backdoors or vulnerabilities have been spotted sooner. Also many less popular FOSS projects from unknown or untrusted upstream authors are simply not packaged.

          Contrast that with distributing large “blobs”, be it containers from docker hub or flatpak, snap etc, or statically linked binaries or pulling dependencies straight from upstream repositories (e.g. npm install): any vulnerability or backdoor can reach end users quickly and potentially stay unnoticed for years, as it happened many times.

          There has been various various reports and papers published around the topic, for example https://www.securityweek.com/analysis-4-million-docker-images-shows-half-have-critical-vulnerabilities/

          They have to watch hundreds to thousands of packages so having them do security checks for each package is simply not feasible.

          That is what we do and yes, it takes effort, but it is still working better than the alternatives. Making attacks difficult and time consuming is good security.

          If there is anything to learn from the xz attack is that both package maintainers and end users should be less lenient in accepting blobs of any kind.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      Even if Snap was completely open it would not change anything. Snaps are overly complex and way to complex. Flatpaks on the other hand are fairly simple because it uses bubble wrap for isolation.

    • Samueru@lemmy.ml
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      5 months ago

      Appimages don’t bloat the system, they are actually many times even smaller than native packages thanks to their compression (librewolf being 100 MiB instead of 300 MiB, libreoffice being 300 MiB instead of 600 MiB).

      And those are “lazy” appimages made with linuxdeploy, if you do some tricks with static linking to can get their size down way way more. For example one case is qbittorrent, their official appimage is 100 MiB while there is a fork called “qBittorrent Enhanced Edition” and they got the size of the appimage down to 26 MiB

      I also don’t know what you mean by security risks with the libraries, the appimage gets made in CI (usually ubuntu 20.04 or debian stable) and the libraries from those distros get bundled and released, the only way this could be a security risk is if the whole appimage is outdated or debian/ubuntu haven’t caught to updating their distros.

      My big issue wiht flatpak is that they don’t follow the xdg base dir spec and neither add the binaries to PATH (And they said that they will not fix those issues btw), making them only useful for some graphical applications, while pulling several gigabytes of runtimes and dependencies, and the more I’ve been using and understanding appimage the more I think both flatpak and snap should have never existed. As 99% of what they do could have been done with appimage already and just keep a centralized repo of approved appimages for security concerns.

    • refalo@programming.dev
      link
      fedilink
      arrow-up
      3
      arrow-down
      3
      ·
      5 months ago

      obvious bot account is obvious.

      snaps are so much worse AND less trustworthy it’s not even funny.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      arrow-up
      3
      arrow-down
      4
      ·
      5 months ago

      Former OS security chief here.

      Please, God, avoid flatpaks, appimages and snaps. They break rules just to break more rules, and you’re the victim.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        Flatpaks are always going to be better than just installing random software of the Internet. This is true from both and security and reliability context. Software inside flatpak only has the permissions it needs which is a example of least privilege. Furthermore chances are you are getting the software from flathub which more trustworthy than some random repo. If someone tried to do something problematic such as a fake crypto app it will likely be caught.

  • Oisteink@feddit.nl
    link
    fedilink
    arrow-up
    66
    arrow-down
    5
    ·
    5 months ago

    A ppa is a repo. It’s Ubuntu stuff, and there’s no reason to work your ass off for Ubuntu for free. They’ll just shit on you and claim that snaps are great (they’re not)

  • lemmyvore@feddit.nl
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    5 months ago

    PPAs are a nice idea but a terrible design. They work well as long as they are kept up to date and they don’t overwrite distro packages. But in practice as you’ve noticed they tend to be abandoned after a while, and they don’t respect the rule to not supersede original packages. Together these two faults lead to terrible consequences, as time passes they corrupt your Debian/Ubuntu dependencies and lead to unsolvable situations. Basically your system reaches a dead-end where it cannot upgrade anymore (or only partially, which makes things even worse).

    Aptitude has a very smart and determinate dependecy solver that can recover a system from this kind of situation but it usually involves uprooting most of it by removing and reinstalling a huge amount of packages, some of which are essential. It takes a long time, if anything goes wrong you’re screwed, and you may end up with older packages than what you had installed, which may cause your user data for those apps to not be understood anymore, leading to malfunctions or crashes. So yeah it can be done but at that point you might as well do a clean reinstall.

  • fine_sandy_bottom@discuss.tchncs.de
    link
    fedilink
    arrow-up
    31
    arrow-down
    2
    ·
    5 months ago

    This seems as good a place as any to point out that I just perpetually have problems with flatpaks and snaps. Appimages less so but I wish they were better integrated.

    Yes I understand why devs like these new packages. Yes I think that in the future they will be great. Yes they probably work fine for everyone else. I personally dislike them.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      It really depends on what you are using them for. I would avoid Snap as it is a mess but flatpaks are fairly similar to regular apps. The big difference is that the app configs are in home/var/local and Flatpaks use sandboxing.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    5 months ago

    Because we have Flatpak. Also everyone kind of dislikes Ubuntu now especially the more technical users who historically maintaining the PPAs

  • Fliegenpilzgünni@slrpnk.net
    link
    fedilink
    arrow-up
    22
    arrow-down
    5
    ·
    5 months ago

    Because it’s outdated. They are a lot of work and can cause package conflicts or errors, making the whole system less reliable.

    If you need something, that’s not in your package manager, then use Distrobox and create an Arch container, and use the AUR for example.
    You can export the program after installing, and it integrates better into your system.

    By doing that, the devs have to do the work only once and you will have less problems.

  • NauticalNoodle@lemmy.ml
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I used to crash my Debian and eventual Ubuntu distros with regularity due to outdated PPAs. It was such a headache, and it’s why I still put my /home directory on a seperate partition just to make a reinstall safe for my personal files. I thought I didn’t like Appimages and their bloat until Snap came along. I hated Snap so much it convinced me switch distro’s again. Now I’m on Pop! and I love Flatpaks by comparison and now think Appimages are alright…

    It’s 10+ years later and I still irrationally worry about crashing my system due to outdated & conflicting source dependencies. In hindsight the problems with PPAs clearly had a lasting impact on me.

  • Smoolak@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    6
    ·
    5 months ago

    Most people I know just use Arch Linux and the AUR. It seems to be the easiest system around for maximal package support and it’s well maintained.

    • christophski@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 months ago

      As someone that prefers the repo method to the all-in-one package method, Arch is becoming more and more appealing

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        5 months ago

        Please keep in mind though that the reason the AUR tends to work well is because it’s a very loose wrapper over source packages. Compiling from source is a very flexible process which adapts well to system changes. But at the same time it’s resource-consuming (CPU time and RAM).

        Most importantly, AUR is completely unsupported by any of the Arch-based distros including Arch itself. Anybody who mentions “AUR compatibility” either doesn’t know what they’re saying or are making a tongue-in-cheek observation about how their system happens to be coping well with a very specific selection of AUR packages that they are using at that particular moment. But there’s absolutely no guarantee that your system will do well with whatever AUR packages you attempt to use, or that they’ll keep working a month or a year from now.

        • Dempf@lemmy.zip
          link
          fedilink
          arrow-up
          4
          ·
          5 months ago

          AUR tends to work really well for me. There are binary packages for almost every software that I use. Things do go wrong occasionally, but when they do it’s almost always solvable. AUR packages are just scripts, so you can go and fix the problem yourself and then tell the maintainer how you did it.

          • lemmyvore@feddit.nl
            link
            fedilink
            English
            arrow-up
            4
            ·
            5 months ago

            Arch and Gentoo basically approach this issue from the opposite sides. Gentoo is source-first with optional binaries as-needed, Arch is binary first with optional source as-needed. Gentoo also tends to support the exceptions (the binaries) much better than Arch supports AUR (which is not at all).

        • Smoolak@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          On the contrary AUR seems to have a lot more binary packages than source packages in my experience. Tons of package also have a “-bin” version (e.g. yay).

          Your “unsupported” comment is a bit weird. It’s the AUR user community that supports Arch and makes AUR compatible with it. I don’t know why somebody would contemplate the other way around. I mean, it’s the while philosophy of the AUR.

          I’ve been using it for the past 12 years and I rarely got any issues with it. I think you fear mongering quite a bit. Sure, you get over some abandoned packages from time to time and once in a blue moon you get a dependency that doesn’t install properly. When that happen you post a comment on the AUR or flag the package and it’s solved in a matter of days most of the time. It’s surprising that such a system would work so well, but it does.

  • Mactan@lemmy.ml
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    5 months ago

    whenever somebody brings up some terribly ancient Debian/buntu distro with outdated packages we end up having them use a .deb instead since the ppa is long gone and it’s been fine. wild that they’re often stuck on 4 year old packages though

  • MentalEdge@sopuli.xyz
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    5 months ago

    Because they only work on one distro/package manager.

    Distributing software is simply transitioning to work in a distro-agnostic way. It’s only a matter of time until distros start updating flatpaks along with system packages. Many already do.

    And some apps distributed as appimages self-update. (RPCS3 for example)

    Not to mention that Ubuntu itself has basically ditched apt for snap.

    • nelov@feddit.de
      link
      fedilink
      arrow-up
      10
      arrow-down
      3
      ·
      5 months ago

      PPA’s are the reason why I stopped using Debian-based distros about 8 years ago.

      For me, those have been the primary source of pain and anger. Back then, almost every dude had a PPA. Keeping track was hard. Not only that, but often those were full of other unrelated software or libs. The outcome was broken systems left and right.

    • 0x0@programming.dev
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      5 months ago

      Distributing software is simply transitioning to work in a distro-agnostic way. It’s only a matter of time until distros start updating flatpaks along with system packages. Many already do.

      I guess Canonical being money-driven would be wanting to cut costs so reducing packagers is a viable way. So what if many packages ship the same lib? It’s all isolated and drive space is not an issue, right?

        • 0x0@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          5 months ago

          It is, but snap helps Canonical become the walled garden it wants to be, so let’s bitch about how troublesome it is to do packages for all architectures omg what a downer…

  • iopq@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    5 months ago

    I used ppas and then noticed the updates take forever and start failing as those ppas don’t exist anymore. I switched to NixOS to eliminate having to deal with this. NixOS packages perfectly integrate with your system and you can install almost anything you need, even networking software and other things that need root. For everything else you can package it yourself, and nixpkgs will accept your pull requests

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Nix is also a pain in the ass in some ways. Also it doesn’t seem to care about licensing.

      I’ll just stick to Debian, Fedora and Linux Mint.

      • iopq@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        It literally doesn’t install non-free software until you manually configure it to do so. What do you mean by not caring about licensing?

      • Tony Bark@pawb.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        While they aren’t perfect, it’s certainly better than waiting on the distro or dealing with potential package conflicts that PPAs also had a habit of causing.