🪿

  • Natanox@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    4
    ·
    4 days ago

    Puritans on Linux are a real menace. Every time someone calls an OS install image of 3-4gb “bloated” I want to scream uncontrollably. Not statically linking stuff is part of this cultural issue.

    Flatpak might solves these issues in the long run. Of course the same people therefore hate it, because it’s “bloated” and “convoluted”.

    <rant> How dare we have different versions of the same lib! Where will we end up, like MS Windows? Where I can boot up apps as old as myself? Outrageous! Not my precious mibibytes!). </rant>

    • qqq@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      But you can do that: Linux provides a ton of ways to use different versions of the same lib. The distro is there to provide a solid foundation, not be the base for every single thing you want to run. The idea is you get a core usable operating system and then do whatever you want on top of that.

    • Delilah (She/Her)@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      3 days ago

      The core principal of GNU from which every other principal is derived is “I shouldn’t need an ancient unmaintained printer driver that only works on windows 95 to use my god damned printer. I should have the source code so I can adapt it to work with my smart toaster”

      If an app is open source then I’ve almost never encountered a situation where I can’t build a working version. Its happened to me once that I remember. A synthesia clone called linthesia. Would not compile for love nor money and the provided binary was built for ubuntu 12 or something.

      Linux was probably ready for the 64-bit appocalypse even before Apple for this exact reason. Anything open source will just run, on anything, because some hobbiest has wanted to use it on their favourite platform at some point. And if not, you’d be surprised how not hard it is to checkout the sourcecode from github and make your own port. Difficult, but far from impossible.

      Steam games do not distribute source code, which means they break, and when they break the community can’t fix them. They can’t statically link glibc because that would put them in violation of the GPL (as far as I’m aware anyway). They are fundamentally second class citizens on linux because they refuse to embrace its culture. FOSS apps basically never die while there’s someone to maintain them.

      Its like when American companies come to Europe and realise the workers have rights and then get a reputation as scuzzballs for trying to rules lawyer those rights.

      • Kairos@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Okay so bundle glibc. As far as I know link systemcalls are set up to look in the working directory first

        Why would statically compiling it violate the GPL?

        • qqq@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          It wouldn’t; glibc is LGPL not GPL. The person you’re replying to was mistaken.

          • Delilah (She/Her)@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            You know what, that explains how they can exist on linux at all. Because from what I understand, if glibc was GPL and not LGPL, closed source software would basically be impossible to run on the platform. Which… maybe isn’t the best outcome when you think about it. As much as I hate the Zoom VDI bridge, I don’t want “using windows” to be the alternative.

            and yeah, from the source you provided, I can see why they don’t statically link. “If you dynamically link against an LGPLed library already present on the user’s computer, you need not convey the library’s source”. So basically if they bundle glibc then they need to provide the glibc source to users on request but if they just distribute a binary linked against the system one then that’s their obligations met.

            Welcome to “complying with the LGPL for the terminally lazy”, I’ll be your host “Every early linux port of a steam game!”

            • qqq@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              My understanding of the linking rules for the GPL is that they’re pretty much always broken and I’m not even sure if they’re believed to be enforceable? I’m far out of my element there. I personally use MPLv2 when I want my project to be “use as you please and, if you change this code, please give your contributions back to the main project”

            • qqq@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              2 days ago

              It should be noted that statically linking against an LGPL library does still come with some constraints. https://www.gnu.org/licenses/gpl-faq.html#LGPLStaticVsDynamic

              You have to provide the source code for the version of the library you’re linking somewhere. So basically if you ship a static linked glibc executable, you need to provide the source code for the glibc part that you included. I think the actual ideal way to distribute it would be to not statically link it and instead deliver a shared library bundled with your application.

              EDIT: Statically linking libc is also a big pain in general, for exampled you lose dlopen. It’s best not to statically link it if possible. All other libraries, go for it.

        • Delilah (She/Her)@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          As far as I know link systemcalls are set up to look in the working directory first

          Not so much but that’s easily fixed with an export LD_LIBRARY_PATH=.

          Why would statically compiling it violate the GPL?

          Because you’ve created something that contains compiled GPL code that can’t be untangled or swapped out. The licence for the Gnu C Compiler is basically designed so you can’t use it to build closed source software. Its a deal with a communist devil. If you want to build a binary that contains GPL code (which is what glibc is) then you have to make everything in that binary licensed under a GPL compatible license. That’s what the whole “Linux is a cancer that attaches itself in an intellectual property sense to everything it touches” quote from Steve Balmer was in aid of. And he was correct and this was literally the system operating as intended.

          Dynamic linking is some looney tunes ass “see, technically not violating the GPL” shit that corporations use to get around this.

            • Delilah (She/Her)@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              From a technical standpoint, yes. From a legal standpoint:

              If you dynamically link against an LGPLed library already present on the user’s computer, you need not convey the library’s source

              Welcome to “what did you think was going to happen if you told for profit corporations that if they want to distribute a library in a bundle they also have to provide the source code but if they just provide it linked against an ancient version that nobody will be using in 5 years and don’t even tell you which one they’re 100% in compliance”?

              Could they? yes. Will they? probably not, that takes too much work.

              This is why steam’s own linux soldier runtime environment (Which is availible from the same dropdown as proton) had to become a thing.

                • Delilah (She/Her)@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  23 hours ago

                  as long as you run it from the command line. On my system at least if there’s a library missing it will just silently fail to launch. I love linux but it does not make it easy

                • Delilah (She/Her)@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  23 hours ago

                  .so files are distro dependent. (This is theoretially a good thing. Means debian can distribute a stable version and Arch can distribute a so fresh the hen doesn’t know its missing yet version). If you’re a command line guru you can run a pacman or deb query to find out what package you have to install to add that library at a system level. But oftentimes you can’t just use a different .so because the .so was built to depend on another .so and you basically have to solve a dependency chain by hand. Its a big mess that apt or pacman or even gentoo’s famously obtuse emerge solves for you invisibly.

                  as to where to find the ancient version that binary was built for? well My goto is archive.archlinux.org but its an asshole of a process and not for the faint of heart. sometimes you just need to build the library from scratch which is BULLSHIT and I’m not unaware of that fact.

                  can you just go to a website and download the .so? yes actually. But it might just decide not to work. This is a problem that individual distros are meant to be solving and do so when distributing open source software. As for with steam games, I think valve solves this problem with the soldier runtime environment which I mentioned above.

      • Natanox@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        4
        ·
        3 days ago

        This shit is the exact reason Linux doesn’t just have ridiculously bad backwards compatibility but has also alienated literally everyone who isn’t a developer, and why the most stable ABI on Linux is god damn Win32 through Wine. Hell, for the same reason fundamentally important things like accessibility tools keep breaking, something where the only correct answer to is this blogpost. FOSS is awesome and all, but not if it demands from you to become a developer and continuesly invest hundreds of hours just so things won’t break. We should be able to habe both, free software AND good compatibility.

        What you describe is in no way a strength, it’s Linux’ core problem. Something we have to overcome ASAP.

        • qqq@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          The Linux ABI stability is tiered, with the syscall interface promising to never change which should be enough for any application that depends on libc. Applications that depend on unstable ABIs are either poorly written (ecosystem problem, not fixable by the kernel team, they’re very explicit about what isn’t stable) or are inherently unstable and assume some expertise from the user. I’d say the vast majority of programs are just gonna use the kernel through libc and thus should work almost indefinitely.

        • Semperverus@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          6
          ·
          3 days ago

          It isn’t a core problem, it’s a filter, and a damn good one. Keeps the bad behavior out of Linux. Thats why people keep turning to it for lack of enshittification. Stable ABIs are what lead to corpo-capital interests infecting every single piece of technology and chaining us to their systems via vendor lock-in.

          I wish the Windows users who are sick of Windows would stop moving to Linux and trying to change it into Windows. Yes, move to Linux if you want, but use Linux.

          • Natanox@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            3 days ago

            This might be the most awful Linuxbro take I’ve read this year, congratulations. Linux has to lack a stable ABI to keep the capitalists away and make apps constantly require maintenance to filter out bad behaviour? Just wow.

            I really hope for way more people to come over so nonsense like this finally stops.

            • Semperverus@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              13 hours ago

              My favorite part about your post is how you intentionally spin it to mean the opposite of what I said.

              Linux requires source compilation by design. This ensures that the Linux ecosystem stays open no matter what. The term “ABI” literally stands for Application Binary Interface. Having a stable one is literally the antithesis of a libre/open-source project. You could try for something like reproducible builds but this disallows for distros to make their own builds as-needed with the necessary flags and patches enabled/disabled.

              The purpose of this isn’t to make apps require maintenance, its to enforce the open-source nature of the project. Stallman was a gross toenail-eating weirdo but he was dead to rights on the principals he held, and it’s because of him and thousands of developers like him that you even have an OS to escape from Microsoft onto.

              People like you just want Linux to be “Windows without the bullshit” instead of trying to set aside your decades of conditioning in order to learn how to use the tool properly. If someone hands you a hammer, I bet you’d try to spin it on top of a nail to get it to sink into the wood instead of realizing its not a screwdriver.

              • hanni@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 hours ago

                The term “ABI” literally stands for Application Binary Interface. Having a stable one is literally the antithesis of a libre/open-source project.

                Can you elaborate? Having an unstable ABI does not make an project libre/open-source… but one might be wrong. And This creature is open to a different point of view.

                Linux requires source compilation by design.

                So does every other software on this planet? Microsoft Windows requires compilation from source. The source just happens to not be under an open-source license (or even source available to have a look inside for oneself…)

                edit:typo

            • Delilah (She/Her)@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              No. Its not about driving away the capitalists. Its about forcing them to bend to the community. Its not “Linux has to lack a stable ABI to keep the capitalists away” its “Linux is not here to baby rich corporations and exempt them from rules that literally nobody including little timmy who’s 14 and just submitted his first PHP patch has a problem with”. This is developers who are used to living in houses trying to set up shop in an apartment complex and then finding out different rules apply and being colossal babies about it.

              The point of the GNU foundation was to destroy the concept of closed source software. Which is a completely justified response to Xerox incorporated telling you your printer is no longer supported and you just have to buy a new one. Capitalists are welcome. Anti right to repair people can fuck right off and if we had the right to repair their software we wouldn’t have this problem in the first place because someone else would have already fixed it.

              • Natanox@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                1 day ago

                And that fight against closed-source and anti-consumer shit is awesome, but that changes absolutely nothing about Linux being completely awful in terms of long-term support. Running old software is a whole project (for enthusiasts) in itself almost every single time, meanwhile I can run almost any decade-old software on systems like Android or Windows simply by installing it without having to be an IT professional.

                that literally nobody including little timmy who’s 14 and just submitted his first PHP patch has a problem with."

                Except that this causes usability issues for the 99.99% of users who aren’t that little Timmy you just made up, and it causes accessibility tools which are freaking essential for many people to simply break. Old games becoming unplayable isn’t an issue only because of their Windows versions and Wine, dxvk etc - we literally have to fall back to Windows software to keep software running because of how badly the Linux system architecture works for desktop usage. What a disgrace.

                if we had the right to repair their software we wouldn’t have this problem in the first place because someone else would have already fixed it.

                Literally has nothing to do with Linux’ own problems.

                • Delilah (She/Her)@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  23 hours ago

                  Linux’s own problems is that we have a culture of “tear everything down and make way for progress”, which I personally approve of. However, things keep getting left behind in the rebuilding process and that’s a very real cultural problem. We should have been rebuilding those accessibility tools with everything else and the reason we haven’t is that quite frankly the linux community itself hates disabled people.

                  I see no other reason that disabled people would be relying on old and unmaintained code in the first place. That’s not a problem with the build and rebuild attitude, that’s a problem of who we accept into the community. Why is the only wheelchair accessible building 20 years old and full of rotting floorboards?

                  Linux is built by the community and always does what it thinks is best for the community. The fact that “what’s best” does not include maintaining the accessibility features is fucking deplorable and that’s a legitimate thing to complain about. But a system shouldn’t need to support legacy junk just to provide accesibility features that should have been core parts of the system from the beginning. In that, no linux developer has the right to look a microslop developer in the eye.

                  • Natanox@discuss.tchncs.de
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    22 hours ago

                    I think it’s slowly getting better though, more people are finally listening. At least that’s what I notice; still, those purists who don’t give a proper shit (“The CLI is perfectly accessible! It’s all text, where’s the problem?”) and believe everyone got to be a developer or filtered out are really loud and annoying.

                    Of course the system should inherently be accessible. Better backwards compatibility would just make a lot of things simpler, even if what’s being made simpler is to deal with bad decisions and exclusion. Enabling people (everyone, not just abled or developers) is always good.

                • qqq@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 day ago

                  Android

                  Android is Linux! You’re running your decades old software, on Linux. What was the last completely unmaintained binary that you pulled on Windows and ran (with no tweaking) and the last one that failed on Linux?

                  Why do you keep sharing that link instead of this one? https://fireborn.mataroa.blog/blog/i-want-to-love-linux-it-doesnt-love-me-back-post-4-wayland-is-growing-up-and-now-we-dont-have-a-choice/ The one where the same person you’ve been posting says clearly people are working on accessibility and things are improving?

                  Have you considered joining the community and working with it – like the author of the blog that you keep sharing – instead of trying to insult every one who works on it and calling it a disgrace?

                  • Natanox@discuss.tchncs.de
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    23 hours ago

                    Android is Linux!

                    That’s a rabbit hole of semantics I’m not going down. 😅 I think it’s clear we are talking about Desktop Linux, which is very different to Android.

                    What was the last completely unmaintained binary that you pulled on Windows and ran (with no tweaking) and the last one that failed on Linux?

                    Ouff, didn’t use Windows (10) for years. Probably either Photoshop CS6 or one of my old favs like Total Annihilation (1998). On Linux the last app that failed also happened to be a (native) game, Life is Strange: Before the Storm. I saw someone fixed it with a glibc shim, and a friend likes that game.

                    Why do you keep sharing that link instead of this one?

                    Because I’m complaining about puritans and Linux-bros who keep sugarcoating real problems that exist for a long time now, or even still make a fuss about things like systemd or Flatpak (which solve a lot of long lasting issues). That blogpost is a perfect example of this. I said it in my first comment, “Puritans on Linux are a real menace”. Everything after that is merely me putting my finger into open wounds (which are being worked on by devs and I’m absolutely celebrating that, please don’t get me wrong!) which are regularly being sugarcoated by those people.

                    Have you considered joining the community and working with it – like the author of the blog that you keep sharing – instead of trying to insult every one who works on it and calling it a disgrace?

                    It wasn’t my intention to insult any dev working in these issues, if it sounds like that I’m genuinely sorry. I’m mad about puritans who behave as if Desktop Linux is a silver bullet for long-term app support or people like Semperservus who think it’s a good thing non-devs (and those who simply don’t have time to invest that time into their computer) are being “filtered out”. And if someone sugarcoats big issues like how Linux systems historically handled packages and dependencies and the problems it causes I’ll use strong words to make abundantly clear how wrong they are, because I’m fed up by this willful ignorance.

                    (Same willful ignorance in my opinion is the reason why accessibility deteriorated to the current degree since we once had that part figured out. That’s why I used is as argument)

    • srestegosaurio@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      3 days ago

      What, you don’t like role-playing software development & distribution as if we were still in the 90s?? 🥺🥺 /j

      But srs, most of Linux’s biggest technical problems are either caused by cultural legacy or blocked by it. The distribution model being one of the most pungent examples.

      • Natanox@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        3 days ago

        Fortunately we do have a steady influx of new people incl. those who demand shit to god damn work, finally shifting this notion.

        For the time being we still have to resort to using the Windows version and Wine for old software though… But I already had the situation where the (unmaintained but working) app also had a Flatpak which was last updated many years ago and it just worked, which made me incredibly happy and hopeful. ❤️

        Good thing there’s a battle-proven response if people don’t like this because it’s “not what Linux is supposed to be” or some other nonsense: If you don’t like it just fork it yourself. 😚

        • qqq@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          Fortunately we do have a steady influx of new people incl. those who demand shit to god damn work, finally shifting this notion.

          What the hell is going on in this thread? Linux has been being actively developed by people who want “shit to god damn work” forever. What are the concrete examples of things that don’t work? Old games? Is that the problem here? These things that were developed for the locked in Windows ecosystem since time immemorial and never ran on Linux and now, through all of the work of the Linux ecosystem, do, by some miracle, run on Linux. It’s amazing that these things work at all: they were never intended to!

          • Natanox@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 day ago

            What the hell is going on in this thread? Linux has been being actively developed by people who want “shit to god damn work” forever.

            Yes and no. Yes as in “you can fix it” (if you’re a programmer), but no in terms of “everything is set up so binaries will still run in 20 years as-is”. Dependency hell, missing library versions, binaries being linked against old glibc versions you can’t provide… all of these are known issues, and devs are often being discouraged from compiling tools in a way that makes them work forever (since that makes the app bigger and potentially consume more memory). And better don’t tell someone who’s blind (and used Linux before) what’s quoted above, they’ll either laugh at you or get really angry. It’s also one of the reasons I’m angry (I’m able to see, but I hate this hypocrisy in the community). Linux on desktop utterly alienated disabled people, simply because stuff like screenreaders keep breaking.

            • qqq@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 day ago

              Running 20 year old binaries is not the primary use case and it is very manageable if you actually want to do that. I’ve been amazed at some completely ancient programs that I’ve been able to run, but I don’t see any reason a 20 year old binary should “just work”, that kind of support is a bit silly. Instead maybe we should encourage abandonware to not be abandonware? If you’re not going to support your project, and that project is important to people, provide the source. I don’t blame the Linux developers for that kind of thing at all.

              devs are often being discouraged from compiling tools in a way that makes them work forever (since that makes the app bigger and potentially consume more memory)

              This is simply not true. If you want your program to be a core part of a distribution, yes, you must follow that distribution’s packaging and linking guidelines: I’m not sure what else a dev would expect. There is no requirement that your program be part of a distribution’s core. Dynamic linking isn’t some huge burden holding everyone back and I have absolutely no idea why anyone would pretend it is. If you want to static link go for it? There is literally nothing stopping you.

              Linux desktop isn’t actively working against disabled people, don’t be obtuse. There is so much work being done for literally no money by volunteers and they are unable to prioritize accessibility. That’s unfortunate but it’s not some sort of hypocritical alienation. That also has likely very little to do with the Linux kernel ABI stability like you claimed earlier.

              But this idea that “finally we have people that want Linux to work” is infuriating. Do you have any idea how much of an uphill battle it has been to just get WiFi working on Linux? That isn’t because the volunteer community is lazy and doesn’t want things to work: that’s because literally every company is hostile to the open source community to the point of sometimes deliberately changing things just to screw us over. The entitlement in that statement is truly infuriating.

              • Natanox@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                1
                ·
                23 hours ago

                Running 20 year old binaries is not the primary use case and it is very manageable if you actually want to do that. I’ve been amazed at some completely ancient programs that I’ve been able to run, but I don’t see any reason a 20 year old binary should “just work”, that kind of support is a bit silly. Instead maybe we should encourage abandonware to not be abandonware? If you’re not going to support your project, and that project is important to people, provide the source. I don’t blame the Linux developers for that kind of thing at all.

                I see your point. What I think though is that it’s particularly hard on Linux to fix programs, especially if you are not a developer (which is always the perspective I try to see things from). Most notable architectural difference here between f.e. Windows and Linux would be how you’re able to simply throw a library into the same folder as the executable on Windows for it to use it (an action every common user can do and fully understand). On Linux you hypothetically can work with LD_PRELOAD, but (assuming someone already wrote a tutorial and points to the file for you to grab) even that already requires more knowledge about some system concepts.

                Of course software not becoming abandonware would be best, but that’s not really something we can expect to happen. Even if Europe would make the absolutely banger move and enforce open-sourcing upon abandonment of software after a few years, it would still require a developer to fix issues. The architecture of the OS should be set up so it’s as easy as possible to make something run, using concepts (like file management) as many people as possible are familiar with.

                devs are often being discouraged from compiling tools in a way that makes them work forever (since that makes the app bigger and potentially consume more memory) This is simply not true.

                We might be in different bubbles in this case. Please be aware I’m talking about the very loud toxic minority (hopefully it’s a minority…) who constantly shit about how things aren’t following “KISS” close enough, that your app or distro is bloated, etc. It feels like if I was collecting all statements against Flatpak, systemd, even just static linking that boil down to “it’s bloated! It’s not KISS! Bad!” (so not well-reasoned criticism) I read or hear, including around my local hackspace or on events, I could fill whole books.

                Linux desktop isn’t actively working against disabled people, don’t be obtuse.

                Not actively, no. The issue here is rather that, for way too long, we didn’t care enough. We had things working comparatively nicely one or two decades ago, but in more recent history the support deteriorated to such a degree the Linux desktop has become, to a huge degree, inaccessible to blind people (mostly due to issues with Wayland). I didn’t save those blogposts or statements to show in discussions like these, but the takeaway from all of them is that “It used to work for me many years ago, but if I want a system that respects me today I’m forced to use Mac”. But of course you’re also right, it’s slowly getting better! (Correct me if I’m wrong, not a native speaker: “being alienated” doesn’t inherently imply malicious intent of doing so, does it?)

                But this idea that “finally we have people that want Linux to work” is infuriating. Do you have any idea how much of an uphill battle it has been to just get WiFi working on Linux? That isn’t because the volunteer community is lazy and doesn’t want things to work: that’s because literally every company is hostile to the open source community to the point of sometimes deliberately changing things just to screw us over. The entitlement in that statement is truly infuriating.

                Sorry, I was really pissed off yesterday evening by earlier comments in the chain implying it’s good to “filter out people” and got carried away. This one is completely on me.

                • qqq@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  21 hours ago

                  What I think though is that it’s particularly hard on Linux to fix programs, especially if you are not a developer (which is always the perspective I try to see things from). Most notable architectural difference here between f.e. Windows and Linux would be how you’re able to simply throw a library into the same folder as the executable on Windows for it to use it (an action every common user can do and fully understand). On Linux you hypothetically can work with LD_PRELOAD, but (assuming someone already wrote a tutorial and points to the file for you to grab) even that already requires more knowledge about some system concepts.

                  You’re not even realizing how advanced of a user on Windows you have to be to realize that putting a DLL in the correct directory will make that the library used by the program running from that directory. Most users won’t even know what a DLL is. Also I work in security professionally and I’ve used this fun little fact to get remote code execution multiple times, so I don’t see how it’s a good thing, especially when you consider that Linux’s primary use case is servers. You can do the exact same thing on Linux, as you said, it’s just opt in behavior. If you are knowledgeable enough to know what a DLL is and what effects placing one in a given folder have, you’re knowledgeable enough to know what a shared library is and how to open a text editor and type LD_LOAD_PATH or LD_PRELOAD. I don’t buy this argument at all.

                  Linux Desktop is predominantly a volunteer project. It is not backed by millions of dollars and devs from major corporations like the kernel or base system. It is backed by people who are doing way too much work for free. They likely care about accessibility and people using their project, but they also care about the myriad of other issues that they face for the other 90+% of their user base. Is that hugely unfortunate? Yes, it sucks. I wish there was money invested in Linux as a desktop platform, but compared to macOS and Windows it’s fair to say there is a rounding error towards $0.

              • Natanox@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                1
                ·
                23 hours ago

                Running 20 year old binaries is not the primary use case and it is very manageable if you actually want to do that. I’ve been amazed at some completely ancient programs that I’ve been able to run, but I don’t see any reason a 20 year old binary should “just work”, that kind of support is a bit silly. Instead maybe we should encourage abandonware to not be abandonware? If you’re not going to support your project, and that project is important to people, provide the source. I don’t blame the Linux developers for that kind of thing at all.

                I see your point. What I think though is that it’s particularly hard on Linux to fix programs, especially if you are not a developer (which is always the perspective I try to see things from). Most notable architectural difference here between f.e. Windows and Linux would be how you’re able to simply throw a library into the same folder as the executable on Windows for it to use it (an action every common user can do and fully understand). On Linux you hypothetically can work with LD_PRELOAD, but (assuming someone already wrote a tutorial and points to the file for you to grab) even that already requires more knowledge about some system concepts.

                Of course software not becoming abandonware would be best, but that’s not really something we can expect to happen. Even if Europe would make the absolutely banger move and enforce open-sourcing upon abandonment of software after a few years, it would still require a developer to fix issues. The architecture of the OS should be set up so it’s as easy as possible to make something run, using concepts (like file management) as many people as possible are familiar with.

                devs are often being discouraged from compiling tools in a way that makes them work forever (since that makes the app bigger and potentially consume more memory)

                This is simply not true.

    • highball@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 days ago

      I really think its just not that common. There are ways to do this for the few and not pollute the OS for the many. Steam does it for their use case. If it were a more common of a need, then I would expect distro maintainers to take care of it. The same way they did for 32bit libraries back in the day. When is the last time you had to install a 32bit distro along side your 64bit distro so you could run 32bit applications? Sometimes I need a bleeding edge build of an application. I run a stable distro. So build the application myself or install a quick chroot These days there is distrobox that makes it even easier. There are solutions. Easy from my perspective. That’s why I think, if this was such a common need, distro maintainers would provide a simple solution (automatically done for you).

    • Calfpupa [she/her]@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      6
      ·
      edit-2
      3 days ago

      This hasn’t been a problem for a decade or two, but I see drive costs inflate immensely, I wonder how it will impact how “bloat” is processed. Not everyone has infinite access to storage. BTRFS and other fs dedup features may be an acceptable work around, but I don’t know flatpacks structure enough to know if they can benefit from it.