Is Ubuntu becoming Windows?

Since 14.04 when I discovered Ubuntu I saw it is getting worse and worse with each LTS. 24.04 being the worse (slower than ever, buggy etc.)

It is because of Ubuntu pro?

I see the same thing in Laravel (php framework).
I see the same in cars.
I see the same in almost any product out there.

What is wrong with humanity?
Update: What is wrong with: "If it works, DO NOT CHANGE IT!!!" ?

4 Likes
  1. I do wholeheartedly agree with you.
  2. "Times are bad: children no longer obey their parents and everyone is writing a book." (attributed to Cicero. 43 BC)
3 Likes

My first exposure to Ubuntu was with 08.04. As an experiment, I installed it on a partitioned MacBook (the little white one, remember?). As much as I liked it, getting wpa_supplicant working was a horror show. Still, I remember with humor sitting outside an Apple store in Las Vegas, camping on their free wifi, running Linux.

Much of Linux/Ubuntu has improved. But yes, it's gotten more complex to make it a "modern" OS. Windows? Please.

Of all the OSes out there, Windows is the one with the worst track record when it comes to convenience and usability. Remember "DLL Hell?" The Registry? Internet Exploder, which you couldn't remove?

As a Mac user primarily, I simply want a computer that "just works." That USED to be the Mac. Linux isn't there yet, but it's awfully close. Windows? ?? Blech.

1 Like

Your posting has given me the opportunity to address the topic raised, which is one I have long wanted to address, with everyone in mind. I hope many of you will take time to consider these thoughts.

It is unfortunate, but the reality of Free Open Source Software (FOSS) is that

  • it is free ... because of the generosity of others,
  • it is open ... so that everyone can see, for themselves, whether it is safe or abusive,
  • it is source ... so that those with the creative genius can share their works for posterity, and
  • because of that sharing, anyone, including yourselves, can "roll-your-own" as they say.

This last one IS FOSS's greatest advantage in its "tortoise-like" race with the proprietary software in that, once out there, it can never go away, except by being deliberately ignored (likely because a better tool has emerged from another creative genius)!

That is unlike the $W that will disappear because people/corporations want to blame someone when (not if) things go wrong.

The biggest benefit (or challenge depending on your point of view) is that it is subject to the whims of the "owner's/creator's" sense/perception of what is good, best, or needs to evolve.

For those of us who cannot follow, let alone absorb, the details of every evolutionary step in our technological baseline, we must resign ourselves to the fact that others control our technological destiny.

Given that, as End-Users of such "freebies", we must find ways to form communities of like mind (UbuntuMATE is one such community) where as a group we can foster and encourage, by all means that the creators look favourably upon, to nurture and encourage their continued devotion to a shared view of how "things" should be!

That's enough for the "why" and "how".

What you need to "flesh out", document and publish/communicate/share is the what, namely

a very clear vision

of every element

of exactly you vision of desirable reality that is,

being specific on the

  • I want keep this as is,
  • I want this change, and
  • I don't want that change,

and being very clear about your reasoning for, or against, any of those things. Without that, there is nothing for individuals to "gravitate" towards.

With that out there, for everyone to see, and more importantly to discuss and come to a consensus of shared perception of desirability, that vision will come to be ... over time ... because you are still relying on the generosity of creators being willing to "consume/spend/dedicate" their limited budget of focus/energies on your community's shared roadmap towards your long-term vision.

It is very hard to find shared concensus.

Within jus the Linux community, each of the over 100+ Linux distro represents a divergence from concensus !!!

While disheartening, that is just another reflection of human nature, such that we still have 300+ countries, rather than a single humanity !!!

Getting back to Linux and desirability of its various aspects, if you can try to keep the "target" small enough so that it will be crystal clear to all concerned and easy enough to find like-minded individuals to form a community of interest for that target, one which includes "worker bees" able to create the reality so that the remaining "drones" (not meant to be negative) can leverage that into a productive framework that becomes a sustainable society/economy, then you will have a winner.

I do hope you can achieve your goal, for all our sakes!

It is an old, and "corny" expression, but in the FOSS world we must try to rally under as few, or preferrably a single, "flag" for our vision, to put aside many differences which are primarily "preferences", rather than "mechanics". I propose the use of the cry,

"All for one and one for all!"

shared with the world by Alexandre Dumas in his novel "The Three Musketeers".

I have the unwaivering faith that the FOSS community at large will, someday, see the light, even if conflicts such as systemd vs non-systemd, or GNOME vs KDE vs MATE, persist. These, in my view, are primarily differences in perception regarding the "primacy" of what is the "default" configuration, and not a matter of "this plug-in" vs "that plug-in" regarding choice, again of configuration, regarding method/workflow to achieve a functional implementation.

Food for thought ... for the community at large.

6 Likes

You seem to find the plethora of Linux distros, but I think it's one of the strengths of the OS. For example, I have an old Asus eee PC "netbook" that is still running today thanks to the ability to find a Linux that will run under an Atom CPU with 1 GB memory. Windows won't. Mac won't. Even MATE/Mint or the "mainstream" distros won't.

The fact remains that if one wants a specific purpose computer, there's likely a Linux that will provide it. Raspberry Pi is an example. I use Linux (MATE) as my server software because I find I can get all the bang I need without having to go through the Microsoft or Apple hoops.

1 Like

You can use a protest distro like antiX linux and feel fine. But the world moves forward with everything.

Hi ratatoskr

I think you're missing a point here.

Antix is not a protest distro, but a distro aimed for older hardware.
You see, Antix is a wordplay on Antique and not on Anti.

I don't think you ever used it, otherwise you wouldn't be so dismissive about it.

No it doesn't, what makes you think so ?

You mean GNOME™ ?? Even something simple like Fluxbox already dwarfs the usability of GNOME. It's also much faster (without stuttering GUI) and that for only a fraction of the resources. And Fluxbox is very old

What is "forward" about that ? Increasing entropy ? :smile:

We all have to deal with the fallout of GNOMEs designdecisions and breakage and some decisions of our motherdistro and try to keep everything working along the way.

@marius-ciclistu may come on a bit strong but there are several grains of truth in his rant which we shouldn't just in a kneejerk reaction dismiss beforehand.

Also, the reaction of @ericmarceau is exactly on point.
It's better to stand together with good formulated arguments and and doing pull requests than saying "the world moves forward" and do nothing.

It reminds me of an old steppenwolf song, 'the ostrich':
"we stick our heads into the sand, just pretend that all is grand, and everything will turn out okay"

5 Likes

Thank you for replying, @OldStrummer,

But my point was to say that there is too much "segmentation" of the Linux Community into "warring factions". I recognize that such is because, unlike the old days when there was only 12 nation-wide TV channels, the Linux "nation" no longer has the "shared experience", which inevitably leads to divergence in experience and values as to what is good vs bad, in everything around us, including functional elements of Linux.

I tried to outline concrete steps to try to reduce that "balkanization" of the Linux "mind-space" and to encourage those who have the energy to communicate an opinion, to also make the additional effort to build their like-minded community, not just try to find one where they can share commiserations about how bad things are or how poorly someone is doing.

Not meaning to offend, talk is always cheap. "Doing" is what is concrete and builds the world. Talk is good, but only if it eventually leads, somehow, to build. Otherwise, it is wasted energy, better spent elsewhere constructively.

That is one pearl of wisdom distilled from this 69-year old's life experience.

1 Like

@ericmarceau Good point.

But you see, even if someone wants to help, the politics begind free open source things nowadays kills it (I have no development skills in ubuntu's regard but only in php's regard - Laravel and in cars regard from what I exemplified in my post). It feels like it is meant to fail, it is meant to be worse. Why? For bigger profit. Follow the money and the ilogical will become logical.

An excellent old satire on the topic:

http://harmful.cat-v.org/software/c++/I_did_it_for_you_all

2 Likes

@ugnvs =))

I saw oop's dark side also :)))

"Stroustrup: It was only supposed to be a joke, I never thought people would take the book seriously. Anyone with half a brain can see that object-oriented programming is counter-intuitive, illogical and inefficient.."

Unit tests fall in the same pit :)))

1 Like

@ugnvs no wonder this lib gain no popularity: macropay-solutions/laravel-crud-wizard-free - Packagist

Low delivery time is not desirable.

You remind me of the systemd-versus-init debate, which still rages today. Why should everyone line up behind a specific component or subsytem? This is about heterogeneity rather than homogeneity.

Apple and Microsoft have a lock on what goes in or comes out of their OSes. I'm currently having a bit of an issue with Apple removing an internal piece of macOS that completely broke an app I had developed (for personal use). They did so without ever announcing it, and when I asked, all I got in reply was, "Send a feature request." Right. Apple's going to listen to one voice in the wilderness (although I've read many people on GitHub, Stack Exchange and elsewhere ask for the same capability).

Linux lets the end user decide what's best. For them. GNOME? KDE? CINNAMON? Want your menus on the top? Or the bottom. Fine. This to me, is the supreme value that Distrowatch brings to the playing field. It's why I can keep an old EeePC running long after it's been abandoned by ASUS, Intel, and pretty much anybody else. No, it's not my primary computer, and I suspect some day it will simply die for lack of support or usability, but no one has forced me into using one flavor of an OS over another.

vive la différence!

2 Likes

@marius-ciclistu, On your point about money, I will admit to paranoia that those with the money, trying to protect their interests, use puppet strings on paid "agents of chaos" to stir the pot and prevent the Linux community from coming together as a whole, with a unified vision.

@OldStrummer, thank you again for your comments.

Unfortunately, I think people have again misunderstood my stand.

I am not for homogeneity!

I am for full, top-to-bottom, customizability, but having that achievable from a single, all-encompassing, build source! I know that some would think that I am crazy to venture such a "monstrous" concept, but disciplined coding would ensure documented/embedded conditionals that would simplify eventual integration of other components that are not already within the "accumulated framework".

Think of LEGO blocks.

All those basic common pieces, (new pieces added all the time), but in the end, all one happy family that can plug-n-played as you see fit ... but without lockout of other pieces.

That is where the Devuan split was a disservice to Linux. They should have stayed "within the family" to ensure the option to use the init approach was an option at install time, to ensure init-driven build was the outcome from a fresh build, rather than a post-install, potentially kludgy, retrofit.

For a long time, I leaned towards going to Devuan, but I never even tested it, simply because they were starting from scratch and, for me, that implied too many opportunities to introduce new "blind" issues that would surface as "gotchas", and that is one thing I could never tolerate.

The same situation has arisen with snap vs no-snap.

I don't care that some people want their packages as snap, or others don't.

I only care that when I do an install, I have the ability to do everything, top-to-bottom, in my chosen method which, as it happens, is no-snap.

I disagree with it being forced on me by the default distro and having to perform "patchwork" to get rid of it. I absolutely resent that some packages will only be delivered as snap, without giving me that choice, because they think the package "is better suited for that process"!!! I will stop myself there.

My point is every user should have every "customizable choice" at install time. Yes, that may become onerous. But here is my solution to that problem ...

Split the process into two parts:

  • prompting for specification, then
  • building.

While we have the appearance of that approach now, that is not the case, because it is doing step/prompt/step/prompt/step/...

I believe that more thought into that initial "specification definition" engine, and having that specification generated as an XML (or other-formatted) file on the user's side, without need for immediate start of the build process, would permit every user to have a fully-captured definition of his configuration, that would be portable to any computer in case of catastrophic failure.

Yes ... I know that is a huge endeavour, but that is my vision. Unfortunately, I am not sufficiently technically adept, so I must rely on what I can only call faith that those with the ability might see the benefits in such a vision to maybe pick up that torch and go with it!

I apologize for what might be characterized as being long-winded! However, some concepts cannot be properly characterized, or fleshed-out, without a more substantial outlay of "flavoured concept-strings"! :slight_smile:

At no point do I ever intend to demean by subjecting others to a "flood of word", or to project the image that I am trying "baffle with bs"! That is not my intention.

1 Like

Wouldn't that be nice? One OS that does everything for everybody! Even if that were technically possible, the resulting code base would be so huge that it would take a monstrously large and fast computer to run it, because it would be such a ponderously slow behemoth.

Developers all the time have to consider trade-offs when creating software. "Yes, we could add that feature, but it would slow the program to a snail's pace." And imagine the regressions that would be introduced!

I believe this is a primary reason there are so many distros available. Each distro is the result of choices being made on what to include and what to exclude.

Have you ever heard of OpenDoc? In the 1990s, a collaborative effort called AIM was formed: Apple, IBM and Motorola. OpenDoc was an attempt to create a framework for embedding technologies into documents. It was meant to compete with Microsoft's OLE (object linking and embedding) technology. I played with it in its early stages, but it never got fully baked. In a way, it was a solution looking for a problem.

I think AI is another attempt to provide a "one stop shop" for information exchange. One ring to rule them all, so to speak.

I don't think we'll all of a sudden see the entire FOSS community have a moment of enlightenment and stand ready to big the monolith. HAL 9000, are you listening?

1 Like

I must be explaining myself badly.

I don't mean to say that the installed OS should be all-encompassing, or keep "parallel functionality" in place at all times.

My vision is that the source code base, used only for the build at time of install, is all-encompassing. That the "User-selected customization phase", resulting in the "specification file" is the only one that would be all-encompassing.

In my view, post-install would only involve minor stuff, like a change of wallpaper because of mood changes, everything else having been sorted out prior to the first actual build/install step.

Naturally, post-install changes could be applied using a kind of tool like the debconf tool regarding choices like

  • "You currently have MATE. Your choices are MATE (no change), GNOME, KDE, xfce, ... Which would you like to change to?"

Once the change has been applied, there would be a "burn-in period" after which the OS would prompt if you want to purge the old, or revert to the old and purge the new. That way, the OS is always lean, as lean as it can be for runtime.

Again, I realize that is a lot to ask from developers, but they didn't get to the Moon without a vision to stimulate the creativity! :slight_smile: Kennedy didn't know how to do it himself, only how to inspire them to do it!

As you can see, I am trying my hand at that last part ... possibly failing! :slight_smile:

1 Like

I don't mean to say that the installed OS should be all-encompassing, or keep "parallel functionality" in place at all times.

My vision is that the source code base, used only for the build at time of install, is all-encompassing.

At first, this seems to be a contradictory statement, but I think I get what you're saying: Make it all encompassing until installation, let the user choose the features desired, and then remove the leftovers. Is that right?

When software came on disk (floppy, CD, optical) a lot of times this was the primary deployment option. Even today, ISO files mimic this behavior. I think the limiting factor here might be the assumed size of the media (I don't know; is there a limit on how much can be fit on a virtual disk rather than physical media?). It's an interesting wish, but I suspect most developers don't want to be put in the position of having to support every possible hardware configuration, disk specification, NIC, monitor, etc. That's an awful lot of work with diminishing return on value.

That is exactly what I mean!

I view the install process as involving 3 distinct steps:

  • download the "configuration-/specification-engine",
  • download the relevant packages, with all dependencies (to create the install media), then
  • build the OS from that install media.

The "configurator" would have all the language-related files to ensure universality, in totality. But, the downloads and install steps would be strictly limited to the languages specified by the User, including English (the currently-designated standard) as the one universal common denominator in all cases as fallback for developers, troubleshooting and bug reporting.

Regarding space limitations on a virtual disk, I believe that, for all 64-bit distros, the issue is moot, since you are likely to have ample hard disk on which to create a potentially-dedicated partition to perform the temporary download of files before burning onto 4.4 GB optical media, or USB which come in any sizes these days. I still partial to an DVD image myself for tamper-proof backup and recovery tool.

Let's not forget the code used for doing the hardware probing/reporting involves packages that may not be as critical to success during the install process. So, any such that are not required during the actual build, would be left off until after full install, giving that much more capacity on the media for the packages that represent the customized selection/build.

Not to belittle the issue, that is what conditional compiling is all about. A "merged-monolithic codebase" would have those developers focused on that hardware supplying the knowledge of those bits as coding for shared use in the overall context. The contributors that don't want to manage that code, don't need to because those that do would. There would obviously be some instances where "quid-pro-quo" negotiations/concessions will arise to ensure that one gets what they need accomplished.

The only real danger I see in such a monolithic model is purging of defunct code (code for which there is no user). Scanning of code for conditionals matching hardware that is no longer being reported as being used. The only gotcha there is the recency of such a scan to the time of code-base purge. The smaller the window, the better.

To that end, a critical part of the "configurator tool" is the harvesting and feeding back to source details of hardware components used for a configuration. I visualize that being managed (to avoid duplicate reporting) using a UUID for each installed OS (or network "tap"). Maybe there is a better mechanism, but that I leave to the experts who would know how best to manage.

You might be right, but I never looked at it that way. When I tried to get Nortel, London, Ontario to get into "AI", I never used that label. I preferred the label "Expert System". A means to relieve the Tool Designers of the "grunt work" and help roll-out a more structured workflow. Corporate R&D accepted my proposal to pursue the "Design Verification Checklist and Troubleshooting Assistant" for plastic injection-moulded housings for desktop residential/business telephones, and it was progressing in its initial phases (Teknowledge consulting; Semi-automated layout generating and drawing auto-dimensioning tool). We never got the chance to finish because, one year into the multi-year project, the Corporation brought down the axe and shuttered the division, because "plastics", the Division's core revenue, was deemed no longer core business, and consequently shed as old skin. They failed to realize that it was one of their primary "gravy-trains" sustaining the rest of the business, even if it wasn't predominantly electronics.