This is very interesting. I have no real idea what is causing this problem, but based upon the facts you have provided so far, I can make a wild-donkey guess (my avatar image is a donkey! ).
To recap, I believe you said:
- If you boot up with the
splash
boot option, the system hangs at a black screen.
- If you boot up without the
splash
boot option, the system boots, but autologin still doesn't work -- I'm guessing the system hangs with a black screen when autologin is about to begin?
- If you boot up with the
nomodeset
boot option, the system boots and autologin works -- but games run very slow (probably using the swrast
/ llvmpipe
software rendering graphics driver instead of the radeonsi
driver, which uses the GPU's computing capabilities).
- If you first boot into Windows, then reboot into Ubuntu MATE, everything works, even autologin, even with modesetting enabled (as it is by default).
- If you boot from a USB-/SATA-connected SSD instead of your normal NVMe-attached SSD, everything works even without first booting into Windows.
Now here's my suspicion: I suspect this is a problem involving the PCI Express bus(es) to which both your graphics card and your SSD are connected. After all, NVMe is Non-Volatile Memory Express -- PCI Express. You probably have a multi-core CPU, and when you have modesetting enabled and your NVMe SSD, the drivers for the two devices probably load at the same time. I'll bet that when you load Ubuntu MATE with modesetting enabled, Linux tries to load some "firmware" used to drive the GPU, at the same time that it's trying to communicate with the SSD to load the rest of the operating system -- they probably step on each other's toes, the firmware doesn't get loaded into the GPU, and the graphics therefore doesn't work.
But if you load Windows first, then reboot, Windows probably loads the firmware into the GPU; then when you reboot into Linux, the firmware has already been loaded into the GPU, so Linux can concentrate on communicating with the SSD and no conflicts occur on the PCI Express bus.
And if you boot from the SSD attached via USB: While the USB controller in your system is most likely attached via PCI Express, it's probably integrated on your system's chipset and thus uses a separate, internal PCI Express bus, instead of the external PCI Express slots that your graphics card and NVMe SSD use. There's no conflict if the SSD and GPU use different busses.
So, what to do? I sadly don't know. I kid you not, my hardware is so old and lame that I don't have to worry about weird conflicts like this. But whatever the specific solution is, it sounds to me like you need to find some way to ensure that the driver for your NVMe loads after the driver for your GPU has loaded completely and has loaded the firmware, too. Whatever the answer is, you probably need to research boot parameters understood by your initrd
or initramfs
.
Also, is it possible for you to boot up in UEFI mode instead of legacy BIOS mode? Because my general experience with legacy BIOS mode is Linux will boot with the graphics initially in VGA mode, which does not use the GPU hardware at all (it uses fallback hardware with no fancy GPU computing capabilities) -- then Linux switches into full-resolution mode, using the GPU hardware (which requires the firmware to be loaded). Whereas, every UEFI I've seen initializes the GPU in full-resolution mode from the very start, giving Linux a bit of a head start (so it never needs to begin in VGA mode and then initialize the GPU totally afresh). I'll bet modesetting will work with your NVMe SSD on Linux if you enable UEFI boot.