Hello, I’m facing some rather annoying issues with running KDE6 on Wayland. Despite consulting the Arch Wiki and various online resources, I haven’t been able to get it working. X11 works fine, so I’ve been sticking with X exclusively due to my limited experience with desktop Linux and troubleshooting.
For context, my setup includes an AMD 7950x with Unraid as the main OS. My daily desktop runs on a mainline Arch VM with an RTX 3070ti and a dedicated USB controller card passed through. This setup has been stable for months, serving my gaming and media needs.
When attempting a Wayland session, I select it, observe a brief black screen flash, and then return to the login prompt. This issue persisted through upgrades and tests on different distributions like Manjaro and Endeavor. I’ve also tried multiple Wayland compositors, all with the same outcome.
The only success I’ve had with Wayland was on Garuda, which defaults to Wayland. I’m keen on using Wayland due to its evolving features, especially since it’s essential for running Waydroid.
Any assistance would be greatly appreciated. While my use case may be unique, Waydroid is crucial for specific Android apps vital to my workflow.
Thanks in advance!
This should work for all Nvidia user working with Plasma6 on Arch, Garuda etc:
Edit /etc/mkinitcpio.conf
MODULES=()
Change the Line and add the Nvidia Modules:
MODULES=(nvidia nvidia_modeset nvidia_uvm nvidia_drm)
Save the File and create the initramfs:
sudo mkinitcpio -P
Now Add the Kernelflags to Grub: nano /etc/default/grub
There is a line like this:
GRUB_CMDLINE_LINUX_DEFAULT="loglevel=3 quiet"
add this to the line:
GRUB_CMDLINE_LINUX_DEFAULT="loglevel=3 quiet nvidia_drm.modeset=1"
Now generate the new Grub config:
sudo grub-mkconfig -o /boot/grub/grub.cfg
Reboot and have fun with wayland. Maybe you have to install: xorg-xwayland
It is not necessary to add the nvidia stuff to initramfs. The important part is
nvidia_drm.modeset=1
.Sorry it took me so long to test this all out, thanks a bunch, this seems to have solved the issue. I guess I did indeed miss a step. There are certainly some weird bugs I’m getting right now with flickery apps and such but that’s sorta what I expected. HDR even works which is very nice to see. Regardless, I appreciate you taking the time to assist me here, I will have a play around and get my Waydroid config all sorted now.
Thank you!
Honestly, you can probably just wait it out. The problem comes about from lack of explicit sync compatibility, and that’s just been merged in to Wayland and Kwin. Once the nvidia-beta drivers with the feature are released (a month or two?) then most nvidia/Wayland problems should go away
I’m guessing you forgot to add DRM kernel mode setting. What you need to do is create
/etc/modprobe.d/nvidia.conf
, and addoptions nvidia_drm modeset=1 blacklist nouveau
then edit
/etc/mkinitcpio.conf
to haveMODULES=(nvidia nvidia_modeset nvidia_uvm nvidia_drm)
and ensure
/etc/mkinitcpio.conf
also has themodconf
hookHOOKS=(... modconf ...)
then run
mkinitcpio -P
, restart, and enjoy Wayland.deleted by creator
Its working since months, so I guess you have the drivers installed. For me I needed the „wayland-protocols“ package to work on my RTX3070
And the obvious once others have pointed out, the kernel parametersI do not think your issues are in any way unique. I am also using Plasma 6 on Arch with a 3080 Ti. Initially, it simply booted to a black screen with nothing but a cursor, and no ability to do anything.
Adding the changes to /etc/mkinitcpio.conf per @markus@hubzilla.markusgarlichs.de 's reply as well as adding those kernel params to my /boot/refind_linux.conf, and now it boots into a desktop, but I am seeing large amounts of flickering, windows going black, or visual elements flickering in and out. Looking at my installed packages, I already had both xorg-xwayland and wayland-protocols (per @Varen@kbin.social 's reply) installed.
It seems like there are still more steps before what I would consider a usable result. Personally, it was not “obvious” to me when Plasma 6 rolled out that I needed to do any of this.
Some of the flickering can be gotten rid of by disabling hardware-acceleration for qtwebengine.
I’ve got
`export QTWEBENGINE_DISABLE_GPU_THREAD=1export QTWEBENGINE_CHROMIUM_FLAGS=“–disable-gpu-compositing --num-raster-threads=1 --enable-viewport --main-frame-resizes-are-orientation-changes --disable-composited-antialiasing”`
in .bashrc.Note that there is still enough flickering left to annoy, and some (appimage?) apps dont seem to register the setting.
Thanks for the reply!
I tried this, and it does not seem to have helped much, if at all. Visual elements like scroll bars and text boxes in Steam continue to flicker, and apps like Discord flicker or go completely black randomly. My main use case for this desktop is gaming, so sadly without more of a fix I am not sure that I can move away from X11 yet.
Confirming, this is also my experience as well. It’s unfortunate but a thing regardless.
Is there any specific reason you’re on nVidia? Can’t you sell your card and get an AMD instead?
This is not a good answer to whats asked tho. Youre now asking me to sell what i already have?
First of all, it wasn’t an answer, it was a question. I was genuinely curious if OP had a legitimate reason to use nVidia.
Second, it makes sense to use hardware that’s best compatible with Linux - and avoid ones known to cause issues. Even two seconds of googling would show you how notorious nVidia is on Linux, heck, even Linus Torvalds had some less than polite things to say about nVidia.
And what’s wrong with selling? People buy not-fit-for-purpose things all the time, or people’s requirements may change, and what was once useful may not be ideal any more. There’s no law saying you need to use something you buy for the rest of your life.
I already own the 3070ti because it was just what I had before switching to Linux. It works great and I’m not keen on trying to swap cards until I can afford one that is a reasonably large step up. Plus when I do upgrade I plan making this one a dedicated transcoding\ai card so I don’t want to sell it. I do intend on never buying nVidia ever again after living through tons of driver bs while my other AMD system is a very smooth experience. Generally it seems to me the idea of selling the card and buying another when it’s not needed is sorta throwing the baby out with the bathwater kind of situation.
Linus is absolutely right on Nvidia. The thing is: The graphics server should work with the card, not the other way round.
Why replace functioning, not broken hardware just because the software can’t handle it?