What’s so much better about Wayland than X? I mean, I’m not really a fan of X and the security nightmare that it is, but as a user it’s all pretty plug and play these days. What does a normal user get out of Wayland? Would they even know they’re using it?
I’d love to try it, but it currently won’t work with some software I use, so I haven’t bothered… And honestly I’m kind of confused about how everybody is talking about how amazing Wayland is (and how it seems to suddenly be the one true path for a bunch of distros) when my only experience with Wayland is people talking about how great it is and then not being able to screenshare or whatever… Which doesn’t make it seem great from the outside? That maybe sounds a bit flippant, but I genuinely don’t understand why “normal” people are so excited? I mean, I can see people caring about features like HDR and maybe that’s easier to build into Wayland than ancient X11, but I’d be more excited about the specific feature than Wayland itself which may make implementing these things easier?
Wayland cuts out all of the dead features and allows content to be drawn to the screen more directly. This means that there is a simplified architecture with great battery life.
Other than that, it doesn’t really bring much to the table currently. Not everyone needs (or wants) HDR and many of the other features that I would like to have are still in the works, so… I don’t really see a reason to use it, at least not now.
Support for HDR, variable refresh rate, direct draw and battery improvements sound like a very good list to have, other than the overall leaner build. You personally not caring about it doesn’t change the fact that it’s good to not stagnate when it comes to things like this.
Variable refresh rate has become the de facto standard of modern gaming now. They aren’t referring to the direct draw API, but the fact that Wayland does not have extra baggage to draw to the screen through a display server. Wayland just draws to the screen directly, saving time and performance.
VRR is fantastic for games, I really notice the difference and I use Wayland because of it.
The downside to that is (from my understanding) Wayland forces some form of Vsync on everything, so if you don’t have a VRR monitor then games can become very stuttery and have noticeable input lag. There is an option to “force lowest latency” which supposedly allows screen tearing for things like games, though I didn’t test how well it worked myself.
If people are interested in experimenting, then VRRTest is a great utility to see what VRR is doing and to test various settings.
The biggest feature of Wayland for me is mixed refreshrate monitors works OOB. On X this is a pain to get even remotely working and it’s impossible if your monitors aren’t dividable (120/60 works, 144/60 stutters).
This is from my experience something that is starting to be a way more common issue (high refreshrate laptops with 60 external monitors at businesses or high refreshrate monitor for gaming and a smaller secondary monitor for info lookup/discord).
other than that, Xorg does win the “more stable” prize for me, but if I wanted stability, I should’ve become a carpenter.
144/60 works fine for me on X. I only had to disable Vsync for the compositor. Games now run at full 144Hz on my main monitor, and the other two are running perfectly fine at 60Hz.
Though I’m still waiting for the day that I can finally make the jump to Wayland when nvidia support improves (or I have enough money for a new AMD GPU).
The biggest feature of Wayland for me is mixed refreshrate monitors works OOB. On X this is a pain to get even remotely working
Literally just plug the monitor and it works. Is this what Wayland people consider hard? No wonder they won’t implement anything remotely complex in their protocol.
Mixed refresh rates do not work because X technically is not doing multi monitor. Both monitors are rendered from the same “screen” that uses one refresh rate. If it’s running at 144hz, the 60 fps screen gets frame pacing issues. If it runs at 60, then the 144hz monitor is slow and gets frame pacing issues, and from most anecdotes and videos I’ve seen, it’s usually the latter and a pain to fix. If you wanted perfect frame pacing on both, you’d have to have the X11 screen set to 8640hz, which I don’t even think can render on modern systems. Wayland, on the other hand, just has multi monitor support built in and actively used. Each display has its own screen and renders at its preferred refresh rate, giving perfect frame rates and frame times for both.
Here’s the sad truth that Wayland haters hate: Wayland is way more performant and streamlined. X11 is an overly patched mess.
Everytime I had to install a distro, EVERYTIME I had to do some textfile hacking to avoid screen tearing with X11. Turns out in Wayland that is a virtually impossible bug.
Forget about making touchscreens work properly in X11, specially with a secondary screen.
I also remember all the weird bugs that appear in X11 when you have 2 screens with different scaling. No issue at all with Wayland.
Pretty basic stuff in any modern setup.
Wayland performs perfectly on platforms like KDE Plasma or Gnome. I miss no feature. It just requires that some propietary apps realise its potential. And that is what is already happening and will happen throughout 2024.
First of all, X is not a security nightmare. There were 0 cases of someone getting hacked because of X exploit. It’s a FUD.
Now Wayland is a fad (haha). It’s not that much better than X and when it was drafted 10 years ago everyone just ignored it. Over the decade it became clear that X is stuck and at some point it will become obsolete so people started looking at alternatives and Wayland started getting some traction. Over time different tools started getting Wayland support, some people started getting exited about it and a kind of new meme developed where using Wayland meant that you’re ahead of everyone else (just like using Arch BTW). In the end it’s just a nice PR stunt. Ask people what specifically is so great about Wayland and they will mention some obscure features most people don’t need and features that it will have ‘soon’. In the long term the move will hopefully be a good thing but as of now if you don’t specifically need the few features it has you can keep ignoring it.
There are some really major deficiencies in Xorg that aren’t present in Wayland. The main one that made me switch was proper support for variable refresh rate, and the ability to mix and match any fixed or variable refresh rate displays you want.
It’s a super common use case to have a primary monitor with high refresh rate and VRR, plus one or two cheaper monitors that don’t. Xorg doesn’t really support that at all without some really hokey tricks that severely impede usability.
Proper sync support is another one. Yes, you can set tearfree in X but the implementation is crap. You’ll still get tearing in a lot of programs and at least in my experience, it introduces a pretty significant and perceptible input lag, far more than needed to eliminate tearing.
It’s a super common use case to have a primary monitor with high refresh rate and VRR, plus one or two cheaper monitors that don’t. Xorg doesn’t really support that at all without some really hokey tricks that severely impede usability.
I wish Wayland shills would stop spreading this lie. It literally just works. In fact, I’m doing it now on my laptop with a 144Hz 1080p monitor, and an external 60Hz 1440p monitor connected with Thunderbolt, with a dual-GPU setup (iGPU + nVidia, which Wayland doesn’t properly support, yet this is nVidia’s fault somehow even though Wayland compositors run entirely in user space, without interacting with the driver directly).
That’s why it doesn’t make sense arguing about it with Wayland fans. They always find this one obscure feature that X is missing and then claim it’s absolutely essential for everyone to have it. Most people have just one monitor, two equal/similar monitors, a handheld device with one screen or (and that’s the vast majority) simply don’t give a fuck that one of their monitors is working on a lower refresh rate. I’m glad Wayland finally found some traction with gamers obsessed with those things and is being adopted but the constant BS about everyone needing it is getting boring.
Mixed VRR is not an obscure feature for one. Most of my friends with gaming rigs have a primary monitor with VRR and use their old fixed rate monitors as secondary displays. Does it make a massive difference to run fixed refresh rate? No but it is noticeable and nice to have. Windows can do it and I paid for the hardware. Without parity on this kind of stuff, Linux is a hard sell to the people who do care about it.
Does it matter to Joe Schmoe? Probably not, but Joe Schmoe probably doesn’t care about Linux to begin with. You have to go for the tech enthusiasts first before you can get it to the masses.
1.6% of gamers use Linux. 25% of developers use Linux. Typical tech enthusiast is not gamer. Just because in your bubble people use VRR doesn’t mean it’s important to majority of users. Most Linux users don’t care.
1.6% of gamers is still millions of people. Entire industries exist on the back of much smaller customer bases than that. Might as well say we should stop caring about desktop linux completely since the server market dwarfs it.
I’m not saying we should just ignore it. I’m saying that it took the time it took (a decade) for Wayland to become a thing because most people don’t need it. Some people do and it’s not getting traction but most people can still safely ignore it.
With VRR? Xorg definitely did not support this as of a year or so ago without running a separate xorg screen for each monitor which prevents you from doing stuff like moving windows between your displays.
Mixed refresh rates worked okay-ish but VRR definitely did not work well in multi monitor setups.
It is not a 'fad". Major distros have defaulted to Wayland (Ubuntu, Fedora, Red Hat, Debian, Manjaro etc).
X11 is old and designed for use cases in the 1980s. A lot of features have gradually moved out of X11 into the kernel or into other compositor systems. But the core X11 system is still limited by legacy design decisions and needing work arounds (which are complex to build and maintain).
Wayland is built to be the modern system that is built for current usage and needs. A lot of the benefits are not immediately obvious to the end user - a desktop is a desktop. But desktop interface projects like KDE who build user interfaces are hitting X11s limitations all the time, and a lot of effort goes in to working around X11s limits compared to working with Wayland. Effort spent working to work around X11 is time and work that could have been spent elsewhere on other fixes or new features and innovations.
The push to Wayland is deliberate and necessary, but was not always inevitable. Now that it’s being adopted so widely as the default by big distros and projects it is likely inevitable. It has essentially reached critical mass.
I think a lot of people asking “what’s the point” are not the ones working to build systems and distros at the back end. It’s easy for us as end users to take for granted all the work behind the scenes that make our desktops “just work”. But if you’re a volunteer building a compositor fit for 2024, I can see why it’d be frustrating working around the limitations of a system built for 1984.
X11 has served us incredibly well and is a hugely important project. But Wayland is the way forward.
When Wayland can do and run everything X11 can, without problems, plus everything it promisses it can do, then I’ll make the switch. Till that time comes, I’m sorry, but it’s just not for me 🤷.
Sorry, I used the term “fad” to make a pun on X flaws being a ‘FUD’ (haha). It’s not a fad in the sense that it will soon disappear. What I meant is that the excitement around it is not funded in actual benefits and it just recently became fashionable to support it.
While I don’t think X11 is great, I do not think wayland compositor is made to be easier to develop with. Wlroots had to be made to make things easier for compositor devs.
What does it do on new hardware? Not a lot of people are running normal desktop Linux on phones / tablets, are they? Which, totally cool if it works better on those things… but I guess I’m just surprised by how much hype there is for Wayland when X just works for me and would presumably just work for most people’s use cases. Like… who are all of these people that are emotionally invested in display servers, and what am I missing?
I mean, 20 years ago or whatever there was always the pain of black screens and X configs… but it just kind of works now in my experience?
What’s so much better about Wayland than X? I mean, I’m not really a fan of X and the security nightmare that it is, but as a user it’s all pretty plug and play these days. What does a normal user get out of Wayland? Would they even know they’re using it?
I’d love to try it, but it currently won’t work with some software I use, so I haven’t bothered… And honestly I’m kind of confused about how everybody is talking about how amazing Wayland is (and how it seems to suddenly be the one true path for a bunch of distros) when my only experience with Wayland is people talking about how great it is and then not being able to screenshare or whatever… Which doesn’t make it seem great from the outside? That maybe sounds a bit flippant, but I genuinely don’t understand why “normal” people are so excited? I mean, I can see people caring about features like HDR and maybe that’s easier to build into Wayland than ancient X11, but I’d be more excited about the specific feature than Wayland itself which may make implementing these things easier?
Wayland cuts out all of the dead features and allows content to be drawn to the screen more directly. This means that there is a simplified architecture with great battery life.
Do
wmctrl
,xdotool
and similar work with it, and if not, what are their equivalents under Wayland?Wayland is a protocol so everything is implemented by the desktop.
Other than that, it doesn’t really bring much to the table currently. Not everyone needs (or wants) HDR and many of the other features that I would like to have are still in the works, so… I don’t really see a reason to use it, at least not now.
Support for HDR, variable refresh rate, direct draw and battery improvements sound like a very good list to have, other than the overall leaner build. You personally not caring about it doesn’t change the fact that it’s good to not stagnate when it comes to things like this.
VFR 🤨… I mean, does anyone actually use that? It flopped for video content, I seriously doubt anyone is gonna use that on a PC.
DirectDraw is an MS specific thing, part of DirectX. How does that fit into Wayland?
The second, I would actually LOVE to get in any frame server, X or Wayland, but that will most probably never happen.
Variable refresh rate has become the de facto standard of modern gaming now. They aren’t referring to the direct draw API, but the fact that Wayland does not have extra baggage to draw to the screen through a display server. Wayland just draws to the screen directly, saving time and performance.
VRR is fantastic for games, I really notice the difference and I use Wayland because of it.
The downside to that is (from my understanding) Wayland forces some form of Vsync on everything, so if you don’t have a VRR monitor then games can become very stuttery and have noticeable input lag. There is an option to “force lowest latency” which supposedly allows screen tearing for things like games, though I didn’t test how well it worked myself.
If people are interested in experimenting, then VRRTest is a great utility to see what VRR is doing and to test various settings.
The biggest feature of Wayland for me is mixed refreshrate monitors works OOB. On X this is a pain to get even remotely working and it’s impossible if your monitors aren’t dividable (120/60 works, 144/60 stutters).
This is from my experience something that is starting to be a way more common issue (high refreshrate laptops with 60 external monitors at businesses or high refreshrate monitor for gaming and a smaller secondary monitor for info lookup/discord).
other than that, Xorg does win the “more stable” prize for me, but if I wanted stability, I should’ve become a carpenter.
144/60 works fine for me on X. I only had to disable Vsync for the compositor. Games now run at full 144Hz on my main monitor, and the other two are running perfectly fine at 60Hz.
Though I’m still waiting for the day that I can finally make the jump to Wayland when nvidia support improves (or I have enough money for a new AMD GPU).
If you’re using the latest Nvidia drivers, try it out. I heard support improved dramatically with the latest releases.
Literally just plug the monitor and it works. Is this what Wayland people consider hard? No wonder they won’t implement anything remotely complex in their protocol.
Mixed refresh rates do not work because X technically is not doing multi monitor. Both monitors are rendered from the same “screen” that uses one refresh rate. If it’s running at 144hz, the 60 fps screen gets frame pacing issues. If it runs at 60, then the 144hz monitor is slow and gets frame pacing issues, and from most anecdotes and videos I’ve seen, it’s usually the latter and a pain to fix. If you wanted perfect frame pacing on both, you’d have to have the X11 screen set to 8640hz, which I don’t even think can render on modern systems. Wayland, on the other hand, just has multi monitor support built in and actively used. Each display has its own screen and renders at its preferred refresh rate, giving perfect frame rates and frame times for both.
It literally doesn’t work on X11 lol
Then how am I doing it?
It’s been plug and play for a decade but sure.
No it hasn’t. You need to do a weird workaround.
Yeah, well, not knowing that it can’t work or need a workaround, I’ll just keep doing it then.
“I made that chair. It’s stable AF.”
😂😂😂
Don’t you need a HDR monitor for HDR?
Yes, I believe so.
I don’t even want to know the price. I bought myself a new monitor for Christmas and I doubt it has that.
Mine are all standard as well, usually 10+ years old. I absolutely have no need for HDR, but I get that some people would like to use that.
Here’s the sad truth that Wayland haters hate: Wayland is way more performant and streamlined. X11 is an overly patched mess.
Everytime I had to install a distro, EVERYTIME I had to do some textfile hacking to avoid screen tearing with X11. Turns out in Wayland that is a virtually impossible bug.
Forget about making touchscreens work properly in X11, specially with a secondary screen.
I also remember all the weird bugs that appear in X11 when you have 2 screens with different scaling. No issue at all with Wayland.
Pretty basic stuff in any modern setup.
Wayland performs perfectly on platforms like KDE Plasma or Gnome. I miss no feature. It just requires that some propietary apps realise its potential. And that is what is already happening and will happen throughout 2024.
First of all, X is not a security nightmare. There were 0 cases of someone getting hacked because of X exploit. It’s a FUD.
Now Wayland is a fad (haha). It’s not that much better than X and when it was drafted 10 years ago everyone just ignored it. Over the decade it became clear that X is stuck and at some point it will become obsolete so people started looking at alternatives and Wayland started getting some traction. Over time different tools started getting Wayland support, some people started getting exited about it and a kind of new meme developed where using Wayland meant that you’re ahead of everyone else (just like using Arch BTW). In the end it’s just a nice PR stunt. Ask people what specifically is so great about Wayland and they will mention some obscure features most people don’t need and features that it will have ‘soon’. In the long term the move will hopefully be a good thing but as of now if you don’t specifically need the few features it has you can keep ignoring it.
One of my favorite features: no tearing when watching movies.
Who uses that?
There are some really major deficiencies in Xorg that aren’t present in Wayland. The main one that made me switch was proper support for variable refresh rate, and the ability to mix and match any fixed or variable refresh rate displays you want.
It’s a super common use case to have a primary monitor with high refresh rate and VRR, plus one or two cheaper monitors that don’t. Xorg doesn’t really support that at all without some really hokey tricks that severely impede usability.
Proper sync support is another one. Yes, you can set tearfree in X but the implementation is crap. You’ll still get tearing in a lot of programs and at least in my experience, it introduces a pretty significant and perceptible input lag, far more than needed to eliminate tearing.
I wish Wayland shills would stop spreading this lie. It literally just works. In fact, I’m doing it now on my laptop with a 144Hz 1080p monitor, and an external 60Hz 1440p monitor connected with Thunderbolt, with a dual-GPU setup (iGPU + nVidia, which Wayland doesn’t properly support, yet this is nVidia’s fault somehow even though Wayland compositors run entirely in user space, without interacting with the driver directly).
That’s why it doesn’t make sense arguing about it with Wayland fans. They always find this one obscure feature that X is missing and then claim it’s absolutely essential for everyone to have it. Most people have just one monitor, two equal/similar monitors, a handheld device with one screen or (and that’s the vast majority) simply don’t give a fuck that one of their monitors is working on a lower refresh rate. I’m glad Wayland finally found some traction with gamers obsessed with those things and is being adopted but the constant BS about everyone needing it is getting boring.
Mixed VRR is not an obscure feature for one. Most of my friends with gaming rigs have a primary monitor with VRR and use their old fixed rate monitors as secondary displays. Does it make a massive difference to run fixed refresh rate? No but it is noticeable and nice to have. Windows can do it and I paid for the hardware. Without parity on this kind of stuff, Linux is a hard sell to the people who do care about it.
Does it matter to Joe Schmoe? Probably not, but Joe Schmoe probably doesn’t care about Linux to begin with. You have to go for the tech enthusiasts first before you can get it to the masses.
1.6% of gamers use Linux. 25% of developers use Linux. Typical tech enthusiast is not gamer. Just because in your bubble people use VRR doesn’t mean it’s important to majority of users. Most Linux users don’t care.
1.6% of gamers is still millions of people. Entire industries exist on the back of much smaller customer bases than that. Might as well say we should stop caring about desktop linux completely since the server market dwarfs it.
I’m not saying we should just ignore it. I’m saying that it took the time it took (a decade) for Wayland to become a thing because most people don’t need it. Some people do and it’s not getting traction but most people can still safely ignore it.
With VRR? Xorg definitely did not support this as of a year or so ago without running a separate xorg screen for each monitor which prevents you from doing stuff like moving windows between your displays.
Mixed refresh rates worked okay-ish but VRR definitely did not work well in multi monitor setups.
I believe we’re specifically talking VRR, which for me in Kubuntu did not work properly without switching to Wayland.
Well explained 👍.
It is not a 'fad". Major distros have defaulted to Wayland (Ubuntu, Fedora, Red Hat, Debian, Manjaro etc).
X11 is old and designed for use cases in the 1980s. A lot of features have gradually moved out of X11 into the kernel or into other compositor systems. But the core X11 system is still limited by legacy design decisions and needing work arounds (which are complex to build and maintain).
Wayland is built to be the modern system that is built for current usage and needs. A lot of the benefits are not immediately obvious to the end user - a desktop is a desktop. But desktop interface projects like KDE who build user interfaces are hitting X11s limitations all the time, and a lot of effort goes in to working around X11s limits compared to working with Wayland. Effort spent working to work around X11 is time and work that could have been spent elsewhere on other fixes or new features and innovations.
The push to Wayland is deliberate and necessary, but was not always inevitable. Now that it’s being adopted so widely as the default by big distros and projects it is likely inevitable. It has essentially reached critical mass.
I think a lot of people asking “what’s the point” are not the ones working to build systems and distros at the back end. It’s easy for us as end users to take for granted all the work behind the scenes that make our desktops “just work”. But if you’re a volunteer building a compositor fit for 2024, I can see why it’d be frustrating working around the limitations of a system built for 1984.
X11 has served us incredibly well and is a hugely important project. But Wayland is the way forward.
When Wayland can do and run everything X11 can, without problems, plus everything it promisses it can do, then I’ll make the switch. Till that time comes, I’m sorry, but it’s just not for me 🤷.
Sorry, I used the term “fad” to make a pun on X flaws being a ‘FUD’ (haha). It’s not a fad in the sense that it will soon disappear. What I meant is that the excitement around it is not funded in actual benefits and it just recently became fashionable to support it.
While I don’t think X11 is great, I do not think wayland compositor is made to be easier to develop with. Wlroots had to be made to make things easier for compositor devs.
It’s great on newer hardware, specially phones and tablets. For your 5 year old laptop, it likely is about the same as X11.
What does it do on new hardware? Not a lot of people are running normal desktop Linux on phones / tablets, are they? Which, totally cool if it works better on those things… but I guess I’m just surprised by how much hype there is for Wayland when X just works for me and would presumably just work for most people’s use cases. Like… who are all of these people that are emotionally invested in display servers, and what am I missing?
I mean, 20 years ago or whatever there was always the pain of black screens and X configs… but it just kind of works now in my experience?
For example, Pinetab 2 was developed and tested with Wayland and is more stable on it. Plus way better touchscreen support.