There is another side to the issue of user authorization. Sometimes, a user is authorized to do something organizationally, but the technical controls refuse to allow them to do this. Obviously, this is often because we can't make programs that do everything, but too often, we also decide to limit what a program can do due to security. Security, as we know, is not a one sized fits all process, and there are deep problems when we decide very early what security is. This decision is what the other side of the Authorized User problem.
I think the poster child for this problem is Accessibility. I want to be on the same page about accessibility, so I did some hunting for a primary source, and I think the best one is Linux After Dark Episode 72. If you want to see where I'm coming from that is a quick summary (but I want to stress there's so many sources, I just wanted one to ground the discussion in, this topic is complex). Also - please, if any of this is surprising to you, go find out more, and figure out how you can help, because it's important. The important bits of the discussion for my purposes are 7 to 10 minutes of the podcasts.
The salient points are:
- Many tools require a flag to allow accessibility
- Many tools have guard rails to prevent a tool from reading from another user (including privileged users)
- Many tools have guard rails preventing users from subscribing the same resources from multiple TTYs
I think that security people tend to be (afraid of certain classes of threat)[https://www.businessofapps.com/news/malicious-screen-readers-can-extract-data-from-92-of-finance-apps/]. This is a pretty reasonable thing to be worried about. There's a strange situation where someone is doing something unexpected, and security people will want security by default. Secure by default is a good idea in our daily security. I even think that the point that you need to bake accessibility into a product from the beginning probably resonates with security people, since we all strongly believe that you need to bake security into the product, and we've all met products that can't even be secure because of decisions made years ago.
But, we can all agree that it's absurd to force someone to set a flag in order to use a monitor. Can't we? We can agree that it's absurd to disallow a user from seeing a program they ran with sudo? Imagine you run gksudo synaptic
and suddenly a black window shows up, and you file a bug. Then the github issue is closed "Users should not be able to display windows owned by root, that would be absurd from a security perspective. A nearby person might see the windows contents!". We don't make this argument because we all agree that computers should be used, and that users can be asked to use them responsibly, and that if they want to black the screen they can. For some reason it's considered reasonable to give this experience to users using screen readers.
I saw a comment recently that it would be absurd for there to be a universal way to zoom the screen on Wayland. I ... did not shout at anyone. Only because it was an older comment in a thread doing readings for this post.
There's a trend in computer design which is basically that certain kinds of users wont be allowed to use their computer. The flagship of this project is the Pwn2Own competition, which is a competition to confirm that device owners are not able to use their devices in ways not approved by the designer. Which is a slightly wild idea for a project. These devices are designed to not allow users to be technically authorized to do things that the organization that owns the device want them to be able to do. And Wayland's security model seems to be that users should be prevented from doing things with their compositor they want to be able to do (adding accessibility as a work around). The really important question is - what should a user be authorized to do.
Of course, we need to be clear that we can layer additional tools on top of these things. It's valid to disallow screen readers by policy on a computer where you know the valid user list, and none of them have chosen to use a screen reader. But this needs to be the non-default state. We can have an accessibility flag - but it needs to default to on, allowing screen readers, not off. A lot of device level MDM tools are designed for this kind of thing - making devices not be generic tools and instead secure tools. This is why everyone who has ever used it without a good reason, hates SELinux. Because SELinux isn't for daily living, but it's designed to do this kind of thing. Am I advocating for security by exception? No! I'm advocating for good security modelling.
When designing a security model for an operating system, a compositor, or any other tool - you need to be able to ask questions about valid authorizations. Designing a threat model is often the beginning of wisdom and not the end. If you are worried about a threat where a screen reader leaks your data, then the question that needs to be asked is how to mitigate that threat. And I'm worried that lots of tools are deciding that the solution is to disallow the screen reader in some cases. Which is at least immoral, definitely discrimination, and probably illegal in your jurisdiction, and probably not a good security decision.
I think this comes out of the attitude driving Pwn2Own. The large OS and device designers have a limited set of designed usage patterns, and a desire to force people to their clouds and their tools, in order to keep them viewing their ads, and manage their expectations. And this lets them feel that there's a very narrow list of valid actions and normal activities, and they are willing to cry "security" to disallow those flows. This has knock on effects which I would like to keep us aware of. There's an old joke, where the first security person says "The only secure computer is the one thrown down the Marianas Trench", and the second one says "encase it in concrete first". This jokes exists for a very good reasons. We need to accept that computers must do things that have threats. We need to allow people accessibility, and accessibility needs to exist off the path you can think of.
Because we can't just secure computers