Electrical engineer here who also does hobby projects. I’m with you. I think some of the reason may be that modern GaN-type green or blue LEDs are absurdly efficient, so only a couple mA of drive current is enough to make them insanely bright.
When I build LEDs into my projects, for a simple indicator light, I might run them at maybe only a tenth of a milliamp and still get ample brightness to tell whether it is on or not in a lit room. Giving them the full rated 10 or 20mA would be blindingly bright. I also usually design most things with a hard on/off switch so they can be turned all the way off when not in use.
Of things I own normally I also have two power strips with absurdly bright LEDs to indicate the surge protection. It lights up my whole living room with the lights off. If I had to have something like that in my bedroom, I would probably open it up and disconnect the LEDs in some way, or maybe modify the resistor values to run at the lowest current I could get away with.
I feel like designers have lost sight of the fact that these lights are meant to be indicators only-- i.e. a subtle indication of the status of something and not trying to light a room-- and yet they default to driving them at full blast as if they were the super dim older-gen LEDs from 20+ years ago.
I could see that being the case for some things, at least in cases where it is an older design being updated or VE’d. Perhaps some sourcing person changes the LED part number on the AVL and forgets to check with engineering whether the resistor value which goes with it is a sane level of brightness still.