I design electronics sometimes. Generally, people want an indicator light on their product, since it's a cheap way to show the state of a system.
The main problem is, the human eye adapts to darkness. You can still clearly see an LED in a dark room when a few microamperes pass through them, but then they are useless in brighter light in that case. There's no specific amount of current that produces light that's bright enough in a lit room, but isn't too bright in a dark room.
I can fix that by occasionally turning off the LED and measuring voltage across it (LEDs detect light in addition to emitting it), then dimming it if I'm in a dark room. However, this is quite complicated to do and requires a capable microcontroller and a pretty ninja embedded systems programmer. Most product developers I know won't think of specifically doing this.
Finally, I can save 0.1 cents (plus board space plus assembly complexity, which cost more) by connecting an LED directly to the pins of a microcontroller instead of using a resistor to limit current. Some microcontrollers specifically allow this, up to 10 or 20 milliamperes, which is enough to be too bright in some contexts already. Margins on hardware manufacture are extremely thin, so optimizing even 1 cent off a board is pretty important.
All of this together leads to a lot of LED proliferation, which I' don't like either. The stuff I build for myself often has a way to control the LED brightness, although this would be too expensive to add to a consumer product as a general rule. For small devices, there's a tilt switch inside that turns off the indicator LEDs if you turn it upside down and hold it for a few seconds. That way you can just reach over at night and fix it without fiddling for switches or controls.