It seems that we perceive light similarly. I used to be a gluttonous reader and my parents were worried about my eye-sight, as I tended to read in fairly dark conditions. Light that may seem necessary for others may cause book pages daze my eyes.
It's not that I prefer to read in dark conditions. I love sunlight, albeit not necessarily directly on the page of my book. I just don't like artificial lights to be too bright, especially later at night.
Screen/monitor backlight is usually configurable. In a brighter place turn it up. In a darker place turn it down. In a dark room minimal light is sufficient, but no light is, well, not enough.
Without backlight a modern LCD is simply very dark gray. My netbook has a function key to turn off the screen to preserve energy, something I've started using as a reflex (like Ctrl+s) because it can really prolong your battery life. What it actually does is not to turn off the screen, but to turn off the backlight. You can still make out some text and window decorations.
Anyway, turning the backlight down all the way as far as it can go is tolerable, but I'd like to go down further. For example, I typically use my desktop monitor through a range of about 0 through 6, occasionally up to as much as 10 when it's particularly bright out and the sun shines in through the window just the "right" way. The thing goes to a maximum brightness of 100...
On my laptop and netbook it's similar. I don't know if the actual panels have the same range of fine-tuning available as my desktop monitor, but only 10 steps or less are actually exposed to the user (and no, that's not a Linux bug, although on my netbook there is this strange thing in Linux where it increases and decreases the brightness with
two steps at a time, so I typically have to go all the way up or all the way down and then reverse in order to reach the desired brightness -- luckily it's an odd number of steps).
I have heard that it's bullshit, yes, but I have to clarify for myself how it's so. I have never bought a monitor specifically. I have only used other people's leftovers.
Apple means to imply that you can't see any pixels and that therefore the screen quality couldn't get any better with a higher PPI. To be "fair" I believe they say that applies to a distance of 12 inch (30 cm), but who the heck holds their phone or tablet an arm's length away? In any case, you can simply compare the "Retina" display on the iPhone to the full HD display on something like the Samsung Galaxy S4 (a little over 300 PPI vs a little over 400PPI) and you'll see the S4 looks sharper -- even at 30 cm.
There are some very simple tests you can perform on screens (but less easy to do so on phones...) to tell whether or not such claims of not seeing pixels are bullshit.
1. The checkers pattern:
http://carltonbale.com/pixel_by_pixel_checkerboard/ Does it look like a uniform gray? If not, you're seeing pixels. This is how (some) printers make gray. My 300 DPI laser printer used to look quite checkerish when printing gray, even if its letters looked nice and sharp. A slightly later 1200 DPI model didn't look too checkerish at a superficial glance. I believe nicer inkjet printers can put out a wide variety of different colors and shades of dots (only up to a gross or so, but still) so their dithering methods are a lot more sophisticated than mid-'90s laser printers that could only do black in various densities. Even relatively cheap models offer 2400 DPI. Let's say because of the dithering we should say the effective resolution is only actually a fourth of that compared to a monitor -- that's still 600 effective DPI. And that's consumer-level medium quality print.
2. The line test. I couldn't find a quick example, but create an image with two one pixel lines one pixel apart. If you can't tell the lines apart, you're not seeing pixels. I've never seen a display on which I wasn't easily able to discern the two separate lines, although I haven't had the pleasure of being able to try it on a display like the Samsung Galaxy S4's. I expect it to be slightly harder than on the iPhone, but not impossible.
Some more technical background can be found
here. It implies that beyond 600 PPI we'd be unlikely to notice much of an improvement for typical use cases (whatever that means). That may or may not be true, but we're hardly there yet. Me, I tend to think all of these "typical" things are written about 50-year-olds with faltering vision when I see how obviously non-"Retina" those Apple displays are (even if very nice compared to ye olde pixel monsters).