Putting some numbers to that using my monitor as an example:
Max output is 350 cd/m2 (candelas per square meter) but in my room that is too bright, so I have the brighness set lower at about 250 cd/m2. A popular figure on the web for black level is 0.3 cd/m2 but, assuming that to be the minimum setting on my monitor, it causes shadow detail to disappear for whatever reason. So I have recently set it higher - I'm guessing maybe 0.75 cd/m2. That means I have my monitor set to a contrast ratio of 250/0.75 = a miserable 333! That is 8.4 EV or stops. Doesn't bother me too much because, after processing, 8 EV is about as good as my camera gets and I don't print.
I don't understand that either. With my monitor, reducing the black level increases the apparent contrast in the image. But, in the literature, there are conflicting statements about monitor settings - I've seen brightness described as altering the black level for example.This usually results in the image looking a little dull and flat. But a thorough understanding of this difference in perception is complex and way above my head.
So I conclude that our monitors are probably different insofar as basic adjustment is concerned. Might be fun to produce a target and spot meter it with the camera while adjusting brightness, contrast, black level in the dark.