I wonder if there is a way to see how physical monitor quality and size improvements have led to more complicated code, nevermind moving off of punch cards.
The type of monitor doesn't matter a whole lot because it's really limited by human eyesight. High-res monitor will enable rendering things tiny if you disable hi-dpi, but then it's unreadable. If you use a big 8K TV to display everything larger, you have to sit further away to comfortably view it. If you add more monitors, at some point it becomes too hard to look at so many things at once.
Personally, my setup has shown the same number of vim rows/cols on my 4K monitor vs my 1080p one (or whatever you call the 16:10 equivalents).
The effective resolution of the eye should be an angular value. I'd say that the smallest comfortable size of text characters for me is about 1/8° wide. I see the width of my 14" FHD laptop screen at about 30°, so the full width contains about 240 characters. This is a large upgrade from 80 columns of VT220 or VGA, and almost twice as much as the densest VT220 mode of 132 columns.
My 28" 4K monitor is exactly like four 14" FHD screens put together. It offers me 480 columns (a bit fewer, because window borders and scroll bars of multiple windows).
So indeed, better screens allow me to see much more text than poorer screens of old. There is a limit, but definitely it was not reached 30-40 years ago yet.