Back in the 80s, as the home computer revolution got going, computers were typically wired up to small, cheap, portable TVs as a display device. These TVs used shadow masks, and the computer video output was typically modulated to a TV signal, and the TV was 'tuned' to the computer. All of this added large amounts of blur and distortion even before the signal was displayed on the TV.
By the mid 80s, it was maybe more typical to buy a dedicated CRT monitor, and the computer connected via composite, or maybe even an RGB feed to the monitor, allowing higher resolution and much improved quality.
For the well healed, this route also led to the holy grail, a trinitron tube!
At each of these changes, the aesthetic of the display technology changed, but probably the best memories come from the original blurry stuff as the magical moment of actually getting something out of a home computer.
For a long time my only "monitor" when I was a kid was a 12" B+W TV for my ZX Spectrum. For the day of my birthday when I got it I was allowed to hook it up to the family 14" color TV, but after that it was back to the B+W for the next couple of years!
(funnily enough, when I finally got a PC years later, the only monitor I could afford was a Philips monochrome VGA -- I guess they now sell for multiples of the original retail price? https://www.ebay.com/itm/176945464730)
By the mid 80s, it was maybe more typical to buy a dedicated CRT monitor, and the computer connected via composite, or maybe even an RGB feed to the monitor, allowing higher resolution and much improved quality.
For the well healed, this route also led to the holy grail, a trinitron tube!
At each of these changes, the aesthetic of the display technology changed, but probably the best memories come from the original blurry stuff as the magical moment of actually getting something out of a home computer.