In a standard television-like computer monitor, an image is produced on the screen by a beam of electrons sweeping rapidly across the surface of the picture tube, lighting up the screen as it passes. Starting at the top, the beam traces one horizontal row across the screen, shifts down a bit and does another row, and so on, until the full height of the screen has been covered.
In an interlaced monitor, the electron beam takes two passes to form a complete image: it skips every other row on the first pass, and then goes back and fills in the missing rows. A non-interlaced monitor does the whole job in one pass, tracing each row consecutively. Interlaced monitors are easier to build and therefore cheaper, but as you can guess-they aren't as good as non-interlaced monitors. The problem is that all things being equal, it takes twice as long to create the complete screen image on an interlaced monitor. That's long enough to spoil the illusion that you're looking at a steady picture, and the image on the screen flickers annoyingly.