Why do I get black bands on my TV or computer screen when I take a picture with my camera, and how can I fix it?

W

It’s not uncommon to see a black band on a TV or computer screen that’s clear to the naked eye when taken with a camera. This is due to a mismatch between the display’s scanning lines and the camera’s exposure time, which can be fixed by increasing the exposure time.

 

Have you ever taken a picture of a TV or computer screen with your camera and noticed something strange? It’s not uncommon to see a screen that looks perfectly fine to the naked eye, but when you take a picture of it with your phone, a black band appears. This can be frustrating because the camera can’t capture the same sharpness as the screen in front of you, but what could be wrong?
To understand this phenomenon, you first need to realize that the tiny light bulbs that make up your display are not emitting a steady stream of light. The display has to constantly change to keep the screen fresh, and it does this by turning on and off numerous bulbs, LEDs, in short cycles. They do this by blinking in horizontal rows, which we call “scanning lines”. The display divides these scanning lines into odd and even rows, and crosses them rapidly to fill the screen. So the black streaks are some of the crossing scanning lines, and that’s what the display actually looks like.
However, our eyes don’t see any of this. This is due to a phenomenon called “afterimage”. An afterimage is a lingering visual effect in the visual system after the light stimulus has been removed. In other words, the image you just saw lingers in front of your eyes for a moment. This lasts for about 1/16 of a second, and visual changes faster than that are not perceived. This phenomenon also plays an important role in video media such as movies and animations. In reality, what we see is just a series of still images played back in rapid succession, but thanks to the afterimage effect, we don’t recognize the individual frames and mistake them for continuous motion. Similarly, in the case of displays, the afterimage hides the flickering scanning lines.
Let’s go back to displays for a moment. Most displays on the market have a scanning rate of 60 Hz, which means that the bulb flashes every 1/60th of a second. This means that our visual system remembers the stimulus for about 1/16th of a second after the image is gone, whereas the display flashes at a faster rate of 1/60th of a second. This is why the afterimage fills in the gap even during the briefest of LED light offs. Because of this, our eyes don’t see the moment when the scanning lines cross and perceive it as a continuous screen.
However, there is no such thing as an afterimage in a camera. A camera that takes pictures in very short time increments, down to a few hundredths or thousandths of a second, captures all of the intersection of the scanning lines. In that split second, the lit LEDs appear as the original screen, and the unlit LEDs appear as the “black band” we saw.
So, how do we avoid these scanning lines? In cameras, there is no such thing as an afterimage, but there is something called ‘exposure’. A common example of this is when people draw letters with firecrackers, or when photographers capture the movement of stars in the night sky. Just as our eyes dilate their pupils to gather more light in the dark, cameras accumulate light by keeping the aperture (the pathway for light to enter) open for a long period of time in the dark. This time is called the ‘exposure time’. Because it literally “builds up” light, any movement during the exposure time will be visible in the photo. This can act as a substitute for the afterimage that the camera doesn’t have.
Eventually, you can increase the exposure time. Longer exposures allow the aperture to stay open long enough for all the LEDs on the display to turn on and off. The afterimage in our eyes lasts about 1/16th of a second, so similarly, adjusting the exposure time by 1/10th or more will give you a smooth, sharp display photo. Longer exposures also eliminate black bands caused by flicker. Of course, the longer exposure time means that the screen will be brighter due to the amount of light it takes in, so you’ll need to adjust the brightness accordingly using the aperture and ISO values to control the brightness.
As you can see, the black bands that we didn’t see were not “weird,” but were actually the display’s real appearance, the flickering “scanning lines. It’s just that our eyes were tricked into not noticing it. The human eye is an all-around camera that can adjust contrast, perspective, focus, and everything else on AUTO, but in front of the display, the camera was seeing better than our eyes.
With the development of cameras, we have entered the world of digital images. The world they capture records reality in a different way than the human eye, and that difference can sometimes provide a new visual experience. In the end, what we “see” is not just what we see with our eyes, but how we perceive and take it in, and devices like cameras become tools to expand our perspective.

 

About the author

Blogger

I'm a blog writer. I like to write things that touch people's hearts. I want everyone who visits my blog to find happiness through my writing.

About the blog owner

 

BloggerI’m a blog writer. I want to write articles that touch people’s hearts. I love Coca-Cola, coffee, reading and traveling. I hope you find happiness through my writing.