![]() ![]() ![]() (I used 480 and 60 Hz since that's what the question asked. The difference is that if game screen has 240 lines but sends it as interlaced 480i, there can be some flicker seen due to the same lines being drawn as interlaced, the picture seems to jump up and down by one line of 480i (half line of 240p). There is no flickering as only one field of video is sent so the video lines drawn are on top of each other. It means that the progressive signal that is comparable to 480i is not 480p, but 240p. So gaming consoles must also use same horizontal rate and vertical field rate to send the video. ![]() Gaming consoles are free to send any kind of signal they want but the TV must be able to lock onto it. But it also means that the image is really sent as two 240-line (or 288-line) fields at 60 (50) times per second, with half a line offset, so basically a full frame of both interlaced fields is sent at 30 (25) Hz. #Flickery thoughts updateStandard TV programs are interlaced 480i (or 576i) to have high screen update rate of 60 Hz (or 50 Hz) to prevent flickering and to achieve a high resolution of 480 lines for static images. TVs can only accept a video signal with single horizontal frequency of about 15.734 kHz (or 15.625 kHz). While early gaming devices used progressive 240p signal only, later on, some games on newer gaming devices used interlaced signal to display even the game in interlaced 480-line resolution. The reason why switch to 240p, instead of sending 240p content as two 480i fields, is to avoid the 240-line content from flickering, as it would look like as if the same 240-line content is jumping up and down by half a line for each 60 Hz field. This can be used for example show a static title screen in 480i for high resolution and the moving game action in 240p for lower resolution without flickering. So it was not possible to use 480p.įor example, Amiga 500 can send either interlaced 480i for hi-res graphics and progressive 240p for low-res graphics, and the game is free to select the mode anytime. #Flickery thoughts 480pAnd later devices such as the Amiga could send either 480i or 240p.īut TVs were not 480p capable, only 480i or 240p. Indeed, early devices such as C64, NES, or IBM PC with CGA adapter did not use interlacing, but simply sent 240p to the TV. #Flickery thoughts movieWhy would any game console or individual game send an "interlaced" picture to the TV when it's right there above the console? I assume that the picture must suffer in some subtle way from this "only every other line of image data", but it must be far more noticeable for video game content compared to a movie or TV show. Apparently, this was done very late, so it was not some kind of early 1980 video game console technique either. Why would it have to send "half the data" in that scenario?īut then I watched a video on YouTube where it was casually mentioned that some games used "480i" instead of "480p", without further explaining it. ![]() I thought that all video games, being local to the customer's home with a video cable between the console and the TV, used "progressive" mode, meaning "not interlaced", since there was a dedicated video cable between the two units. I was under the impression that "interlaced" video modes was only a thing for remote television content because it saved bandwidth to only send 50% of the data to the homes, so you could fit more TV channels in the same bandwidth (air or cable). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |