It used to be (before digital TVs, flat-screen, etc) that you didn’t have the option of switching inputs, so you would have to go to channel 3, which doubled as the “input” channel also. This would be like, for example, NES or Sega.
Also, we had a box that would let you use channels 3 OR 4, but 4 either didn’t work as well or had a more useful content on it (i.e, the local FOX network).
At one point I remember having the NES hooked up to a VCR so the TV had to be on channel 3 so that you would see what the VCR was displaying and the VCR had to be on channel 4 to play the game.
Old TVs only had one input. They had a coaxual cable that would connect to your antenna or cable provider. The TV has a tuner to select channels.
Because there is only one input, the only way to display a console is to spoof the broadcast signals the TV expects to get from the antenna.
To set this up, you connect both the console and your antenna to an RF converter box. This replaces channel 3 or 4 with your console so you can play games while still being able to watch TV.
Once consoles and VCRs became an established thing, TV makers started adding other plugs. But in the early days, this was the only way to play consoles on home TVs.
Even when RCA jacks were added, you would sometimes have to tune the TV to channel 2 or 3 to use them. The TV we had when I was a kid worked like that.
I’m sure there were other consoles that did it, but the one I remember is the NES; it had a physical switch on it for channel 3 or 4, and you had to have your TV tuned to the respective channel to get the game to show up. As to why that was necessary on a technical level, I’m not sure. But it was a thing you had to do.
It’s because the console was encoding the video in the same NTSC format that tv stations transmitted over the airwaves at the time, and the little adapter box it came with would connect between the tv and the antenna and the console, and merge the console’s signal with the antenna’s signal, so the TV would detect it as if the antenna had picked it up.
It had a channel selector to let you pick which of two frequencies to center the console’s signal at, in case one or the other was in use by a real tv station. Where I and apparently the previous poster lived, “channel 3” was unused but “channel 4” had a tv station so only channel 3 worked for video games
At a specifically technical level, that adapter box contained a RF oscillator and a “frequency mixer” - the mixer likely being made of a transistor or 2 being switched on and off by the RF oscillator at a very high speed, with the effect of frequency-shifting the signal from the console. It’s similar to the way a camera’s shutter speed shifts the frequency of things like car wheels, helicopter blades, old tv’s and some LED dimmers to make the frequencies of those things visible to the human eye. Radio systems are almost all built on that concept.
it’s possible that channel 4 just plain didn’t work very well in that design but in my area I’m pretty sure it was just interference. I remember that channel 4 looked empty at first glance but if you sat and watched the snow, it would occasionally pick up some very faint stuff - there was likely enough RF signal there to interfere but not enough for the TV to consistently lock on.
It’s because it used radio frequency. Yes through a cable. The cable was basically just an antenna. The RF box in the back of your console or whatever would send the RF signal at a specific frequency, usually channel 3 but some had switches that you could change to channel 2 or 4.
It’s because it used radio frequency. Yes through a cable. The cable was basically just an antenna. The RF box in the back of your console or whatever would send the RF signal at a specific frequency, usually channel 3 but some had switches that you could change to channel 2 or 4.
It’s basically like tuning in to a radio station.
Edit: That’s also why you’d get static when running a microwave, vacuum or whatever. The TV would pickup the noise generated by those machine.
Edit 2: As another user has mentioned, the option to switch the channel existed in case one channel had interference issues.
Can someone explain this to me pls?
It used to be (before digital TVs, flat-screen, etc) that you didn’t have the option of switching inputs, so you would have to go to channel 3, which doubled as the “input” channel also. This would be like, for example, NES or Sega.
Also, we had a box that would let you use channels 3 OR 4, but 4 either didn’t work as well or had a more useful content on it (i.e, the local FOX network).
At one point I remember having the NES hooked up to a VCR so the TV had to be on channel 3 so that you would see what the VCR was displaying and the VCR had to be on channel 4 to play the game.
O shit I remember that too
Old TVs only had one input. They had a coaxual cable that would connect to your antenna or cable provider. The TV has a tuner to select channels.
Because there is only one input, the only way to display a console is to spoof the broadcast signals the TV expects to get from the antenna.
To set this up, you connect both the console and your antenna to an RF converter box. This replaces channel 3 or 4 with your console so you can play games while still being able to watch TV.
Once consoles and VCRs became an established thing, TV makers started adding other plugs. But in the early days, this was the only way to play consoles on home TVs.
Even when RCA jacks were added, you would sometimes have to tune the TV to channel 2 or 3 to use them. The TV we had when I was a kid worked like that.
I’m sure there were other consoles that did it, but the one I remember is the NES; it had a physical switch on it for channel 3 or 4, and you had to have your TV tuned to the respective channel to get the game to show up. As to why that was necessary on a technical level, I’m not sure. But it was a thing you had to do.
It’s because the console was encoding the video in the same NTSC format that tv stations transmitted over the airwaves at the time, and the little adapter box it came with would connect between the tv and the antenna and the console, and merge the console’s signal with the antenna’s signal, so the TV would detect it as if the antenna had picked it up.
It had a channel selector to let you pick which of two frequencies to center the console’s signal at, in case one or the other was in use by a real tv station. Where I and apparently the previous poster lived, “channel 3” was unused but “channel 4” had a tv station so only channel 3 worked for video games
At a specifically technical level, that adapter box contained a RF oscillator and a “frequency mixer” - the mixer likely being made of a transistor or 2 being switched on and off by the RF oscillator at a very high speed, with the effect of frequency-shifting the signal from the console. It’s similar to the way a camera’s shutter speed shifts the frequency of things like car wheels, helicopter blades, old tv’s and some LED dimmers to make the frequencies of those things visible to the human eye. Radio systems are almost all built on that concept.
it’s possible that channel 4 just plain didn’t work very well in that design but in my area I’m pretty sure it was just interference. I remember that channel 4 looked empty at first glance but if you sat and watched the snow, it would occasionally pick up some very faint stuff - there was likely enough RF signal there to interfere but not enough for the TV to consistently lock on.
It’s because it used radio frequency. Yes through a cable. The cable was basically just an antenna. The RF box in the back of your console or whatever would send the RF signal at a specific frequency, usually channel 3 but some had switches that you could change to channel 2 or 4.
It’s basically like tuning in to a radio station.
Might as well paste my answer here as well.
Edit: That’s also why you’d get static when running a microwave, vacuum or whatever. The TV would pickup the noise generated by those machine.
Edit 2: As another user has mentioned, the option to switch the channel existed in case one channel had interference issues.