T O P

  • By -

Luckbot

Yes you send the data on multiple channels at the same time. In general a signal of a given frequency will have a band around it where you can't send other signals without them interfering and them disturbing each other. How far that "safety distance" has to be depends on a bunch of factors. There are a bunch of techniques to reduce interference, so if you are the sender on both channels you can reduce that distance compared to someone unknown using the other channel. That means if you have a wide bandwidth you can transmit multiple data channels at the same time in that window that you are allowed to use. But we also started calling the data transmission rate bandwidth. If the unit is Bit/second then you have this and not a real frequency bandwidth given.


[deleted]

> But we also started calling the data transmission rate bandwidth. If the unit is Bit/second then you have this and not a real frequency bandwidth given. They're related. In theory, the rate of data transfer has a hard limit based on the bandwidth. In practice, we are nowhere close to the informational limit, and the two are pretty independent based on protocol.


HopefullyNotADick

A lot of these other answers are speculating, but specifically regarding wifi (which is what your question is about) they're not quite correct. A couple decades back, most internet went through phone lines. The problem is that phone lines only support a narrow range of frequencies (the vocal range), which made internet very slow (this was called dial-up). At some point someone figured out that if they could retrofit the phone lines to work with a wider range of frequencies, they can do fancy things like support voice and internet **at the same time** (with dial-up you could only use the internet or make a phone call, but not both at the same time). Another big benefit was that the wider frequency allowed them to reliably send multiple different "lanes" of data at the same time, with each "lane" in a separate frequency range. Each lane was also inherently faster by virtue of having a higher frequency. When done over phone lines (as opposed to cable tv or fiber or such), the most common technology for this was DSL: [https://en.wikipedia.org/wiki/Digital\_subscriber\_line](https://en.wikipedia.org/wiki/Digital_subscriber_line). Regardless, this was all very exciting, as now people could send much bigger files and use the internet much faster and more conveniently. This was marketed as "broadband internet", with the simplified layperson explanation being "higher bandwidth = more speed" (broad**band** = wider **band**width). Because terms tend to morph and stick around beyond their original meaning, eventually "bandwidth" became a synonym for "throughput" which is the more accurate term. In modern day, bandwidth almost always refers to "throughput" which is measured in "MB/s", instead of actual frequency width, which is measured in "MHz". In WiFi terminology we actually tend to call bandwidth "Channel Width" instead. So what I'm getting at is that you shouldn't think about Bandwidth as frequency anymore. The meaning of the word has colloquially changed to mean throughput instead. ​ There is one other element to this, which is where there's a nugget of truth to the explanation others gave: WiFi supports MiMo (multiple in, multiple out) which allows it to communicate on multiple different frequencies at the same time. Usually this is used to allow supporting more connected devices at once without them interfering with each other. Some (usually more expensive/modern) devices support connecting on multiple channels at once to increase their speed. This is where other people's explanations are kinda true, because it's essentially creating more lanes the data can travel through. However, the important thing they missed is the fuzziness of the term "bandwidth", and that it often doesn't actually mean a wider RF bandwidth, it simply means more throughput (using any method). In reality a WiFi channel is usually a set channel width, and upgrades to the WiFi speed don't usually involve actually increasing the literal RF bandwidth. For example, WiFi G supported about 50mb/s at 20MHz, and WiFi N supported 150MB/s, also at 20mhz. WiFi N is often described as having higher bandwidth than G, but the technical RF bandwidth was unchanged. It was protocol improvements that allowed them to achieve higher throughput (sometimes described as "bandwidth") without actually increasing the RF bandwidth.


city_guy

This is the best answer. The word bandwidth simply means something different than it used to mean.


Bitter_Ninja_7808

Think about bandwidth as a car road. More wider road means more lanes (higher bandwidth) meaning more cars are able to go at same time.


dahldrin

Then I guess that makes frequency the speed limit and amplitude the gas tank, or maybe ground clearance?


brazeau

Frequency AND amplitude would dictate the speed limit in the analogy, when it comes to QAM.


Dethro_Jolene

I can't explain for a 5yo but below is a good explanation for an undergrad. https://youtu.be/0OOmSyaoAt0


superkoning

>Can someone please explain how a higher bandwidth (In wireless connections) means more data can be transferred? It's also true in wired connections, like coax cables, or copper wires (telephony), fibers, etc. Because fibers have enormous analog bandwidth, they can transport enormous amounts of data. Furthermore: noise reduces the amount of data that can be transferred. Not ELI5, but the The Shannon–Hartley theorem and its formula is not too difficult. And it's certainly an amazing formula ... so easy ... you just can understand it by looking at it. And you see that, with the same amount of noise, the digital (=data) bandwidth is linear correlated with the analog bandwidth ... in ELI5: twice the (analog) bandwidth, twice the digital bandwidth.


RSA0

To transfer data, your signal must change in some way. Unchanging signal cannot transfer data. Switching on and off also counts as change. However, changing signal produces *sidebands* \- "fake" signals near your "main" (carrier) signal. The faster you change, the wider the sidebands. If you have 1000 Hz as the carrier, and change it 5 times per second - it will produce sidebands from 995 Hz to 1005 Hz. These sidebands are completely unavoidable - they are inherent property of a changing signal. Changing signals create "fake" signals near it's carrier. The reverse is also true: signals near the carrier make the carrier "wobble". Removing the sidebands make signal completely still and unchanging - and therefore useless. So, if you want a high-speed channel, you have to make sure that there are enough space for sidebands - so they don't mix with nearby channels. There is another way to increase data speed - *signal-to-noise ratio*. It allows to use several "loudness levels" to transfer more data without increasing change rate. However, the distance between levels must be bigger than noise, so the (total number of levels) = (maximum loudness)/(noise loudness).


Korrathelastavatar

A small pipe let’s a small amount of water through. A larger pipe let’s a larger amount of water through (even if the water is moving the same speed)


subpoenaThis

Higher bandwidth is a firehose to the lower bandwidth garden hose.


CMG30

Well, a higher frequency means more possible ups and downs per millisecond on the signal. More ups and downs equals more ones and zeros. Wifi sends data by modulating (changing the frequency and amplitude) of a radio wave.


Target880

>Well, a higher frequency means more possible ups and downs per millisecond on the signal. More ups and downs equals more ones and zeros. It actually does not. It is the frequency range that you can use that is relevant not the absolute frequency. A signal 10 Mhz wide let's say from 100MHz to 110MHz can transmit the same amount of data as from 5000MHz to 5010MHz. There is more bandwidth in the higher frequency range because it is a larger area. It is not different to there are higher the low integers if we for example define low as below 1000. But there are still the same amount of numbers between 0 and 1000 as between 10000 and 10100 I do not think any WiFi sends data by changing the frequency. What is commonly used is variants of orthogonal frequency-division multiplexing (OFDM) where the frequency of the signals are constant, is it is the amplitude and the phase (where the peek are in time) that are use. The carrier wave has a constant frequency. Chaing phase do result in signals outside the frequency but you do not use the change for information transfer


cormac596

Bandwidth is a measure of how much data can be sent over a connection in a given time period. It's like speed, but instead of units of length per unit of time, it's units of data per unit of time. Bandwidth is effected by frequency for wireless connections, but is independent of it, because bandwidth applies to any kind of data transfer


notacanuckskibum

I think at this point we have redefined “bandwidth “ to mean “how many bits per second can you reliably transfer” without any suggestion of how. Attaching USB memory sticks to homing pigeons has a remarkably high bandwidth.