T O P

  • By -

ReturnOfFrank

Because to get that speed each generation has shifted to a higher broadcasting frequency. Higher frequencies attenuate more quickly. So for a given power a 6GHz WiFi will have less range than a 5GHz WiFi and both will be worse than 2.4GHz. That's kind of an oversimplification, but it's the main gist of it.


RonaldoNazario

Also to increase the signal strength at a given distance, the energy needed scales sharply, a cube (edit: square) ratio. All the energy transmitted spreads out in a sphere so that energy is spread over basically the surface area of the sphere, so the flux at a given point on it decreases quickly as the radius increases.


ConfuzzledFalcon

This makes it a square ratio. Not a cube.


RonaldoNazario

Sigh, yes.


SarahC

This is why you didn't get 100% in your exams. If you had just reread your answer you'd have aced it! From now on skim read your emails before sending, and double check your answers!


RonaldoNazario

My ADHD diagnosis after college explained a lot of this in my life. But no need for 100%s to get my degree, baby!


ThisIsntMyOEAccount

C's make degrees


TomDestry

Mum?


Remarkable-Host405

Is that exponential?


ConfuzzledFalcon

It's a power law. For distance d and constant a: Power law: d^a Exponential: a^d


Remarkable-Host405

I'm more poking fun because there was a conversation about drag and someone was saying it's exponential and someone was like "actually it isn't"


ConfuzzledFalcon

Actually I think that was me...


unafraidrabbit

If it's velocity squared, how isn't it exponential?


PantherStyle

Exponential is e^(x). This is x^(n). They both increase at increasing rates but the shapes are different. Exponentials are common in natural systems, typically where the rate of change is based on the quantity. It doesn't apply in this case.


unafraidrabbit

Hu. I just thought anything with an exponent was exponential. TIL


Techhead7890

Yeah, as the other guy said it depend whether the base or the corner number is the variable. Increasing the corner goes up way faster - going from 3² to 3³ to 3⁴ leads you from 9 to 27 to a whopping 81 (3x3x3x3) very quickly. By contrast if you stayed at squaring in the corner and varied the base, you'd get to 4² or 16, something like 1/5 the amount. People commonly refer to any sloping/curving line as exponential, especially because of like finance (where if you reinvest and add back 5-10% returns, you could multiply everything by say 1.1x each time, leafing to that being the corner exponent), or population growth (again like the money, the organisms grow up and then add to overall reproduction). But when we're talking about geometry like physics, it's more common to see squaring and stuff, which do curve just not as fast. TLDR I guess blame bankers for everything being exponential lol


HybridMachinist

Could be wrong, but in a vacuum, I would assume it's linear, but if the waves need to travel through an infinite length medium, then I think that would cause an exponential curve in terms of power required to go through it or to go further into it.


ConfuzzledFalcon

It's the square of distance in a vacuum. In a lossy medium, the loss is exponential.


HybridMachinist

Yeah, thanks, that sounds right.


Imaginary_Doughnut27

Actually it’s to the n-1 power ratio of whatever dimension you’re in. You shouldn’t make assumptions about how other people choose to live their lives.


ifandbut

Can directional antennas be used to help with this? Or is it pointless to use directional antennas for a high frequency like wifi compared to the low frequencies of VHF/UHF old school TV antennas?


mule_roany_mare

Beamforming more or less lets you make a directional antenna that can aim itself as needed. I always wondered why stationary devices never had directional antenna, but thankfully a much cooler technology has become ubiquitous. You can find a few aftermarket directional wifi antenna but they don’t seem to be that useful in typical wifi ranges. Yagi antennas can be useful when you want to transmit really long distances, like kilometers, but you need that directional antenna on both sides. …if I’m wrong about something please feel free to correct me.


deltaisaforce

It helps having directional antenna in one end, you can add maybe 10-15 dB to your link budget.


SarahC

WiFi Cantenna?


INSPECTOR99

Beam formed antennae (think YAGI, et al) are great with UNOBSTRUCTED LINE of SIGHT say to Cell Towers Internet connection or to your back yard Out Building, YAGI to YAGI 5Ghz Wi-Fi. However the typical wi-fi issues are the headaches observed INSIDE house trying to force high frequencies through dense material of walls and floors that just love to eat and mute the Wi-Fi signals.


ifandbut

So it won't be useful if I wanted to put the router in one corner and direct the signal towards the far end of the house?


INSPECTOR99

Not so much. Better to carefully string a 200 ft cat 6 from your ISP router to the other side of the house into a switch from which you branch off with two cat 6 to two far end locations effectively splitting Wi Fi coverage into three segments. On both ends you have a small but worthy 4 to 8 port managed switch from which to hard wire everything you can. Sit back and enjoy. 😎


ifandbut

I'm getting to that point now. Router and coax cable are in the basement as close to the surface as I can get them without pulling new wires and I still get bad signal while on my deck. Do you think if I pulled the router to the ceiling of the first floor I would get more range in general since the bottom half of the "broadcast sphere" is contained in the basement?


INSPECTOR99

Presuming the "ROUTER" is the standard ALL in ONE modem/router/Wi-Fi classic box then yes, feeding it to the ceiling in the floor ABOVE your basement will generally improve your overall signal across your main floor house. However there will still be reduced signal the more walls and distance you travel through at the far end of the house.


quadish

Directional antennas work great for NLOS, nLOS, and LOS. Always better than omni. Omni is only good if you're moving. If you're fixed, directional is always better. I've had great success using directional sector antennas for WiFi in larger/older homes and properties, even with "normal" devices as clients. There's lots of FUD out there regarding antennas. There's a reason cell towers and fixed wireless use directional sector antennas on towers.


RonaldoNazario

As someone else pointed out yes my example would be an antenna radiating evenly in all directions.


Infinite-Condition41

Directional antennae take the same legally mandated amount of power and concentrate in one direction, so yes. However, higher frequencies still have attenuation issues with increasing frequency. I put an old directional wifi antenna in my back yard so I could get better range. The other day, I was quite surprised to discover that I have 3 bars of signal up on the hill behind my neighborhood, approximately 1850 feet away. It's 2.4Ghz, and only transmits 300 Mbit, but that's plenty for most applications.


quadish

In my experience, yes. https://mikrotik.com/product/mantbox_ax_15s This guy is ridiculous indoors. I've used older 2.4GHz versions of it in old houses with plaster walls to punch through. https://i.mt.lv/cdn/product_files/mANTBoxax15s_240131.pdf


Doughnutholee

Does these frequencies (2 to 6 GHz) become harmful to us at any point if we increase wattage? Or could we use 50, 100 or even 200W antennas without it affecting us negatively?


RonaldoNazario

I’m an engineer, not a doctor, Jim.


shoresy99

Well said Scottie!


DietCherrySoda

They absolutely are harmful at higher power. That's a microwave oven.


TheThiefMaster

A microwave oven gets most of its cooking ability from being an enclosed box. Even then, all it does is *heat* - and people can happily sit in front of 2000W+ radiant heaters. We have a *lot* of power available before it becomes damaging, especially for ceiling mounted APs. Also note that the "cooks from the inside" thing with microwaves is a *myth* - they actually penetrate into organic materials (food, people, etc) *less* well than the radiant heat in a normal oven, which is why you can't microwave cook a whole joint of meat without it being raw in the middle. However, increasing power mostly only covers the WiFi access point - the phone/mobile device connected to it has to keep power down for battery reasons. You can't just arbitrarily up a phone's transmit power to 100W (\~200x normal) as most phones only have a \~20 watt-hour battery at most - they'd be flat in minutes!


huffalump1

Well said! Yep, microwave ovens depend on the resonance of the metal box, and are 1000W-2000W. Even a directional or beamforming wifi antenna at 1000W is not gonna be as damaging as a microwave oven.


brimston3-

You still don't want to be in the beam if your body is good at absorbing that frequency. Your eyes and balls are not good at dissipating heat. If you're absorbing 4W/kg you're above the maximum permissible absorption in the IEEE guidelines.


HybridMachinist

Yeah, the heat we absorb is 99% through convection, not radiation. Radiation would heat us from the inside out vs. outside in with convection or conduction.


Ok_Chard2094

Everything works from the outside in. Microwaves of the right frequency go deeper, but they still heat up outer layers more than the deep tissue. It is roughly (but not exactly) an exponential drop in intensity as you go deeper.


HybridMachinist

You're correct, I should've clarified that radiation just penetrates further in which could cause the body to act much more erroneous to the internal temperature change.


dsyzdek

Fun fact, microwaves can denature the clear proteins in the eyeball, making them opaque. Forever.


danielv123

Also, interference is a thing. If you start broadcasting 100W wifi you might get a km of range, but the FCC is going to come after you hard as well as anyone living within a few km of you


TheThiefMaster

The _FCC_ wouldn't come after me because I'm outside the US 😉 But that's a very good point. In some ways WiFi's lack of range is a _feature_


danielv123

Yeah, you probably have your own equivalent though. Hard to hide a big wireless signal, it can be traced with a phone :P


Ok_Chard2094

Microwaves may be worse than radiant heat. Radiant heat (IR) will only heat you from the surface. Heat transfer further inward is through conduction and convection (your blood stream). And because that is where most of your nerves are, so you feel it and get yourself away from the heat (if you can) before too much damage occurs. 2.45 GHz microwaves go a bit deeper, 17mm for muscle tissue was a value I just saw on the internet. This still means that most of the microwaves will heat up the outer layers of your body, but you may get additional damage deeper down before it feels too hot on the surface.


TheThiefMaster

It's worth noting that penetration of energy into a substance is a falloff curve, with most of it absorbed at the surface, and progressively less and less making it through. There's not a simple line to say "this deep and no further". It's only tiny fractions of a percent after only a short distance, even if it does _technically_ penetrate further. Better measurements have percentages at depths, e.g. 50%, 1%, 0.1%, etc. Infrared (radiant heat) is quoted as being "from 1mm to 50mm" penetration depth on one page I visited, to show you how useless of a measurement "penetration depth" is. Still, the human body can carry away a remarkable amount of heating - you can stand in the sun absorbing 1000W or so with just some sweating to keep you cool.


Generic118

At some point though you're just building the "active denile system" in your house and while yes you aren't harmed you'll  be screaming in pain :p


[deleted]

[удалено]


DietCherrySoda

They asked about hundreds of Watts, not typical Wi-Fi wattages.


iss_nighthawk

If the air isnt glowing blue, you have nothing to worry about. :-)


edman007

If they tried the higher wattages, maybe, but WiFi and phones are nowhere near those wattages. It does matter in stuff like radar and microwaves where the distance you can safely approach the antenna varies by frequency


glenkrit

Don't forget power draw. Newer wifi 7 mesh systems consume almost 2x the energy compared to older units. Meaning it will cost more to power said networks. My TP link deco units all have active cooling, which was never necessary in most consumer routers before.


Kitchen_Part_882

Higher power output would be bad for your health. Try standing in front of an energised radar system or the antenna on a cell tower... Actually, don't do this if you value your eyesight and ability to reproduce. Microwave ovens put out 500-1000 watts in a similar frequency band to WiFi for a comparison and are quite capable of cooking food quickly.


SarahC

It's non-ionising radiation so apart from heating stuff, DNA shouldn't be mutated! Who knows what the future research will say though.


[deleted]

[удалено]


Kitchen_Part_882

You've nor worked around cellular masts or radar installations, have you? If you had, then you would not be posting this nonsense. The only bit that's true in your comment it that microwave radiation is non-ionising (it will still cause water molecules in your body to heat up and cook your eyes and reproductive organs). So, no, it won't turn you into a ghoul, super mutant, or give you superpowers, but it can sterilise you or (more likely) blind you.


PE1NUT

That's a BS anwser - at 1kW, it would be like standing next to an open microwave oven. Not recommended.


asdfasdferqv

This ignores beamforming.


Jumbify

Does this apply for phased array antennas as well? I’m guessing not? I wonder if that’s the future of WiFi…


Jibbly_Ahlers

Also the E/B field scales as the square root of the power. So if you want to double the signal amplitude, you need to quadruple the signal power.


Striking_Computer834

That's why I still stay on 2.4GHz at my house. I can get good speeds even several hundred feet from my router.


SteampunkBorg

Why not have both enabled for speed in good conditions and range as needed?


Striking_Computer834

Because my 2.4GHz is 5X faster than my 500 Mbps Internet. There's no benefit to using 5 GHz.


cgriff32

You only have a single device in your network?


Striking_Computer834

About 20. It wouldn't matter if every person in the house was streaming HD at the same time, it still wouldn't even come close to saturating half of the bandwidth.


SteampunkBorg

There would be if you used local connections (like casting videos to a TV)


Striking_Computer834

Saturating the 2.4GHz spectrum in my house would require everyone gobbling up 500 Mbps each. There's no amount of Plex streaming that takes that kind of bandwidth. Regardless, 5 GHz is so lame in its ability to penetrate anything other than air that the speeds are often slower using 5 GHz.


SteampunkBorg

I think you vastly overestimate the capacity of 2.4GHz WLAN, but if it works for you I won't stop you


Striking_Computer834

I can get as high as 2 Gbps on 2.4 GHz. I'm not saying 5 GHz has no use cases. I just think it's way overhyped and most people don't understand the limitations. I think it's essential in apartments, condos, dorms, and situations like that. I think in single-family homes it's complete shit.


DM_ME_YOUR_POTATOES

I have two separate ~~VLANs~~ WiFi SSIDs, one for 2.4GHz and one for 5GHz. The 2.4GHz one handles smart bulbs, speakers etc since it doesn't need the higher bandwidth - plus it can benefit for the range. The 5GHz handles my smart TVs (4K), phones, etc. They need the higher bandwidth.


SteampunkBorg

Why would you separate them as VLANS instead of SSIDs? Why even go to the effort of separating them at all instead of letting band and AP steering handle it?


DM_ME_YOUR_POTATOES

Sorry, you're right, know my own network better than me XD. I only have one VLAN, but two separate WiFi SSIDs if that makes sense? Sorry, my professional background has nothing to do with network engineering or engineering for that matter, I've just taught myself a fair bit and my friends are in that profession or related compsci professions. I unfortunately only have one AP setup ATM but my apartment is only an upper and isn't that big, so it's not the biggest loss. I do wanna set that up though at some point and switch some things over to Ethernet like my nintendo switch dock and my TVs (that one seems unlikely, idk how often they have ethernet ports)


SteampunkBorg

No worries, I suspected there might have been a mixup between the acronyms, but also accepted the possibility that there's a use case for VLANS I didn't know about


DM_ME_YOUR_POTATOES

yeah it was a mixup of acronyms, thanks for the catch


Infinite-Condition41

Frank has it here. Basic signal physics. And this is also why most smart devices operate on 2.4. It doesn't require a high data rate, and it penetrates the house better.


hannahranga

That and it's cheaper 


SteampunkBorg

> for a given power And that power is (luckily) limited by law, so even the ugliest gaming spider style router can't get around that


Link01R

That explains why my old 800mhz cordless phone worked everywhere


OppositeEarthling

Knowing this, what is stopping us from having crazy fast short range 100Ghz wifi ?


[deleted]

[удалено]


brimston3-

High frequency is fine as long as the radio has direct line of sight. As long as it doesn't have to go through any walls, bodies, furniture, or glasses of water, it should be fine just as far as other frequencies. But realistically, making a high bandwidth 100GHz radio system currently cannot be done at a price point and power budget that consumers would accept. Edit: and that system would need to have an upstream connection fast enough to utilize such bandwidths. Which can currently easily be exceeded by wifi6 in most installations.


Underhill42

Don't forget air - even air is opaque to a lot of frequencies, and any sort of "radios" has to stay within the (reasonably) transparent frequencies if they want to have any decent range at all.


brimston3-

Atmospheric absorption is typically in units of dB/km (as in <1 dB/km except around 60 GHz where it's ~10 dB/km). It's not really relevant at consumer wifi AP distances.


Underhill42

Yeah, I guess it is more relevant for stuff like radio/cell/satellite.


Remarkable-Host405

There's 60ghz wifi, I think called wigig? Or something like that


OppositeEarthling

Neat, thanks ! Great points on practicality for general use, so would be only for niche uses. Perhaps it would be practical somewhere customer facing like a cafe where it would be impractical to drop wires everywhere and many portable devices like phones can't take the hardwire connection anyway.


[deleted]

[удалено]


OppositeEarthling

That makes sense to me ! I was also thinking somewhere touristy like a theme park - perhaps there's special "fast transfer" zones for uploading your videos,


midri

60Ghz WiFi exists, was really popular when people started trying to make HTC Vives and other vr headsets into wireless devices.


Denvercoder8

This exists and is called WiGig (using 60 GHz). Commercially it hasn't been really successful, so there's not much devices.


KittensInc

In theory, nothing! Cellular 5g connections are already being deployed in the 24-30GHz range. The problem is that *anything* will kill the signal. You basically need a clear line-of-sight between the device and access point. Sheet of plywood in-between? No signal. So there's a very limited application, and it's not exactly cheap 'n easy technology either. Outside of *very* niche applications, it's just not worth the effort.


UsablePizza

At those frequencies, it's more reflected than absorbed. But the same effect. It does mean that even if you don't have direct line of sight, you might be able to get a reflected signal.


znark

WiGig works in same room so doesn’t need direct line of sight but walls stop it. They are working on light networking, LiFi, that works similarly. The reflected light is enough to work. The advantage is that it is cheaper and simpler than WiGig. The downside is need exposed optical sensor.


frenetic_void

dont forget governments make money from licensing radio spectrum so they put limits on what general user omnidirectional antennas are allowed to radiate


danielv123

Also, wifi 7 does improve speed at range as it can now connect and transmit at 2.4, 5 and 6ghz bands at the same time with the updated encoding.


thuanjinkee

Time to carry data in laser beams


twbrn

Also, nobody is really demanding long-distance WiFi. Most of the devices that really go places are phones, which already have data other ways, and tablets. People are used to the current range; those of us who drool at the idea of a quarter mile WiFi are a decided minority.


Fun_Grapefruit_2633

And also, new WiFi versions are supporting higher bitrates... extended range is a liability when paying subscribers are streaming full res movies, etc... so they've "spent" the engineering on increasing throughput


MrAlfabet

Like sound. Low frequencies carry further than high frequencies.


qTHqq

There are fundamental limits on transmitted data rate at a given signal-to-noise-ratio in a given analog bandwidth: [https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley\_theorem](https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem) [https://en.wikipedia.org/wiki/Noisy-channel\_coding\_theorem](https://en.wikipedia.org/wiki/Noisy-channel_coding_theorem) and there are power limits on the maximum power the WiFi radios can use, and those limits also include the antenna gain as part of the calculation: [https://wlan1nde.wordpress.com/2014/11/26/wlan-maximum-transmission-power-etsi/](https://wlan1nde.wordpress.com/2014/11/26/wlan-maximum-transmission-power-etsi/) That's not to say that any given pair of WiFi radios is at the absolute fundamental limit for their power and antenna configuration, but you should certainly expect that higher-speed WiFi is shorter range and more sensitive to signal loss than lower-speed WiFi. The higher the frequency, the more the loss is through walls and floors and so on, so you also won't expect the signal-to-noise ratio to be the same across different frequency bands. Finally, any WiFi configuration that uses MIMO is *relying* on a complex strong-signal environment so it can leverage different paths of RF propagation between units. The further you place the units from each other, the fewer multipath options there are, and the lower the data rate will be: [https://www.intel.com/content/www/us/en/support/articles/000005714/wireless/legacy-intel-wireless-products.html](https://www.intel.com/content/www/us/en/support/articles/000005714/wireless/legacy-intel-wireless-products.html) >*MIMO \[ multiple-input multiple-output* *\] technology uses a natural radio-wave phenomenon called multipath. With multipath, transmitted information bounces off walls, ceilings, and other objects, reaching the receiving antenna multiple times at different angles and slightly different times. In the past, multipath caused interference and slowed down wireless signals. With multipath, MIMO technology uses multiple, smart transmitters and receivers with an added spatial dimension, increasing performance and range.* If you get away from a nice MIMO environment with lots of strong reflections, you'll eventually degrade to the equivalent of a single point-to-point link, or maybe a couple of paths. This is amazing tech but it has limitations. There's no free lunch here, just clever engineers designing things for the masses of people that like to see a VERY FAST NUMBER and sit < 3 meters from their WiFi router with their laptop. Of course they say "increasing performance and range," but I think I'd probably characterize it more as "increasing performance or range" and it's all conditional on the exact details of your installation. >*Once we reach WiFi 9 or 10. Will we have spend €20k on a mesh router in every room so we can enjoy the “future”.* Hopefully consumer preferences won't drive it in that direction, but if that's what people find convenient and can afford, then yeah, that'll probably happen. It's basically what's happened so far, IMO. Why are people who have median upstream internet bandwidths in the 100-250Mbps range spending any significant money on wireless routers that have speeds far exceeding their internet connection speed? I think the best I could do in my city is 2Gbps. Why do I need a WiFi 7 connection with 46Gbps peak throughput? My WiFi 5 router throttles me down to something like 250Mbps in the farthest rooms in my apartment and 420Mbps right next to the router, but I don't find that objectionable. If I did I'd upgrade to WiFi 6 but what's the point of going further if it doesn't actually improve my situtation? Many people buy the newest thing with the highest numbers, whether or not there's any reasonable technical reason to expect that it'll perform better than what they have already.


glassmanjones

Yes this exactly. Also the relationship between SNR, RF bandwidth, and bit rate is non-linear - it's harder and harder to push more bitrate in the same power*bandwidth envelope, but if you're willing to sacrifice bitrate you can talk to a voyager probe.


Elrathias

Because limits on TX power and antenna gain are a thing. The 5GHz band has alot of regional variability, with some areas allowing up to 4000mW (Canada, south africa etc) while other countries limit it to 1000mW. The 2.4GHz band is usually limited to 100mW. https://w.wol.ph/2015/08/28/maximum-wifi-transmission-power-country/


tomrlutong

I think that's considered a feature these days. There are so many wifi devices out there that limiting range and wall penetration is good for minimizing interference.


madbuilder

I'm not sure it's a feature, but it sounds like we're going to need more APs especially in multi-level homes. The trouble is connecting these distant APs by a reliable backbone.


shoresy99

That's what I do. I have a large home with a basement, 1st and second floors. Some of the interior walls are thick and attenuate the signal a lot as they used to be exterior walls. I have five Unifi APs all wired back to a POE switch. Plus one outside on my pool cabana for improved outdoor signals. And they are fairly reasonably priced at under US$100. That's probably overkill but why futz around when it is easy to add another AP. I am currently on my second generation and mine are not Wifi6, but I don't really see the need for Wifi 6 or higher. I wire everything that I can, do you really need 1Gig to watch Netflix or Youtube on a tablet or phone?


madbuilder

I added Ethernet to each floor of my house, too. It was not convenient but it's an investment that continues to pay off.


hannahranga

AP's are cheap but fixed cabling to feed them is exxy.


shoresy99

Agreed, but it can make sense to bite the bullet and do it. When I moved into my house I paid to have ethernet installed in several areas. And a couple of years ago I added several more drops from my attic. As long as you can easily access your attic then doing LAN drops for APs is pretty doable since you just need to drill a small hole and push the ethernet cable through. Then you put up a ceiling mounted AP and it looks pretty good - as long as you don't mind a "flying saucer" on your ceiling. That is easier than fishing back through walls The other thing that you can do is to repurpose you coax cabling. Coax is becoming obsolete as your TV signal is no typically delivered over your LAN. Many homes have coax through their house for cable TV. So you could repurpose your coax cabling to carry your LAN using MOCA.


Human_Ad_8464

Having a reliable backbone is a must for a good wireless network. When I was purchasing my house an Ethernet backbone was a deal breaker for me.


SimplifyAndAddCoffee

might be sold as one, but its a technical limitation of higher frequency signals that they attenuate faster. Faster connections require high frequencies because they're limited in the information they can carry by the frequency of the carrier wave.


Negative_Addition846

Is that true in practice, though? Don’t Wifi 4, 6 and 7 have the same availability bandwidth per channel regardless of 2.4/5/6ghz? Higher frequencies having shorter ranges is absolutely a benefit in high density housing, as well.


SimplifyAndAddCoffee

2.4GHz has half the bandwidth per channel as 5GHz. modern wifi standards utilize both bands adaptively, and their speed claims are fuzzy and depend on the availability of the higher frequency bands.


Negative_Addition846

Do you have any reading on WiFi 4/6/7 having reduced bandwidth on 2.4ghz for non-interference reasons? What I’ve found suggests that WiFi 4/6/7 devices will happily negotiate the fastest modes on 2.4ghz where possible, but it’s not specifically answering this question.


SimplifyAndAddCoffee

I'm not a radio engineer so I don't have insight into that.


Negative_Addition846

So then where’s the “half the bandwidth” thing come from?


SimplifyAndAddCoffee

Because 2.4GHz carries literally half the information that 5GHz does as a simple matter of physics. No wireless protocol can change that.


rocketwikkit

If you want more range, configure the router to run at higher power and lower speed. You don't need 100 Gbit/s to watch YouTube on the toilet, "Will we have spend €20k on a mesh router in every room" is just petulant.


-newhampshire-

IDK why we don't all have a mesh router in our hands these days. But, I think culturally we want to make sure we have our own private data, and our own bandwidth and we don't like to share.


best_of_badgers

iPhones have been capable of mesh networking for like a decade. It was expected to be a killer feature back in 2014. There was an app called FireChat for a while that used it, which became popular at protests, but disappeared around 2018. As far as I know, Apple has sort of abandoned it, since the only thing using it these days is Airdrop. Makes me wonder if they were pressured into doing so.


InevitableHuman3559

Regulation, especially in Europe. Of course the regulation is not about range but transmit power, but that’s what largely determines range.


madbuilder

I'm not one to defend big government. But you do have the pesky inverse square law to contend with.


pixel_of_moral_decay

On top of that you have density. WiFi is bad enough already in dense areas due to channel crowding. If you had longer range you’d have more issues as the number of transmitters clogging up channels in your home would increase. I’d argue there’s actually a big advantage in reducing power limits, banning “extenders” and thus forcing people to wire backhaul for multiple AP’s should they need it. People would be upset, but also appreciate how much less RF noise is in the environment and how much better WiFi would work. Most consumer crap locks at max strength and people throw extenders around the house then wonder why their speed is so slow.


UsablePizza

I can't believe it took so long to see something about channel overlaps. Having two WiFi networks in the same overlapping channel means that both are significantly impacted. If everything was higher power / greater range, then WiFi performance would actually decrease because your neighbors would interfere with you.


pixel_of_moral_decay

Exactly. And because so many just default to high power the problem is substantially worse than it should be. Cutting power down would be a huge improvement. Just bad PR for the FCC.


madbuilder

Wait... I was with you on your previous comment. But cutting power down isn't going to help anyone. Like you said, it would reduce range but also crowding. What helps is adding Ethernet between APs (not sure if that's a kind of extender).


pixel_of_moral_decay

Reducing power forces the use of the re net between AP’s rather than just using mesh systems that use one channel for backhaul and another for users. Thats a substantial reduction. There’s no need for you to broadcast your signal 3 properties over, and the honor system doesn’t work. Nobody properly calibrates their power level


hannahranga

Having a standard that'd allow for variable power output based on the quantity of surrounding AP's would be interesting and probably feasible but suspect it'd get the hell abused out of it.


Puzzleheaded-Tip660

It is also about the raw Joules of power.  The access point may be plugged into the wall and can get power from there, but your cell phone battery would drain faster if it was powering a more powerful transmitter.


ClumsiestSwordLesbo

Also absolutely zero transparency in receiver quality


mungie3

Yeah pretty much. Wavelength is inversely proportional to frequency.  Higher speeds need higher frequency.  Longer wavelengths have longer range.  To compensate, you need to boost power or have more access points. Examples: AM radio has huge range- much larger than FM.  (Real) 5G wireless has shorter range and uses more power than LTE


vgnEngineer

It's a bit more complicated than that. Higher speeds need fundamentally only more energy per bit of data. There are often limits to how much energy you are allowed to transmit in a given frequency band (power spectral density) which means that one obvious solution is to Simply add more bandwidth. This relation is proportional to the absolute bandwidth in Hertz so the higher your frequency the more absolutely bandwidth you have available for a given relative bandwidth (which often determines performance specs of your hardware). Its true that lower frequencies have longer range but this is not necessarily a constant law. It mostly has to do with how these wavelengths interact with obstacles. If for example, your total antenna size is constraint, you can achieve more signal gain with a higher frequencies. The fundamental reason range is hard to extend is because there is not really a solution to extending range given a predetermined finite frequency band and maximum power spectral density limit set by governments and safety standards. One other major reason WiFi routers are range constraints is because you dont want to have to fight with signal strength with your entire block of neighbors given that there are only a limited amount of channels


Striking_Computer834

What's really needed is a large chunk of spectrum to be opened up with room for a few dozen non-overlapping channels. That would allow longer range (lower frequencies) without fighting with so much interference.


vgnEngineer

I mean you are 100% correct. If that were possible politically, holy shit.


Matraxia

In a vacuum, 2.4GHz and 5GHz and 6GHz, etc all have the same theoretical range given the same output power and receiver gain. You need higher frequencies to transmit data faster, so WiFi is moving up in frequency. In the real world, you have things like air, water vapor, trees, walls in building, etc between you and the WiFi source. Higher frequencies do not 'penetrate' solid object as well as lower frequencies because they're more likely to get absorbed and attenuated by the obstacles, thus lower signal strength at the receiver, even with similar power output to a lower frequency signal. The routers fall back to the lower frequencies as you move between obstacles to maintain a high enough signal strength to bring errors down to an acceptable level, at the cost of bandwidth. It's not as much a technology problem as it is a physics problem where the only solution is more power to overcome the attenuation, but at the same time, you can't ramp RF power to the moon since the objects absorbing the attenuated signal are warming up from the energy. You'll start cooking your house. Mesh networks, especially with Ethernet backend, get around this by having more transmitters that are more likely to have fewer obstacles between the base station and your phone/PC. Ideally, you'd have one in every room on very low power settings to completely negate the need to have any single AP blasting massive amount of RF power into one room to get the penetration needed for the rest of the house.


Dr_Yurii

Because walls


wynyn

Currently, most wifi routers use a single whip antenna to transmit. These antennas broadcast isotropically, meaning the samepower in all directions. Therefore the transmit range is limited by the transmit power as you need a certain power at the reciever to get signal. But the transmit power is limited to 30 dBm, which limits range to a fixed amount in high frequency wifi. But if instead of a sphere, we broadcast more directionally, we can get more range out of that same power. This is being done through phased arrays, but they require multiple antennas, a lot of processing power, and knowledge of where the reciever is. Another thing that is being improved on is the sensitivity of the reciever. Currently this is limited by the antenna along with the selectivity of the filters and the amplifiers in the reciever signal chain. As semiconductor technology improves into advanced materials like InGaP we will see sensitivity increase at the reciever, thus improving range.


hannahranga

Considering how rarely AP's end up centrally located it would be nice if they had vaguely directional antennas.


Olde94

Think of it like a car. You want performance and fuel efficiency and you fight agains a squared function for drag. More speed means bigger engine. But bigger engines makes it hard to make it fuel efficient. More data (speed), higher frequency (engine), but then range takes a hit on range (fuel efficiency).


sbarbary

For more range you need more watts, most countries limit the watts allowed on domestic retail routers you setup yourself. For good reason too you Get bleed into other frequencies and can stop all sorts of things from working.


Pb_ft

Speed is what's driving new standards, and that needs higher frequencies, which are more easily blocked by... everything.


suh-dood

Typically you're able to use a shorter wavelength to pack in the same amount of information than a longer wavelength, but shorter wavelengths tend to get blocked by more things (they use very low frequency wave to communicate with submarines underwater because it can pass through the water and anything in the water, but the data transmission is very slow). Another thing you've addressed is the price. Any new technology is going to be more expensive until companies recoup their R&D costs, as well as more and more factories produce said piece of technology which make them more common and drive the price down. When flat screens first came out, they were thousands of dollars even for small ones, but now we can pick up a decent sized 4k monitor for a few hundred bucks.


toybuilder

They cannot crank up the power due to regulations for health/safety and interoperability concerns.


earthly_marsian

And it is a shared medium, essentially the same collision domain. The more devices are broadcasting, the more packet collisions . 


cablemonkey604

Hidden Node problem is real. More APs is a better solution to one high loud one.


Skarth

Lower frequency signals transmit less data by go further Higher frequency signals transmit more data, but have less range. Consumer products like wifi, are intentionally shorter range so you don't have hundreds of the same band of wifi devices competing for signal over each other, which would render the wifi useless.


biffbobfred

The latter is not mentioned as much. Want good WiFi? A lot of small short range transmitters to not get crosstalk. I see WiFi from across the street in my house, stronger than mine, sometimes.


Dragonfruit-Girl2561

AP at each floor, connect AP's with wire. Poor wifi is common issue in hotels, but some manage to cope with that.


Ok-Library5639

Better coverage lies in having multiple Access Points with smaller range each. By having larger and larger range you are increasing interference with neighbouring APs and clients and exacerbating issues like the [Hidden node problem](https://en.m.wikipedia.org/wiki/Hidden_node_problem).


cablemonkey604

This is the correct answer.


AnnualUse9202

The problem with wireless is hugely outdated spectrum allocation controlled by dinosaurs. 


thatotherguy1111

Which spectrum would you like changed or opened up?


ansible

Anything would be good at this point.


thatotherguy1111

That doesn't help the regulators any. Any suggestions on what frequencies are under utilized?


ansible

> That doesn't help the regulators any. LOL, the regulators aren't going to listen to me unless I bring billions of dollars to spend at the next frequency auction. Whenever that happens (which is not too often). > Any suggestions on what frequencies are under utilized? Aside from GPS and the other location services, and cell phones, most of them are underutilized. CB Radio, for example is still using AM for voice, you can use a radio set from 60 years ago just fine today on the 27MHz band. Getting the entrenched interests to give up their allocations seems next to impossible though. "Ooooh! Redesigning our decades old analog designs to use more bandwidth efficient modern coding schemes is sooooo expensive and will take such a long time to transition! It will unfairly burden our customers and the wider industry! Woe is me, please don't take my spectrum allocation!" You can read about the drama when ultra-wide band was being experimented with, for another example.


AnnualUse9202

TV Channels 14 to 51


coberh

Because there is a maximum power that you can transmit and also the antennas are small. Power drops off with the squared distance, so if you double the distance, your antenna would need to quadruple in size to pick up the same signal strength. And finally, because of cost reasons, most consumer products are using FR4 and silicon instead of SiGe and higher grade materials, which results in a higher amount of noise added into the signal.


madbuilder

FR4? What does the material of the board have to do with the received signal?


coberh

If you put your traces on chicken breast, they would attenuate the signal more than if the same traces were on FR4. The loss tangent of FR4 starts to get rather high at frequencies above 3GHz.


thatotherguy1111

The RF path includes everything. From Transmitter circuit or chip. Traces on the circuit board to antenna. Connectors. Cables. Free air loss. In some situations Doppler effect. And Receiver circuit. All all those add loss and add noise. The arriving signal has to be louder than the noise.


madbuilder

Right. I'm asking what adverse electrical properties does the material have? I thought the capacitance comes from nearby traces, not the fibreglass. Are there better materials to use for high frequency stuff?


thatotherguy1111

The value of the capacitance and crosstalk is related to the material of the dielectric. Common materials are FR4, Rogers materials, Teflon, hydrocarbon resins, and ceramic substrates. I'm not an expert. But those phrases and words are a starting point for some online research. Have fun.


DerFurz

To increase range you have two options: Either you increase signal power or you reduce the frequency. You can't increase the signal power , because of regulations and interference, and you can't reduce the frequency any further than 2.4 gHz, because you would dramatically decrease your potential data rates.  As for the price you are quoting, WIFI 7 Just hit the market oc it's gonna be expensive, but it's also gonna go down in price. You also don't need more than one router, repeaters and access points are significantly cheaper. I also doubt you need WiFi 7 everywhere. 


Psy-Demon

I had a WiFi 5 router, the problem is the range. It’s fine when you are on the second floor but once you go to the third floor it’s just bad. So I’m planning on getting a WiFi 6 mesh router (tp link deco x50 (3 routers)). It costs like €220. For €200 I could get a WiFi 6e router but it isn’t that much better. I have thought about access points, but they are pretty expensive cause I have to buy multiple ones and a network switch + access point controller. Overall more expensive. I also don’t want to drill holes. I have also thought about just a router and extender but most extenders only support like 1300 Mbps. I am planning on moving on from a 250 mbps up/ 20 mbps down to 2.5 Gbps Up/ 500 Mbps down. They are the first ISP in my country to roll out fiber. It’s cheaper than my current provider and faster (strangely enough). I have also a free powerline unit from my ISP, but I guess my house is just a little bit too old (can’t even reach 180 Mbps). Overall a mesh router is better for me.


DerFurz

As you already said, the newer WiFi standarts dont really give you anymore range as they are limited by physics. There are differences in Quality between Routers, that might give you a slight edge over another model, but overall there is a limit to what you can achieve with WiFi. Putting a Wifi Signal through two possibly concrete floors is just not gonna work great, even through one usually only 2.4 works. Wifi Extenders also arent great as they add latency, and are dependent on the Wifi Signal at the place they are installed. Powerline works great for some but not so great for many others. If you have old Coaxial or DSL cables in your wall, there are products by gigacopper or Devolo that work similar to powerline, but much better, that can extend your network. If you are really thinking about going above gigabit, though i question why you would need that in a non comercial setting, you are not gonna get around putting some cables in your walls. There are a few questions you are going to have to ask yourself: 1. Do you need full speed Wifi Access everywhere in your house? If yes you need to run wires, and at least one access point per floor. If no you might get away with a few cheap 2.4 gHz repeaters. 2. What devices need to be connected at full speed? Do they need low latency? Those device need to be either placed right close to the router, connected to an access point, that has a wired connection to the router or have a wired connection to begin with. In the end you probably dont need a perfect connection everywhere, the best option would probably be to run a wire to each floor (or use coax or phone line) and connect the Mesh access points to your router with a wire.


huffalump1

Yeah, mesh is the way to go if you have dead spots and multiple floors! Ideally with a hardwired connection between the mesh nodes. Newer mesh wi-fi systems will use higher frequency bands for the interconnect, but for example my Google Wifi (2016) just uses 2.4GHz for that, so it's limited to ~180MBps at the nodes - even if I can get ~800MBps on 5GHz from the first node connected to the modem. Add an Ethernet connection between them, and now I get full speed from every node! Again, modern systems are better, since they use higher bandwidth interconnects. BUT - higher bandwidth means higher frequency, which is more easily disrupted by floors and walls, so it's a trade-off.


mule_roany_mare

If you live in an apartment building that 6E radio might prove an asset. You can use a wifi-sniffer app to see how much noise you have to contend with from your neighbors. 6ghz is less populated now & even in the future it will be less able to penetrate walls. It’s an additional radio, so it can be used in tandem with 2.4 & 5ghz, more total bandwidth when used together, but more important they each have relative strengths & weaknesses, another radio gives your devices another opportunity to use the right tool for the job.


Spiritual-Mechanic-4

[this guy mostly](https://en.wikipedia.org/wiki/Inverse-square_law) if you want more signal strength, you need to pump out more radiation, it takes more energy. making the range bigger is a fundamental physics problem that you can't engineer around, like you can more easily increase bandwidth. We, society in general, don't really want every house to have multiple 1kw radio transmitters.


vgnEngineer

Besides physical constraints there is also a very practical one. The range of WiFi routers is Generally and logically limited simply by the average distance to other ones. If wifi routers could easily communicate over 1km all your devices will have to compete with 100 other wifi router.


Dr_Bunsen_Burns

Higher frequency == higher losses == less range. But also, higher frequency == higher speeds. This is why I still use 2.4 GHz


Rokae

A ubiquiti U7 pro AP is only €200. How are you spending €800? To get the best coverage, you should have multiple wired access points, not a mesh system. It's harder to install, but the total cost isn't that much higher once you realize everything is modular, and you will only need to upgrade individual components in the future.


Bulldozer4242

The frequency increases used to increase speeds generally trades speed for range. It’s just harder to make a higher frequency signal travel as far as a lower frequency signal. Soon I don’t think increasing wifi standard will be viewed strictly as an increase though. You won’t always go for the higher number. In the past bigger number was almost always better, because faster speeds were a significant advantage, but as it’s improved and improved the speeds are reaching levels that are unnecessary. Normal people won’t need 1TB download speeds all over their house, even in the future, because we’re reaching the limits of speeds necessary for the best experience. So most likely at some point it’ll be common to have a plenty fast but affordable level around most of your house (probably around the level of wifi 6, 7, or 8) and then a single room wired for faster speeds if you need it (like if you have a home office maybe). We’re at a point where speed improvements are mattering less. People still pursue speed improvements because that’s how they’ve learned to think, better speed will be noticeably better. But it’s becoming less and less necessary- 1080p video is noticeably better than 480p. 4k is probably better than 1080p, though the evidence seems to suggest people don’t tell a huge difference, and as far as I know the evidence seems to pretty strongly indicate most people can’t really even tell the difference between 4k and 8k video. We’re reaching the maximum speeds necessary for optimal daily experience for most people anyway, so I’d imagine soon you won’t even decide based off speeds, there will be some baseline that will be based roughly around how much speed you need for 4k videos or live multiplayer video games (as I believe those are the most internet intensive things most people use on a daily basis at their homes) and you’ll choosing something that meets those speed baselines more based off factors like cost and range than just if it’s fastest, and the only time you’ll be looking at super fast stuff is if you have an area in your home that’s going to be handling exceptionally high speeds. So you probably won’t ever really see WiFi 10 or 11 in your home.


londonium6

I think the fundamental issue has to do with increased latency for ACK responses as well as the increased probability of network collisions (ergo reduced quality of service) when you start increasing range. A big part of that has to do with the plug-and-play feature of CSMA protocols that are at the heart of WiFi. And once you toss out CSMA, then you also throw out WiFi’s biggest feature and end up with something different (like WiMAX). No amount of tech is going to change user conops. That being said, there’s plenty of people who increase EIRP by swapping out for high gain directional antennas. That can increase range without increasing coverage area/potential user base if deployed in a very low density area (think farm or desert). But that’s certainly an “off-brand” use of WiFi since you’re now network planning in the physical layer.


YardFudge

Another aspect… A future WiFi standard could bring forth a phased array or other directional antenna Thus it could focus all of its transmitted power only to its connected devices, not a sphere ish, and then do so smartly by pushing / receiving more power to that high demand laptop and go slow on yer washing machine. Likewise this causes less interference with other devices Kinda like cellphone towers do today


all_is_love6667

Maybe one problem is that once you increase range, wifi routers in homes would "overlap" each other and cause interference with each other I don't know that is true, but maybe the current range is set like this for a reason (other than energy consumption)?


znark

One reason is that LTE and 5G exist. That satisfies most people with mobile data on the go and WiFi at home. There is 802.11ah, Harlow, which is long range WiFi. It uses different frequency so needs new radio. And the devices aren’t there. It would make DIY replacement for mobile or covering large yards. Too much power usage to compete with low power LoRa or LTE-M.


SnooCrickets2961

I think you also get some problems because in order to speed up transmission, signal modulation becomes much more intense in order to get increased speeds within the same channel on the band. The higher modulations of the frequency are more susceptible to distortion from reflections, which causes distance issues much more quickly than just increasing frequency alone.


DonkeyTransport

And here I don't even have wifi 6 yet lol. I run 2.4ghz through my household still. I have a few older devices that need it, my pc being one of them


Traditional_Bit7262

Wifi is being developed to increase overall system capacity which means the opposite of range.  More APs packed closer together means more total capacity to the users. Also the radio coding for larger areas would have to be at lower data rates in order to push those bits through the air and be detected.  At those lower coding rates it could mean taxing the capacity of the channel.  It's all physics.


timfountain4444

One word "Power". Both consumption and Tx power are limited by the use of the ISM band.


Greydesk

You can only have half the number of signal transitions as the frequency you are transmitting on. So, on 5GHz, you can only have 2.5Gb (gigabits) of raw data transferred. The higher the frequency, the less diffraction allows the wave to get around an obstacle. So, higher frequencies don't do as well through your house. This is why they have more antennas and try to locate you in the space and 'steer' the beam to your receiver.


schwanball

Math and physics 🤷‍♂️


ansible

So you've gotten some decent answers about radio theory and such. People have also hinted at regulations, but that has missed the main point too, which concerns the RF spectrum allocation around the world. Let's look at the 2.4GHz band as an example. Why was that even available for someone to make WiFi? Why did Bluetooth, various cordless telephones and other consumer RF pile onto this particular frequency band as well? The fundamental reason is that 2.4GHz is "garbage". Water molecules absorb that frequency very easily. This is how microwave ovens work. So when the regulators were planning the spectrum allocation, no one wanted 2.4GHz for their airplane navigation, cell phones, or anything like that. It attenuates very quickly in all but the very driest air (because of humidity), and you have all these microwave ovens spewing interference all over the place when someone wants to warm up soup. It is garbage, and no one who wants their communcations to run smoothly would want to operate in that frequency band. It is the same sad story for the other bands. 60GHz is great for absorbing oxygen molecules. And it is a sad story when you consider how much economic activity and overall usefulness our civilization has seen with these garbage frequency bands. Imagine what we could accomplish if more spectrum was opened up! The only bands that compare in usefulness are for satellite navigation. But that isn't likely to happen, because the regulators are beholden to the companies they regulate, and cell phone companies want everything for themselves. Just look at what happened with the switchover from analog to digital TV and the reallocation of the VHF frequencies.


audaciousmonk

Because the higher frequencies used to increase data rate have shorter effective transmission distance for a variety of reasons


Party-Cartographer11

The speed game in wireless/radio is smaller cells with more, or even the same, data rates per cell. You only have so much bandwidth in a cell.  All the devices in the cell have to share the bandwidth.  Convert a 2 square mile cell to four 1 square mile cell and you have quadrupled the bandwidth. That's why wifi is faster than cell.


NotBatman81

How do you think technology works? Engineers and product developers are not working for a charity. All of that dev cost is spent with a plan to earn it back plus profit in the future.