The other bit of advice that is buried in there that no-one wants to hear for residences is the best way to speed up your Wi-Fi is to not use it. You might think it's convenient to have your TV connect to Netflix via WiFi and it is, but it is going to make everything else that really needs the Wi-Fi slower. It's a much better answer to hook up everything on Ethernet that you possibly can than it is to follow the more traveled route of more channels and more congestion with mesh Wi-Fi.
Absolutely. Everything other than cell phones and laptops-not-at-a-desk should be on Ethernet.
I had wires run in 2020 when I started doing even more video calls. Huge improvement in usability.
(We do have one internet-connected device which permanently lives about an inch away from one of the ethernet sockets, but it is, ironically, a wifi-only device with no RJ45 port.)
The endpoint in my living room also has a wifi AP so signal is pretty good for laptops and whatnot.
In NYC every channel is congested, I can see like 25 access points at any time and half are poorly configured. Any wired medium is better than the air, I could probably propagate a signal through the drywall that's more reliable than wifi here.
So having something I can just plug into the wall is pretty nice compared to running cables even if it's a fraction of gigE standards.
So they would have to do quite a bit of work to run cable. Also people living in apartments that cant just start drilling through walls.
I'd say most ppl use wifi because they have too, not pure convenience
So true!
Other tips I’ve found useful:
Separate 2.4ghz network for only IoT devices. They tend to have terrible WiFi chipsets and use older WiFi standards. Slower speed = more airtime used for the same amount of data. This way the “slow” IoT devices don’t interfere with your faster devices which…
Faster devices such as laptops and phones belong on 5ghz only network, if you’re able to get enough coverage. Prefer wired backhaul and more access points, as you’re better off with a device talking on another channel to an ap closer to it rather than tieing up airtime with lots of retries to a far away ap (which impacts all the other clients also trying to talk to that ap)
WiFi is super solid at our house but it took some tweaking and wiring everything that doesn’t move.
The only devices on wifi should be cell phones and laptops if they can't be plugged in. Everything else, including TVs, should be ethernet.
When I moved into my last house with roommates their network was gaaarbage cuz everything was running off the same router. The 2.4ghz congestion slowed the 5ghz connections because the router was having to deal with so much 2.4ghz noise.
A good way of thinking about it is that every 2.4ghz device you add onto a network will slow all the other devices by a small amount. This compounds as you add more devices. So those smart lights? Yeaaahh
As a broad concept: Ever since my last Sonos device [that they didn't deliberately brick] died, I don't have any even vaguely bandwidth-intensive devices left in my world that are 2.4GHz-only.
Whatever laptop I have this year prefers the 5GHz network, and has for 20 years. My phone, whatever it is today, does as well and has for 15 years. My CCwGTV Chromecast would also prefer hanging out on the 5GHz network if it weren't plugged into the $12 ethernet switch behind the TV.
Even things like the Google Home Mini speakers that I buy on the used market for $10 or $15 seem to prefer using 5GHz 802.11ac, and do so at a reasonably-quick (read: low-airtime) modulation rate.
The only time I spend with my phone or tablet or whatever on the singular 2.4GHz network I have is when I'm at the edge of what I can reach with my access points -- like, when I visit the neighbors or something, where range is more important than speed and 2.4GHz tends to go a wee bit further.
So the only things I have left in normal use that requires a 2.4GHz network are IoT things like smart plugs and light bulbs and other small stuff like my own little ESP/Pi Zero W projects that require so little bandwidth that the contention doesn't matter. (I mean... the ye olde Wii console and PSP handheld only do 2.4GHz, but they don't have much to talk about on the network anymore and never really did even in the best of times.)
It's difficult to imagine that others' wifi devices aren't in similar form, because there's just not much stuff left out there in the world that's both not IoT and that can't talk at 5GHz.
I can see some merit to having a separate IoT VLAN with its own SSID where that's appropriate (just to prevent their little IoT fingers from ever reaching out to the rest of the stuff on my LAN and discovering how insecure it may be), but that's a side-trip from your suggestion wherein the impetus is just logical isolation -- not spectral isolation.
So yes, of course: Build out a robust wireless network. Make it awesome -- and use it for stuff.
But unless I'm missing something, it sounds like building two separate-but-parallel 2.4GHz networks is just an exercise in solving a problem that hasn't really existed for a number of years.
A few things come to mind...
- You can buy ethernet adapters... for iPhone/ipad/etc. Operations are so much faster, especially large downloads like offline maps.
- many consumer devices suck wrt to wifi. For example, there seem to me ZERO soundbars with wired subwoofers. They all incorporate wifi.
- also, if anyone has lived in a really dense urban environment, wifi is a liability in just about every way.
- Whats's worse is how promiscuous many devices are. Why do macs show all the neighbor's televisions in the airplay menu?
- and you can't really turn off wifi on a mac without turning off sip. (in settings, wifi OFF toggle is stuck on but greyed out)
That's a feature that can be configured on the TV/AirPlay receiver. They've configured to allow streaming from "Anyone", which is probably the default. They could disable this is setting and limit it to only clients on their home network. And you can't actually stream without entering a confirmation code shown on the TV.
When you stream to an AirPlay device this way it sets up an adhoc device-to-device wireless connection which usually performs much better that using a wifi network/router and is why screen sharing can be so snappy. Part of the 'Apple Wireless Direct Link' proprietary secret sauce also used by AirDrop. You can sniff the awdl0 or llw0 interfaces to see the traffic. Open AirDrop and then run `ping6 ff02::1%awdl0` to see all the Apple devices your Mac is in contact with (not necessarily on your wifi network)
> and you can't really turn off wifi on a mac without turning off sip.
Just `sudo ifconfig en0 down` doesn't work? You can also do `networksetup -setairportpower en0 off`. Never had issues turning off wifi.
Sonos has its issues, but I do need to point out that their subs (and the rest) all have Ethernet ports in addition to WiFi.
For my IoT network I just block most every device's access to the internet. That cuts down on a lot of their background chatter and gives me some minor protection.
Also honestly, I feel the majority of wifi problems could be fixed by having proper coverage (more access points), using hardwired access points (no meshing), and getting better equipment. I like Ubiquiti/Unifi stuff but other good options out there. Avoid TP-Link and anything provided by an ISP. If you do go meshing, insist on a 6ghz backhaul, though that hurts the range.
Certainly this is the brute-force way to do it and can work if you can run enough UTP everywhere. As a counterexample, I went all-in on WiFi and have 5 access points with dedicated backhauls. This is in SF too, so neighbors are right up against us. I have ~60 devices on the WiFi and have no issues, with fast roaming handoff, low jitter, and ~500Mbit up/down. I built this on UniFi, but I suspect Eero PoE gear could get you pretty close too, given how well even their mesh backhaul gear performs.
I'm glad it works but lol that's just hilarious.
I’ve connected a switch and a second access point with mine.
Also I think they work best if there fewer of them on the same circuit. But not sure. Check first.
I have helped some people who had many troubles with wifi devices (particularly printers) and when then didn't want to run a cable to solve the problem forever I told them to fuck off. If there is one thing that is certain with wifi, it's that it will break at some point and randomly show poor performance/issues. Anything that doesn't have to be absolutely wireless has to be connected that way, problem solved, forever.
Wired connection is an absolute hack.
Now put an access point into every room and wire them to the router, and things start looking very differently.
People say this until it takes 3 days to restore a fibre cut, when the wireless guys just work around the problem with replacement radios etc.
Issue with Wireless is usually the wireless operator. And most of them do work hard to give wireless a bad rep.
Proliferation of consumer hardware that lacks ethernet ports is probably a contributing factor
IMHO, the greatest utility of wifi is wireless keyboards and monitors, not wireless internet access
The ability to remotely control multiple computers not on the same network from the same keyboard, for example
But I've always had a bias for using a (mechanical) external keyboards over built-in laptop keyboards, even before there were wireless keyboards
Sometimes DFS certification comes after general device approval, but I'm not aware of any that just flat out doesn't support it. It supported it 10+ years ago.
There is other stuff to watch - like uhd bluray backups and those need more than the crappy 100mbps lan port can deliver.
TV streaming seems like a bad example, since it's usually much lower average bandwidth than e.g. a burst of mobile app updates installing with equal priority on the network as as soon as a phone is plugged in for charging, or starting a cloud photo backup.
That's true of any client with older and crappier WiFi chips though, but TVs are such a race to the bottom when it comes to performance in so many other things.
We've gone from 100 Mbps being standard consumer level to 2.5 or 10 Gbps being standard now. That sounds substantial to me.
Your take is really weird and doesn't represent the real world. What blog did you read this on and why haven't you bothered to attack that obviously wrong stance?
If you want to spend a really long time optimizing your wifi, this is the resource: https://www.wiisfi.com/
If you are experiencing problems, this might give you an angle to think about that you hadn't otherwise, if you just naively assume Wifi is as good as a dedicated wire. Modern Wifi has an awful lot of resources, though. I only notice degradation of any kind when I have one computer doing a full-speed transfer for quite a while to another, but that's a pretty exceptional case and not one I'm going to run any more wires around for for something that happens less than once a month.
Also that's an amazing resource, thanks for linking.
Add another idiot sitting on channel 8 or 9 and the other half of the bandwidth is also polluted, now even your mediocre IoT devices that cannot be on 5GHz are going to struggle for signal and instead of the theoretical 70/70mbps you could get off a well placed 20MHz channel you are lucky to get 30.
Add another 4 people are you cannot make a FaceTime call without disabling wifi or forcing 5GHz
I just now reduced it to 20Mhz, and though there is a (slight) perceptible drop in latency, those 5 extra dB I gained from Signal/Noise have given me wifi in the bedroom again
The best Ressource out there. Period.
Their `networkQuality` implementation is on the CLI for any Mac recently updated. It's pretty interesting and I've found it to be very good at predicting which networks will be theoretically fast, but feel unreliable and laggy, and which ones will feel snappy and fast. It measures Round-trips Per Minute under idle and load condition. It's a much better predictor of how fast casual browsing will be than a speed test.
My house is old and has stones walls up to 120cm, including the inner walls, so I have to have access points is nearly all rooms.
I never had a true seamless roaming experience. Today, I have TP-Link Omada and it works better than previous solutions, but it is still not as good as DECT phones for examples.
For example if I watch a twitch stream in my room and go to the kitchen grab something with my tablet or my phone, I have a freeze about 30% of the times, but not very long. Before I sometime had to turn the wifi off and on on my device for it to roam.
I followed all Omada and general WiFi best practice I could find about frequency, overlap... But it is still not fully seamless yes.
Most people place wifi repeaters incorrectly, or invest in crappy repeater / mesh devices that do not multiple radios. A Wifi repeater or mesh device with a single radio by definition cuts your throughput in half for every hop.
I run an ISP. Customers always cheap out when it comes to their in home wireless networks while failing to understand the consequences of their choices (even when carefully explained to them).
The design of roaming being largely client initiated means roaming doesn't really work how people intuitively think it should, because at least every device I've ever seen seems to be programmed to aggressively cling to a single AP.
"The basement"
"Uh, i can send someone out to install some repeaters for $$$"
"No just make internet good now"
I assume you have hardwired all the APs, otherwise that would be the first step. Make sure they're on different channels, and have narrow MHz bands (20Mhz for 2.4GHz, 40MHz for 5GHz) selected.
Only use 1,6,11 for 2.4GHz and don't use the DFS channels on 5GHz as they will regularly hang everything.
Afterwards you can try reducing the 5GHz transmission power so there is no/less overlap in the far rooms.
Unfortunately you probably need the 2.4GHz (at least I do) but as the range is so much higher it might make sense to deactivate it on some APs to prevent overlaps.
Doing this basically eliminated the issues for me.
I have worked with networks for many years, and users blaming all sorts of issues on the network is a classic, so of course in their minds they need more speed and more bandwidth. But improvements only makes sense up to some point. After that it is just psychological.
Is that actually a thing? Why would any ISP intentionally add unnecessary load to their network?
So they're not really increasing their network load a measurable amount since the data never actually leaves their internal network. My ISP's network admin explained this to me one day when I asked about it. He said they don't really notice any difference.
(at least as per my understanding)
* https://en.wikipedia.org/wiki/IEEE_802.11bn
So other considerations are being considered.
Whie the two are not the same, they are not exactly separable.
You will not get good Internet speed out of a flaky network, because the interrupted flow of acknowledgements, and the need to retransmit lost segments, will not only itself impact the performance directly, but also trigger congestion-reducing algorithms.
Most users are not aware whether they are getting good speed most of the time, if they are only browsing the web, because of the latencies of the load times of complex pages. Individual video streams are not enough to stress the system either. You have to be running downloads (e.g. torrents) to have a better sense of that.
The flakiness of web page loads and insufficient load caused by streams can conceal both: some good amount of unreliability and poor throughput.
> Many ISPs, device manufacturers, and consumers automate periodic, high-intensity speed tests that negatively impact the consumer internet experience as demonstrated
But there’s no support for this claim presented frankly I am skeptical. What WiFi devices are regularly conducting speed tests without being asked?ISP provided routers, at least Xfinity does. I've gotten emails from them (before I ripped out their equipment and put my own in) "Great news, you're getting more than your plan's promised speeds" with speedtest results in the email, because they ran speed tests at like 3AM.
I wouldn't be surprised if it's happening often across all the residential ISPs, most likely for marketing purposes.
Really? DOCSIS has been the bottleneck out of Wi-Fi, DOCSIS, and wider Internet every time I've had the misfortune of having to use it in an apartment.
Especially the tiny uplink frequency slice of DOCSIS 3 and below is pathetic.
I used to run a docker than ran a speed test every hour and graphed the results but I haven't done that in a while now.
Wifi 8 will probably be another standard homes can skip. Like wifi 6 it is going to bring little that they need to utilise their fibre home connnections well across their home.
There are are these cool new features like MLO, but maybe devices could mostly use narrow channels and only use more RF bandwidth when they actually need it.
IEEE 802.11ah 900Ish
IEEE 802.11ax(WiFi6): traditional channels can be subdivided between 26 and 2x996 resource units according to need(effectively a 2MHz channel at the low end). This means multiple devices can be transmitted to within the same transmit opportunity.
> How about some modulations designed to work at low received power and/or low SNR?
802.11(og), 1 & 2 Mbps.
> 802.11(og), 1 & 2 Mbps
I’m a little vague on the details, but those are rather old and I don’t think there is anything that low-rate in the current MCS table. Do they actually work well? Do they support modern advancements like LDPC?
For people who dont follow WiFi closely. While WiFi 8, or 7 or 6 all has the intended features for its release, they are either not mandated or dont work as well as it should. Instead every release was a full refined execution of previous version. So the best WiFi 6 ( OFDMA ) originally promised will only come in WiFi 7. And current WiFi 7 feature like Multi-Link Operation will likely work better in WiFI 8. So if you wanted a fully working WiFi 8 as they marketed it, you better wait for WiFI 9.
But WiFi has come a long way. Not only have they exceeded 1Gbps in real world performance, they are coming close to 2.5Gbps, maximising the 2.5Gbps Ethernet. And we are now working on more efficient and reliable WiFi.
I wonder how many of those could be wired.
The only thing that makes wifi in a large condo building viable is the 6Ghz channels available on wifi 6e
Also MIMO.
And don't think it's relevant to compare what to do in a large space with what one should do at home. The requirements are entirely different.
In a large space with many users I'd use small channels and many access points. I want it to work good enough for everyone to have calls, and have good aggregate throughput.
In a two bed home I'd use large channels and probably only one AP. Peak single device speed is MUCH more important than aggregate speed.
And in a home it matters much more what channels are being busyed by neighbors.
For latency, of course, there is only wired. Even with few devices.
So yeah, I do think speed is more important.
Responsiveness doesn’t matter that often and when it does, plugging in Ethernet takes it out of the equation.
I have 50+ ESP based devices on WiFi and while low bandwidth (and their own SSID) I really wish there were affordable options that they could be "wired" for comms (since they mostly control mains appliances, but the rules and considerations for mixing data and mains in one package are prohibitively expensive).
I don't see a way to change that setting and I don't see a way to see what it's currently set to.
Use a dedicated 2.4ghz AP for all IoT devices. Firewall this network and only allow the traffic those devices need. This greatly reduces congestion.
Use 5ghz for phones/laptops and keep IoT off that network.
That's really about it. If you have special circumstances there are other solutions, but generally the solution to bad wifi is to not use the wifi, lol.
I operate a large enterprise wireless network with 80mhz 5Ghz channels and 160Mhz 6Ghz channels. It is possible if your environment allows.
Router, and extenders (multi floor house): 1-4
Chromecast|Sonos|Apple speaker/Chromecast|google|firestick|roku|apple TV/smart speaker/hifi receiver/eaves dropping devices: 2-10
Smart doorbell/light switch/temperature sensor/weather station/co2|co detector/flood detector/bulb/led strip/led light/nanoleaf/garage door: 4-16
Some cars: 0-2
Some smart watches speak wifi: 0-4
Computers.. maybe the desktops are wired (likely still support wifi), all laptops, chromebooks, and tablets : 3-8
All game consoles, many TVs, some computer monitors: 3-8
Some smart appliances: 0-4 (based on recent news of ads, best to aim for 0)
The biggest factor in your count, and I think it is the one with the highest ceiling, is smart devices. Trouble is, even by sources like https://www.consumeraffairs.com/homeowners/average-number-of..., around half of all households still have zero, and the average household has only 2.6 people.
In this thread (from its root), we have various users defending the reasonableness of the numbers, some providing numbers in their own houses: 10, 11, 14, 17, 19, 23, 28, 34, over 50, 60+. Averaging, I’ll say, about 27, and that’s with two pretty big outliers—if you excluded them (maybe reasonable, maybe not), you’d be down to 19.5. And these sorts of users are already likely to be above-average, it’s the nature of HN, compounded by them being the ones commenting (confirmation bias). Yet already (with the fiddling of removing what I’m calling outliers) they’re under the claimed average. And for each one of them, there’s another household with zero smart home devices; and the 20% of the population with no broadband are, I imagine, effectively using zero wifi devices, though discounting in this way is a little too simplistic. However you look at it, the average will drop quite a bit. In fact, if you return to the original 27 and simplify the portion of the population without smart home devices to a 30% zero rate (mildly arbitrary, but I think reasonable enough as a starting point) and let the other 70% be average… your 27 has dropped to about 19. In order to reach the 21 across the population, you’d need to establish these HN users, defenders of high wifi device counts, to be below average users of wifi devices, which is implausible.
If the number was 10, I’d consider it plausible, though honestly I’d still expect the number to be lower. But I think my reasoning backs up my initial feeling that 21 is pretty outlandish for your national average. I’d like to see Deloitte Insights’ methodology; I reckon it’s a furphy. I bet it’s come from some grossly misleading survey data, or from sales figures of devices that are wifi-capable even though half of them never get used that way, or from terrible sampling bias (surveys are notorious for that), or something like that. Wouldn’t be the first wildly wrong or grossly misleading result one of those sorts of companies have published.
I had probably 20 prior to swapping out some smart light bulbs and switches for Zigbee.
21 for an average household isn’t nuts.
We have 2 phones, a tablet for the kids, a couple of Google homes, a Chromecast, 2 yoto players, a printer, a smart TV, 2 laptops, a raspberry pi, a solar power Inverter, an Oculus Quest, and a couple of things that have random hostnames.
It adds up.
Add a few wifi security cameras and other IoT devices and 30+ is probably pretty common.
I currently have 23, my parent's house has 19
People have all kinds of stuff on wifi these days - cameras, light bulbs, dishwashers, irrigation, solar, hifi..
Wireless temperature monitor
Sync module for some Blink cameras
2 smart plugs
Roomba
5 smart lights
RPi 3
3 of the smart lights I currently don't need and and so aren't actually connected. That leaves 8 connected 2.4 GHz devices.On 5 GHz I've got 16 devices:
Amazon Fire Stick
iPad
Printer
Echo Show
Apple Watch
Surface Pro 4
iMac
Nintendo Switch
EV charger
Mac Studio
A smart plug
Google Home Mini
Echo Dot
RPi 4
Kindle
iPhone
The iMac and the Surface Pro 4 are almost never turned on, and the printer is also most of the time. That leaves 13 regularly connected 5 GHz devices.That's a total of 21 devices usually connected on my WiFi, right what the article says is average. :-)