What’s the speed? Do you have a shitty 10mbps connection like my parents? Then WiFi, because you’re easily saturating that line either way.
Do you have gigabit? Then Ethernet, but then again getting like 600mbps wirelessly is good enough.
Biggest thing is having GOOD coverage. My house has multiple access points so that my connection is great everywhere. People with a shitty ISP router shoved in the cupboard in their basement make no sense lol.
What crap are you doing that so intensive WiFi causes latency? It’s essentially a negligible difference unless you are saturating the signal. We’re taking less than 3ms for a reliable round trip.
There are lots of factors that can cause jitter on WiFi, and it’s mostly outside of your control if you’re living somewhere more densely populated. My apartment randomly gets a lot of noise, and as a result my WiFi starts to get unacceptable amounts of packet loss and jitter. It doesn’t happen often enough to motivate the effort for me to go around signal analyzing, but still…
No more “why’s it down now”; no deauth attacks; no weird outages when highway traffic spikes from nav\music-streaming users getting tower timeouts that cause their WiFi to aggressively cry out for every known SSID.
With wired connections, I set it up once & it keeps working.
With WiFi, it’s a constant shouting match version of the Telephone game, with openly malicious actors literally headquartered a few blocks away.
Wi-Fi has constant retransmissions. This adds perceptible latency because the checksum check, turnaround, and packet transmission add a lot of time compared to the speed of light through air across 3 meters.
and playing video games. No offense, but did you read the whole comment? I need good latency for my games and I need it while downloading a bunch of other stuff. Idk if you’ve ever tried downloading a few torrents while gaming, but it’ll definitely have an impact. Especially if you’re on WiFi.
You’re probably right, but I’m not a power user and nor do I care to be. I can make all my problems go away by just plugging in a cable and making sure I have good Internet otherwise. That’s my point and what matters to me in this discussion
I’m pretty sure it’s wifi from the neighbours interfering. I can’t be bothered to deal with that, not when I have a cable laying around. Plus, no matter what, a cable will always be more stable than wifi
Packet loss really, and the latency and jitter said loss can contribute to.
Radio waves go faster (speed of light) than through a medium (copper).
Not that it matters at such a small scale, but it’s helpful to have a good picture of the elements at work here. The further you are from the receiving point, the more obstacles (matter) that can obstruct it. But in ideal conditions WiFi is better than most people think. Replicating those ideal conditions though…
Radio waves go faster (speed of light) than through a medium (copper).
Except that copper ethernet is baseband, so it’s not radio waves. WiFi is still faster than copper AFAIK (there was a huge debate about this between youtubers not that long ago), at least for signalling, but the difference is smaller than you think. light (which is EM, the same waves as radio/WiFi) through glass is about 2/3rds c (aka the speed of light), and it’s actually a lot slower than ethernet or WiFi for propagation delay, however, WiFi must use CSMA/CA as well as other tricks to ensure it doesn’t step on itself, and that it doesn’t step on other sources of radio interference (Microwave ovens, wireless controllers (like xbox), bluetooth, zigbee, etc, on 2.4Ghz and stuff like RADAR on 5Ghz). It’s half-duplex, so only one station can transmit at a time, hense CSMA/CA being required, where ethernet doesn’t need any collision avoidance or detection except for rare cases of 10/100 half duplex, all gigabit is full duplex. Half duplex on wireline networks is basically eliminated at this point, so it’s little more than a footnote.
Factoring all this in, getting the signal down the line, WiFi loses in almost every case, due to all the considerations it needs to take into account.
I had 100mbps ethernet because incompetent ISP worker who crimped only two pairs out of four. And I had AFAIR 150mbps plan! Don’t know what to wish for that idiot.
What’s the speed? Do you have a shitty 10mbps connection like my parents? Then WiFi, because you’re easily saturating that line either way.
Do you have gigabit? Then Ethernet, but then again getting like 600mbps wirelessly is good enough.
Biggest thing is having GOOD coverage. My house has multiple access points so that my connection is great everywhere. People with a shitty ISP router shoved in the cupboard in their basement make no sense lol.
Only if latency doesn’t matter. WiFi has a lot more jitter, no matter if your WAN connection is 10 or 1000mbps.
What crap are you doing that so intensive WiFi causes latency? It’s essentially a negligible difference unless you are saturating the signal. We’re taking less than 3ms for a reliable round trip.
There are lots of factors that can cause jitter on WiFi, and it’s mostly outside of your control if you’re living somewhere more densely populated. My apartment randomly gets a lot of noise, and as a result my WiFi starts to get unacceptable amounts of packet loss and jitter. It doesn’t happen often enough to motivate the effort for me to go around signal analyzing, but still…
deleted by creator
15 wired devices, kthx. Once & done.
No more “why’s it down now”; no deauth attacks; no weird outages when highway traffic spikes from nav\music-streaming users getting tower timeouts that cause their WiFi to aggressively cry out for every known SSID.
With wired connections, I set it up once & it keeps working. With WiFi, it’s a constant shouting match version of the Telephone game, with openly malicious actors literally headquartered a few blocks away.
Wi-Fi has constant retransmissions. This adds perceptible latency because the checksum check, turnaround, and packet transmission add a lot of time compared to the speed of light through air across 3 meters.
Few Ms. It’s absolutely unperceptive.
I often will be downloading a film, streaming youtube or music and be playing video games. Latency matters to me and WiFi is just not stable enough
Can you not contradict yourself?
and playing video games. No offense, but did you read the whole comment? I need good latency for my games and I need it while downloading a bunch of other stuff. Idk if you’ve ever tried downloading a few torrents while gaming, but it’ll definitely have an impact. Especially if you’re on WiFi.
I did. Just pointing out uselessness of mentioning big downloads in context of latency. Just don’t bring bad arguments.
Yes, but also setting up network priority(QoS), limiting torrent transfer speed and other stuff.
You’re probably right, but I’m not a power user and nor do I care to be. I can make all my problems go away by just plugging in a cable and making sure I have good Internet otherwise. That’s my point and what matters to me in this discussion
Something is wrong. None of this is perceptible to humans.
If you don’t want to figure it out, cool. But it ain’t the protocol causing your issues.
I’m pretty sure it’s wifi from the neighbours interfering. I can’t be bothered to deal with that, not when I have a cable laying around. Plus, no matter what, a cable will always be more stable than wifi
deleted by creator
Wi-Fi adds like 4ms. It’s not high latency.
Packet loss really, and the latency and jitter said loss can contribute to.
Radio waves go faster (speed of light) than through a medium (copper). Not that it matters at such a small scale, but it’s helpful to have a good picture of the elements at work here. The further you are from the receiving point, the more obstacles (matter) that can obstruct it. But in ideal conditions WiFi is better than most people think. Replicating those ideal conditions though…
Except that copper ethernet is baseband, so it’s not radio waves. WiFi is still faster than copper AFAIK (there was a huge debate about this between youtubers not that long ago), at least for signalling, but the difference is smaller than you think. light (which is EM, the same waves as radio/WiFi) through glass is about 2/3rds c (aka the speed of light), and it’s actually a lot slower than ethernet or WiFi for propagation delay, however, WiFi must use CSMA/CA as well as other tricks to ensure it doesn’t step on itself, and that it doesn’t step on other sources of radio interference (Microwave ovens, wireless controllers (like xbox), bluetooth, zigbee, etc, on 2.4Ghz and stuff like RADAR on 5Ghz). It’s half-duplex, so only one station can transmit at a time, hense CSMA/CA being required, where ethernet doesn’t need any collision avoidance or detection except for rare cases of 10/100 half duplex, all gigabit is full duplex. Half duplex on wireline networks is basically eliminated at this point, so it’s little more than a footnote.
Factoring all this in, getting the signal down the line, WiFi loses in almost every case, due to all the considerations it needs to take into account.
As an IT professional who has worked with a lot of wireless systems, I approve. This is the way.
I had 100mbps ethernet because incompetent ISP worker who crimped only two pairs out of four. And I had AFAIR 150mbps plan! Don’t know what to wish for that idiot.