For me, the normal stuff. Mathematically my gig fiber is overkill for my usage. And internet services can rarely keep up with that - you want to download some update or new game? It’s throttled at the source regardless of your internet connection
But in reality when I visit people with “fast enough” internet, I always see glitches and buffering and lag. While it usually serves the need and sometimes gets advertised bandwidth, gig fiber always serves the need. I shouldn’t have to complain about my network or worry about how many streams or how big a download or how many people on their phones. I should never worry about lag during games or interrupted video calls. And I shouldn’t have to worry about sketchy broadband providers (like xFinity/ConCast) way over provisioning their lines or otherwise never delivering marketed bandwidth.
Gig fiber delivers. Always. Like any good infrastructure you don’t even have to think about it: it just always does the job
But computers are getting faster - it seems like even medium level laptops are coming with 2.5Ge, and everything is more and more digital, and we expect more all the time. Yes I do expect to want a faster connection within 5-10 years even without doing anything high bandwidth. Heck, if history holds, another couple upgrades of JavaScript and we’ll need 50G to load web pages
“Why do I need electricity? I have candles. Lights seem excessive.”
Yes, but once most people have electricity, new products will be designed to take advantage of it. Now you can have a washing machine, for example.
Broadband is the same. Once most of your population has high bandwidth, we can start to design things that will use it. Right now we’re still designing for DSL speeds.
That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
Take a look at devContainers as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….
devContainers are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDE
But you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data
Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages
At work, I’m on the same network, but working from home, I still need the responsiveness to do my job
there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.
We’re not using the bandwidth we have. Many US cities have service with 1Gbps download speed available. I have it for my own reasons. Servers are the bottleneck; they rarely even reach half that speed.
If we’re not using 1Gbps, why should we believe something would pop up if we had 50Gbps?
Now, direct addressing where everyone can be a server and bandwidth utilization is spread more towards the edges of the network? Then you have something that could saturate 1Gbps. But you can’t do that on IPv4.
We are not even filling out the bandwidth of pipes we have to the home right now. “If you build it, they will come” does not apply when there’s already something there that isn’t being fully utilized.
China morally bankrupt and developing at a staggering pace which has somewhat stymied as their scoffing at regulations in favor of backroom dealings is kneecapping themselves.
So if you zoom in close enough, like looking at this amazingly fast reported internet speed and only at this speed, China “good.”
So I’m just going to be a completely different person once I have access to these speeds or you are suggesting new tech that will be made available to consumers?
Think back to when you were on dial-up. The concept of a streaming movie service would have been a fantasyland. No one was creating one. The infrastructure wasn’t there. It was impossible.
As soon as people started getting broadband, and enough people got it, streaming services could exist.
Are you different? No, you just want to watch a movie. But now you don’t have to go to Blockbuster.
Why do I care? Why it need to be so fast?
What is everyone doing with their internet that I’m apparently missing out on?
For me, the normal stuff. Mathematically my gig fiber is overkill for my usage. And internet services can rarely keep up with that - you want to download some update or new game? It’s throttled at the source regardless of your internet connection
But in reality when I visit people with “fast enough” internet, I always see glitches and buffering and lag. While it usually serves the need and sometimes gets advertised bandwidth, gig fiber always serves the need. I shouldn’t have to complain about my network or worry about how many streams or how big a download or how many people on their phones. I should never worry about lag during games or interrupted video calls. And I shouldn’t have to worry about sketchy broadband providers (like xFinity/ConCast) way over provisioning their lines or otherwise never delivering marketed bandwidth.
Gig fiber delivers. Always. Like any good infrastructure you don’t even have to think about it: it just always does the job
But computers are getting faster - it seems like even medium level laptops are coming with 2.5Ge, and everything is more and more digital, and we expect more all the time. Yes I do expect to want a faster connection within 5-10 years even without doing anything high bandwidth. Heck, if history holds, another couple upgrades of JavaScript and we’ll need 50G to load web pages
Decades ago…
“Why do I need electricity? I have candles. Lights seem excessive.”
Yes, but once most people have electricity, new products will be designed to take advantage of it. Now you can have a washing machine, for example.
Broadband is the same. Once most of your population has high bandwidth, we can start to design things that will use it. Right now we’re still designing for DSL speeds.
That’s entirely speculative. There are diminishing returns. Unless you’re going to host your own YouTube, the use case for 50Gbps connections to the home is quite small. 4K video streaming at Ultra HD Blu-ray bitrates doesn’t even come close to saturating 1Gbps, and all streaming services compress 4K video significantly more than what Ultra HD Blu-ray offers. The server side is the limit, not home connections.
Now, if you want to talk about self-hosting stuff and returning the Internet to a more peer-to-peer architecture, then you need IPv6. Having any kind of NAT in the way is not going to work. Connection speed still isn’t that important.
Take a look at
devContainers
as an idea that might be generalized. It’s just docker containers so so big but not huge however the use case ….devContainers
are a complete portable development environment, with support from major IDEs. Let’s say I want to work on a Java service. I open my IDE, it pulls the latest Java devContainer with my environment and all my tools, fetches the latest from git, and I’m ready to go. The problem with this use case is I’m waiting this whole time. I don’t want to sit around for a minute or two every time I want to edit a program. The latest copy needs to be here, now, as I open my IDEBut you could generalize this idea. Maybe it’s the next ChromeOS-like thing. All you need is something that can run containers, and everything you do starts with downloading a container with everything you need …… if something like this happens, there’s a great example of needing to be responsive with a lot more data
Maybe don’t rely on cloud garbage for basic development?
Technically I don’t. I’m also the guy running CI/CD building devContainers for my engineers. They no longer have to worry about updating certificates and tools and versions or security patches, and IT doesn’t have to worry about a lot of crap on their laptops that IT doesn’t manage. Engineers can use a standard laptop install and just get the latest of everything they need, scanned, verified, as soon as it’s available. And since it’s all automated, I can support many variations, and yes they can pull any older version from the repo if they need to, every project can easily be on different versions of different tools and languages
At work, I’m on the same network, but working from home, I still need the responsiveness to do my job
there could be some new thing that no one has not even bothered to think about because of the limitations. Imagine streaming back when downloading few kilobytes for an hours was considered reasonable, people would have laughed at the very thought of it.
We’re not using the bandwidth we have. Many US cities have service with 1Gbps download speed available. I have it for my own reasons. Servers are the bottleneck; they rarely even reach half that speed.
If we’re not using 1Gbps, why should we believe something would pop up if we had 50Gbps?
Now, direct addressing where everyone can be a server and bandwidth utilization is spread more towards the edges of the network? Then you have something that could saturate 1Gbps. But you can’t do that on IPv4.
Unless you’re going to host your own YouTube…
This is exactly what peer tube is struggling with. This bandwidth would solve the video federation problem.
See, you get it!
Except we need IPv6 before that’s at all viable.
We are not even filling out the bandwidth of pipes we have to the home right now. “If you build it, they will come” does not apply when there’s already something there that isn’t being fully utilized.
Oh, maybe. I’m not familiar with bandwidth utilization in China.
How exactly does NAT prevent that? On good hardware it adds insignificant latency.
It has nothing to do with latency, and everything to do with not being able to directly address things behind NAT.
Edit: and please, nobody argue that NAT increases security. That dumbass argument should have died the moment it was first uttered.
Yes but have you considered China bad?
China is a totalitarian regime with more human right, continuing atrocities, corruption, and illegal trade/business practices.
They are also
Those accomplishments and many more can be celebrated with losing sight of the basic horribleness of their government
China morally bankrupt and developing at a staggering pace which has somewhat stymied as their scoffing at regulations in favor of backroom dealings is kneecapping themselves.
So if you zoom in close enough, like looking at this amazingly fast reported internet speed and only at this speed, China “good.”
Notice how many extra hoops you jumped through to get here
To arrive at “China Good,” yes you do need to jump through many hoops. Glad we’re on the same page, even if you started out strangely.
Lmao the irony
You have serious ego issues that you will need self reflection to fix.
Feel free to elaborate. I have no idea what you’re talking about other than it seems like tankie screeching to me.
And then he blocked me XD
All these egotistical children with nothing to be proud of
Who blocked you?
So I’m just going to be a completely different person once I have access to these speeds or you are suggesting new tech that will be made available to consumers?
The second one.
Think back to when you were on dial-up. The concept of a streaming movie service would have been a fantasyland. No one was creating one. The infrastructure wasn’t there. It was impossible.
As soon as people started getting broadband, and enough people got it, streaming services could exist.
Are you different? No, you just want to watch a movie. But now you don’t have to go to Blockbuster.
360 VR experience with 16K resolution, highly textured touchable surfaces, and smell-o-vision. Only a $40 Meta subscription with ads.
Latency is much more critical than bandwidth for any sort of real-time VR.
We’ll solve that with AI. Because you can solve anything by saying “AI”.
What about quantum computing? I don’t want anything without quantum computing.
Quantum computing with AI
That goes without saying.
I tried to upload some 8k 360 footage to FB before I left it “We’re sorry, but an error has occurred”
Tried over several days, no good. tried again a month later, still no good.
Camera is more or less useless if you can’t host the footage anyway :/
It’s not fast it’s more of more bandwidth, means more people can be connected from one line. Speed will remain the same.