Whoosh!
Whoosh!
Posted Jul 29, 2010 5:41 UTC (Thu) by thedevil (guest, #32913)In reply to: Whoosh! by robertm
Parent article: GUADEC: Luis Villa points GNOME at the web
Posted Jul 29, 2010 6:30 UTC (Thu)
by Tara_Li (guest, #26706)
[Link] (3 responses)
Seriously, just this evening, I was having random inability to get to huge chunks of the Internet - at one point, even Google wouldn't come up for about 30 minutes. If I have nothing local - I have a paperweight on my desktop if my network goes down.
And honestly, the best way to keep a computer virus-free? Don't connect it to the Net!
Posted Jul 29, 2010 7:53 UTC (Thu)
by dlang (guest, #313)
[Link] (1 responses)
all these extra layers will work find if you are on a high-powered mains powered system connected to a high-speed, low-latency network connection. but change any of these criteria and the feasability of 'everything is the web' starts falling apart rapidly.
OLPC and the netbooks pointed out to people that there is a backside to moors law, namely the same capability gets cheaper over time. today a few people are willing to pay large sums of money for smartphones. when the same capability that sells today for $600 drops to $100 there will be a _LOT_ more of them around.
if you are on a high-latency network connection (say a satellite feed if you are in rural areas) every round trip to the server is very painful (approximately 1 second)
I already mentioned the battery life issue, radios are expensive to power (and as the power for the rest of the system drops over time, they become even more expensive as a relative cost).
Posted Jul 29, 2010 10:35 UTC (Thu)
by NAR (subscriber, #1313)
[Link]
Exactly. In my experience it takes about 2-5 seconds to open an e-mail in gmail while with the local pine client it's instant. WWW stands for World Wide Wait. In my experience the latency is simply unacceptable for most web based applications. Reliability is also worse: it's not enough that my computer works, the connection to my ISP must work, the international line of my ISP must work and also the server must not be down. A number of extra components that can broke down.
Posted Jul 29, 2010 7:56 UTC (Thu)
by mjthayer (guest, #39183)
[Link]
I am one of the most clueless people about web applications going around, but my understanding was that Javascript applications were stored on a remote server but cached and executed locally. Surely with a bit of tweaking (like making sure that the whole application gets cached and not just part of it) it could continue to work once your connection is gone. apt-get normally also initially relies on a network connection.
Whoosh!
Whoosh!
all these extra layers will work find if you are on a high-powered mains powered system connected to a high-speed, low-latency network connection. but change any of these criteria and the feasability of 'everything is the web' starts falling apart rapidly.
Whoosh!
Whoosh!