Why Web Components Aren’t Ready for Production… Yet

web_components_header

Web components are the new hotness. And now that a complete web components implementation landed in Chrome 36, we finally have stable, unprefixed, unflagged version to try out. But, although web components are certainly something to be excited about, and a technology worth experimenting with, that doesn’t mean that they’re ready to use in your production applications — because for most applications, they’re not.

In this article you’ll see why that is. We’ll discuss the current issues with using web components in production, and what needs to be done to solve them.

Browser support

The obvious reason to avoid web components is browser support. Although web components landed in Chrome 36, they only have partial support in Firefox, and they are not present in Safari or IE. Because cross-browser support won’t be possible for a very long time, if it happens at all, a polyfill is a long-term necessity for developers that want to use web components outside of Chrome.

And there is seemingly good news on this front: the Polymer team maintains a complete set of polyfills collectively known as “The Platform”. But although Polymer’s official documentation makes it seem like getting cross-browser web components support is as easy as including platform.js, things aren’t quite that simple.

All polyfills aren’t created equal

Although developers tend to think of a polyfill as a polyfill, the complexity of the implementations can vary widely depending on the technology being polyfilled. APIs that are syntactic sugar for existing APIs tend to be the easiest to implement. For instance you can write a Function.prototype.bind() polyfill in a handful of lines of code, and a classList polyfill polyfill in about 70 lines of code.

APIs that add entirely new behavior, or that include CSS syntax changes, tend to be harder to write. For example consider a pointer events polyfill. Because pointer events work with the touch-action CSS property, and because browsers ignore CSS rules for properties they do not understand, pointer event polyfills have to get creative. Polymer’s pointer events polyfill requires you to use a touch-action attribute on elements — e.g. <div touch-action="none"></div> — and forego the CSS declaration entirely. Microsoft’s HandJS polyfill does a text-based search of all <style> and <link rel="stylesheet"> tags for the touch-action property, and then replicates that functionality in JavaScript — which obviously has non-trivial performance implications.

The point here being, some technologies are harder to polyfill than others. Why is this relevant? Well, as it turns out, polyfilling web components is the king of all polyfilling challenges. Simply put, writing a web components polyfill is absurdly difficult, and the inherit complexity has non-trivial implications on the viability of using web components polyfills in production. Let’s look at some examples to demonstrate this.

The insanity that is polyfilling web components

To start, consider the encapsulation that the shadow DOM specification allows. When HTML elements live within a shadow root they are hidden from functions such as querySelector() and querySelectorAll(). For example, in the following screenshot querySelectorAll() does not find the <h1> element because it’s within a shadow root:

687474703a2f2f692e696d6775722e636f6d2f5942615758545a2e706e67

Think for a moment how you would polyfill this behavior. Crazy, right? But, if you add Polymer’s platform.js to the same example, it unbelievably works. That is, querySelectorAll() does not find elements within a shadow root — in browsers that have no concept of a shadow root. The following screenshot shows this behavior in Safari:

687474703a2f2f692e696d6775722e636f6d2f657879325341542e706e67

I had to run this a few times to verify my own sanity, as I had no idea how this was even possible. How does a polyfill do this?

The answer is, Polymer’s shadow DOM polyfill wraps a large number of DOM methods — at least 25 of them — with a series of customized shims that exclude elements that reside within its internal list of shadow roots, and it does this for over 30 HTML element interfaces. No, I’m not kidding.

And we’re just getting started with the crazy polyfilling challenges that web components present. The shadow DOM specification also defines a number of new CSS concepts for exposing and selecting elements that reside within shadow roots. So, because browsers discard CSS selectors and rules they don’t understand, this means that polyfills must resort to text-based searches of <style> and <link rel="stylesheet"> tags. This 38-line block of regular expressions defined in platform.js’ source should give you an idea of the difficulty of doing this:

687474703a2f2f692e696d6775722e636f6d2f473659467072742e706e67

Why this stuff matters?

I want to make it clear that the code examples I show above are not meant to be a reflection on the engineering effort put forth by Google, but rather an attempt to show the complexity inherent to reimplementing web components in JavaScript. The web components specification aims to give hooks into browser internals, so it makes sense that it’s nearly impossible to replicate that functionality in JavaScript. Given this, the job the Google did with the platform polyfills is extraordinary.

But, given that Google has already done the hard work, why should you care about how difficult it was? Because the complexity has several adverse effects on the feasibility of using these polyfills in production. Let’s look at each of these issues in turn.

Not everything is supported

The first issue is, even with some of the code examples shown above, Polymer’s polyfills do not support all the functionality that web components offer (most notably shadow DOM), or as the source says:

“The intention here is to support only the styling features which can be relatively simply implemented. The goal is to allow users to avoid the most obvious pitfalls and do so without compromising performance significantly.”

Although I disagree with Google’s interpretation of “relatively simply”, I understand the sentiment here, as truly polyfilling all of shadow DOM would require rewriting CSS in JavaScript, and even Google agrees that goes too far.

The problem is, from a developer’s perspective it’s hard to know what will work and what won’t. The only documentation of this I can find is hidden, such as this 113-line comment in a source file discussing which portions of shadow DOM’s CSS is supported.

Remember the example above of querySelectorAll() not finding an <h1>? That same example has different behavior when styling that <h1> in a <style> tag. The following shows the difference in Chrome and Safari:

687474703a2f2f692e696d6775722e636f6d2f63514d57706b662e706e67

The polyfills are also big

The next issue is sheer file size. As of version 0.3.3, the platform polyfills comprise 151K of JavaScript, 44K gzipped. If you choose to use Polymer on top of the polyfills, you can add another 66K, 20K gzipped. For comparison, people complain incessantly about the size of jQuery, which is 29K gzipped.

Polymer does do a good job of separating the polyfills into separate modules so you can pick and choose the ones you need, but in practice almost no one actually does that (the one exception being Mozilla’s X-Tags library). All of Polymer’s elements, and most (all?) of the elements listed on http://customelements.io/ and http://component.kitchen/ depend on Polymer, which depends on the platform in its entirety.

Performance considerations

Beyond the overhead of shipping the polyfills across the network, there’s also the time it takes for the browser to parse and interpret the JavaScript code, which is a known performance bottleneck on mobile devices. Polymer’s documentation states that they do not yet do performance benchmarking, which is unfortunate considering that many of the things Polymer’s polyfills do are inherently slow.

One performance issue worth specifically discussing is the numerous requests generated by HTML imports. The idea behind HTML imports is great: a single .html file that contains everything you need — templates, CSS, JavaScript, other HTML, and so forth. As such, it’s relatively common to see an HTML import that depends on several other resources, each of which must be resolved over HTTP.

In browsers that support HTML imports natively — i.e. in Chrome — the browser can resolve these resources using parallel connections, usually ~6–8 per hostname, as well as leverage speculative parsing algorithms to optimize the loading of these resources. This works great in Chrome. But in all other browsers, when HTML imports aren’t natively supported, polyfills must resort to queued up XHRs, which can be painfully slow.

This isn’t very noticeable on a demo of one web component on a high-end development machine, but it shows on pages with multiple components — especially on slow networks and mobile devices. The best example of this is actually Polymer’s site itself, which uses Polymer to build everything you see. Visit http://www.polymer-project.org/ in any browser that isn’t Chrome 36+ and see the performance issues for yourselves (iOS Safari and IE Mobile are particularly bad).

Now, there is some good news on this front, as some tooling is emerging to aide with the performance issues. Vulcanize is a build tool from the Polymer team specifically for HTML imports. In a nutshell, you pass Vulcanize an HTML file and it outputs that HTML file with all HTML imports inlined, including deep dependencies. For example, the following inlines all imports in an index.html file, and places the output in a built.html file:

$ vulcanize -o built.html index.html

That being said, build tools such as Vulcanize are still in their infancy. A lot more research needs to be done to show how to incorporate a tool such as this in to existing development processes and workflows.

So what does this all mean?

Web components and Polymer are exciting technologies that may fundamentally change the way we develop web applications, but because of the large performance gap between browsers that support the technologies natively (aka Chrome 36+) and those that don’t (aka every other browser), it will be difficult for most developers to use web components until they’re implemented everywhere, and there’s no way of knowing how long that will be.

Here are a few things I think would help the adoption of web components:

  • Browser support—Duh. But seriously, web components without any polyfills perform great in Chrome 36+; it’s the polyfilling that causes the issues.
  • Better modularity—99% of published web components rely on Polymer, which relies on platform.js, which means always downloading 66K of gzipped JavaScript. jQuery doesn’t get a free pass for 29K, so Polymer shouldn’t get one for 66K (mental note: register youmightnotneedpolymer.com). Polymer could be better about micromanaging its dependencies, and documenting how to do that in third-party components. Shadow DOM is easily the biggest and most complex web components polyfill, and many components don’t even need it — yet, 99% of them are downloading and parsing it.
  • Performance benchmarks—If the Polymer team expects people to use web components in real apps, then they should start benchmarking the polyfills in the browsers developers need to support, especially mobile ones.

Bear in mind that none of this is especially surprising given that Polymer is still officially designated as a “developer preview”. I’m confident that better tools, better modularity, and better performance will come in time, but Polymer and web components are still really new in the web development world.

How This Affects Kendo UI

At Telerik we get a lot of questions about when Kendo UI will support web components, as UI widgets are seen as an ideal use case for the web components model. But, although we certainly look forward to the day we can provide <kendo-calendar> as an API, we have to weigh that against the need to provide high quality and performant components for our users in the browsers we support.

We’re continuously researching Polymer and the web components specification so that we can leverage the benefits they provide when appropriate, and the research presented above is a direct result of that. Simply put, Kendo UI does not support web components today because we cannot provide high quality and performant components using them.

We are, however, concentrating on making our widgets as easy as possible to use with production-ready technologies; therefore, although you can’t write <kendo-calendar> yet, you can write <div data-role="calendar"> using our declarative data bindings, as well as <div kendo-calendar="..." k-ng-model="..."> using our comprehensive set of Angular directives. We’re also constantly listening to what our users want. If you’re interested in seeing web components and/or Polymer in Kendo UI, let us know in our feedback portal.

What’s next

This article purposely focused on the feasibility of using web components in production today, and did not touch on whether web components are actually worth using — that discussion is coming in a future article.

Comments

  • You bring up the use of and wishing you could write them. Why not? Why can’t you adopt custom elements that work everywhere instead of more annoying role attributes. While registerElement is part of the Web Component specification, you could just as easily migrate to the custom tags and slowly migrate.

    It’s not clear why you see components as an all or nothing endeavor.

    • Hey @tbranyen:disqus, thanks for commenting. I present it this way because it is the message that is being conveyed to web developers currently. All of the HTML5Rocks articles on web components, all Google I/O talks, and so forth recommend that if you want to use web components today, you should use Polymer, which requires platform.js, which includes every platform polyfill. The main purpose of this article was to show that there are performance issues associated with doing that.

      We need more documentation and tutorials on using the individual polyfills. What are the performance implications of using each? What are the limitations? That’s the main thing I want to see come out of this article.

      The custom elements portion of web components has the most value for libraries like Kendo UI and jQuery UI, so we’re looking into starting with just that, as the polyfills seem sane.

      • I agree completely with the messaging and ecosystem. I’m also a bit skeptical about a lot of the new underlying technologies. They don’t seem to fit in well with how I see the web and am troubled with the direction. The CSS scoping problem seemed so easy until Shadow DOM was brought into the mix and now I have no idea if I need :::shadow or /deep/ to just change a simple style.

        Besides the browser compatibility the decision making has definitely held me back from fully implementing to spec.

        • I feel the same way. In my prototypes of Kendo UI and jQuery UI with web components, when I place the widgets in a shadow DOM I have to completely rewrite the theming systems to account for /deep/ in the appropriate places, which gets messy quickly.

          That being said, although I’m skeptical of shadow DOM, I still think it has good use cases where you really need encapsulation—a Facebook Like button or ads for instance, but I’m not sure how applicable shadow DOM is to general widgets. With all of these technologies I think we need time for developers to use them in production and for best practices to develop before encouraging mass adoption.

          • Christopher Sanders

            So basically you have a chicken and egg problem. You need developers to use Web Components on production so we can figure out best practices but you just said we shouldn’t use them in production. Which is it?

          • It’s definitely a bit of a chicken and egg problem. My point is that we need a few people to experiment before the masses should adopt.

          • Christopher Thomas

            it’s not really a chicken and the egg situation at all.

            custom elements are the way to go, I should be able to supply my own elements and they work like any other element and I can bind a javascript behaviour object to it and the browser will just react as if it was a div or span node.

            shadow dom allows me to hide all the complexity of how the node is actually working, take the tag for example, it uses the shadow dom to hide all the video controls and provide a standard CSS style and define what happens when I click all the buttons. I can use CSS to access the internals using /deep/ when I want my website to override the default styling.

            the only problem for me is the polyfill stuff, it’s a bit heavy, but overtime that platform will dissolve and disappear leaving only the native code.

            I really don’t understand the problem you guys are having, these two technologies allow me to define custom elements, their default css and javascript and allow it all to be packaged up in ways that won’t break if I include it on a page with styles that overlap except if I specifically want them to override the internal definitions.

            thats a win-win all around. apart from the performance issue, it’s an amazing step forward

          • Christopher Thomas

            it’s not really a chicken and the egg situation at all.

            custom elements are the way to go, I should be able to supply my own elements and they work like any other element and I can bind a javascript behaviour object to it and the browser will just react as if it was a div or span node.

            shadow dom allows me to hide all the complexity of how the node is actually working, take the tag for example, it uses the shadow dom to hide all the video controls and provide a standard CSS style and define what happens when I click all the buttons. I can use CSS to access the internals using /deep/ when I want my website to override the default styling.

            the only problem for me is the polyfill stuff, it’s a bit heavy, but overtime that platform will dissolve and disappear leaving only the native code.

            I really don’t understand the problem you guys are having, these two technologies allow me to define custom elements, their default css and javascript and allow it all to be packaged up in ways that won’t break if I include it on a page with styles that overlap except if I specifically want them to override the internal definitions.

            thats a win-win all around. apart from the performance issue, it’s an amazing step forward

  • Great article!

    I don’t see it so far fetched that Firefox will soon follow Chrome in fully supporting Web Components though, then why not Safari at that point. IE11 is great, but the truth of the matter is “production-ready” means supporting IE9. So although Web Components may take years (or as long as IE9 is phased out), Web Components in mobile web development may be much sooner than we think.

    I also agree with Tim that you should still be able to implement today without the rest of Web Components. This is even possible with AngularJS directives without Polymer. And since Kendo has committed to AngularJS, you might as well use it for future proofing Kendo. That way, when Web Components do finally get here, still remains while the plumbing underneath it changes.

    • Thanks Basem Emara. IMO this site has the most up-to-date information on where other browser vendors stand on web components: http://jonrimmer.github.io/are-we-componentized-yet/.

      Everybody has a different definition of “production-ready”, but I think you’re right that for the average developer that means support a couple versions of IE at least. A lot of developers are also supporting Android 2.3, which is debatably harder than supporting IE8.

      Web components are fine for most developers to use as long as there are reasonably performant polyfills available for those browsers. The problem now is that developers don’t have that information. If you use Polymer you get all the polyfills, and some of them are subject to the performance issues I outline in this article. As I mentioned in my reply to Tim, we are looking into using the custom elements polyfill specifically, as custom elements provide the most value and has a relatively sane polyfill.

      • Christopher Sanders

        Agreed on that point about Android 2.3 being harder to support than IE8. I too have been following jon’s site. He needs to update it though.

    • Dmitri

      Valid point – AngularJS lets you implement already now quite a lot without Polymer.

      • Christopher Thomas

        both the polymer and angular teams are working together to merge some of this stuff together, so the directive part of angular will dissolve into using the same stuff from polymer as it’s obvious the native custom components solve this problem in a far better way

      • Christopher Thomas

        both the polymer and angular teams are working together to merge some of this stuff together, so the directive part of angular will dissolve into using the same stuff from polymer as it’s obvious the native custom components solve this problem in a far better way

  • I could have sworn ASP already did this minus the whole client side/performance issue thing.

  • Good job on writing this up, with all the nuances and considerations.

    We have run into the same problem, but we are still using some web components in production. For this we simply created a subset of the platform.js polyfills (https://github.com/Versal/component-runtime), that indeed excludes the Shadow DOM polyfill. Most other polyfills are small and simple in comparison.

    On the other hand, the Shadow DOM is arguably one of the most useful features of web components, so I hope for quick browser adoption!

    • Awesome and thanks. I’m curious if you are using HTML imports at all? Since writing this article, I’ve had a few people tell me that the custom elements polyfill is production ready, but I haven’t seen anyone using HTML imports yet.

      Also it would be great if you could publish your experiences in some fashion. We need more documentation on the performance ramifications of the various polyfills, especially in a production environment.

      • We are using HTML imports, but we haven’t “componentized” many things yet, so not using it that much yet. But the polyfill seems to work mostly fine. I’m planning on writing some more about our experiences when we have used it for a bit longer!

        • Dan

          HTML Imports is working for you on IE9? I am currently trying to add HTML Imports into a custom framework, but we have to support IE9, and HTML Imports don’t seem to be working there.

  • lamtran

    I came to the same conclusion while researching Web Components and Polymers to see if I can adopt them in my work. I blogged about it here http://blog.lamtran.com/lets-not-apply-that-new-trick-you-just-learned-to-everything/, though I wasn’t as nice about it as you are 🙂

    • Hey @lamtran:disqus, thanks for sharing. It’s good to see that you came to the same basic conclusion independently, and your post was an amusing read 🙂

  • Check out spock on npm if you’re interested in a tool to preprocess `rel=”import”`s as a gulp plugin 🙂

  • Johnny Larue

    No one can dispute that browser support of web components is weaker today than it will be in the future but that will most surely change in the very near future.
    My sense is that Web Components are by far the best bet going forward, given that they are part of the HTML5 specification and given that the Web Component approach is a more intuitive way to build client applications much than the current div soup and JS entanglement of today,
    Personally, I’d rather invest my time developing using Web Components even if that means my applications run slower with Poly fills with the intent being that these will eventually fall aside and see the same applications run much faster once the browsers provide native support of Web Components.

  • Christopher Sanders

    TJ, I have a question for you. In what case would you encourage the use of Web Components? How far would you take them? Do you all have any guidelines on what might be the best case of using Web Components?

  • spacemonkey82

    firefox will implement these features in no time but what i don’t understand is why apple doesn’t want to play ball here. can someone explain why?

  • Pingback: Why Web Components Are Ready For Production -Telerik Developer Network()

  • Michael Trotter

    One extra thing to note: Google’s platform.js polyfill does not work at all on Safari 8 (releasing this fall). Webkit has mis-implemented one of the built in properties, preventing platform.js from doing its thing: https://bugs.webkit.org/show_bug.cgi?id=49739

  • Dmitri

    As AngularJS user, I see Web Components as merely architectural design pattern. Once I like it, I can easily implement them with Angular directives, can’t I? Am I missing something?

  • johnrhunt

    Shadow dom is not being removed from webkit.. it was just being removed from that branch or something like that. Please update your article.

  • Pingback: On Our Radar This Week: Polymer, Patterns and Programmers()

  • Pingback: Pixels | Web Components | The Hamburger Menu | The Treehouse Show Episode 104()

  • Pingback: My thoughts on Polymer - Liam Curry()

  • Trey Shugart

    Very nice writeup, TJ. There is a lot of very valid reasons to be selective of the parts of web components to adopt now, and those to be wary of. We’ve settled on using Custom Elements and leaving the rest until they’re better supported and / or polyfilling becomes better. It gives us the benefits of offering a very clean, clear and concise API whilst using the most stable part of the spec.

    Besides the issues you’ve pointed out, we also found that the Mutation Observer polyfill used was unacceptably slow in IE 9 and 10 and the size of the polyfills or Polymer itself wasn’t something we were prepared to swallow. X-Tags is an alternative, and while smaller, they use the same Mutation Observer polyfill that is included in Polymer’s platform.

    After considerable deliberation, we ended up rolling our own solution. The result is much smaller than even X-Tags, with a similar API and something that yields acceptable performance all the way back to IE9. I hope you won’t mind my shameless plug here, but in case you or anyone else is interested you can find it at: https://github.com/skatejs/skatejs.

    Again, great writeup and all the best!

  • Pingback: Component Based Development | Sysadmin Exchange()

  • Pingback: A No-Nonsense Guide to Web Components, Part 2: Practical Use | Chris Bateman()

  • jokeyrhyme

    Looks like some of this is being addressed:

    http://updates.html5rocks.com/2015/01/polymer-state-of-the-union

  • Thanks for this great post

  • Pingback: Web Performance | Treehouse Blog()

  • Christian Tzolov

    @tjvantoll:disqus It looks like Polymer 1.0 has addressed some of the issues?

  • Pingback: Pie in the Sky (July 18th, 2014) | The Silver Lining Blog()

  • Just to update that Polymer 1.0 now comes with recent `webcomponents.js` (re-branded from `platform.js`) which seems to be broken down a bit lighter. They come as `webcomponents.js` for all of polyfills and `webcomponents-lite.js` for all polyfills except shadow DOM. Check it out here https://github.com/webcomponents/webcomponentsjs.

  • Pingback: AngularJS 2.0 (rewrite), ShadowDOM, Web Components, Polymer and more… | abaBiaS Inc()

  • Pingback: Polymer Web Components and the Shadow DOM | Lena Feldberg()

  • Pingback: What To Expect From JavaScript In 2016  - Frameworks - Telerik Developer Network()

  • Umair

    Its 2016 and Now they are production ready 🙂