title: JS Objects: Distractions
url: http://davidwalsh.name/javascript-objects-distractions
hash_url: ff61babc66
JavaScript has been plagued since the beginning with misunderstanding and awkwardness around its “prototypal inheritance” system, mostly due to the fact that “inheritance” isn’t how JS works at all, and trying to do that only leads to gotchas and confusions that we have to pave over with user-land helper libs. Instead, embracing that JS has “behavior delegation” (merely delegation links between objects) fits naturally with how JS syntax works, which creates more sensible code without the need of helpers.
When you set aside distractions like mixins, polymorphism, composition, classes, constructors, and instances, and only focus on the objects that link to each other, you gain a powerful tool in behavior delegation that is easier to write, reason about, explain, and code-maintain. Simpler is better. JS is “objects-only” (OO). Leave the classes to those other languages!
I’d like to thank the following amazing devs for their generous time in feedback/tech review of this article series: David Bruant, Hugh Wood, Mark Trostler, and Mark McDonnell. I am also honored that David Walsh wanted to publish these articles on his fantastic blog.
In part 1 of this article series, I went into great detail (aka, wordiness) about the differences between what the traditional definition of “inheritance” means and how JS’s [[Prototype]]
mechanism works. We saw that JS operates oppositely to “inheritance”, being better labeled as “behavior delegation”. If you haven’t read it and you have any twinges of doubt or confusion about that statement, I’d encourage you to go read part 1 first.
Inheritance implies, to an extent, copying of behavioral definition down the chain, whereas behavior delegation implies delegating behavior up the chain. These aren’t just word semantics, but an important distinction that, when examined, can de-mystify a lot of confusing stuff around JS objects.
I’m by far not the first dev to realize this truth about JS. What differs here is in my reaction to that realization. The response usually is layering on other concepts to smoothe out the rough edges or unexpected consequences of how “prototypal inheritance” can surprise us, to try to make JS feel more like classical OO.
I think those attempts just distract us from the plain truth of how JS works.
I would rather identify the things which are merely distractions, and set them aside, and embrace only the true essence of how JS’s [[Prototype]]
works. Rather than trying to make JS more “inheritance friendly”, I’d rather rip out everything that confuses me (and others) into thinking JS has “inheritance” at all.
It’s often cited that in JavaScript, if you declare a function and add things to that function’s prototype, then that alone makes a definition of a custom “type”, which can be instantiated. If we were in a traditional OO language, that sort of thinking might be more appropriate, but here in JS land, it’s just one of many distractions.
You’re not really creating a new type in any real sense of that word. It’s not a new type that will be revealed by thetypeof
operator, and it’s not going to affect the internal [[Class]]
characteristic of a value (what would be reported by default via Object#toString()
). It is true that you can do some self-reflection to check if an object is an “instance of” some function’s construction (via the instanceof
operator). But importantly,foo1 instanceof Foo
is just following the internal [[Prototype]]
chain of your object foo1
to see if at any level of that chain it happens to find the .prototype
object attached to the Foo
function.
In other words, the reflection you’re doing is not about checking if the value is a specified type at all, nor is it about the function constructor. It’s only about asking if one object is in another object’s [[Prototype]]
chain. The name and semantics of the instanceof
operator (referring to “instances” and “constructor functions”) are layering on extra and unnecessary meaning, which only confuses you into thinking there’s anything more than simple [[Prototype]]
chain checking going on.
Some developers frown on the usage of instanceof
, and so an alternate form of determining the “type” of some object is called duck typing, which is basically inferring a value’s “type” by inspecting the object for one or more charateristic features, like a specific method or property.
Either way, these aren’t really “types”, they’re just approximations of types, which is one part of what makes JS’s object mechanism more complicated than other languages.
Another distraction is trying to mimic the automatic “copying” of inheritance by using the “mixin” pattern, which essentially manually iterates through all the methods/properties on an object and makes a “copy” (techically just a reference for functions and objects) onto the target object.
I’m not saying that mixins are bad -- they’re a very useful pattern. However, mixins have nothing to do with the[[Prototype]]
chain or inheritance or delegation or any of that -- they rely entirely on implicit assignment ofthis
by having an “owning object” at the call-time of a function/method. They are, in fact, completely circumventing the [[Prototype]]
chain.
Take any two independent objects, call them A
and B
(they don’t have to be linked via [[Prototype]]
at all), and you can still mixin A
’s stuff into B
. If that style of code works for your situation, use it! But just note that it has nothing to do with [[Prototype]]
or inheritance. Trying to think of them as related is just a distraction.
Another related distraction is when the inevitable desire to create “multiple inheritance” comes up, because JavaScript only allows an object to be [[Prototype]]
linked to one other object at a time. When we read about the lack of multiple inheritance in JavaScript, several problems come up, and various “solutions” are often proposed, but we never actually solve them, we just do more fancy hand-waiving to distract us from the difficulties that JS poses at the syntax/semantic level.
For example, you basically end up doing some form of “mixin” to get multiple different sets of properties/methods added into your object, but these techniques don’t, without elaborate and inefficient work-arounds, gracefully handle collision if two of your “ancestor” objects have the same property or method name. Only one version of the property/method is going to end up on your object, and that’s usually going to be the last one you mixed-in. There’s not really a clean way to have your object reference the different versions simultaneously.
Some people choose another distraction to resolve these problems, by using the “composition” pattern. Basically, instead of wiring your object C
to both A
and B
, you just maintain a separate instance of each of A
andB
inside your C
object’s properties/members. Again, this is not a bad pattern, it has plenty of goodness to it.
Parasitic Inheritance is another example of a pattern that works around this “problem” that [[Prototype]]
doesn’t work like classes by simply avoiding [[Prototype]]
altogether. It’s a fine pattern, but I think it’s a confusing distraction because it makes you feel like you’re doing OO when you’re not.
Whatever technique you use here, you basically end up ignoring the [[Prototype]]
chain, and doing things manually, which means you’ve moved away from JavaScript’s “prototypal inheritance” mechanism altogether.
One particular distraction that ends up creating some of the most awkward code patterns we deal with in JS is polymorphism, which is the practice of having the same method or property name at different levels of your “inheritance chain”, and then using super
-style relative references to access ancestor versions of the same.
The problem is mechanical: JavaScript provides a this
property, but importantly it is always rooted at the bottom of the [[Prototype]]
chain, not whatever level of the chain the current function was found at. While it’s true that this.foobar()
might end up resolving (finding) foobar()
at an ancestor level of the chain, inside that call, his this
will still be the original rooted this
object.
Put more simply, this
is not relative, but absolute to the beginning of the call stack. If JS had a super
or acurrentThis
(as I proposed recently), then those references would be relative to whatever the currently resolved link in the [[Prototype]]
chain was, which would allow you to make a relative reference to a link “above”. But, JS does not currently have any such mechanism. And this
being absolute rooted makes it an ineffective (or inefficient at best, thus impractical) solution to these relative references that polymorphism requires.
Most of the OO helper libraries try to provide a way for you to make super
calls, but all of them end up having to do inefficient things under the covers to make that kind of relative call work.
Lastly, I think the long and hotly debated topic of class { .. }
syntax that is coming to the language in ES6 represents more of the attempt to cover up what JS actually does with what people wished JS did. These sorts of distractions are not likely to make understanding JS’s actual mechanisms better. Some speculate that it will make JS more approachable from traditional OO devotees, which may be true at first, but I suspect the ultimate result is that they’ll quickly become frustrated about how it doesn’t work as they’d expect.
What’s important to understand is that the new class syntax we’re getting is not introducing radically new behavior or a more classical version of inheritance. It’s wrapping up how JS [[Prototype]]
delegation currently works, in a syntax and semantics which come pre-loaded with lots of baggage understanding and expectation, which run quite contradictory to what you’ll really get with JS classes. If you currently do not understand, or don’t like, JS object “inheritance”, the class {..}
syntax is pretty unlikely to satisfy you.
Yes, the syntax takes away some of the boilerplate of explicitly adding items to a “constructor” function’s.prototype
object, and goodness knows we all will love not having to type the function
keyword as many times. Hooray! If you already fully understand the awkward parts of JS “classes”, and you can’t wait forclass {..}
to sugar up the syntax, I’m sure you’re happy, but I also think you’re probably in the minority. It’s made far too many compromises to even make it into the language to fully please a broad range of totally opposite opinions.
The underlying [[Prototype]]
system isn’t changing, and almost none of the difficulties we just outlined are getting measurably any better. The only exception is the addition of the super
keyword. That will be a welcome change I suppose.
Although, as a side note, the engine won’t actually bind super
dynamically (at call time) to the appropriate link in the [[Prototype]]
chain, but will instead bind it statically (at definition time) based on the owning object of a function reference. This is going to possibly create some weird WTF’s because the engine is going to have to create new function references on the fly as functions that use super
are assigned around to different owning objects. It’s possible (unconfirmed suspicion) that it may not work in all cases as you’d expect if super
were instead bound dynamically.
We’ve just examined a variety of ways that many JS devs try to layer on extra abstractions and concepts on top of JS’s core object mechanism. I argue that this is a mistake that takes us further from the beauty of core JavaScript. Rather than adding complexity to smoothe out the rough edges, I think we need to strip things out to get to the good stuff.
In part 3, I will address exactly that, taking JS from the more complex world of classes and inheritance back to the simpler world of objects and delegation links.