-
-
Notifications
You must be signed in to change notification settings - Fork 5.7k
Description
I thought it may be time to discuss multi-format module interop. This is a difficult discussion, and most discussions about this tend to reach a stalemate, since you have to go quite far down a road of thought to see where it falls over, so it can be tricky to switch between viewpoints in a single discussion. I'm starting it off entirely from my viewpoint, which is particularly based around the ES6 module loader, so do be wary of my biases too. There is mixed agreement about this stuff, so I'll try to give as balanced a view as I can.
Consider if I have an application which has native CommonJS modules and CommonJS modules generated from ES6, and I want those two to play well together.
An example scenario is:
es6-main.js
import p from 'cjs';
import { q } from 'es6';
export function r() {}
cjs.js
module.exports = 'cjs';
es6.js
export var q = 'es6';
Where es6.js
and es6-main.js
need to be transpiled to CommonJS that will work with cjs.js
after transpilation.
The output we do in Traceur currently is something like (a slightly more readable version here):
es6.js ->
exports.q = 'es6';
exports.__esModule = true;
es6-main.js ->
var cjsModule = require('cjs');
if (!cjsModule || !cjsModule.__esModule)
cjsModule = { default: cjsModule }; // treat non-ES6 as default-only exports
var p = cjsModule.default;
var es6Module = require('es6');
if (!es6Module || !es6Module.__esModule)
es6Module = { default: es6Module }; // this won't pass for the ES6 module
exports.r = function r() {}
exports.__esModule = true;
Note that this is a case of unlinked interop compilation, where we only consider how to compile on an individual file basis, without doing any deeper lookups into the tree at compilation. If we did look at the tree more closely, we could remove the second conditional, since we know it is ES6. It would be possible to write a resolver that can deduce this info at compile time, but still be flexible to allow modules that don't resolve at compile time.
The basic assumption of unlinked interop is that we are trying to compile so that imports should work against other module formats, without having to assume in advance how those imports resolve. The reason I like this scenario is because it maintains module portability. I can publish to npm, and perhaps tomorrow the package at cjs.js
gets updated on npm to be based on ES6 code. Similarly, I can run my code in an ES6 module loader as ES6, and it still behaves like the CommonJS-compiled version did.
If we don't think about the above, and we continue doing what we are doing currently here (sorry I didn't want to go into all this before), which is:
import * as p from 'cjs';
import { q } from 'es6';
Then if I try to load the above in an ES6 module loader, it will load cjs.js
and create a new module object for it, Module({ default: 'cjs' })
. We thus get the wrong thing when loading the ES6 module.
The point is, code written this way does not support non-object CJS exports, and will not easily run in an ES6 module loader
The reason is that we have to treat CommonJS modules as a default export only in an ES6 module loader since they can be any object type and module objects in the loader registry must be plain namespaces.
The pain point here for consumers is that they want to be able to write:
import { readFile } from 'fs';
And have that work against existing NodeJS modules. But interop and compiler concerns do trump this in my opinion. We can add an addition to make that work here, and that is to extend a CommonJS module and iterate the exported names and decorate the namespace with them: Module(extend(getAllProperties(fs), { default: fs })
. This way it may still be possible to export pieces off of a CommonJS object like above. This could potentially be a middle ground and is currently a point of debate. My personal opinion is against this type of work, but a lot of others are for it.
Note also that the exact rules and choices in this process entirely depend on the linking assumptions of the compiler and runtime assumptions of the code. My bias is towards unlinked portable single-file static compilation as the most general case, also running in the dynamic es6 module loader. But it may be interesting to explore the middle grounds as mentioned above where the linked / unlinked boundary is given separate treatment.
The other extreme is to rather look at resolving any imported modules at compile time (CommonJS node lookup / UMD support etc), and use those resolutions to know how to treat the imports. If you do want to allow custom resolution at compile time, this is as simple as a normalize hook into the compiler - normalize(path, parentPath)
, and is how the ES6 module loader works.
There are lots of long discussions about this stuff, but sometimes its easier to just go through everything again fresh rather so I'm happy to discuss further. In case you are interested - google/traceur-compiler#1388, google/traceur-compiler#1295, esnext/es6-module-transpiler#156, esnext/es6-module-transpiler#86, esnext/es6-module-transpiler#85.