Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

augmentChunkHash plugin hook #2921

Merged
merged 16 commits into from Aug 13, 2019
Merged

augmentChunkHash plugin hook #2921

merged 16 commits into from Aug 13, 2019

Conversation

isidrok
Copy link
Contributor

@isidrok isidrok commented Jun 10, 2019

This PR contains:

  • bugfix
  • feature
  • refactor
  • documentation
  • other

Are tests included?

  • yes (bugfixes and features will not be merged without tests)
  • no

Breaking Changes?

  • yes (breaking changes will not be merged unless absolutely necessary)
  • no

List any relevant issue numbers:

closes #2739
related #2839

Description

Introduce a new plugin hook augmentChunkHash in order to give plugins the ability to augment chunk hashes depending on implicit dependencies or explicit ones that the plugin takes into account in the renderChunk hook in order to change the code.

For example:

// rollup-plugin-babel
{
  agumentChunkHash(){
    const pkgPath = path.join(require.resolve('rollup-plugin-babel'), 'package.json')
    const version = require(pkgPath).version;
    const config = fs.readFileSync(options.babelrc);
    return version + config;
  }
}

This would invalidate chunk hashes whenever the plugin is updated or the babel config file changes.

Alternatively plugins can decide to augment the hash only of certain chunks:

// rollup-plugin-whatever-css
{
  agumentChunkHash(chunk){
    const sheets = chunk.imports.filter(isCss);
    if(!sheets.length) {
        return;
    }
    return Promise.all(sheets.map((sheet)=>fs.readFile(sheet))
         .then(sheets => sheets.join(':'));
  }
}

TODO

  • Don't use hookSeqSync in favour of hookReduceValue, or alternatively a new hookReduceValueSync: I think this hook should definitely be async but introducing async code on the getRenderedHash requires a great refactor (could be mitigated if using async/await).

  • Pass chunk information to the hook, at least its dependencies since those would be the ones impacting the final hash apart from the plugin options and implicit dependencies.

  • Write docs once the previous points are solved.

Hook to extend chunk hashes callable from within plugins in order to take in to account implicit dependencies such as plugin options or version when creating the chunk hash
@isidrok
Copy link
Contributor Author

isidrok commented Jun 11, 2019

One possible way to make this hook async and parallel without major changes would be to call it on the generate phase and pass its value to assignChunkIds

@lukastaegert
Copy link
Member

Thanks for your work on this, hope to find a little more time to look into your suggestions in detail.

One possible way to make this hook async and parallel without major changes would be to call it on the generate phase and pass its value to assignChunkIds

It should definitely be called on the generate and not the build phase as changes in renderChunk can vary between outputs (e.g. different output formats etc.).

@isidrok
Copy link
Contributor Author

isidrok commented Jun 12, 2019

Thanks for your work on this, hope to find a little more time to look into your suggestions in detail.

We can discuss the API whenever you have time and I'll update the docs once its reviewed.

Also will need some help with failing tests since I can't relate them with the PR changes.

@@ -1328,4 +1328,157 @@ module.exports = input;
]);
});
});
it('supports augmentChunkHash hook', () => {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it might be interesting to convert some of those tests to file-hashes tests, i.e. tests/file-hashes/samples/.... These are specifically designed to compare two different setups and throw when there are files with the same hashes but different content. Also, those tests are closer to real world scenarios.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

})
)
.then(output => {
assert.equal(augmentChunkHashCalls, 1);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not think this is too important to test as calling is implicitly tested with any test that also tests that the hash is changed, which is what is actually important to us.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the end I left this one inside hook tests and created a file-hashes one with tests the rest of them

}
return Promise.all([bundleWithoutAugment(), bundleWithAugment()]).then(([base, augmented]) => {
assert.notEqual(base, augmented);
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test is a prime candidate for being converted to a file-hashes test.

]);
});
const facadeModule = chunk.facadeModule;
const chunkInfo = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not like calculating this information twice. Besides performance considerations, this information will need to be maintained and synced in more than one place which is always problematic.

One way forward might be to first create a Map of Chunk -> PreRenderedChunk info and use this map in the second usage.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, could also do it in an array and just match positions.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that would be fine as well.

});
})
).then(() => {});
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure what exactly causes the test failures, e.g. misc/'handles different import paths for different outputs', but it is definitely caused by some of the changes in this file. Maybe it becomes clearer once the code is deduplicated. My guess is that there is something that has a side effect that does not look side-effectful here.

Copy link
Contributor Author

@isidrok isidrok Jul 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have found what's causing the errors and its pretty strange:

return graph.pluginDriver
  .hookParallel('renderStart', [])
  .then(() => createAddons(graph, outputOptions))
  .then(addons => {
    // ...
    return Promise.all(
      chunks.map(chunk => {
        const outputChunk = outputBundle[chunk.id] as OutputChunk;
        return chunk.render(outputOptions, addons, outputChunk).then(rendered => {
          // ...  
          return graph.pluginDriver.hookParallel('ongenerate', [
            { bundle: outputChunk, ...outputOptions },
           outputChunk
          ]);
        });
      })
    ).then(() => {});
})

Works fine and test pass whereas if I move the Promise.all block into another promise:

return graph.pluginDriver
  .hookParallel('renderStart', [])
  .then(() => createAddons(graph, outputOptions))
  .then(addons => {
    // ...
    const augmentChunkHashes = Promise.resolve();
    return augmentChunkHash.then( () => {
      // Moving this Promise.all inside a then breaks test
      return Promise.all(
        chunks.map(chunk => {
          const outputChunk = outputBundle[chunk.id] as OutputChunk;
          return chunk.render(outputOptions, addons, outputChunk).then(rendered => {
            // ...  
            return graph.pluginDriver.hookParallel('ongenerate', [
              { bundle: outputChunk, ...outputOptions },
              outputChunk
            ]);
          });
        })
      ).then(() => {});
   })
})

Some tests break but if I understand correctly:

Promise.all(promises)
  .then(()=>{}) 
=== 
Promise.resolve()
  .then(()=>Promise.all(promises).then(()=>{}))

Any idea?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a slight difference in that errors thrown synchronously when generating the promises will be caught and reject the Promise in the second case while in the first case, they will be thrown synchronously. But I do not think this is the issue here.

I cannot seem to be able to confirm your finding, will do some digging myself.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also in the second case, execution will be one micro-tick slower, which would point to a possible race condition. But I really hope this is not the issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I understand the issue now. It could be argued that this is a bug in Rollup but it is actually only a bug with your modification.
The problem is that the core part of chunk rendering is stateful, i.e. you cannot just interleave the renderings of two different chunks as they leave information about the current rendering process at variables etc. I know this is not ideal but for the time being, I fear we will need to live with this.

The state only needs to be preserved between chunk.preRender and chunk.render. Previously, this was guaranteed but with your modification, there is now an asynchronous micro-tick between those two calls where another output rendering can sneak in. Which is a problem as the CLI does indeed use Promise.all to render outputs.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly not sure what a good solution is except forbidding augmentChunkHash to be asynchronous for now.

Copy link
Contributor Author

@isidrok isidrok Jul 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for looking into this, I was loosing my mind. So if I get it right the correct order of events would be to preRender and then render all chunks for each output but with the constraint that this sequence has to be completed before another output goes in. Moving the rendering to another micro-task makes it possible for some sequences when several outputs are generated to go preRender-ChunkA-OutA preRender-ChunkA-OutB render-ChunkA-OutA or something among those lines breaking the sequence since they are generated inside a Promise.all in parallel.

Another possible solution could be running this hook before generate but I would rather have the information generated during preRender and make this syncrhonous with a new hookReduceValueSync hook for example.

@shellscape
Copy link
Contributor

@isidrok @lukastaegert what do we need to get some traction on this PR?

@codecov
Copy link

codecov bot commented Aug 12, 2019

Codecov Report

Merging #2921 into master will increase coverage by 0.03%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #2921      +/-   ##
==========================================
+ Coverage   88.69%   88.73%   +0.03%     
==========================================
  Files         165      165              
  Lines        5716     5735      +19     
  Branches     1744     1748       +4     
==========================================
+ Hits         5070     5089      +19     
  Misses        388      388              
  Partials      258      258
Impacted Files Coverage Δ
src/utils/pluginDriver.ts 87.5% <100%> (+0.33%) ⬆️
src/Chunk.ts 92.63% <100%> (+0.2%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 04b7d52...74f8305. Read the comment docs.

@isidrok isidrok marked this pull request as ready for review August 12, 2019 16:03
@shellscape
Copy link
Contributor

@isidrok thanks so much for jumping on this one.

function assignChunksToBundle(
chunks: Chunk[],
outputBundle: OutputBundleWithPlaceholders
): OutputBundle {
for (let i = 0; i < chunks.length; i++) {
const chunk = chunks[i];
const facadeModule = chunk.facadeModule;

outputBundle[chunk.id as string] = {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the end I ended up duplicating this info since a great part of it must be recalculated since internal Ids change from augmentChunkHashes to assignChunksToBundle and I thought it would be messier to keep track of what should/shouldn't be updated.

Copy link
Member

@lukastaegert lukastaegert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for picking this up again! I left some small comments, the main question being if it wouldn't make more sense to calculate the hash augmentation only on demand now that the hook is sync. Thus if [hash] is not used, the new logic would not need to run. Also some other notes around types.

bin/rollup Outdated
@@ -0,0 +1,1566 @@
#!/usr/bin/env node
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file was probably checked in accidentally and should be removed. The reason this happened is probably that the bin file has been moved and the .gitignore was changed accordingly.

bin/rollup.map Outdated
@@ -0,0 +1 @@
{"version":3,"file":"rollup","sources":["../node_modules/minimist/index.js","../src/utils/mergeOptions.ts","../node_modules/require-relative/index.js","../src/utils/path.ts","../src/utils/relativeId.ts","../node_modules/turbocolor/index.js","src/logging.ts","src/run/batchWarnings.ts","../node_modules/parse-ms/index.js","../node_modules/pretty-ms/index.js","src/sourceMappingUrl.ts","../node_modules/pretty-bytes/index.js","src/run/timings.ts","src/run/build.ts","src/run/loadConfigFile.ts","../node_modules/time-zone/index.js","../node_modules/date-time/index.js","../node_modules/signal-exit/signals.js","../node_modules/signal-exit/index.js","src/run/resetScreen.ts","src/run/watch.ts","src/run/index.ts","src/index.ts"],"sourcesContent":["module.exports = function (args, opts) {\n if (!opts) opts = {};\n \n var flags = { bools : {}, strings : {}, unknownFn: null };\n\n if (typeof opts['unknown'] === 'function') {\n flags.unknownFn = opts['unknown'];\n }\n\n if (typeof opts['boolean'] === 'boolean' && opts['boolean']) {\n flags.allBools = true;\n } else {\n [].concat(opts['boolean']).filter(Boolean).forEach(function (key) {\n flags.bools[key] = true;\n });\n }\n \n var aliases = {};\n Object.keys(opts.alias || {}).forEach(function (key) {\n aliases[key] = [].concat(opts.alias[key]);\n aliases[key].forEach(function (x) {\n aliases[x] = [key].concat(aliases[key].filter(function (y) {\n return x !== y;\n }));\n });\n });\n\n [].concat(opts.string).filter(Boolean).forEach(function (key) {\n flags.strings[key] = true;\n if (aliases[key]) {\n flags.strings[aliases[key]] = true;\n }\n });\n\n var defaults = opts['default'] || {};\n \n var argv = { _ : [] };\n Object.keys(flags.bools).forEach(function (key) {\n setArg(key, defaults[key] === undefined ? false : defaults[key]);\n });\n \n var notFlags = [];\n\n if (args.indexOf('--') !== -1) {\n notFlags = args.slice(args.indexOf('--')+1);\n args = args.slice(0, args.indexOf('--'));\n }\n\n function argDefined(key, arg) {\n return (flags.allBools && /^--[^=]+$/.test(arg)) ||\n flags.strings[key] || flags.bools[key] || aliases[key];\n }\n\n function setArg (key, val, arg) {\n if (arg && flags.unknownFn && !argDefined(key, arg)) {\n if (flags.unknownFn(arg) === false) return;\n }\n\n var value = !flags.strings[key] && isNumber(val)\n ? Number(val) : val\n ;\n setKey(argv, key.split('.'), value);\n \n (aliases[key] || []).forEach(function (x) {\n setKey(argv, x.split('.'), value);\n });\n }\n\n function setKey (obj, keys, value) {\n var o = obj;\n keys.slice(0,-1).forEach(function (key) {\n if (o[key] === undefined) o[key] = {};\n o = o[key];\n });\n\n var key = keys[keys.length - 1];\n if (o[key] === undefined || flags.bools[key] || typeof o[key] === 'boolean') {\n o[key] = value;\n }\n else if (Array.isArray(o[key])) {\n o[key].push(value);\n }\n else {\n o[key] = [ o[key], value ];\n }\n }\n \n function aliasIsBoolean(key) {\n return aliases[key].some(function (x) {\n return flags.bools[x];\n });\n }\n\n for (var i = 0; i < args.length; i++) {\n var arg = args[i];\n \n if (/^--.+=/.test(arg)) {\n // Using [\\s\\S] instead of . because js doesn't support the\n // 'dotall' regex modifier. See:\n // http://stackoverflow.com/a/1068308/13216\n var m = arg.match(/^--([^=]+)=([\\s\\S]*)$/);\n var key = m[1];\n var value = m[2];\n if (flags.bools[key]) {\n value = value !== 'false';\n }\n setArg(key, value, arg);\n }\n else if (/^--no-.+/.test(arg)) {\n var key = arg.match(/^--no-(.+)/)[1];\n setArg(key, false, arg);\n }\n else if (/^--.+/.test(arg)) {\n var key = arg.match(/^--(.+)/)[1];\n var next = args[i + 1];\n if (next !== undefined && !/^-/.test(next)\n && !flags.bools[key]\n && !flags.allBools\n && (aliases[key] ? !aliasIsBoolean(key) : true)) {\n setArg(key, next, arg);\n i++;\n }\n else if (/^(true|false)$/.test(next)) {\n setArg(key, next === 'true', arg);\n i++;\n }\n else {\n setArg(key, flags.strings[key] ? '' : true, arg);\n }\n }\n else if (/^-[^-]+/.test(arg)) {\n var letters = arg.slice(1,-1).split('');\n \n var broken = false;\n for (var j = 0; j < letters.length; j++) {\n var next = arg.slice(j+2);\n \n if (next === '-') {\n setArg(letters[j], next, arg)\n continue;\n }\n \n if (/[A-Za-z]/.test(letters[j]) && /=/.test(next)) {\n setArg(letters[j], next.split('=')[1], arg);\n broken = true;\n break;\n }\n \n if (/[A-Za-z]/.test(letters[j])\n && /-?\\d+(\\.\\d*)?(e-?\\d+)?$/.test(next)) {\n setArg(letters[j], next, arg);\n broken = true;\n break;\n }\n \n if (letters[j+1] && letters[j+1].match(/\\W/)) {\n setArg(letters[j], arg.slice(j+2), arg);\n broken = true;\n break;\n }\n else {\n setArg(letters[j], flags.strings[letters[j]] ? '' : true, arg);\n }\n }\n \n var key = arg.slice(-1)[0];\n if (!broken && key !== '-') {\n if (args[i+1] && !/^(-|--)[^-]/.test(args[i+1])\n && !flags.bools[key]\n && (aliases[key] ? !aliasIsBoolean(key) : true)) {\n setArg(key, args[i+1], arg);\n i++;\n }\n else if (args[i+1] && /true|false/.test(args[i+1])) {\n setArg(key, args[i+1] === 'true', arg);\n i++;\n }\n else {\n setArg(key, flags.strings[key] ? '' : true, arg);\n }\n }\n }\n else {\n if (!flags.unknownFn || flags.unknownFn(arg) !== false) {\n argv._.push(\n flags.strings['_'] || !isNumber(arg) ? arg : Number(arg)\n );\n }\n if (opts.stopEarly) {\n argv._.push.apply(argv._, args.slice(i + 1));\n break;\n }\n }\n }\n \n Object.keys(defaults).forEach(function (key) {\n if (!hasKey(argv, key.split('.'))) {\n setKey(argv, key.split('.'), defaults[key]);\n \n (aliases[key] || []).forEach(function (x) {\n setKey(argv, x.split('.'), defaults[key]);\n });\n }\n });\n \n if (opts['--']) {\n argv['--'] = new Array();\n notFlags.forEach(function(key) {\n argv['--'].push(key);\n });\n }\n else {\n notFlags.forEach(function(key) {\n argv._.push(key);\n });\n }\n\n return argv;\n};\n\nfunction hasKey (obj, keys) {\n var o = obj;\n keys.slice(0,-1).forEach(function (key) {\n o = (o[key] || {});\n });\n\n var key = keys[keys.length - 1];\n return key in o;\n}\n\nfunction isNumber (x) {\n if (typeof x === 'number') return true;\n if (/^0x[0-9a-f]+$/i.test(x)) return true;\n return /^[-+]?(?:\\d+(?:\\.\\d*)?|\\.\\d+)(e[-+]?\\d+)?$/.test(x);\n}\n\n","import {\n\tInputOptions,\n\tOutputOptions,\n\tWarningHandler,\n\tWarningHandlerWithDefault\n} from '../rollup/types';\n\nexport interface GenericConfigObject {\n\t[key: string]: unknown;\n}\n\nexport interface CommandConfigObject {\n\texternal: string[];\n\tglobals: { [id: string]: string } | undefined;\n\t[key: string]: unknown;\n}\n\nconst createGetOption = (config: GenericConfigObject, command: GenericConfigObject) => (\n\tname: string,\n\tdefaultValue?: unknown\n): any =>\n\tcommand[name] !== undefined\n\t\t? command[name]\n\t\t: config[name] !== undefined\n\t\t? config[name]\n\t\t: defaultValue;\n\nconst normalizeObjectOptionValue = (optionValue: any) => {\n\tif (!optionValue) {\n\t\treturn optionValue;\n\t}\n\tif (typeof optionValue !== 'object') {\n\t\treturn {};\n\t}\n\treturn optionValue;\n};\n\nconst getObjectOption = (\n\tconfig: GenericConfigObject,\n\tcommand: GenericConfigObject,\n\tname: string\n) => {\n\tconst commandOption = normalizeObjectOptionValue(command[name]);\n\tconst configOption = normalizeObjectOptionValue(config[name]);\n\tif (commandOption !== undefined) {\n\t\treturn commandOption && configOption ? { ...configOption, ...commandOption } : commandOption;\n\t}\n\treturn configOption;\n};\n\nconst defaultOnWarn: WarningHandler = warning => {\n\tif (typeof warning === 'string') {\n\t\tconsole.warn(warning);\n\t} else {\n\t\tconsole.warn(warning.message);\n\t}\n};\n\nconst getOnWarn = (\n\tconfig: GenericConfigObject,\n\tdefaultOnWarnHandler: WarningHandler = defaultOnWarn\n): WarningHandler =>\n\tconfig.onwarn\n\t\t? warning => (config.onwarn as WarningHandlerWithDefault)(warning, defaultOnWarnHandler)\n\t\t: defaultOnWarnHandler;\n\nconst getExternal = (config: GenericConfigObject, command: CommandConfigObject) => {\n\tconst configExternal = config.external;\n\treturn typeof configExternal === 'function'\n\t\t? (id: string, ...rest: string[]) =>\n\t\t\t\tconfigExternal(id, ...rest) || command.external.indexOf(id) !== -1\n\t\t: (typeof config.external === 'string'\n\t\t\t\t? [configExternal]\n\t\t\t\t: Array.isArray(configExternal)\n\t\t\t\t? configExternal\n\t\t\t\t: []\n\t\t ).concat(command.external);\n};\n\nexport const commandAliases: { [key: string]: string } = {\n\tc: 'config',\n\td: 'dir',\n\te: 'external',\n\tf: 'format',\n\tg: 'globals',\n\th: 'help',\n\ti: 'input',\n\tm: 'sourcemap',\n\tn: 'name',\n\to: 'file',\n\tv: 'version',\n\tw: 'watch'\n};\n\nexport default function mergeOptions({\n\tconfig = {},\n\tcommand: rawCommandOptions = {},\n\tdefaultOnWarnHandler\n}: {\n\tcommand?: GenericConfigObject;\n\tconfig: GenericConfigObject;\n\tdefaultOnWarnHandler?: WarningHandler;\n}): {\n\tinputOptions: InputOptions;\n\toptionError: string | null;\n\toutputOptions: any;\n} {\n\tconst command = getCommandOptions(rawCommandOptions);\n\tconst inputOptions = getInputOptions(config, command, defaultOnWarnHandler as WarningHandler);\n\n\tif (command.output) {\n\t\tObject.assign(command, command.output);\n\t}\n\n\tconst output = config.output;\n\tconst normalizedOutputOptions = Array.isArray(output) ? output : output ? [output] : [];\n\tif (normalizedOutputOptions.length === 0) normalizedOutputOptions.push({});\n\tconst outputOptions = normalizedOutputOptions.map(singleOutputOptions =>\n\t\tgetOutputOptions(singleOutputOptions, command)\n\t);\n\n\tconst unknownOptionErrors: string[] = [];\n\tconst validInputOptions = Object.keys(inputOptions);\n\taddUnknownOptionErrors(\n\t\tunknownOptionErrors,\n\t\tObject.keys(config),\n\t\tvalidInputOptions,\n\t\t'input option',\n\t\t/^output$/\n\t);\n\n\tconst validOutputOptions = Object.keys(outputOptions[0]);\n\taddUnknownOptionErrors(\n\t\tunknownOptionErrors,\n\t\toutputOptions.reduce<string[]>((allKeys, options) => allKeys.concat(Object.keys(options)), []),\n\t\tvalidOutputOptions,\n\t\t'output option'\n\t);\n\n\tconst validCliOutputOptions = validOutputOptions.filter(\n\t\toption => option !== 'sourcemapPathTransform'\n\t);\n\taddUnknownOptionErrors(\n\t\tunknownOptionErrors,\n\t\tObject.keys(command),\n\t\tvalidInputOptions.concat(\n\t\t\tvalidCliOutputOptions,\n\t\t\tObject.keys(commandAliases),\n\t\t\t'config',\n\t\t\t'environment',\n\t\t\t'silent'\n\t\t),\n\t\t'CLI flag',\n\t\t/^_|output|(config.*)$/\n\t);\n\n\treturn {\n\t\tinputOptions,\n\t\toptionError: unknownOptionErrors.length > 0 ? unknownOptionErrors.join('\\n') : null,\n\t\toutputOptions\n\t};\n}\n\nfunction addUnknownOptionErrors(\n\terrors: string[],\n\toptions: string[],\n\tvalidOptions: string[],\n\toptionType: string,\n\tignoredKeys: RegExp = /$./\n) {\n\tconst unknownOptions = options.filter(\n\t\tkey => validOptions.indexOf(key) === -1 && !ignoredKeys.test(key)\n\t);\n\tif (unknownOptions.length > 0)\n\t\terrors.push(\n\t\t\t`Unknown ${optionType}: ${unknownOptions.join(\n\t\t\t\t', '\n\t\t\t)}. Allowed options: ${validOptions.sort().join(', ')}`\n\t\t);\n}\n\nfunction getCommandOptions(rawCommandOptions: GenericConfigObject): CommandConfigObject {\n\tconst external =\n\t\trawCommandOptions.external && typeof rawCommandOptions.external === 'string'\n\t\t\t? rawCommandOptions.external.split(',')\n\t\t\t: [];\n\treturn {\n\t\t...rawCommandOptions,\n\t\texternal,\n\t\tglobals:\n\t\t\ttypeof rawCommandOptions.globals === 'string'\n\t\t\t\t? rawCommandOptions.globals.split(',').reduce((globals, globalDefinition) => {\n\t\t\t\t\t\tconst [id, variableName] = globalDefinition.split(':');\n\t\t\t\t\t\tglobals[id] = variableName;\n\t\t\t\t\t\tif (external.indexOf(id) === -1) {\n\t\t\t\t\t\t\texternal.push(id);\n\t\t\t\t\t\t}\n\t\t\t\t\t\treturn globals;\n\t\t\t\t }, Object.create(null))\n\t\t\t\t: undefined\n\t};\n}\n\nfunction getInputOptions(\n\tconfig: GenericConfigObject,\n\tcommand: CommandConfigObject = { external: [], globals: undefined },\n\tdefaultOnWarnHandler: WarningHandler\n): InputOptions {\n\tconst getOption = createGetOption(config, command);\n\n\tconst inputOptions: InputOptions = {\n\t\tacorn: config.acorn,\n\t\tacornInjectPlugins: config.acornInjectPlugins as any,\n\t\tcache: getOption('cache'),\n\t\tchunkGroupingSize: getOption('chunkGroupingSize', 5000),\n\t\tcontext: config.context as any,\n\t\texperimentalCacheExpiry: getOption('experimentalCacheExpiry', 10),\n\t\texperimentalOptimizeChunks: getOption('experimentalOptimizeChunks'),\n\t\texperimentalTopLevelAwait: getOption('experimentalTopLevelAwait'),\n\t\texternal: getExternal(config, command) as any,\n\t\tinlineDynamicImports: getOption('inlineDynamicImports', false),\n\t\tinput: getOption('input', []),\n\t\tmanualChunks: getOption('manualChunks'),\n\t\tmoduleContext: config.moduleContext as any,\n\t\tonwarn: getOnWarn(config, defaultOnWarnHandler),\n\t\tperf: getOption('perf', false),\n\t\tplugins: config.plugins as any,\n\t\tpreserveModules: getOption('preserveModules'),\n\t\tpreserveSymlinks: getOption('preserveSymlinks'),\n\t\tshimMissingExports: getOption('shimMissingExports'),\n\t\tstrictDeprecations: getOption('strictDeprecations', false),\n\t\ttreeshake: getObjectOption(config, command, 'treeshake'),\n\t\twatch: config.watch as any\n\t};\n\n\t// support rollup({ cache: prevBuildObject })\n\tif (inputOptions.cache && (inputOptions.cache as any).cache)\n\t\tinputOptions.cache = (inputOptions.cache as any).cache;\n\n\treturn inputOptions;\n}\n\nfunction getOutputOptions(\n\tconfig: GenericConfigObject,\n\tcommand: GenericConfigObject = {}\n): OutputOptions {\n\tconst getOption = createGetOption(config, command);\n\tlet format = getOption('format');\n\n\t// Handle format aliases\n\tswitch (format) {\n\t\tcase 'esm':\n\t\tcase 'module':\n\t\t\tformat = 'es';\n\t\t\tbreak;\n\t\tcase 'commonjs':\n\t\t\tformat = 'cjs';\n\t}\n\n\treturn {\n\t\tamd: { ...config.amd, ...command.amd } as any,\n\t\tassetFileNames: getOption('assetFileNames'),\n\t\tbanner: getOption('banner'),\n\t\tchunkFileNames: getOption('chunkFileNames'),\n\t\tcompact: getOption('compact', false),\n\t\tdir: getOption('dir'),\n\t\tdynamicImportFunction: getOption('dynamicImportFunction'),\n\t\tentryFileNames: getOption('entryFileNames'),\n\t\tesModule: getOption('esModule', true),\n\t\texports: getOption('exports'),\n\t\textend: getOption('extend'),\n\t\tfile: getOption('file'),\n\t\tfooter: getOption('footer'),\n\t\tformat: format === 'esm' ? 'es' : format,\n\t\tfreeze: getOption('freeze', true),\n\t\tglobals: getOption('globals'),\n\t\tindent: getOption('indent', true),\n\t\tinterop: getOption('interop', true),\n\t\tintro: getOption('intro'),\n\t\tname: getOption('name'),\n\t\tnamespaceToStringTag: getOption('namespaceToStringTag', false),\n\t\tnoConflict: getOption('noConflict'),\n\t\toutro: getOption('outro'),\n\t\tpaths: getOption('paths'),\n\t\tpreferConst: getOption('preferConst'),\n\t\tsourcemap: getOption('sourcemap'),\n\t\tsourcemapExcludeSources: getOption('sourcemapExcludeSources'),\n\t\tsourcemapFile: getOption('sourcemapFile'),\n\t\tsourcemapPathTransform: getOption('sourcemapPathTransform'),\n\t\tstrict: getOption('strict', true)\n\t};\n}\n","/*\nrelative require\n*/'use strict';\n\nvar path = require('path');\nvar Module = require('module');\n\nvar modules = {};\n\nvar getModule = function(dir) {\n var rootPath = dir ? path.resolve(dir) : process.cwd();\n var rootName = path.join(rootPath, '@root');\n var root = modules[rootName];\n if (!root) {\n root = new Module(rootName);\n root.filename = rootName;\n root.paths = Module._nodeModulePaths(rootPath);\n modules[rootName] = root;\n }\n return root;\n};\n\nvar requireRelative = function(requested, relativeTo) {\n var root = getModule(relativeTo);\n return root.require(requested);\n};\n\nrequireRelative.resolve = function(requested, relativeTo) {\n var root = getModule(relativeTo);\n return Module._resolveFilename(requested, root);\n};\n\nmodule.exports = requireRelative;\n","const absolutePath = /^(?:\\/|(?:[A-Za-z]:)?[\\\\|/])/;\nconst relativePath = /^\\.?\\.\\//;\n\nexport function isAbsolute(path: string) {\n\treturn absolutePath.test(path);\n}\n\nexport function isRelative(path: string) {\n\treturn relativePath.test(path);\n}\n\nexport function normalize(path: string) {\n\tif (path.indexOf('\\\\') == -1) return path;\n\treturn path.replace(/\\\\/g, '/');\n}\n\nexport { basename, dirname, extname, relative, resolve } from 'path';\n","import { basename, extname, isAbsolute, relative } from './path';\n\nexport function getAliasName(id: string) {\n\tconst base = basename(id);\n\treturn base.substr(0, base.length - extname(id).length);\n}\n\nexport default function relativeId(id: string) {\n\tif (typeof process === 'undefined' || !isAbsolute(id)) return id;\n\treturn relative(process.cwd(), id);\n}\n\nexport function isPlainName(name: string) {\n\t// not starting with \"./\", \"/\". \"../\"\n\treturn !(\n\t\tname[0] === '/' ||\n\t\t(name[1] === '.' && (name[2] === '/' || (name[2] === '.' && name[3] === '/')))\n\t);\n}\n","\"use strict\"\n\nconst tc = {\n enabled:\n process.env.FORCE_COLOR ||\n process.platform === \"win32\" ||\n (process.stdout.isTTY && process.env.TERM && process.env.TERM !== \"dumb\")\n}\nconst Styles = (tc.Styles = {})\nconst defineProp = Object.defineProperty\n\nconst init = (style, open, close, re) => {\n let i,\n len = 1,\n seq = [(Styles[style] = { open, close, re })]\n\n const fn = s => {\n if (tc.enabled) {\n for (i = 0, s += \"\"; i < len; i++) {\n style = seq[i]\n s =\n (open = style.open) +\n (~s.indexOf((close = style.close), 4) // skip first \\x1b[\n ? s.replace(style.re, open)\n : s) +\n close\n }\n len = 1\n }\n return s\n }\n\n defineProp(tc, style, {\n get: () => {\n for (let k in Styles)\n defineProp(fn, k, {\n get: () => ((seq[len++] = Styles[k]), fn)\n })\n delete tc[style]\n return (tc[style] = fn)\n },\n configurable: true\n })\n}\n\ninit(\"reset\", \"\\x1b[0m\", \"\\x1b[0m\", /\\x1b\\[0m/g)\ninit(\"bold\", \"\\x1b[1m\", \"\\x1b[22m\", /\\x1b\\[22m/g)\ninit(\"dim\", \"\\x1b[2m\", \"\\x1b[22m\", /\\x1b\\[22m/g)\ninit(\"italic\", \"\\x1b[3m\", \"\\x1b[23m\", /\\x1b\\[23m/g)\ninit(\"underline\", \"\\x1b[4m\", \"\\x1b[24m\", /\\x1b\\[24m/g)\ninit(\"inverse\", \"\\x1b[7m\", \"\\x1b[27m\", /\\x1b\\[27m/g)\ninit(\"hidden\", \"\\x1b[8m\", \"\\x1b[28m\", /\\x1b\\[28m/g)\ninit(\"strikethrough\", \"\\x1b[9m\", \"\\x1b[29m\", /\\x1b\\[29m/g)\ninit(\"black\", \"\\x1b[30m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"red\", \"\\x1b[31m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"green\", \"\\x1b[32m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"yellow\", \"\\x1b[33m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"blue\", \"\\x1b[34m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"magenta\", \"\\x1b[35m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"cyan\", \"\\x1b[36m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"white\", \"\\x1b[37m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"gray\", \"\\x1b[90m\", \"\\x1b[39m\", /\\x1b\\[39m/g)\ninit(\"bgBlack\", \"\\x1b[40m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgRed\", \"\\x1b[41m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgGreen\", \"\\x1b[42m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgYellow\", \"\\x1b[43m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgBlue\", \"\\x1b[44m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgMagenta\", \"\\x1b[45m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgCyan\", \"\\x1b[46m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\ninit(\"bgWhite\", \"\\x1b[47m\", \"\\x1b[49m\", /\\x1b\\[49m/g)\n\nmodule.exports = tc\n","import tc from 'turbocolor';\nimport { RollupError } from '../../src/rollup/types';\nimport relativeId from '../../src/utils/relativeId';\n\n// log to stderr to keep `rollup main.js > bundle.js` from breaking\nexport const stderr = console.error.bind(console);\n\nexport function handleError(err: RollupError, recover = false) {\n\tlet description = err.message || err;\n\tif (err.name) description = `${err.name}: ${description}`;\n\tconst message =\n\t\t((err as { plugin?: string }).plugin\n\t\t\t? `(plugin ${(err as { plugin?: string }).plugin}) ${description}`\n\t\t\t: description) || err;\n\n\tstderr(tc.bold.red(`[!] ${tc.bold(message.toString())}`));\n\n\tif (err.url) {\n\t\tstderr(tc.cyan(err.url));\n\t}\n\n\tif (err.loc) {\n\t\tstderr(`${relativeId((err.loc.file || err.id) as string)} (${err.loc.line}:${err.loc.column})`);\n\t} else if (err.id) {\n\t\tstderr(relativeId(err.id));\n\t}\n\n\tif (err.frame) {\n\t\tstderr(tc.dim(err.frame));\n\t}\n\n\tif (err.stack) {\n\t\tstderr(tc.dim(err.stack));\n\t}\n\n\tstderr('');\n\n\tif (!recover) process.exit(1);\n}\n","import tc from 'turbocolor';\nimport { RollupWarning } from '../../../src/rollup/types';\nimport relativeId from '../../../src/utils/relativeId';\nimport { stderr } from '../logging';\n\nexport interface BatchWarnings {\n\tadd: (warning: string | RollupWarning) => void;\n\treadonly count: number;\n\tflush: () => void;\n}\n\nexport default function batchWarnings() {\n\tlet allWarnings = new Map<string, RollupWarning[]>();\n\tlet count = 0;\n\n\treturn {\n\t\tget count() {\n\t\t\treturn count;\n\t\t},\n\n\t\tadd: (warning: string | RollupWarning) => {\n\t\t\tif (typeof warning === 'string') {\n\t\t\t\twarning = { code: 'UNKNOWN', message: warning };\n\t\t\t}\n\n\t\t\tif ((warning.code as string) in immediateHandlers) {\n\t\t\t\timmediateHandlers[warning.code as string](warning);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tif (!allWarnings.has(warning.code as string)) allWarnings.set(warning.code as string, []);\n\t\t\t(allWarnings.get(warning.code as string) as RollupWarning[]).push(warning);\n\n\t\t\tcount += 1;\n\t\t},\n\n\t\tflush: () => {\n\t\t\tif (count === 0) return;\n\n\t\t\tconst codes = Array.from(allWarnings.keys()).sort((a, b) => {\n\t\t\t\tif (deferredHandlers[a] && deferredHandlers[b]) {\n\t\t\t\t\treturn deferredHandlers[a].priority - deferredHandlers[b].priority;\n\t\t\t\t}\n\n\t\t\t\tif (deferredHandlers[a]) return -1;\n\t\t\t\tif (deferredHandlers[b]) return 1;\n\t\t\t\treturn (\n\t\t\t\t\t(allWarnings.get(b) as RollupWarning[]).length -\n\t\t\t\t\t(allWarnings.get(a) as RollupWarning[]).length\n\t\t\t\t);\n\t\t\t});\n\n\t\t\tcodes.forEach(code => {\n\t\t\t\tconst handler = deferredHandlers[code];\n\t\t\t\tconst warnings = allWarnings.get(code);\n\n\t\t\t\tif (handler) {\n\t\t\t\t\thandler.fn(warnings as RollupWarning[]);\n\t\t\t\t} else {\n\t\t\t\t\t(warnings as RollupWarning[]).forEach(warning => {\n\t\t\t\t\t\ttitle(warning.message);\n\n\t\t\t\t\t\tif (warning.url) info(warning.url);\n\n\t\t\t\t\t\tconst id = (warning.loc && warning.loc.file) || warning.id;\n\t\t\t\t\t\tif (id) {\n\t\t\t\t\t\t\tconst loc = warning.loc\n\t\t\t\t\t\t\t\t? `${relativeId(id)}: (${warning.loc.line}:${warning.loc.column})`\n\t\t\t\t\t\t\t\t: relativeId(id);\n\n\t\t\t\t\t\t\tstderr(tc.bold(relativeId(loc)));\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif (warning.frame) info(warning.frame);\n\t\t\t\t\t});\n\t\t\t\t}\n\t\t\t});\n\n\t\t\tallWarnings = new Map();\n\t\t\tcount = 0;\n\t\t}\n\t};\n}\n\nconst immediateHandlers: {\n\t[code: string]: (warning: RollupWarning) => void;\n} = {\n\tUNKNOWN_OPTION: warning => {\n\t\ttitle(`You have passed an unrecognized option`);\n\t\tstderr(warning.message);\n\t},\n\n\tMISSING_NODE_BUILTINS: warning => {\n\t\ttitle(`Missing shims for Node.js built-ins`);\n\n\t\tconst detail =\n\t\t\t(warning.modules as string[]).length === 1\n\t\t\t\t? `'${(warning.modules as string[])[0]}'`\n\t\t\t\t: `${(warning.modules as string[])\n\t\t\t\t\t\t.slice(0, -1)\n\t\t\t\t\t\t.map((name: string) => `'${name}'`)\n\t\t\t\t\t\t.join(', ')} and '${(warning.modules as string[]).slice(-1)}'`;\n\t\tstderr(\n\t\t\t`Creating a browser bundle that depends on ${detail}. You might need to include https://www.npmjs.com/package/rollup-plugin-node-builtins`\n\t\t);\n\t},\n\n\tMIXED_EXPORTS: () => {\n\t\ttitle('Mixing named and default exports');\n\t\tstderr(\n\t\t\t`Consumers of your bundle will have to use bundle['default'] to access the default export, which may not be what you want. Use \\`output.exports: 'named'\\` to disable this warning`\n\t\t);\n\t},\n\n\tEMPTY_BUNDLE: () => {\n\t\ttitle(`Generated an empty bundle`);\n\t}\n};\n\n// TODO select sensible priorities\nconst deferredHandlers: {\n\t[code: string]: {\n\t\tfn: (warnings: RollupWarning[]) => void;\n\t\tpriority: number;\n\t};\n} = {\n\tUNUSED_EXTERNAL_IMPORT: {\n\t\tfn: warnings => {\n\t\t\ttitle('Unused external imports');\n\t\t\twarnings.forEach(warning => {\n\t\t\t\tstderr(`${warning.names} imported from external module '${warning.source}' but never used`);\n\t\t\t});\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tUNRESOLVED_IMPORT: {\n\t\tfn: warnings => {\n\t\t\ttitle('Unresolved dependencies');\n\t\t\tinfo('https://rollupjs.org/guide/en/#warning-treating-module-as-external-dependency');\n\n\t\t\tconst dependencies = new Map();\n\t\t\twarnings.forEach(warning => {\n\t\t\t\tif (!dependencies.has(warning.source)) dependencies.set(warning.source, []);\n\t\t\t\tdependencies.get(warning.source).push(warning.importer);\n\t\t\t});\n\n\t\t\tArray.from(dependencies.keys()).forEach(dependency => {\n\t\t\t\tconst importers = dependencies.get(dependency);\n\t\t\t\tstderr(`${tc.bold(dependency)} (imported by ${importers.join(', ')})`);\n\t\t\t});\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tMISSING_EXPORT: {\n\t\tfn: warnings => {\n\t\t\ttitle('Missing exports');\n\t\t\tinfo('https://rollupjs.org/guide/en/#error-name-is-not-exported-by-module-');\n\n\t\t\twarnings.forEach(warning => {\n\t\t\t\tstderr(tc.bold(warning.importer as string));\n\t\t\t\tstderr(`${warning.missing} is not exported by ${warning.exporter}`);\n\t\t\t\tstderr(tc.gray(warning.frame as string));\n\t\t\t});\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tTHIS_IS_UNDEFINED: {\n\t\tfn: warnings => {\n\t\t\ttitle('`this` has been rewritten to `undefined`');\n\t\t\tinfo('https://rollupjs.org/guide/en/#error-this-is-undefined');\n\t\t\tshowTruncatedWarnings(warnings);\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tEVAL: {\n\t\tfn: warnings => {\n\t\t\ttitle('Use of eval is strongly discouraged');\n\t\t\tinfo('https://rollupjs.org/guide/en/#avoiding-eval');\n\t\t\tshowTruncatedWarnings(warnings);\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tNON_EXISTENT_EXPORT: {\n\t\tfn: warnings => {\n\t\t\ttitle(`Import of non-existent ${warnings.length > 1 ? 'exports' : 'export'}`);\n\t\t\tshowTruncatedWarnings(warnings);\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tNAMESPACE_CONFLICT: {\n\t\tfn: warnings => {\n\t\t\ttitle(`Conflicting re-exports`);\n\t\t\twarnings.forEach(warning => {\n\t\t\t\tstderr(\n\t\t\t\t\t`${tc.bold(relativeId(warning.reexporter as string))} re-exports '${\n\t\t\t\t\t\twarning.name\n\t\t\t\t\t}' from both ${relativeId((warning.sources as string[])[0])} and ${relativeId(\n\t\t\t\t\t\t(warning.sources as string[])[1]\n\t\t\t\t\t)} (will be ignored)`\n\t\t\t\t);\n\t\t\t});\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tMISSING_GLOBAL_NAME: {\n\t\tfn: warnings => {\n\t\t\ttitle(`Missing global variable ${warnings.length > 1 ? 'names' : 'name'}`);\n\t\t\tstderr(\n\t\t\t\t`Use output.globals to specify browser global variable names corresponding to external modules`\n\t\t\t);\n\t\t\twarnings.forEach(warning => {\n\t\t\t\tstderr(`${tc.bold(warning.source as string)} (guessing '${warning.guess}')`);\n\t\t\t});\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tSOURCEMAP_BROKEN: {\n\t\tfn: warnings => {\n\t\t\ttitle(`Broken sourcemap`);\n\t\t\tinfo('https://rollupjs.org/guide/en/#warning-sourcemap-is-likely-to-be-incorrect');\n\n\t\t\tconst plugins = Array.from(new Set(warnings.map(w => w.plugin).filter(Boolean)));\n\t\t\tconst detail =\n\t\t\t\tplugins.length === 0\n\t\t\t\t\t? ''\n\t\t\t\t\t: plugins.length > 1\n\t\t\t\t\t? ` (such as ${plugins\n\t\t\t\t\t\t\t.slice(0, -1)\n\t\t\t\t\t\t\t.map(p => `'${p}'`)\n\t\t\t\t\t\t\t.join(', ')} and '${plugins.slice(-1)}')`\n\t\t\t\t\t: ` (such as '${plugins[0]}')`;\n\n\t\t\tstderr(`Plugins that transform code${detail} should generate accompanying sourcemaps`);\n\t\t},\n\t\tpriority: 1\n\t},\n\n\tPLUGIN_WARNING: {\n\t\tfn: warnings => {\n\t\t\tconst nestedByPlugin = nest(warnings, 'plugin');\n\n\t\t\tnestedByPlugin.forEach(({ key: plugin, items }) => {\n\t\t\t\tconst nestedByMessage = nest(items, 'message');\n\n\t\t\t\tlet lastUrl: string;\n\n\t\t\t\tnestedByMessage.forEach(({ key: message, items }) => {\n\t\t\t\t\ttitle(`Plugin ${plugin}: ${message}`);\n\t\t\t\t\titems.forEach(warning => {\n\t\t\t\t\t\tif (warning.url !== lastUrl) info((lastUrl = warning.url as string));\n\n\t\t\t\t\t\tif (warning.id) {\n\t\t\t\t\t\t\tlet loc = relativeId(warning.id);\n\t\t\t\t\t\t\tif (warning.loc) {\n\t\t\t\t\t\t\t\tloc += `: (${warning.loc.line}:${warning.loc.column})`;\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tstderr(tc.bold(loc));\n\t\t\t\t\t\t}\n\t\t\t\t\t\tif (warning.frame) info(warning.frame);\n\t\t\t\t\t});\n\t\t\t\t});\n\t\t\t});\n\t\t},\n\t\tpriority: 1\n\t}\n};\n\nfunction title(str: string) {\n\tstderr(`${tc.bold.yellow('(!)')} ${tc.bold.yellow(str)}`);\n}\n\nfunction info(url: string) {\n\tstderr(tc.gray(url));\n}\n\nfunction nest<T>(array: T[], prop: string) {\n\tconst nested: { items: T[]; key: string }[] = [];\n\tconst lookup = new Map<string, { items: T[]; key: string }>();\n\n\tarray.forEach(item => {\n\t\tconst key = (item as any)[prop];\n\t\tif (!lookup.has(key)) {\n\t\t\tlookup.set(key, {\n\t\t\t\titems: [],\n\t\t\t\tkey\n\t\t\t});\n\n\t\t\tnested.push(lookup.get(key) as { items: T[]; key: string });\n\t\t}\n\n\t\t(lookup.get(key) as { items: T[]; key: string }).items.push(item);\n\t});\n\n\treturn nested;\n}\n\nfunction showTruncatedWarnings(warnings: RollupWarning[]) {\n\tconst nestedByModule = nest(warnings, 'id');\n\n\tconst sliced = nestedByModule.length > 5 ? nestedByModule.slice(0, 3) : nestedByModule;\n\tsliced.forEach(({ key: id, items }) => {\n\t\tstderr(tc.bold(relativeId(id)));\n\t\tstderr(tc.gray(items[0].frame as string));\n\n\t\tif (items.length > 1) {\n\t\t\tstderr(`...and ${items.length - 1} other ${items.length > 2 ? 'occurrences' : 'occurrence'}`);\n\t\t}\n\t});\n\n\tif (nestedByModule.length > sliced.length) {\n\t\tstderr(`\\n...and ${nestedByModule.length - sliced.length} other files`);\n\t}\n}\n","'use strict';\nmodule.exports = milliseconds => {\n\tif (typeof milliseconds !== 'number') {\n\t\tthrow new TypeError('Expected a number');\n\t}\n\n\tconst roundTowardsZero = milliseconds > 0 ? Math.floor : Math.ceil;\n\n\treturn {\n\t\tdays: roundTowardsZero(milliseconds / 86400000),\n\t\thours: roundTowardsZero(milliseconds / 3600000) % 24,\n\t\tminutes: roundTowardsZero(milliseconds / 60000) % 60,\n\t\tseconds: roundTowardsZero(milliseconds / 1000) % 60,\n\t\tmilliseconds: roundTowardsZero(milliseconds) % 1000,\n\t\tmicroseconds: roundTowardsZero(milliseconds * 1000) % 1000,\n\t\tnanoseconds: roundTowardsZero(milliseconds * 1e6) % 1000\n\t};\n};\n","'use strict';\nconst parseMilliseconds = require('parse-ms');\n\nconst pluralize = (word, count) => count === 1 ? word : word + 's';\n\nmodule.exports = (milliseconds, options = {}) => {\n\tif (!Number.isFinite(milliseconds)) {\n\t\tthrow new TypeError('Expected a finite number');\n\t}\n\n\tif (options.compact) {\n\t\toptions.secondsDecimalDigits = 0;\n\t\toptions.millisecondsDecimalDigits = 0;\n\t}\n\n\tconst result = [];\n\n\tconst add = (value, long, short, valueString) => {\n\t\tif (value === 0) {\n\t\t\treturn;\n\t\t}\n\n\t\tconst postfix = options.verbose ? ' ' + pluralize(long, value) : short;\n\n\t\tresult.push((valueString || value) + postfix);\n\t};\n\n\tconst secondsDecimalDigits =\n\t\ttypeof options.secondsDecimalDigits === 'number' ?\n\t\t\toptions.secondsDecimalDigits :\n\t\t\t1;\n\n\tif (secondsDecimalDigits < 1) {\n\t\tconst difference = 1000 - (milliseconds % 1000);\n\t\tif (difference < 500) {\n\t\t\tmilliseconds += difference;\n\t\t}\n\t}\n\n\tconst parsed = parseMilliseconds(milliseconds);\n\n\tadd(Math.trunc(parsed.days / 365), 'year', 'y');\n\tadd(parsed.days % 365, 'day', 'd');\n\tadd(parsed.hours, 'hour', 'h');\n\tadd(parsed.minutes, 'minute', 'm');\n\n\tif (\n\t\toptions.separateMilliseconds ||\n\t\toptions.formatSubMilliseconds ||\n\t\tmilliseconds < 1000\n\t) {\n\t\tadd(parsed.seconds, 'second', 's');\n\t\tif (options.formatSubMilliseconds) {\n\t\t\tadd(parsed.milliseconds, 'millisecond', 'ms');\n\t\t\tadd(parsed.microseconds, 'microsecond', 'µs');\n\t\t\tadd(parsed.nanoseconds, 'nanosecond', 'ns');\n\t\t} else {\n\t\t\tconst millisecondsAndBelow =\n\t\t\t\tparsed.milliseconds +\n\t\t\t\t(parsed.microseconds / 1000) +\n\t\t\t\t(parsed.nanoseconds / 1e6);\n\n\t\t\tconst millisecondsDecimalDigits =\n\t\t\t\ttypeof options.millisecondsDecimalDigits === 'number' ?\n\t\t\t\t\toptions.millisecondsDecimalDigits :\n\t\t\t\t\t0;\n\n\t\t\tconst millisecondsString = millisecondsDecimalDigits ?\n\t\t\t\tmillisecondsAndBelow.toFixed(millisecondsDecimalDigits) :\n\t\t\t\tMath.ceil(millisecondsAndBelow);\n\n\t\t\tadd(\n\t\t\t\tparseFloat(millisecondsString, 10),\n\t\t\t\t'millisecond',\n\t\t\t\t'ms',\n\t\t\t\tmillisecondsString\n\t\t\t);\n\t\t}\n\t} else {\n\t\tconst seconds = (milliseconds / 1000) % 60;\n\t\tconst secondsDecimalDigits =\n\t\t\ttypeof options.secondsDecimalDigits === 'number' ?\n\t\t\t\toptions.secondsDecimalDigits :\n\t\t\t\t1;\n\t\tconst secondsFixed = seconds.toFixed(secondsDecimalDigits);\n\t\tconst secondsString = options.keepDecimalsOnWholeSeconds ?\n\t\t\tsecondsFixed :\n\t\t\tsecondsFixed.replace(/\\.0+$/, '');\n\t\tadd(parseFloat(secondsString, 10), 'second', 's', secondsString);\n\t}\n\n\tif (result.length === 0) {\n\t\treturn '0' + (options.verbose ? ' milliseconds' : 'ms');\n\t}\n\n\tif (options.compact) {\n\t\treturn '~' + result[0];\n\t}\n\n\tif (typeof options.unitCount === 'number') {\n\t\treturn '~' + result.slice(0, Math.max(options.unitCount, 1)).join(' ');\n\t}\n\n\treturn result.join(' ');\n};\n","let SOURCEMAPPING_URL = 'sourceMa';\nSOURCEMAPPING_URL += 'ppingURL';\n\nexport default SOURCEMAPPING_URL;\n","'use strict';\n\nconst UNITS = [\n\t'B',\n\t'kB',\n\t'MB',\n\t'GB',\n\t'TB',\n\t'PB',\n\t'EB',\n\t'ZB',\n\t'YB'\n];\n\n/*\nFormats the given number using `Number#toLocaleString`.\n- If locale is a string, the value is expected to be a locale-key (for example: `de`).\n- If locale is true, the system default locale is used for translation.\n- If no value for locale is specified, the number is returned unmodified.\n*/\nconst toLocaleString = (number, locale) => {\n\tlet result = number;\n\tif (typeof locale === 'string') {\n\t\tresult = number.toLocaleString(locale);\n\t} else if (locale === true) {\n\t\tresult = number.toLocaleString();\n\t}\n\n\treturn result;\n};\n\nmodule.exports = (number, options) => {\n\tif (!Number.isFinite(number)) {\n\t\tthrow new TypeError(`Expected a finite number, got ${typeof number}: ${number}`);\n\t}\n\n\toptions = Object.assign({}, options);\n\n\tif (options.signed && number === 0) {\n\t\treturn ' 0 B';\n\t}\n\n\tconst isNegative = number < 0;\n\tconst prefix = isNegative ? '-' : (options.signed ? '+' : '');\n\n\tif (isNegative) {\n\t\tnumber = -number;\n\t}\n\n\tif (number < 1) {\n\t\tconst numberString = toLocaleString(number, options.locale);\n\t\treturn prefix + numberString + ' B';\n\t}\n\n\tconst exponent = Math.min(Math.floor(Math.log10(number) / 3), UNITS.length - 1);\n\t// eslint-disable-next-line unicorn/prefer-exponentiation-operator\n\tnumber = Number((number / Math.pow(1000, exponent)).toPrecision(3));\n\tconst numberString = toLocaleString(number, options.locale);\n\n\tconst unit = UNITS[exponent];\n\n\treturn prefix + numberString + ' ' + unit;\n};\n","import prettyBytes from 'pretty-bytes';\nimport tc from 'turbocolor';\nimport { SerializedTimings } from '../../../src/rollup/types';\n\nexport function printTimings(timings: SerializedTimings) {\n\tObject.keys(timings).forEach(label => {\n\t\tconst color =\n\t\t\tlabel[0] === '#' ? (label[1] !== '#' ? tc.underline : tc.bold) : (text: string) => text;\n\t\tconst [time, memory, total] = timings[label];\n\t\tconst row = `${label}: ${time.toFixed(0)}ms, ${prettyBytes(memory)} / ${prettyBytes(total)}`;\n\t\tconsole.info(color(row));\n\t});\n}\n","import ms from 'pretty-ms';\nimport * as rollup from 'rollup';\nimport tc from 'turbocolor';\nimport {\n\tInputOptions,\n\tOutputAsset,\n\tOutputChunk,\n\tOutputOptions,\n\tRollupBuild,\n\tSourceMap\n} from '../../../src/rollup/types';\nimport relativeId from '../../../src/utils/relativeId';\nimport { handleError, stderr } from '../logging';\nimport SOURCEMAPPING_URL from '../sourceMappingUrl';\nimport { BatchWarnings } from './batchWarnings';\nimport { printTimings } from './timings';\n\nexport default function build(\n\tinputOptions: InputOptions,\n\toutputOptions: OutputOptions[],\n\twarnings: BatchWarnings,\n\tsilent = false\n) {\n\tconst useStdout = !outputOptions[0].file && !outputOptions[0].dir;\n\n\tconst start = Date.now();\n\tconst files = useStdout\n\t\t? ['stdout']\n\t\t: outputOptions.map(t => relativeId(t.file || (t.dir as string)));\n\tif (!silent) {\n\t\tlet inputFiles: string = undefined as any;\n\t\tif (typeof inputOptions.input === 'string') {\n\t\t\tinputFiles = inputOptions.input;\n\t\t} else if (inputOptions.input instanceof Array) {\n\t\t\tinputFiles = inputOptions.input.join(', ');\n\t\t} else if (typeof inputOptions.input === 'object' && inputOptions.input !== null) {\n\t\t\tinputFiles = Object.keys(inputOptions.input)\n\t\t\t\t.map(name => (inputOptions.input as Record<string, string>)[name])\n\t\t\t\t.join(', ');\n\t\t}\n\t\tstderr(tc.cyan(`\\n${tc.bold(inputFiles)} → ${tc.bold(files.join(', '))}...`));\n\t}\n\n\treturn rollup\n\t\t.rollup(inputOptions)\n\t\t.then((bundle: RollupBuild) => {\n\t\t\tif (useStdout) {\n\t\t\t\tconst output = outputOptions[0];\n\t\t\t\tif (output.sourcemap && output.sourcemap !== 'inline') {\n\t\t\t\t\thandleError({\n\t\t\t\t\t\tcode: 'MISSING_OUTPUT_OPTION',\n\t\t\t\t\t\tmessage: 'You must specify a --file (-o) option when creating a file with a sourcemap'\n\t\t\t\t\t});\n\t\t\t\t}\n\n\t\t\t\treturn bundle.generate(output).then(({ output: outputs }) => {\n\t\t\t\t\tfor (const file of outputs) {\n\t\t\t\t\t\tlet source: string | Buffer;\n\t\t\t\t\t\tif ((file as OutputAsset).isAsset) {\n\t\t\t\t\t\t\tsource = (file as OutputAsset).source;\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tsource = (file as OutputChunk).code;\n\t\t\t\t\t\t\tif (output.sourcemap === 'inline') {\n\t\t\t\t\t\t\t\tsource += `\\n//# ${SOURCEMAPPING_URL}=${((file as OutputChunk)\n\t\t\t\t\t\t\t\t\t.map as SourceMap).toUrl()}\\n`;\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t\tif (outputs.length > 1)\n\t\t\t\t\t\t\tprocess.stdout.write('\\n' + tc.cyan(tc.bold('//→ ' + file.fileName + ':')) + '\\n');\n\t\t\t\t\t\tprocess.stdout.write(source);\n\t\t\t\t\t}\n\t\t\t\t});\n\t\t\t}\n\n\t\t\treturn Promise.all(outputOptions.map(output => bundle.write(output) as Promise<any>)).then(\n\t\t\t\t() => bundle\n\t\t\t);\n\t\t})\n\t\t.then((bundle?: RollupBuild) => {\n\t\t\tif (!silent) {\n\t\t\t\twarnings.flush();\n\t\t\t\tstderr(\n\t\t\t\t\ttc.green(`created ${tc.bold(files.join(', '))} in ${tc.bold(ms(Date.now() - start))}`)\n\t\t\t\t);\n\t\t\t\tif (bundle && bundle.getTimings) {\n\t\t\t\t\tprintTimings(bundle.getTimings());\n\t\t\t\t}\n\t\t\t}\n\t\t})\n\t\t.catch((err: any) => {\n\t\t\tif (warnings.count > 0) warnings.flush();\n\t\t\thandleError(err);\n\t\t});\n}\n","import path from 'path';\nimport rollup from 'rollup';\nimport tc from 'turbocolor';\nimport { RollupBuild, RollupOutput } from '../../../src/rollup/types';\nimport { GenericConfigObject } from '../../../src/utils/mergeOptions';\nimport relativeId from '../../../src/utils/relativeId';\nimport { handleError, stderr } from '../logging';\nimport batchWarnings from './batchWarnings';\n\ninterface NodeModuleWithCompile extends NodeModule {\n\t_compile(code: string, filename: string): any;\n}\n\nexport default function loadConfigFile(\n\tconfigFile: string,\n\tcommandOptions: any = {}\n): Promise<GenericConfigObject[]> {\n\tconst silent = commandOptions.silent || false;\n\tconst warnings = batchWarnings();\n\n\treturn rollup\n\t\t.rollup({\n\t\t\texternal: (id: string) =>\n\t\t\t\t(id[0] !== '.' && !path.isAbsolute(id)) || id.slice(-5, id.length) === '.json',\n\t\t\tinput: configFile,\n\t\t\tonwarn: warnings.add,\n\t\t\ttreeshake: false\n\t\t})\n\t\t.then((bundle: RollupBuild) => {\n\t\t\tif (!silent && warnings.count > 0) {\n\t\t\t\tstderr(tc.bold(`loaded ${relativeId(configFile)} with warnings`));\n\t\t\t\twarnings.flush();\n\t\t\t}\n\n\t\t\treturn bundle.generate({\n\t\t\t\texports: 'named',\n\t\t\t\tformat: 'cjs'\n\t\t\t});\n\t\t})\n\t\t.then(({ output: [{ code }] }: RollupOutput) => {\n\t\t\t// temporarily override require\n\t\t\tconst defaultLoader = require.extensions['.js'];\n\t\t\trequire.extensions['.js'] = (module: NodeModule, filename: string) => {\n\t\t\t\tif (filename === configFile) {\n\t\t\t\t\t(module as NodeModuleWithCompile)._compile(code, filename);\n\t\t\t\t} else {\n\t\t\t\t\tdefaultLoader(module, filename);\n\t\t\t\t}\n\t\t\t};\n\n\t\t\tdelete require.cache[configFile];\n\n\t\t\treturn Promise.resolve(require(configFile))\n\t\t\t\t.then(configFileContent => {\n\t\t\t\t\tif (configFileContent.default) configFileContent = configFileContent.default;\n\t\t\t\t\tif (typeof configFileContent === 'function') {\n\t\t\t\t\t\treturn configFileContent(commandOptions);\n\t\t\t\t\t}\n\t\t\t\t\treturn configFileContent;\n\t\t\t\t})\n\t\t\t\t.then(configs => {\n\t\t\t\t\tif (Object.keys(configs).length === 0) {\n\t\t\t\t\t\thandleError({\n\t\t\t\t\t\t\tcode: 'MISSING_CONFIG',\n\t\t\t\t\t\t\tmessage: 'Config file must export an options object, or an array of options objects',\n\t\t\t\t\t\t\turl: 'https://rollupjs.org/guide/en/#configuration-files'\n\t\t\t\t\t\t});\n\t\t\t\t\t}\n\n\t\t\t\t\trequire.extensions['.js'] = defaultLoader;\n\n\t\t\t\t\treturn Array.isArray(configs) ? configs : [configs];\n\t\t\t\t});\n\t\t});\n}\n","'use strict';\nmodule.exports = date => {\n\tconst offset = (date || new Date()).getTimezoneOffset();\n\tconst absOffset = Math.abs(offset);\n\tconst hours = Math.floor(absOffset / 60);\n\tconst minutes = absOffset % 60;\n\tconst minutesOut = minutes > 0 ? ':' + ('0' + minutes).slice(-2) : '';\n\n\treturn (offset < 0 ? '+' : '-') + hours + minutesOut;\n};\n","'use strict';\nconst timeZone = require('time-zone');\n\nconst dateTime = options => {\n\toptions = Object.assign({\n\t\tdate: new Date(),\n\t\tlocal: true,\n\t\tshowTimeZone: false,\n\t\tshowMilliseconds: false\n\t}, options);\n\n\tlet {date} = options;\n\n\tif (options.local) {\n\t\t// Offset the date so it will return the correct value when getting the ISO string\n\t\tdate = new Date(date.getTime() - (date.getTimezoneOffset() * 60000));\n\t}\n\n\tlet end = '';\n\n\tif (options.showTimeZone) {\n\t\tend = ' UTC' + (options.local ? timeZone(date) : '');\n\t}\n\n\tif (options.showMilliseconds && date.getUTCMilliseconds() > 0) {\n\t\tend = ` ${date.getUTCMilliseconds()}ms${end}`;\n\t}\n\n\treturn date\n\t\t.toISOString()\n\t\t.replace(/T/, ' ')\n\t\t.replace(/\\..+/, end);\n};\n\nmodule.exports = dateTime;\n// TODO: Remove this for the next major release\nmodule.exports.default = dateTime;\n","// This is not the set of all possible signals.\n//\n// It IS, however, the set of all signals that trigger\n// an exit on either Linux or BSD systems. Linux is a\n// superset of the signal names supported on BSD, and\n// the unknown signals just fail to register, so we can\n// catch that easily enough.\n//\n// Don't bother with SIGKILL. It's uncatchable, which\n// means that we can't fire any callbacks anyway.\n//\n// If a user does happen to register a handler on a non-\n// fatal signal like SIGWINCH or something, and then\n// exit, it'll end up firing `process.emit('exit')`, so\n// the handler will be fired anyway.\n//\n// SIGBUS, SIGFPE, SIGSEGV and SIGILL, when not raised\n// artificially, inherently leave the process in a\n// state from which it is not safe to try and enter JS\n// listeners.\nmodule.exports = [\n 'SIGABRT',\n 'SIGALRM',\n 'SIGHUP',\n 'SIGINT',\n 'SIGTERM'\n]\n\nif (process.platform !== 'win32') {\n module.exports.push(\n 'SIGVTALRM',\n 'SIGXCPU',\n 'SIGXFSZ',\n 'SIGUSR2',\n 'SIGTRAP',\n 'SIGSYS',\n 'SIGQUIT',\n 'SIGIOT'\n // should detect profiler and enable/disable accordingly.\n // see #21\n // 'SIGPROF'\n )\n}\n\nif (process.platform === 'linux') {\n module.exports.push(\n 'SIGIO',\n 'SIGPOLL',\n 'SIGPWR',\n 'SIGSTKFLT',\n 'SIGUNUSED'\n )\n}\n","// Note: since nyc uses this module to output coverage, any lines\n// that are in the direct sync flow of nyc's outputCoverage are\n// ignored, since we can never get coverage for them.\nvar assert = require('assert')\nvar signals = require('./signals.js')\n\nvar EE = require('events')\n/* istanbul ignore if */\nif (typeof EE !== 'function') {\n EE = EE.EventEmitter\n}\n\nvar emitter\nif (process.__signal_exit_emitter__) {\n emitter = process.__signal_exit_emitter__\n} else {\n emitter = process.__signal_exit_emitter__ = new EE()\n emitter.count = 0\n emitter.emitted = {}\n}\n\n// Because this emitter is a global, we have to check to see if a\n// previous version of this library failed to enable infinite listeners.\n// I know what you're about to say. But literally everything about\n// signal-exit is a compromise with evil. Get used to it.\nif (!emitter.infinite) {\n emitter.setMaxListeners(Infinity)\n emitter.infinite = true\n}\n\nmodule.exports = function (cb, opts) {\n assert.equal(typeof cb, 'function', 'a callback must be provided for exit handler')\n\n if (loaded === false) {\n load()\n }\n\n var ev = 'exit'\n if (opts && opts.alwaysLast) {\n ev = 'afterexit'\n }\n\n var remove = function () {\n emitter.removeListener(ev, cb)\n if (emitter.listeners('exit').length === 0 &&\n emitter.listeners('afterexit').length === 0) {\n unload()\n }\n }\n emitter.on(ev, cb)\n\n return remove\n}\n\nmodule.exports.unload = unload\nfunction unload () {\n if (!loaded) {\n return\n }\n loaded = false\n\n signals.forEach(function (sig) {\n try {\n process.removeListener(sig, sigListeners[sig])\n } catch (er) {}\n })\n process.emit = originalProcessEmit\n process.reallyExit = originalProcessReallyExit\n emitter.count -= 1\n}\n\nfunction emit (event, code, signal) {\n if (emitter.emitted[event]) {\n return\n }\n emitter.emitted[event] = true\n emitter.emit(event, code, signal)\n}\n\n// { <signal>: <listener fn>, ... }\nvar sigListeners = {}\nsignals.forEach(function (sig) {\n sigListeners[sig] = function listener () {\n // If there are no other listeners, an exit is coming!\n // Simplest way: remove us and then re-send the signal.\n // We know that this will kill the process, so we can\n // safely emit now.\n var listeners = process.listeners(sig)\n if (listeners.length === emitter.count) {\n unload()\n emit('exit', null, sig)\n /* istanbul ignore next */\n emit('afterexit', null, sig)\n /* istanbul ignore next */\n process.kill(process.pid, sig)\n }\n }\n})\n\nmodule.exports.signals = function () {\n return signals\n}\n\nmodule.exports.load = load\n\nvar loaded = false\n\nfunction load () {\n if (loaded) {\n return\n }\n loaded = true\n\n // This is the number of onSignalExit's that are in play.\n // It's important so that we can count the correct number of\n // listeners on signals, and don't wait for the other one to\n // handle it instead of us.\n emitter.count += 1\n\n signals = signals.filter(function (sig) {\n try {\n process.on(sig, sigListeners[sig])\n return true\n } catch (er) {\n return false\n }\n })\n\n process.emit = processEmit\n process.reallyExit = processReallyExit\n}\n\nvar originalProcessReallyExit = process.reallyExit\nfunction processReallyExit (code) {\n process.exitCode = code || 0\n emit('exit', process.exitCode, null)\n /* istanbul ignore next */\n emit('afterexit', process.exitCode, null)\n /* istanbul ignore next */\n originalProcessReallyExit.call(process, process.exitCode)\n}\n\nvar originalProcessEmit = process.emit\nfunction processEmit (ev, arg) {\n if (ev === 'exit') {\n if (arg !== undefined) {\n process.exitCode = arg\n }\n var ret = originalProcessEmit.apply(this, arguments)\n emit('exit', process.exitCode, null)\n /* istanbul ignore next */\n emit('afterexit', process.exitCode, null)\n return ret\n } else {\n return originalProcessEmit.apply(this, arguments)\n }\n}\n","import { stderr } from '../logging';\n\nconst CLEAR_SCREEN = '\\u001Bc';\n\nexport function getResetScreen(clearScreen: boolean) {\n\tif (clearScreen) {\n\t\treturn (heading: string) => stderr(CLEAR_SCREEN + heading);\n\t}\n\n\tlet firstRun = true;\n\treturn (heading: string) => {\n\t\tif (firstRun) {\n\t\t\tstderr(heading);\n\t\t\tfirstRun = false;\n\t\t}\n\t};\n}\n","import dateTime from 'date-time';\nimport fs from 'fs';\nimport ms from 'pretty-ms';\nimport * as rollup from 'rollup';\nimport onExit from 'signal-exit';\nimport tc from 'turbocolor';\nimport {\n\tInputOption,\n\tRollupBuild,\n\tRollupError,\n\tRollupWatchOptions,\n\tWarningHandler,\n\tWatcherOptions\n} from '../../../src/rollup/types';\nimport mergeOptions, { GenericConfigObject } from '../../../src/utils/mergeOptions';\nimport relativeId from '../../../src/utils/relativeId';\nimport { handleError, stderr } from '../logging';\nimport batchWarnings from './batchWarnings';\nimport loadConfigFile from './loadConfigFile';\nimport { getResetScreen } from './resetScreen';\nimport { printTimings } from './timings';\n\ninterface WatchEvent {\n\tcode?: string;\n\tduration?: number;\n\terror?: RollupError | Error;\n\tinput?: InputOption;\n\toutput?: string[];\n\tresult?: RollupBuild;\n}\n\ninterface Watcher {\n\tclose: () => void;\n\ton: (event: string, fn: (event: WatchEvent) => void) => void;\n}\n\nexport default function watch(\n\tconfigFile: string,\n\tconfigs: GenericConfigObject[],\n\tcommand: any,\n\tsilent = false\n) {\n\tconst isTTY = Boolean(process.stderr.isTTY);\n\tconst warnings = batchWarnings();\n\tconst initialConfigs = processConfigs(configs);\n\tconst clearScreen = initialConfigs.every(\n\t\tconfig => (config.watch as WatcherOptions).clearScreen !== false\n\t);\n\n\tconst resetScreen = getResetScreen(isTTY && clearScreen);\n\tlet watcher: Watcher;\n\tlet configWatcher: Watcher;\n\n\tfunction processConfigs(configs: GenericConfigObject[]): RollupWatchOptions[] {\n\t\treturn configs.map(options => {\n\t\t\tconst merged = mergeOptions({\n\t\t\t\tcommand,\n\t\t\t\tconfig: options,\n\t\t\t\tdefaultOnWarnHandler: warnings.add\n\t\t\t});\n\n\t\t\tconst result: RollupWatchOptions = {\n\t\t\t\t...merged.inputOptions,\n\t\t\t\toutput: merged.outputOptions\n\t\t\t};\n\n\t\t\tif (!result.watch) result.watch = {};\n\n\t\t\tif (merged.optionError)\n\t\t\t\t(merged.inputOptions.onwarn as WarningHandler)({\n\t\t\t\t\tcode: 'UNKNOWN_OPTION',\n\t\t\t\t\tmessage: merged.optionError\n\t\t\t\t});\n\n\t\t\treturn result;\n\t\t});\n\t}\n\n\tfunction start(configs: RollupWatchOptions[]) {\n\t\twatcher = rollup.watch(configs);\n\n\t\twatcher.on('event', (event: WatchEvent) => {\n\t\t\tswitch (event.code) {\n\t\t\t\tcase 'FATAL':\n\t\t\t\t\thandleError(event.error as RollupError, true);\n\t\t\t\t\tprocess.exit(1);\n\t\t\t\t\tbreak;\n\n\t\t\t\tcase 'ERROR':\n\t\t\t\t\twarnings.flush();\n\t\t\t\t\thandleError(event.error as RollupError, true);\n\t\t\t\t\tbreak;\n\n\t\t\t\tcase 'START':\n\t\t\t\t\tif (!silent) {\n\t\t\t\t\t\tresetScreen(tc.underline(`rollup v${rollup.VERSION}`));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\n\t\t\t\tcase 'BUNDLE_START':\n\t\t\t\t\tif (!silent) {\n\t\t\t\t\t\tlet input = event.input;\n\t\t\t\t\t\tif (typeof input !== 'string') {\n\t\t\t\t\t\t\tinput = Array.isArray(input)\n\t\t\t\t\t\t\t\t? input.join(', ')\n\t\t\t\t\t\t\t\t: Object.keys(input as Record<string, string>)\n\t\t\t\t\t\t\t\t\t\t.map(key => (input as Record<string, string>)[key])\n\t\t\t\t\t\t\t\t\t\t.join(', ');\n\t\t\t\t\t\t}\n\t\t\t\t\t\tstderr(\n\t\t\t\t\t\t\ttc.cyan(\n\t\t\t\t\t\t\t\t`bundles ${tc.bold(input)} → ${tc.bold(\n\t\t\t\t\t\t\t\t\t(event.output as string[]).map(relativeId).join(', ')\n\t\t\t\t\t\t\t\t)}...`\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t);\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\n\t\t\t\tcase 'BUNDLE_END':\n\t\t\t\t\twarnings.flush();\n\t\t\t\t\tif (!silent)\n\t\t\t\t\t\tstderr(\n\t\t\t\t\t\t\ttc.green(\n\t\t\t\t\t\t\t\t`created ${tc.bold(\n\t\t\t\t\t\t\t\t\t(event.output as string[]).map(relativeId).join(', ')\n\t\t\t\t\t\t\t\t)} in ${tc.bold(ms(event.duration as number))}`\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t);\n\t\t\t\t\tif (event.result && event.result.getTimings) {\n\t\t\t\t\t\tprintTimings(event.result.getTimings());\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\n\t\t\t\tcase 'END':\n\t\t\t\t\tif (!silent && isTTY) {\n\t\t\t\t\t\tstderr(`\\n[${dateTime()}] waiting for changes...`);\n\t\t\t\t\t}\n\t\t\t}\n\t\t});\n\t}\n\n\t// catch ctrl+c, kill, and uncaught errors\n\tconst removeOnExit = onExit(close);\n\tprocess.on('uncaughtException', close);\n\n\t// only listen to stdin if it is a pipe\n\tif (!process.stdin.isTTY) {\n\t\tprocess.stdin.on('end', close); // in case we ever support stdin!\n\t}\n\n\tfunction close(err: Error) {\n\t\tremoveOnExit();\n\t\tprocess.removeListener('uncaughtException', close);\n\t\t// removing a non-existent listener is a no-op\n\t\tprocess.stdin.removeListener('end', close);\n\n\t\tif (watcher) watcher.close();\n\n\t\tif (configWatcher) configWatcher.close();\n\n\t\tif (err) {\n\t\t\tconsole.error(err);\n\t\t\tprocess.exit(1);\n\t\t}\n\t}\n\n\ttry {\n\t\tstart(initialConfigs);\n\t} catch (err) {\n\t\tclose(err);\n\t\treturn;\n\t}\n\n\tif (configFile && !configFile.startsWith('node:')) {\n\t\tlet restarting = false;\n\t\tlet aborted = false;\n\t\tlet configFileData = fs.readFileSync(configFile, 'utf-8');\n\n\t\tconst restart = () => {\n\t\t\tconst newConfigFileData = fs.readFileSync(configFile, 'utf-8');\n\t\t\tif (newConfigFileData === configFileData) return;\n\t\t\tconfigFileData = newConfigFileData;\n\n\t\t\tif (restarting) {\n\t\t\t\taborted = true;\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\trestarting = true;\n\n\t\t\tloadConfigFile(configFile, command)\n\t\t\t\t.then((_configs: RollupWatchOptions[]) => {\n\t\t\t\t\trestarting = false;\n\n\t\t\t\t\tif (aborted) {\n\t\t\t\t\t\taborted = false;\n\t\t\t\t\t\trestart();\n\t\t\t\t\t} else {\n\t\t\t\t\t\twatcher.close();\n\t\t\t\t\t\tstart(initialConfigs);\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t\t.catch((err: Error) => {\n\t\t\t\t\thandleError(err, true);\n\t\t\t\t});\n\t\t};\n\n\t\tconfigWatcher = fs.watch(configFile, (event: string) => {\n\t\t\tif (event === 'change') restart();\n\t\t});\n\t}\n}\n","import { realpathSync } from 'fs';\nimport relative from 'require-relative';\nimport { WarningHandler } from '../../../src/rollup/types';\nimport mergeOptions, { GenericConfigObject } from '../../../src/utils/mergeOptions';\nimport { getAliasName } from '../../../src/utils/relativeId';\nimport { handleError } from '../logging';\nimport batchWarnings from './batchWarnings';\nimport build from './build';\nimport loadConfigFile from './loadConfigFile';\nimport watch from './watch';\n\nexport default function runRollup(command: any) {\n\tlet inputSource;\n\tif (command._.length > 0) {\n\t\tif (command.input) {\n\t\t\thandleError({\n\t\t\t\tcode: 'DUPLICATE_IMPORT_OPTIONS',\n\t\t\t\tmessage: 'use --input, or pass input path as argument'\n\t\t\t});\n\t\t}\n\t\tinputSource = command._;\n\t} else if (typeof command.input === 'string') {\n\t\tinputSource = [command.input];\n\t} else {\n\t\tinputSource = command.input;\n\t}\n\n\tif (inputSource && inputSource.length > 0) {\n\t\tif (inputSource.some((input: string) => input.indexOf('=') !== -1)) {\n\t\t\tcommand.input = {};\n\t\t\tinputSource.forEach((input: string) => {\n\t\t\t\tconst equalsIndex = input.indexOf('=');\n\t\t\t\tconst value = input.substr(equalsIndex + 1);\n\t\t\t\tlet key = input.substr(0, equalsIndex);\n\t\t\t\tif (!key) key = getAliasName(input);\n\t\t\t\tcommand.input[key] = value;\n\t\t\t});\n\t\t} else {\n\t\t\tcommand.input = inputSource;\n\t\t}\n\t}\n\n\tif (command.environment) {\n\t\tconst environment = Array.isArray(command.environment)\n\t\t\t? command.environment\n\t\t\t: [command.environment];\n\n\t\tenvironment.forEach((arg: string) => {\n\t\t\targ.split(',').forEach((pair: string) => {\n\t\t\t\tconst [key, value] = pair.split(':');\n\t\t\t\tif (value) {\n\t\t\t\t\tprocess.env[key] = value;\n\t\t\t\t} else {\n\t\t\t\t\tprocess.env[key] = String(true);\n\t\t\t\t}\n\t\t\t});\n\t\t});\n\t}\n\n\tlet configFile = command.config === true ? 'rollup.config.js' : command.config;\n\n\tif (configFile) {\n\t\tif (configFile.slice(0, 5) === 'node:') {\n\t\t\tconst pkgName = configFile.slice(5);\n\t\t\ttry {\n\t\t\t\tconfigFile = relative.resolve(`rollup-config-${pkgName}`, process.cwd());\n\t\t\t} catch (err) {\n\t\t\t\ttry {\n\t\t\t\t\tconfigFile = relative.resolve(pkgName, process.cwd());\n\t\t\t\t} catch (err) {\n\t\t\t\t\tif (err.code === 'MODULE_NOT_FOUND') {\n\t\t\t\t\t\thandleError({\n\t\t\t\t\t\t\tcode: 'MISSING_EXTERNAL_CONFIG',\n\t\t\t\t\t\t\tmessage: `Could not resolve config file ${configFile}`\n\t\t\t\t\t\t});\n\t\t\t\t\t}\n\n\t\t\t\t\tthrow err;\n\t\t\t\t}\n\t\t\t}\n\t\t} else {\n\t\t\t// find real path of config so it matches what Node provides to callbacks in require.extensions\n\t\t\tconfigFile = realpathSync(configFile);\n\t\t}\n\n\t\tif (command.watch) process.env.ROLLUP_WATCH = 'true';\n\n\t\tloadConfigFile(configFile, command)\n\t\t\t.then(configs => execute(configFile, configs, command))\n\t\t\t.catch(handleError);\n\t} else {\n\t\treturn execute(configFile, [{ input: null }] as any, command);\n\t}\n}\n\nfunction execute(configFile: string, configs: GenericConfigObject[], command: any) {\n\tif (command.watch) {\n\t\twatch(configFile, configs, command, command.silent);\n\t} else {\n\t\tlet promise = Promise.resolve();\n\t\tfor (const config of configs) {\n\t\t\tpromise = promise.then(() => {\n\t\t\t\tconst warnings = batchWarnings();\n\t\t\t\tconst { inputOptions, outputOptions, optionError } = mergeOptions({\n\t\t\t\t\tcommand,\n\t\t\t\t\tconfig,\n\t\t\t\t\tdefaultOnWarnHandler: warnings.add\n\t\t\t\t});\n\n\t\t\t\tif (optionError)\n\t\t\t\t\t(inputOptions.onwarn as WarningHandler)({ code: 'UNKNOWN_OPTION', message: optionError });\n\t\t\t\treturn build(inputOptions, outputOptions, warnings, command.silent);\n\t\t\t});\n\t\t}\n\t\treturn promise;\n\t}\n}\n","import help from 'help.md';\nimport minimist from 'minimist';\nimport { version } from 'package.json';\nimport { commandAliases } from '../../src/utils/mergeOptions';\nimport run from './run/index';\n\nconst command = minimist(process.argv.slice(2), {\n\talias: commandAliases\n});\n\nif (command.help || (process.argv.length <= 2 && process.stdin.isTTY)) {\n\tconsole.log(`\\n${help.replace('__VERSION__', version)}\\n`);\n} else if (command.version) {\n\tconsole.log(`rollup v${version}`);\n} else {\n\ttry {\n\t\trequire('source-map-support').install();\n\t} catch (err) {\n\t\t// do nothing\n\t}\n\n\trun(command);\n}\n"],"names":["path","Module","basename","extname","relative","tc","parseMilliseconds","rollup\n .rollup","SOURCEMAPPING_URL","ms","rollup","signals","require$$0","require$$1","rollup.watch","rollup.VERSION","dateTime","onExit","fs","realpathSync","run"],"mappings":";;;;;;;;;;;;;;;;;AAAA,YAAc,GAAG,UAAU,IAAI,EAAE,IAAI;IACjC,IAAI,CAAC,IAAI;QAAE,IAAI,GAAG,EAAE,CAAC;IAErB,IAAI,KAAK,GAAG,EAAE,KAAK,EAAG,EAAE,EAAE,OAAO,EAAG,EAAE,EAAE,SAAS,EAAE,IAAI,EAAE,CAAC;IAE1D,IAAI,OAAO,IAAI,CAAC,SAAS,CAAC,KAAK,UAAU,EAAE;QACvC,KAAK,CAAC,SAAS,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC;KACrC;IAED,IAAI,OAAO,IAAI,CAAC,SAAS,CAAC,KAAK,SAAS,IAAI,IAAI,CAAC,SAAS,CAAC,EAAE;QAC3D,KAAK,CAAC,QAAQ,GAAG,IAAI,CAAC;KACvB;SAAM;QACL,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,UAAU,GAAG;YAC5D,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;SAC3B,CAAC,CAAC;KACJ;IAED,IAAI,OAAO,GAAG,EAAE,CAAC;IACjB,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,IAAI,EAAE,CAAC,CAAC,OAAO,CAAC,UAAU,GAAG;QAC/C,OAAO,CAAC,GAAG,CAAC,GAAG,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;QAC1C,OAAO,CAAC,GAAG,CAAC,CAAC,OAAO,CAAC,UAAU,CAAC;YAC5B,OAAO,CAAC,CAAC,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC,MAAM,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,MAAM,CAAC,UAAU,CAAC;gBACrD,OAAO,CAAC,KAAK,CAAC,CAAC;aAClB,CAAC,CAAC,CAAC;SACP,CAAC,CAAC;KACN,CAAC,CAAC;IAEH,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,UAAU,GAAG;QACxD,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;QAC1B,IAAI,OAAO,CAAC,GAAG,CAAC,EAAE;YACd,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,GAAG,IAAI,CAAC;SACtC;KACH,CAAC,CAAC;IAEJ,IAAI,QAAQ,GAAG,IAAI,CAAC,SAAS,CAAC,IAAI,EAAE,CAAC;IAErC,IAAI,IAAI,GAAG,EAAE,CAAC,EAAG,EAAE,EAAE,CAAC;IACtB,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,OAAO,CAAC,UAAU,GAAG;QAC1C,MAAM,CAAC,GAAG,EAAE,QAAQ,CAAC,GAAG,CAAC,KAAK,SAAS,GAAG,KAAK,GAAG,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC;KACpE,CAAC,CAAC;IAEH,IAAI,QAAQ,GAAG,EAAE,CAAC;IAElB,IAAI,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,EAAE;QAC3B,QAAQ,GAAG,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,GAAC,CAAC,CAAC,CAAC;QAC5C,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,CAAC,EAAE,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC;KAC5C;IAED,SAAS,UAAU,CAAC,GAAG,EAAE,GAAG;QACxB,OAAO,CAAC,KAAK,CAAC,QAAQ,IAAI,WAAW,CAAC,IAAI,CAAC,GAAG,CAAC;YAC3C,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,IAAI,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,IAAI,OAAO,CAAC,GAAG,CAAC,CAAC;KAC9D;IAED,SAAS,MAAM,CAAE,GAAG,EAAE,GAAG,EAAE,GAAG;QAC1B,IAAI,GAAG,IAAI,KAAK,CAAC,SAAS,IAAI,CAAC,UAAU,CAAC,GAAG,EAAE,GAAG,CAAC,EAAE;YACjD,IAAI,KAAK,CAAC,SAAS,CAAC,GAAG,CAAC,KAAK,KAAK;gBAAE,OAAO;SAC9C;QAED,IAAI,KAAK,GAAG,CAAC,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,IAAI,QAAQ,CAAC,GAAG,CAAC;cAC1C,MAAM,CAAC,GAAG,CAAC,GAAG,GAAG,CACtB;QACD,MAAM,CAAC,IAAI,EAAE,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,CAAC,CAAC;QAEpC,CAAC,OAAO,CAAC,GAAG,CAAC,IAAI,EAAE,EAAE,OAAO,CAAC,UAAU,CAAC;YACpC,MAAM,CAAC,IAAI,EAAE,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,CAAC,CAAC;SACrC,CAAC,CAAC;KACN;IAED,SAAS,MAAM,CAAE,GAAG,EAAE,IAAI,EAAE,KAAK;QAC7B,IAAI,CAAC,GAAG,GAAG,CAAC;QACZ,IAAI,CAAC,KAAK,CAAC,CAAC,EAAC,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,GAAG;YAClC,IAAI,CAAC,CAAC,GAAG,CAAC,KAAK,SAAS;gBAAE,CAAC,CAAC,GAAG,CAAC,GAAG,EAAE,CAAC;YACtC,CAAC,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC;SACd,CAAC,CAAC;QAEH,IAAI,GAAG,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;QAChC,IAAI,CAAC,CAAC,GAAG,CAAC,KAAK,SAAS,IAAI,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,IAAI,OAAO,CAAC,CAAC,GAAG,CAAC,KAAK,SAAS,EAAE;YACzE,CAAC,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;SAClB;aACI,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,EAAE;YAC5B,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;SACtB;aACI;YACD,CAAC,CAAC,GAAG,CAAC,GAAG,CAAE,CAAC,CAAC,GAAG,CAAC,EAAE,KAAK,CAAE,CAAC;SAC9B;KACJ;IAED,SAAS,cAAc,CAAC,GAAG;QACzB,OAAO,OAAO,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC;YAChC,OAAO,KAAK,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;SACzB,CAAC,CAAC;KACJ;IAED,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;QAClC,IAAI,GAAG,GAAG,IAAI,CAAC,CAAC,CAAC,CAAC;QAElB,IAAI,QAAQ,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE;;;;YAIpB,IAAI,CAAC,GAAG,GAAG,CAAC,KAAK,CAAC,uBAAuB,CAAC,CAAC;YAC3C,IAAI,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;YACf,IAAI,KAAK,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file should be removed as well.

@@ -187,6 +187,23 @@ Kind: `async, parallel`

Called initially each time `bundle.generate()` or `bundle.write()` is called. To get notified when generation has completed, use the `generateBundle` and `renderError` hooks.

#### `augmentChunkHash`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding documentation! It would be nice to stick with the alphabetical ordering of the hooks and move this up.

@@ -187,6 +187,23 @@ Kind: `async, parallel`

Called initially each time `bundle.generate()` or `bundle.write()` is called. To get notified when generation has completed, use the `generateBundle` and `renderError` hooks.

#### `augmentChunkHash`
Type: `(preRenderedChunk: PreRenderedChunk) => any`<br>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is any really the correct type? Looking at how you actually use the return value, the logic assumes these are strings or things that can be safely converted to string, so definitely no functions or objects. I would actually use string here to avoid confusion.

src/Chunk.ts Outdated
@@ -136,6 +136,7 @@ export default class Chunk {
return chunk;
}

augmentedHash?: string;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From how this is generated, I would say this is more like a hashAugmentation as the actual hash is composed of more things. But maybe we can get rid of this property, cf. my other comments?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense and will actually accomplish the lazy behaviour

src/Chunk.ts Outdated
@@ -341,6 +342,9 @@ export default class Chunk {
if (this.renderedHash) return this.renderedHash;
if (!this.renderedSource) return '';
const hash = sha256();
if (this.augmentedHash) {
hash.update(this.augmentedHash);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As the renderedHash is effectively cached and determined at most once, but sometimes not at all, i.e. if there is no [hash] in the name pattern, I wondered if it would not be better to generate this value on the fly. The only necessary parameter graph is available on each chunk anyway and we would not need the additional property on the Chunk class.

@@ -327,6 +327,7 @@ interface OnWriteOptions extends OutputOptions {
}

export interface PluginHooks {
augmentChunkHash: (this: PluginContext, chunk: PreRenderedChunk) => void;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the correct type would be something like ... => string?

@isidrok
Copy link
Contributor Author

isidrok commented Aug 12, 2019

Alright I got requested changes done. Looking closely into how file-hashes test work I don't think they are suitable for this case since they are validating hashes against file contents and this has nothing to do with that. Should I move them back to the hooks suite?

@lukastaegert
Copy link
Member

lukastaegert commented Aug 13, 2019

Should I move them back to the hooks suite

We could do that but in the end, this has to do with file contents, so we could also make the test closer to the real use case, which would then also serve as a good illustrating example for this hook:

For both sets of options, we have a plugin that implements both the renderChunkHook and the augmentChunkHash hook. The renderChunk hook could be really simple, like prepending a comment. But the change should be slightly different in both cases (e.g. a different comment) and the hash augmentation should also be slightly different in a way that corresponds to the change (e.g. the content of the comment). Without the augmentChunkHash hook the test would be red, but due to the hook the test is green.

Copy link
Member

@lukastaegert lukastaegert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks really good. Tell me if you want to do the changes to the test as proposed, otherwise from my side this can be merged.

@@ -341,6 +342,8 @@ export default class Chunk {
if (this.renderedHash) return this.renderedHash;
if (!this.renderedSource) return '';
const hash = sha256();
const hashAugmentation = this.calculateHashAugmentation();
hash.update(hashAugmentation);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@isidrok
Copy link
Contributor Author

isidrok commented Aug 13, 2019

This looks really good. Tell me if you want to do the changes to the test as proposed, otherwise from my side this can be merged.

Really nice test case, done!

@isidrok isidrok changed the title [WIP] augmentChunkHash plugin hook augmentChunkHash plugin hook Aug 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

File hash is not updated when changing code in renderChunk
3 participants