Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only partial data sent to process.stdout when output reaches a certain size #1823

Closed
baseten opened this issue Dec 7, 2016 · 5 comments
Closed

Comments

@baseten
Copy link

baseten commented Dec 7, 2016

  • NPM version: 3.10.8
  • Node version: v6.9.1
  • Node Process:
{ http_parser: '2.7.0',
  node: '6.9.1',
  v8: '5.1.281.84',
  uv: '1.9.1',
  zlib: '1.2.8',
  ares: '1.10.1-DEV',
  icu: '57.1',
  modules: '48',
  openssl: '1.0.2j' }
  • Node Platform: darwin:
  • Node architecture: x64
  • node-sass version:
node-sass	3.13.0	(Wrapper)	[JavaScript]
libsass  	3.3.6	(Sass Compiler)	[C/C++]
  • npm node-sass versions: node-sass@3.13.0

When piping output to another script, if the compiled code is over a certain length then only the first chunk appears to be sent. From the command line all seems fine:

./node_modules/node-sass/bin/node-sass sass/app.scss

But when piping to another node script, the output is cut off:

./node_modules/node-sass/bin/node-sass sass/app.scss | scripts/test.js

Where scripts/test.js is the following:

#!/usr/bin/env node

process.stdin.on( 'readable', () => {
    var chunk = process.stdin.read();

    if ( chunk !== null ) {
        console.log( chunk.toString() ); // logs incomplete css
    }
} );

process.stdin.on( 'end', () => {
    console.log( 'end' );
} );
@baseten
Copy link
Author

baseten commented Dec 7, 2016

I'm not sure if this is any use, but Buffer.byteLength( chunk ) returns 65536 for my particular project. Not sure if this is related to the size of a buffer chunk or not?

@xzyfer
Copy link
Contributor

xzyfer commented Dec 8, 2016 via email

@saper saper self-assigned this Dec 9, 2016
@marcusdarmstrong
Copy link

marcusdarmstrong commented Dec 12, 2016

I ran into this issue a few days ago. Downgraded to 3.8.0 to eliminate the blockage, looked like it was related to a node.js change to no longer flush output buffers on exit unless explicitly told to.

I was also able to workaround this by inserting this snippet into the package as a local mod:

[process.stdout, process.stderr].forEach(function(stream) {
    if (stream._handle && stream._handle.setBlocking) {
        stream._handle.setBlocking(true);
    }
});

@xzyfer
Copy link
Contributor

xzyfer commented Dec 13, 2016

There were two separate issues in play. Firstly console.log has max buffer length that could be exceeded by some large CSS output, and we were exiting before fully flushing the output buffer.

This has been fixed #1834.

@arunstan
Copy link

arunstan commented Aug 21, 2017

For anyone who is looking, this was shipped in v4.1.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants