...
 
Commits (22)
This diff is collapsed.
...@@ -91,9 +91,11 @@ On macOS you will need to install the `Xcode Command Line Tools` by running ...@@ -91,9 +91,11 @@ On macOS you will need to install the `Xcode Command Line Tools` by running
installed, you can find them under the menu `Xcode -> Open Developer Tool -> installed, you can find them under the menu `Xcode -> Open Developer Tool ->
More Developer Tools...`. This step will install `clang`, `clang++`, and More Developer Tools...`. This step will install `clang`, `clang++`, and
`make`. `make`.
* You may want to setup [firewall rules](tools/macosx-firewall.sh) * After building, you may want to setup [firewall rules](tools/macosx-firewall.sh)
to avoid popups asking to accept incoming network connections when running tests: to avoid popups asking to accept incoming network connections when running tests:
If the path to your build directory contains a space, the build will likely fail.
```console ```console
$ sudo ./tools/macosx-firewall.sh $ sudo ./tools/macosx-firewall.sh
``` ```
...@@ -127,6 +129,28 @@ To run the tests: ...@@ -127,6 +129,28 @@ To run the tests:
$ make test $ make test
``` ```
At this point you are ready to make code changes and re-run the tests!
Optionally, continue below.
To run the tests and generate code coverage reports:
```console
$ ./configure --coverage
$ make coverage
```
This will generate coverage reports for both JavaScript and C++ tests (if you
only want to run the JavaScript tests then you do not need to run the first
command `./configure --coverage`).
The `make coverage` command downloads some tools to the project root directory
and overwrites the `lib/` directory. To clean up after generating the coverage
reports:
```console
$ make coverage-clean
```
To build the documentation: To build the documentation:
This will build Node.js first (if necessary) and then use it to build the docs: This will build Node.js first (if necessary) and then use it to build the docs:
...@@ -135,7 +159,7 @@ This will build Node.js first (if necessary) and then use it to build the docs: ...@@ -135,7 +159,7 @@ This will build Node.js first (if necessary) and then use it to build the docs:
$ make doc $ make doc
``` ```
If you have an existing Node.js you can build just the docs with: If you have an existing Node.js build, you can build just the docs with:
```console ```console
$ NODE=/path/to/node make doc-only $ NODE=/path/to/node make doc-only
...@@ -174,6 +198,8 @@ Prerequisites: ...@@ -174,6 +198,8 @@ Prerequisites:
[Git for Windows](http://git-scm.com/download/win) includes Git Bash [Git for Windows](http://git-scm.com/download/win) includes Git Bash
and tools which can be included in the global `PATH`. and tools which can be included in the global `PATH`.
If the path to your build directory contains a space, the build will likely fail.
```console ```console
> .\vcbuild nosign > .\vcbuild nosign
``` ```
......
...@@ -26,7 +26,10 @@ release. ...@@ -26,7 +26,10 @@ release.
</tr> </tr>
<tr> <tr>
<td valign="top"> <td valign="top">
<b><a href="doc/changelogs/CHANGELOG_V6.md#6.11.3">6.11.3</a></b><br/> <b><a href="doc/changelogs/CHANGELOG_V6.md#6.12.0">6.12.0</a></b><br/>
<a href="doc/changelogs/CHANGELOG_V6.md#6.11.5">6.11.5</a><br/>
<a href="doc/changelogs/CHANGELOG_V6.md#6.11.4">6.11.4</a><br/>
<a href="doc/changelogs/CHANGELOG_V6.md#6.11.3">6.11.3</a><br/>
<a href="doc/changelogs/CHANGELOG_V6.md#6.11.2">6.11.2</a><br/> <a href="doc/changelogs/CHANGELOG_V6.md#6.11.2">6.11.2</a><br/>
<a href="doc/changelogs/CHANGELOG_V6.md#6.11.1">6.11.1</a><br/> <a href="doc/changelogs/CHANGELOG_V6.md#6.11.1">6.11.1</a><br/>
<a href="doc/changelogs/CHANGELOG_V6.md#6.11.0">6.11.0</a><br/> <a href="doc/changelogs/CHANGELOG_V6.md#6.11.0">6.11.0</a><br/>
......
This diff is collapsed.
...@@ -176,7 +176,8 @@ Running `make test`/`vcbuild test` will run the linter as well unless one or ...@@ -176,7 +176,8 @@ Running `make test`/`vcbuild test` will run the linter as well unless one or
more tests fail. more tests fail.
If you want to run the linter without running tests, use If you want to run the linter without running tests, use
`make lint`/`vcbuild lint`. `make lint`/`vcbuild lint`. It will run both JavaScript linting and
C++ linting.
If you are updating tests and just want to run a single test to check it, you If you are updating tests and just want to run a single test to check it, you
can use this syntax to run it exactly as the test harness would: can use this syntax to run it exactly as the test harness would:
......
This diff is collapsed.
...@@ -527,8 +527,8 @@ The externally maintained libraries used by Node.js are: ...@@ -527,8 +527,8 @@ The externally maintained libraries used by Node.js are:
- stdint-msvc2008.h (from msinttypes), copyright Alexander Chemeris. Three - stdint-msvc2008.h (from msinttypes), copyright Alexander Chemeris. Three
clause BSD license. clause BSD license.
- pthread-fixes.h, pthread-fixes.c, copyright Google Inc. and Sony Mobile - pthread-fixes.c, copyright Google Inc. and Sony Mobile Communications AB.
Communications AB. Three clause BSD license. Three clause BSD license.
- android-ifaddrs.h, android-ifaddrs.c, copyright Berkeley Software Design - android-ifaddrs.h, android-ifaddrs.c, copyright Berkeley Software Design
Inc, Kenneth MacKay and Emergya (Cloud4all, FP7/2007-2013, grant agreement Inc, Kenneth MacKay and Emergya (Cloud4all, FP7/2007-2013, grant agreement
......
...@@ -753,23 +753,27 @@ bench-idle: ...@@ -753,23 +753,27 @@ bench-idle:
sleep 1 sleep 1
$(NODE) benchmark/idle_clients.js & $(NODE) benchmark/idle_clients.js &
jslint: lint-js:
@echo "Running JS linter..." @echo "Running JS linter..."
$(NODE) tools/eslint/bin/eslint.js --cache --rulesdir=tools/eslint-rules --ext=.js,.md \ $(NODE) tools/eslint/bin/eslint.js --cache --rulesdir=tools/eslint-rules --ext=.js,.md \
benchmark doc lib test tools benchmark doc lib test tools
jslint-ci: jslint: lint-js
lint-js-ci:
@echo "Running JS linter..." @echo "Running JS linter..."
$(NODE) tools/jslint.js $(PARALLEL_ARGS) -f tap -o test-eslint.tap \ $(NODE) tools/lint-js.js $(PARALLEL_ARGS) -f tap -o test-eslint.tap \
benchmark doc lib test tools benchmark doc lib test tools
CPPLINT_EXCLUDE ?= jslint-ci: lint-js-ci
CPPLINT_EXCLUDE += src/node_root_certs.h
CPPLINT_EXCLUDE += src/queue.h LINT_CPP_EXCLUDE ?=
CPPLINT_EXCLUDE += src/tree.h LINT_CPP_EXCLUDE += src/node_root_certs.h
CPPLINT_EXCLUDE += $(wildcard test/addons/??_*/*.cc test/addons/??_*/*.h) LINT_CPP_EXCLUDE += src/queue.h
LINT_CPP_EXCLUDE += src/tree.h
LINT_CPP_EXCLUDE += $(wildcard test/addons/??_*/*.cc test/addons/??_*/*.h)
CPPLINT_FILES = $(filter-out $(CPPLINT_EXCLUDE), $(wildcard \ LINT_CPP_FILES = $(filter-out $(LINT_CPP_EXCLUDE), $(wildcard \
src/*.c \ src/*.c \
src/*.cc \ src/*.cc \
src/*.h \ src/*.h \
...@@ -781,19 +785,21 @@ CPPLINT_FILES = $(filter-out $(CPPLINT_EXCLUDE), $(wildcard \ ...@@ -781,19 +785,21 @@ CPPLINT_FILES = $(filter-out $(CPPLINT_EXCLUDE), $(wildcard \
tools/icu/*.h \ tools/icu/*.h \
)) ))
cpplint: lint-cpp:
@echo "Running C++ linter..." @echo "Running C++ linter..."
@$(PYTHON) tools/cpplint.py $(CPPLINT_FILES) @$(PYTHON) tools/cpplint.py $(LINT_CPP_FILES)
@$(PYTHON) tools/check-imports.py @$(PYTHON) tools/check-imports.py
ifneq ("","$(wildcard tools/eslint/bin/eslint.js)") cpplint: lint-cpp
ifneq ("","$(wildcard tools/eslint/)")
lint: lint:
@EXIT_STATUS=0 ; \ @EXIT_STATUS=0 ; \
$(MAKE) jslint || EXIT_STATUS=$$? ; \ $(MAKE) lint-js || EXIT_STATUS=$$? ; \
$(MAKE) cpplint || EXIT_STATUS=$$? ; \ $(MAKE) lint-cpp || EXIT_STATUS=$$? ; \
exit $$EXIT_STATUS exit $$EXIT_STATUS
CONFLICT_RE=^>>>>>>> [0-9A-Fa-f]+|^<<<<<<< [A-Za-z]+ CONFLICT_RE=^>>>>>>> [0-9A-Fa-f]+|^<<<<<<< [A-Za-z]+
lint-ci: jslint-ci cpplint lint-ci: lint-js-ci lint-cpp
@if ! ( grep -IEqrs "$(CONFLICT_RE)" benchmark deps doc lib src test tools ) \ @if ! ( grep -IEqrs "$(CONFLICT_RE)" benchmark deps doc lib src test tools ) \
&& ! ( find . -maxdepth 1 -type f | xargs grep -IEqs "$(CONFLICT_RE)" ); then \ && ! ( find . -maxdepth 1 -type f | xargs grep -IEqs "$(CONFLICT_RE)" ); then \
exit 0 ; \ exit 0 ; \
...@@ -807,18 +813,17 @@ lint: ...@@ -807,18 +813,17 @@ lint:
@echo "Linting is not available through the source tarball." @echo "Linting is not available through the source tarball."
@echo "Use the git repo instead:" \ @echo "Use the git repo instead:" \
"$ git clone https://github.com/nodejs/node.git" "$ git clone https://github.com/nodejs/node.git"
exit 1
lint-ci: lint lint-ci: lint
endif endif
.PHONY: lint cpplint jslint bench clean docopen docclean doc dist distclean \ .PHONY: lint lint-cpp lint-js bench clean docopen docclean doc dist distclean \
check uninstall install install-includes install-bin all staticlib \ check uninstall install install-includes install-bin all staticlib \
dynamiclib test test-all test-addons test-addons-clean build-addons \ dynamiclib test test-all test-addons test-addons-clean build-addons \
website-upload pkg blog blogclean tar binary release-only \ website-upload pkg blog blogclean tar binary release-only \
bench-http-simple bench-idle bench-all bench bench-misc bench-array \ bench-http-simple bench-idle bench-all bench bench-misc bench-array \
bench-buffer bench-net bench-http bench-fs bench-tls cctest run-ci \ bench-buffer bench-net bench-http bench-fs bench-tls cctest run-ci \
test-v8 test-v8-intl test-v8-benchmarks test-v8-all v8 lint-ci \ test-v8 test-v8-intl test-v8-benchmarks test-v8-all v8 lint-ci \
bench-ci jslint-ci doc-only $(TARBALL)-headers test-ci test-ci-native \ bench-ci lint-js-ci doc-only $(TARBALL)-headers test-ci test-ci-native \
test-ci-js build-ci test-hash-seed clear-stalled test-ci-js build-ci test-hash-seed clear-stalled
This diff is collapsed.
'use strict'; 'use strict';
var common = require('../common.js'); const common = require('../common.js');
var types = [ const bench = common.createBenchmark(main, {
'Array', type: [
'Buffer', 'Array',
'Int8Array', 'Buffer',
'Uint8Array', 'Int8Array',
'Int16Array', 'Uint8Array',
'Uint16Array', 'Int16Array',
'Int32Array', 'Uint16Array',
'Uint32Array', 'Int32Array',
'Float32Array', 'Uint32Array',
'Float64Array' 'Float32Array',
]; 'Float64Array'
],
var bench = common.createBenchmark(main, {
type: types,
n: [25] n: [25]
}); });
function main(conf) { function main(conf) {
var type = conf.type; const type = conf.type;
var clazz = global[type]; const clazz = global[type];
var n = +conf.n; const n = +conf.n;
bench.start(); bench.start();
var arr = new clazz(n * 1e6); var arr = new clazz(n * 1e6);
......
'use strict'; 'use strict';
var common = require('../common.js'); const common = require('../common.js');
var types = [ const bench = common.createBenchmark(main, {
'Array', type: [
'Buffer', 'Array',
'Int8Array', 'Buffer',
'Uint8Array', 'Int8Array',
'Int16Array', 'Uint8Array',
'Uint16Array', 'Int16Array',
'Int32Array', 'Uint16Array',
'Uint32Array', 'Int32Array',
'Float32Array', 'Uint32Array',
'Float64Array' 'Float32Array',
]; 'Float64Array'
],
var bench = common.createBenchmark(main, {
type: types,
n: [25] n: [25]
}); });
function main(conf) { function main(conf) {
var type = conf.type; const type = conf.type;
var clazz = global[type]; const clazz = global[type];
var n = +conf.n; const n = +conf.n;
bench.start(); bench.start();
var arr = new clazz(n * 1e6); var arr = new clazz(n * 1e6);
......
'use strict'; 'use strict';
var common = require('../common.js'); const common = require('../common.js');
var types = [ const bench = common.createBenchmark(main, {
'Array', type: [
'Buffer', 'Array',
'Int8Array', 'Buffer',
'Uint8Array', 'Int8Array',
'Int16Array', 'Uint8Array',
'Uint16Array', 'Int16Array',
'Int32Array', 'Uint16Array',
'Uint32Array', 'Int32Array',
'Float32Array', 'Uint32Array',
'Float64Array' 'Float32Array',
]; 'Float64Array'
],
var bench = common.createBenchmark(main, {
type: types,
n: [25] n: [25]
}); });
function main(conf) { function main(conf) {
var type = conf.type; const type = conf.type;
var clazz = global[type]; const clazz = global[type];
var n = +conf.n; const n = +conf.n;
bench.start(); bench.start();
var arr = new clazz(n * 1e6); var arr = new clazz(n * 1e6);
......
...@@ -18,7 +18,7 @@ keylen_list.forEach(function(key) { ...@@ -18,7 +18,7 @@ keylen_list.forEach(function(key) {
var bench = common.createBenchmark(main, { var bench = common.createBenchmark(main, {
writes: [500], writes: [500],
algo: ['RSA-SHA1', 'RSA-SHA224', 'RSA-SHA256', 'RSA-SHA384', 'RSA-SHA512'], algo: ['SHA1', 'SHA224', 'SHA256', 'SHA384', 'SHA512'],
keylen: keylen_list, keylen: keylen_list,
len: [1024, 102400, 2 * 102400, 3 * 102400, 1024 * 1024] len: [1024, 102400, 2 * 102400, 3 * 102400, 1024 * 1024]
}); });
......
...@@ -10,6 +10,7 @@ const configs = { ...@@ -10,6 +10,7 @@ const configs = {
}; };
const bench = common.createBenchmark(main, configs); const bench = common.createBenchmark(main, configs);
const noop = () => {};
function main(conf) { function main(conf) {
const n = +conf.n; const n = +conf.n;
...@@ -19,19 +20,27 @@ function main(conf) { ...@@ -19,19 +20,27 @@ function main(conf) {
if (port !== undefined && address !== undefined) { if (port !== undefined && address !== undefined) {
bench.start(); bench.start();
for (let i = 0; i < n; i++) { for (let i = 0; i < n; i++) {
dgram.createSocket('udp4').bind(port, address).unref(); dgram.createSocket('udp4').bind(port, address)
.on('error', noop)
.unref();
} }
bench.end(n); bench.end(n);
} else if (port !== undefined) { } else if (port !== undefined) {
bench.start(); bench.start();
for (let i = 0; i < n; i++) { for (let i = 0; i < n; i++) {
dgram.createSocket('udp4').bind(port).unref(); dgram.createSocket('udp4')
.bind(port)
.on('error', noop)
.unref();
} }
bench.end(n); bench.end(n);
} else if (port === undefined && address === undefined) { } else if (port === undefined && address === undefined) {
bench.start(); bench.start();
for (let i = 0; i < n; i++) { for (let i = 0; i < n; i++) {
dgram.createSocket('udp4').bind().unref(); dgram.createSocket('udp4')
.bind()
.on('error', noop)
.unref();
} }
bench.end(n); bench.end(n);
} }
......
...@@ -27,7 +27,7 @@ const bench = common.createBenchmark(main, { ...@@ -27,7 +27,7 @@ const bench = common.createBenchmark(main, {
'foo\nbar', 'foo\nbar',
'\x7F' '\x7F'
], ],
n: [5e8], n: [1e6],
}); });
function main(conf) { function main(conf) {
......
...@@ -37,7 +37,7 @@ const bench = common.createBenchmark(main, { ...@@ -37,7 +37,7 @@ const bench = common.createBenchmark(main, {
':alternate-protocol', // fast bailout ':alternate-protocol', // fast bailout
'alternate-protocol:' // slow bailout 'alternate-protocol:' // slow bailout
], ],
n: [5e8], n: [1e6],
}); });
function main(conf) { function main(conf) {
......
#!/usr/bin/env python #!/bin/sh
# Locate python2 interpreter and re-execute the script. Note that the
# mix of single and double quotes is intentional, as is the fact that
# the ] goes on a new line.
_=[ 'exec' '/bin/sh' '-c' '''
which python2.7 >/dev/null && exec python2.7 "$0" "$@"
which python2 >/dev/null && exec python2 "$0" "$@"
exec python "$0" "$@"
''' "$0" "$@"
]
del _
import sys import sys
if sys.version_info[0] != 2 or sys.version_info[1] not in (6, 7): if sys.version_info[0] != 2 or sys.version_info[1] not in (6, 7):
...@@ -433,6 +444,11 @@ parser.add_option('--without-ssl', ...@@ -433,6 +444,11 @@ parser.add_option('--without-ssl',
dest='without_ssl', dest='without_ssl',
help='build without SSL (disables crypto, https, inspector, etc.)') help='build without SSL (disables crypto, https, inspector, etc.)')
parser.add_option('--without-node-options',
action='store_true',
dest='without_node_options',
help='build without NODE_OPTIONS support')
parser.add_option('--xcode', parser.add_option('--xcode',
action='store_true', action='store_true',
dest='use_xcode', dest='use_xcode',
...@@ -964,6 +980,9 @@ def configure_openssl(o): ...@@ -964,6 +980,9 @@ def configure_openssl(o):
o['variables']['openssl_no_asm'] = 1 if options.openssl_no_asm else 0 o['variables']['openssl_no_asm'] = 1 if options.openssl_no_asm else 0
if options.use_openssl_ca_store: if options.use_openssl_ca_store:
o['defines'] += ['NODE_OPENSSL_CERT_STORE'] o['defines'] += ['NODE_OPENSSL_CERT_STORE']
o['variables']['node_without_node_options'] = b(options.without_node_options)
if options.without_node_options:
o['defines'] += ['NODE_WITHOUT_NODE_OPTIONS']
if options.openssl_fips: if options.openssl_fips:
o['variables']['openssl_fips'] = options.openssl_fips o['variables']['openssl_fips'] = options.openssl_fips
fips_dir = os.path.join(root_dir, 'deps', 'openssl', 'fips') fips_dir = os.path.join(root_dir, 'deps', 'openssl', 'fips')
......
nodejs (6.12.0~dfsg-2~bpo9+1) stretch-backports; urgency=medium
* Rebuild for stretch-backports.
-- Pirate Praveen <praveen@debian.org> Wed, 02 May 2018 15:27:18 +0530
nodejs (6.12.0~dfsg-2) unstable; urgency=medium
* Whitelist allowed architectures. Closes: #881735.
-- Jérémy Lal <kapouer@melix.org> Tue, 14 Nov 2017 18:25:41 +0100
nodejs (6.12.0~dfsg-1) unstable; urgency=medium
* New upstream version 6.12.0~dfsg
-- Jérémy Lal <kapouer@melix.org> Tue, 14 Nov 2017 00:42:34 +0100
nodejs (6.11.4~dfsg-1) unstable; urgency=medium
* New upstream version 6.11.4~dfsg
* Testsuite field is no longer needed
* Standards-Version 4.1.1
-- Jérémy Lal <kapouer@melix.org> Wed, 04 Oct 2017 13:33:00 +0200
nodejs (6.11.3~dfsg-1) unstable; urgency=medium nodejs (6.11.3~dfsg-1) unstable; urgency=medium
* New upstream version 6.11.3~dfsg * New upstream version 6.11.3~dfsg
......
Source: nodejs Source: nodejs
Section: web Section: javascript
Priority: optional Priority: optional
Maintainer: Debian Javascript Maintainers <pkg-javascript-devel@lists.alioth.debian.org> Maintainer: Debian Javascript Maintainers <pkg-javascript-devel@lists.alioth.debian.org>
Uploaders: Jérémy Lal <kapouer@melix.org>, Uploaders: Jérémy Lal <kapouer@melix.org>,
...@@ -25,16 +25,15 @@ Build-Depends: cdbs, ...@@ -25,16 +25,15 @@ Build-Depends: cdbs,
libuv1-dev libuv1-dev
Build-Depends-Indep: node-yamlish, Build-Depends-Indep: node-yamlish,
node-marked node-marked
Standards-Version: 4.0.0 Standards-Version: 4.1.1
Homepage: http://nodejs.org/ Homepage: http://nodejs.org/
Vcs-Browser: https://anonscm.debian.org/gitweb/?p=collab-maint/nodejs.git Vcs-Browser: https://anonscm.debian.org/gitweb/?p=collab-maint/nodejs.git
Vcs-Git: https://anonscm.debian.org/git/collab-maint/nodejs.git Vcs-Git: https://anonscm.debian.org/git/collab-maint/nodejs.git
Testsuite: autopkgtest
Package: nodejs-dev Package: nodejs-dev
Section: devel Section: devel
Priority: extra Priority: optional
Architecture: any Architecture: amd64 arm64 armhf i386 kfreebsd-i386 kfreebsd-amd64 mips mips64el mipsel ppc64 ppc64el s390x
Depends: ${cdbs:Depends}, Depends: ${cdbs:Depends},
${misc:Depends}, ${misc:Depends},
nodejs (= ${binary:Version}) nodejs (= ${binary:Version})
...@@ -49,7 +48,7 @@ Description: evented I/O for V8 javascript (development files) ...@@ -49,7 +48,7 @@ Description: evented I/O for V8 javascript (development files)
This package is needed to build plugins. This package is needed to build plugins.
Package: nodejs Package: nodejs
Architecture: any Architecture: amd64 arm64 armhf i386 kfreebsd-i386 kfreebsd-amd64 mips mips64el mipsel ppc64 ppc64el s390x
Depends: ${shlibs:Depends}, Depends: ${shlibs:Depends},
${misc:Depends} ${misc:Depends}
Provides: ${cdbs:Provides} Provides: ${cdbs:Provides}
......
...@@ -7,16 +7,15 @@ Uploaders: Jérémy Lal <kapouer@melix.org>, ...@@ -7,16 +7,15 @@ Uploaders: Jérémy Lal <kapouer@melix.org>,
Build-Depends: @cdbs@ Build-Depends: @cdbs@
Build-Depends-Indep: node-yamlish, Build-Depends-Indep: node-yamlish,
node-marked node-marked
Standards-Version: 4.0.0 Standards-Version: 4.1.1
Homepage: http://nodejs.org/ Homepage: http://nodejs.org/
Vcs-Browser: https://anonscm.debian.org/gitweb/?p=collab-maint/nodejs.git Vcs-Browser: https://anonscm.debian.org/gitweb/?p=collab-maint/nodejs.git
Vcs-Git: https://anonscm.debian.org/git/collab-maint/nodejs.git Vcs-Git: https://anonscm.debian.org/git/collab-maint/nodejs.git
Testsuite: autopkgtest
Package: nodejs-dev Package: nodejs-dev
Section: devel Section: devel
Priority: extra Priority: optional
Architecture: any Architecture: amd64 arm64 armhf i386 kfreebsd-i386 kfreebsd-amd64 mips mips64el mipsel ppc64 ppc64el s390x
Depends: ${cdbs:Depends}, Depends: ${cdbs:Depends},
${misc:Depends}, ${misc:Depends},
nodejs (= ${binary:Version}) nodejs (= ${binary:Version})
...@@ -31,7 +30,7 @@ Description: evented I/O for V8 javascript (development files) ...@@ -31,7 +30,7 @@ Description: evented I/O for V8 javascript (development files)
This package is needed to build plugins. This package is needed to build plugins.
Package: nodejs Package: nodejs
Architecture: any Architecture: amd64 arm64 armhf i386 kfreebsd-i386 kfreebsd-amd64 mips mips64el mipsel ppc64 ppc64el s390x
Depends: ${shlibs:Depends}, Depends: ${shlibs:Depends},
${misc:Depends} ${misc:Depends}
Provides: ${cdbs:Provides} Provides: ${cdbs:Provides}
......
...@@ -24,7 +24,7 @@ Forwarded: not-needed ...@@ -24,7 +24,7 @@ Forwarded: not-needed
if (added) { if (added) {
--- a/tools/doc/html.js --- a/tools/doc/html.js
+++ b/tools/doc/html.js +++ b/tools/doc/html.js
@@ -383,10 +383,7 @@ @@ -440,10 +440,7 @@
} }
if (tok.type !== 'heading') return; if (tok.type !== 'heading') return;
......
Description: pass tests with openssl 1.1 cli
Use more robust, supported ciphers
Origin: https://github.com/nodejs/node/pull/8491
Last-Update: 2016-12-19
--- a/test/parallel/test-tls-set-ciphers.js
+++ b/test/parallel/test-tls-set-ciphers.js
@@ -15,7 +15,7 @@
const options = {
key: fs.readFileSync(`${common.fixturesDir}/keys/agent2-key.pem`),
cert: fs.readFileSync(`${common.fixturesDir}/keys/agent2-cert.pem`),
- ciphers: 'DES-CBC3-SHA'
+ ciphers: 'AES256-SHA'
};
const reply = 'I AM THE WALRUS'; // something recognizable
--- a/test/parallel/test-tls-ecdh-disable.js
+++ b/test/parallel/test-tls-ecdh-disable.js
@@ -14,7 +14,7 @@
const options = {
key: fs.readFileSync(`${common.fixturesDir}/keys/agent2-key.pem`),
cert: fs.readFileSync(`${common.fixturesDir}/keys/agent2-cert.pem`),
- ciphers: 'ECDHE-RSA-RC4-SHA',
+ ciphers: 'ECDHE-RSA-AES128-SHA',
ecdhCurve: false
};
...@@ -16,11 +16,11 @@ Author: Jérémy Lal <kapouer@melix.org> ...@@ -16,11 +16,11 @@ Author: Jérémy Lal <kapouer@melix.org>
[$system==linux] [$system==linux]
--- a/test/parallel/test-tls-session-cache.js --- a/test/parallel/test-tls-session-cache.js
+++ b/test/parallel/test-tls-session-cache.js +++ b/test/parallel/test-tls-session-cache.js
@@ -70,7 +70,6 @@ @@ -66,7 +66,6 @@
server.listen(0, function() { server.listen(0, function() {
const args = [ const args = [
's_client', 's_client',
- '-tls1', - '-tls1',
'-connect', `localhost:${this.address().port}`, '-connect', `localhost:${this.address().port}`,
'-servername', 'ohgod', '-servername', 'ohgod',
'-key', join(common.fixturesDir, 'agent.key'), '-key', fixtures.path('agent.key'),
openssl/s_client_1.1.patch
openssl/s_client_tls12.patch openssl/s_client_tls12.patch
use_system_node_gyp.patch use_system_node_gyp.patch
privacy_breach.patch privacy_breach.patch
......
...@@ -158,7 +158,7 @@ Author: Jérémy Lal <kapouer@melix.org> ...@@ -158,7 +158,7 @@ Author: Jérémy Lal <kapouer@melix.org>
+// } +// }
--- a/test/common/index.js --- a/test/common/index.js
+++ b/test/common/index.js +++ b/test/common/index.js
@@ -282,24 +282,7 @@ @@ -276,24 +276,7 @@
}; };
exports.platformTimeout = function(ms) { exports.platformTimeout = function(ms) {
......
...@@ -11,7 +11,7 @@ ...@@ -11,7 +11,7 @@
#define V8_MAJOR_VERSION 5 #define V8_MAJOR_VERSION 5
#define V8_MINOR_VERSION 1 #define V8_MINOR_VERSION 1
#define V8_BUILD_NUMBER 281 #define V8_BUILD_NUMBER 281
#define V8_PATCH_LEVEL 107 #define V8_PATCH_LEVEL 108
// Use 1 for candidates and 0 otherwise. // Use 1 for candidates and 0 otherwise.
// (Boolean macro values are not supported by all preprocessors.) // (Boolean macro values are not supported by all preprocessors.)
......
...@@ -795,14 +795,18 @@ namespace { ...@@ -795,14 +795,18 @@ namespace {
*/ */
class ArrayConcatVisitor { class ArrayConcatVisitor {
public: public:
ArrayConcatVisitor(Isolate* isolate, Handle<Object> storage, ArrayConcatVisitor(Isolate* isolate, Handle<HeapObject> storage,
bool fast_elements) bool fast_elements)
: isolate_(isolate), : isolate_(isolate),
storage_(isolate->global_handles()->Create(*storage)), storage_(isolate->global_handles()->Create(*storage)),
index_offset_(0u), index_offset_(0u),
bit_field_(FastElementsField::encode(fast_elements) | bit_field_(
ExceedsLimitField::encode(false) | FastElementsField::encode(fast_elements) |
IsFixedArrayField::encode(storage->IsFixedArray())) { ExceedsLimitField::encode(false) |
IsFixedArrayField::encode(storage->IsFixedArray()) |
HasSimpleElementsField::encode(storage->IsFixedArray() ||
storage->map()->instance_type() >
LAST_CUSTOM_ELEMENTS_RECEIVER)) {
DCHECK(!(this->fast_elements() && !is_fixed_array())); DCHECK(!(this->fast_elements() && !is_fixed_array()));
} }
...@@ -891,12 +895,16 @@ class ArrayConcatVisitor { ...@@ -891,12 +895,16 @@ class ArrayConcatVisitor {
// (otherwise) // (otherwise)
Handle<FixedArray> storage_fixed_array() { Handle<FixedArray> storage_fixed_array() {
DCHECK(is_fixed_array()); DCHECK(is_fixed_array());
DCHECK(has_simple_elements());
return Handle<FixedArray>::cast(storage_); return Handle<FixedArray>::cast(storage_);
} }
Handle<JSReceiver> storage_jsreceiver() { Handle<JSReceiver> storage_jsreceiver() {
DCHECK(!is_fixed_array()); DCHECK(!is_fixed_array());
return Handle<JSReceiver>::cast(storage_); return Handle<JSReceiver>::cast(storage_);
} }
bool has_simple_elements() const {
return HasSimpleElementsField::decode(bit_field_);
}
private: private:
// Convert storage to dictionary mode. // Convert storage to dictionary mode.
...@@ -929,12 +937,14 @@ class ArrayConcatVisitor { ...@@ -929,12 +937,14 @@ class ArrayConcatVisitor {
inline void set_storage(FixedArray* storage) { inline void set_storage(FixedArray* storage) {
DCHECK(is_fixed_array()); DCHECK(is_fixed_array());
DCHECK(has_simple_elements());
storage_ = isolate_->global_handles()->Create(storage); storage_ = isolate_->global_handles()->Create(storage);
} }
class FastElementsField : public BitField<bool, 0, 1> {}; class FastElementsField : public BitField<bool, 0, 1> {};
class ExceedsLimitField : public BitField<bool, 1, 1> {}; class ExceedsLimitField : public BitField<bool, 1, 1> {};
class IsFixedArrayField : public BitField<bool, 2, 1> {}; class IsFixedArrayField : public BitField<bool, 2, 1> {};
class HasSimpleElementsField : public BitField<bool, 3, 1> {};
bool fast_elements() const { return FastElementsField::decode(bit_field_); } bool fast_elements() const { return FastElementsField::decode(bit_field_); }
void set_fast_elements(bool fast) { void set_fast_elements(bool fast) {
...@@ -1166,8 +1176,6 @@ bool IterateElementsSlow(Isolate* isolate, Handle<JSReceiver> receiver, ...@@ -1166,8 +1176,6 @@ bool IterateElementsSlow(Isolate* isolate, Handle<JSReceiver> receiver,
visitor->increase_index_offset(length); visitor->increase_index_offset(length);
return true; return true;
} }
/** /**
* A helper function that visits "array" elements of a JSReceiver in numerical * A helper function that visits "array" elements of a JSReceiver in numerical
* order. * order.
...@@ -1201,7 +1209,8 @@ bool IterateElements(Isolate* isolate, Handle<JSReceiver> receiver, ...@@ -1201,7 +1209,8 @@ bool IterateElements(Isolate* isolate, Handle<JSReceiver> receiver,
return IterateElementsSlow(isolate, receiver, length, visitor); return IterateElementsSlow(isolate, receiver, length, visitor);
} }
if (!HasOnlySimpleElements(isolate, *receiver)) { if (!HasOnlySimpleElements(isolate, *receiver) ||
!visitor->has_simple_elements()) {
return IterateElementsSlow(isolate, receiver, length, visitor); return IterateElementsSlow(isolate, receiver, length, visitor);
} }
Handle<JSObject> array = Handle<JSObject>::cast(receiver); Handle<JSObject> array = Handle<JSObject>::cast(receiver);
...@@ -1476,7 +1485,7 @@ Object* Slow_ArrayConcat(Arguments* args, Handle<Object> species, ...@@ -1476,7 +1485,7 @@ Object* Slow_ArrayConcat(Arguments* args, Handle<Object> species,
// In case of failure, fall through. // In case of failure, fall through.
} }
Handle<Object> storage; Handle<HeapObject> storage;
if (fast_case) { if (fast_case) {
// The backing storage array must have non-existing elements to preserve // The backing storage array must have non-existing elements to preserve
// holes across concat operations. // holes across concat operations.
...@@ -1494,7 +1503,7 @@ Object* Slow_ArrayConcat(Arguments* args, Handle<Object> species, ...@@ -1494,7 +1503,7 @@ Object* Slow_ArrayConcat(Arguments* args, Handle<Object> species,
ASSIGN_RETURN_FAILURE_ON_EXCEPTION( ASSIGN_RETURN_FAILURE_ON_EXCEPTION(
isolate, storage_object, isolate, storage_object,
Execution::New(isolate, species, species, 1, &length)); Execution::New(isolate, species, species, 1, &length));
storage = storage_object; storage = Handle<HeapObject>::cast(storage_object);
} }
ArrayConcatVisitor visitor(isolate, storage, fast_case); ArrayConcatVisitor visitor(isolate, storage, fast_case);
......
...@@ -842,6 +842,12 @@ bool ScopeIterator::CopyContextExtensionToScopeObject( ...@@ -842,6 +842,12 @@ bool ScopeIterator::CopyContextExtensionToScopeObject(
void ScopeIterator::GetNestedScopeChain(Isolate* isolate, Scope* scope, void ScopeIterator::GetNestedScopeChain(Isolate* isolate, Scope* scope,
int position) { int position) {
if (scope->is_function_scope()) {
// Do not collect scopes of nested inner functions inside the current one.
Handle<JSFunction> function =
Handle<JSFunction>::cast(frame_inspector_->GetFunction());
if (scope->end_position() < function->shared()->end_position()) return;
}
if (!scope->is_eval_scope()) { if (!scope->is_eval_scope()) {
nested_scope_chain_.Add(ExtendedScopeInfo(scope->GetScopeInfo(isolate), nested_scope_chain_.Add(ExtendedScopeInfo(scope->GetScopeInfo(isolate),
scope->start_position(), scope->start_position(),
......
...@@ -146,8 +146,11 @@ class SamplingAllocationObserver : public AllocationObserver { ...@@ -146,8 +146,11 @@ class SamplingAllocationObserver : public AllocationObserver {
void Step(int bytes_allocated, Address soon_object, size_t size) override { void Step(int bytes_allocated, Address soon_object, size_t size) override {
USE(heap_); USE(heap_);
DCHECK(heap_->gc_state() == Heap::NOT_IN_GC); DCHECK(heap_->gc_state() == Heap::NOT_IN_GC);
DCHECK(soon_object); if (soon_object) {
profiler_->SampleObject(soon_object, size); // TODO(ofrobots): it would be better to sample the next object rather
// than skipping this sample epoch if soon_object happens to be null.
profiler_->SampleObject(soon_object, size);
}
} }
intptr_t GetNextStepSize() override { return GetNextSampleInterval(rate_); } intptr_t GetNextStepSize() override { return GetNextSampleInterval(rate_); }
......
...@@ -1131,7 +1131,7 @@ RegExpEngine::CompilationResult RegExpCompiler::Assemble( ...@@ -1131,7 +1131,7 @@ RegExpEngine::CompilationResult RegExpCompiler::Assemble(
Handle<HeapObject> code = macro_assembler_->GetCode(pattern); Handle<HeapObject> code = macro_assembler_->GetCode(pattern);
heap->IncreaseTotalRegexpCodeGenerated(code->Size()); heap->IncreaseTotalRegexpCodeGenerated(code->Size());
work_list_ = NULL; work_list_ = NULL;
#ifdef ENABLE_DISASSEMBLER #if defined(ENABLE_DISASSEMBLER) && !defined(V8_INTERPRETED_REGEXP)
if (FLAG_print_code) { if (FLAG_print_code) {
CodeTracer::Scope trace_scope(heap->isolate()->GetCodeTracer()); CodeTracer::Scope trace_scope(heap->isolate()->GetCodeTracer());
OFStream os(trace_scope.file()); OFStream os(trace_scope.file());
......
...@@ -31,7 +31,7 @@ enum CategoryGroupEnabledFlags { ...@@ -31,7 +31,7 @@ enum CategoryGroupEnabledFlags {
kEnabledForETWExport_CategoryGroupEnabledFlags = 1 << 3, kEnabledForETWExport_CategoryGroupEnabledFlags = 1 << 3,
};