CVE ID | Publié | Description | Score | Gravité |
---|---|---|---|---|
In Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. This was addressed in Apache Hadoop 3.2.3 | 9.8 |
Critique |
||
This is an information disclosure vulnerability in Apache Hadoop before 2.6.4 and 2.7.x before 2.7.2 in the short-circuit reads feature of HDFS. A local user on an HDFS DataNode may be able to craft a block token that grants unauthorized read access to random files by guessing certain fields in the token. | 5.5 |
Moyen |
||
The HDFS web UI in Apache Hadoop before 2.7.0 is vulnerable to a cross-site scripting (XSS) attack through an unescaped query parameter. | 6.1 |
Moyen |
||
HDFS clients interact with a servlet on the DataNode to browse the HDFS namespace. The NameNode is provided as a query parameter that is not validated in Apache Hadoop before 2.7.0. | 7.3 |
Haute |
||
The RPC protocol implementation in Apache Hadoop 2.x before 2.0.6-alpha, 0.23.x before 0.23.9, and 1.x before 1.2.1, when the Kerberos security features are enabled, allows man-in-the-middle attackers to disable bidirectional authentication and obtain sensitive information by forcing a downgrade to simple authentication. | 3.2 |