Apache Software Foundation Hadoop 2.0.0

CPE Details

Apache Software Foundation Hadoop 2.0.0
2.0.0
2019-10-10
13h28 +00:00
2019-10-10
13h28 +00:00
Alerte pour un CPE
Restez informé de toutes modifications pour un CPE spécifique.
Gestion des notifications

CPE Name: cpe:2.3:a:apache:hadoop:2.0.0:-:*:*:*:*:*:*

Informations

Vendor

apache

Product

hadoop

Version

2.0.0

Update

-

Related CVE

Open and find in CVE List

CVE ID Publié Description Score Gravité
CVE-2022-25168 2022-08-04 12h30 +00:00 Apache Hadoop's FileUtil.unTar(File, File) API does not escape the input file name before being passed to the shell. An attacker can inject arbitrary commands. This is only used in Hadoop 3.3 InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user. It has been used in Hadoop 2.x for yarn localization, which does enable remote code execution. It is used in Apache Spark, from the SQL command ADD ARCHIVE. As the ADD ARCHIVE command adds new binaries to the classpath, being able to execute shell scripts does not confer new permissions to the caller. SPARK-38305. "Check existence of file before untarring/zipping", which is included in 3.3.0, 3.1.4, 3.2.2, prevents shell commands being executed, regardless of which version of the hadoop libraries are in use. Users should upgrade to Apache Hadoop 2.10.2, 3.2.4, 3.3.3 or upper (including HADOOP-18136).
9.8
Critique
CVE-2022-26612 2022-04-07 16h20 +00:00 In Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. This was addressed in Apache Hadoop 3.2.3
9.8
Critique
CVE-2020-9492 2021-01-26 11h55 +00:00 In Apache Hadoop 3.2.0 to 3.2.1, 3.0.0-alpha1 to 3.1.3, and 2.0.0-alpha to 2.10.0, WebHDFS client might send SPNEGO authorization header to remote URL without proper verification.
8.8
Haute
CVE-2018-11768 2019-10-04 11h56 +00:00 In Apache Hadoop 3.1.0 to 3.1.1, 3.0.0-alpha1 to 3.0.3, 2.9.0 to 2.9.1, and 2.0.0-alpha to 2.8.4, the user/group information can be corrupted across storing in fsimage and reading back from fsimage.
7.5
Haute
CVE-2018-8009 2018-11-13 20h00 +00:00 Apache Hadoop 3.1.0, 3.0.0-alpha to 3.0.2, 2.9.0 to 2.9.1, 2.8.0 to 2.8.4, 2.0.0-alpha to 2.7.6, 0.23.0 to 0.23.11 is exploitable via the zip slip vulnerability in places that accept a zip file.
8.8
Haute
CVE-2016-5001 2017-08-30 19h00 +00:00 This is an information disclosure vulnerability in Apache Hadoop before 2.6.4 and 2.7.x before 2.7.2 in the short-circuit reads feature of HDFS. A local user on an HDFS DataNode may be able to craft a block token that grants unauthorized read access to random files by guessing certain fields in the token.
5.5
Moyen
CVE-2017-3161 2017-04-26 18h00 +00:00 The HDFS web UI in Apache Hadoop before 2.7.0 is vulnerable to a cross-site scripting (XSS) attack through an unescaped query parameter.
6.1
Moyen
CVE-2017-3162 2017-04-26 18h00 +00:00 HDFS clients interact with a servlet on the DataNode to browse the HDFS namespace. The NameNode is provided as a query parameter that is not validated in Apache Hadoop before 2.7.0.
7.3
Haute