Apache Software Foundation Hadoop 2.8.2

CPE Details

Apache Software Foundation Hadoop 2.8.2
2.8.2
2019-03-27
14h33 +00:00
2019-03-27
14h33 +00:00
Alerte pour un CPE
Restez informé de toutes modifications pour un CPE spécifique.
Gestion des notifications

CPE Name: cpe:2.3:a:apache:hadoop:2.8.2:*:*:*:*:*:*:*

Informations

Vendor

apache

Product

hadoop

Version

2.8.2

Related CVE

Open and find in CVE List

CVE ID Publié Description Score Gravité
CVE-2022-25168 2022-08-04 12h30 +00:00 Apache Hadoop's FileUtil.unTar(File, File) API does not escape the input file name before being passed to the shell. An attacker can inject arbitrary commands. This is only used in Hadoop 3.3 InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user. It has been used in Hadoop 2.x for yarn localization, which does enable remote code execution. It is used in Apache Spark, from the SQL command ADD ARCHIVE. As the ADD ARCHIVE command adds new binaries to the classpath, being able to execute shell scripts does not confer new permissions to the caller. SPARK-38305. "Check existence of file before untarring/zipping", which is included in 3.3.0, 3.1.4, 3.2.2, prevents shell commands being executed, regardless of which version of the hadoop libraries are in use. Users should upgrade to Apache Hadoop 2.10.2, 3.2.4, 3.3.3 or upper (including HADOOP-18136).
9.8
Critique
CVE-2021-33036 2022-06-15 12h25 +00:00 In Apache Hadoop 2.2.0 to 2.10.1, 3.0.0-alpha1 to 3.1.4, 3.2.0 to 3.2.2, and 3.3.0 to 3.3.1, a user who can escalate to yarn user can possibly run arbitrary commands as root user. Users should upgrade to Apache Hadoop 2.10.2, 3.2.3, 3.3.2 or higher.
8.8
Haute
CVE-2022-26612 2022-04-07 16h20 +00:00 In Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. This was addressed in Apache Hadoop 3.2.3
9.8
Critique
CVE-2020-9492 2021-01-26 11h55 +00:00 In Apache Hadoop 3.2.0 to 3.2.1, 3.0.0-alpha1 to 3.1.3, and 2.0.0-alpha to 2.10.0, WebHDFS client might send SPNEGO authorization header to remote URL without proper verification.
8.8
Haute
CVE-2018-11765 2020-09-30 15h02 +00:00 In Apache Hadoop versions 3.0.0-alpha2 to 3.0.0, 2.9.0 to 2.9.2, 2.8.0 to 2.8.5, any users can access some servlets without authentication when Kerberos authentication is enabled and SPNEGO through HTTP is not enabled.
7.5
Haute
CVE-2018-11768 2019-10-04 11h56 +00:00 In Apache Hadoop 3.1.0 to 3.1.1, 3.0.0-alpha1 to 3.0.3, 2.9.0 to 2.9.1, and 2.0.0-alpha to 2.8.4, the user/group information can be corrupted across storing in fsimage and reading back from fsimage.
7.5
Haute
CVE-2018-8029 2019-05-30 13h15 +00:00 In Apache Hadoop versions 3.0.0-alpha1 to 3.1.0, 2.9.0 to 2.9.1, and 2.2.0 to 2.8.4, a user who can escalate to yarn user can possibly run arbitrary commands as root user.
8.8
Haute
CVE-2018-1296 2019-02-07 22h00 +00:00 In Apache Hadoop 3.0.0-alpha1 to 3.0.0, 2.9.0, 2.8.0 to 2.8.3, and 2.5.0 to 2.7.5, HDFS exposes extended attribute key/value pairs during listXAttrs, verifying only path-level search access to the directory rather than path-level read permission to the referent.
7.5
Haute
CVE-2018-8009 2018-11-13 20h00 +00:00 Apache Hadoop 3.1.0, 3.0.0-alpha to 3.0.2, 2.9.0 to 2.9.1, 2.8.0 to 2.8.4, 2.0.0-alpha to 2.7.6, 0.23.0 to 0.23.11 is exploitable via the zip slip vulnerability in places that accept a zip file.
8.8
Haute
CVE-2017-15713 2018-01-19 17h00 +00:00 Vulnerability in Apache Hadoop 0.23.x, 2.x before 2.7.5, 2.8.x before 2.8.3, and 3.0.0-alpha through 3.0.0-beta1 allows a cluster user to expose private files owned by the user running the MapReduce job history server process. The malicious user can construct a configuration file containing XML directives that reference sensitive files on the MapReduce job history server host.
6.5
Moyen