Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
3.0.0
-
None
-
Generic Linux - but these dependencies are in the libraries that spark pulls in.
Given that several of these are sveral yrs old, and highly severe (remote code execution is possible) these libraries are ripe for exploitation and it is highlt likly that exploits curretnly exist for these issues.
Please upgrade the dependant libraries and run OWASP dependency check prior to all future releases/
Generic Linux - but these dependencies are in the libraries that spark pulls in. Given that several of these are sveral yrs old, and highly severe (remote code execution is possible) these libraries are ripe for exploitation and it is highlt likly that exploits curretnly exist for these issues. Please upgrade the dependant libraries and run OWASP dependency check prior to all future releases/
-
Patch, Important
-
CVE-2018-17190, CVE-2018-11777, CVE-2018-17190, CVE-2018-21234, CVE-2017-15718, CVE-2018-8009, CVE-2018-11766, CVE-2018-8029, CVE-2018-1337,CVE-2015-3250 ,
Description
CVE-2018-1337 | In Apache Directory LDAP API before 1.0.2, - upgrade dependency to 1.0.2 |
---|
CVE-2018-17190 | In all versions of Apache Spark, |
---|
CVE-2017-15718 | The YARN NodeManager in Apache Hadoop 2.7.3 and 2.7.4 - upgrade lib |
---|
CVE-2018-21234 | Jodd before 5.0.4 performs Deserialization of Untrusted JSON Data when setClassMetadataName is set. |
---|
CVE-2019-17571 | Included in Log4j 1.2 is a SocketServer class that is vulnerable to deserialization of untrusted data which can be exploited to remotely execute arbitrary code when combined with a deserialization gadget when listening to untrusted network traffic for log data. This affects Log4j versions up to 1.2 up to 1.2.17. |
---|
CVE-2018-17190 | In all versions of Apache Spark, its standalone resource manager accepts code to execute on a 'master' host, that then runs that code on 'worker |
---|
CVE-2020-9480 | In Apache Spark 2.4.5 and earlier, a standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. |
---|