Filtered by vendor Apache
Subscribe
Total
2615 CVE
CVE | Vendors | Products | Updated | CVSS v2 | CVSS v3 |
---|---|---|---|---|---|
CVE-2023-43826 | 1 Apache | 1 Guacamole | 2025-02-13 | N/A | 7.5 HIGH |
Apache Guacamole 1.5.3 and older do not consistently ensure that values received from a VNC server will not result in integer overflow. If a user connects to a malicious or compromised VNC server, specially-crafted data could result in memory corruption, possibly allowing arbitrary code to be executed with the privileges of the running guacd process. Users are recommended to upgrade to version 1.5.4, which fixes this issue. | |||||
CVE-2023-43622 | 1 Apache | 1 Http Server | 2025-02-13 | N/A | 7.5 HIGH |
An attacker, opening a HTTP/2 connection with an initial window size of 0, was able to block handling of that connection indefinitely in Apache HTTP Server. This could be used to exhaust worker resources in the server, similar to the well known "slow loris" attack pattern. This has been fixed in version 2.4.58, so that such connection are terminated properly after the configured connection timeout. This issue affects Apache HTTP Server: from 2.4.55 through 2.4.57. Users are recommended to upgrade to version 2.4.58, which fixes the issue. | |||||
CVE-2023-43123 | 1 Apache | 1 Storm | 2025-02-13 | N/A | 5.5 MEDIUM |
On unix-like systems, the temporary directory is shared between all user. As such, writing to this directory using APIs that do not explicitly set the file/directory permissions can lead to information disclosure. Of note, this does not impact modern MacOS Operating Systems. The method File.createTempFile on unix-like systems creates a file with predefined name (so easily identifiable) and by default will create this file with the permissions -rw-r--r--. Thus, if sensitive information is written to this file, other local users can read this information. File.createTempFile(String, String) will create a temporary file in the system temporary directory if the 'java.io.tmpdir' system property is not explicitly set. This affects the class https://github.com/apache/storm/blob/master/storm-core/src/jvm/org/apache/storm/utils/TopologySpoutLag.java#L99 and was introduced by https://issues.apache.org/jira/browse/STORM-3123 In practice, this has a very limited impact as this class is used only if ui.disable.spout.lag.monitoring is set to false, but its value is true by default. Moreover, the temporary file gets deleted soon after its creation. The solution is to use Files.createTempFile https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/nio/file/Files.html#createTempFile(java.lang.String,java.lang.String,java.nio.file.attribute.FileAttribute...) instead. We recommend that all users upgrade to the latest version of Apache Storm. | |||||
CVE-2023-42794 | 1 Apache | 1 Tomcat | 2025-02-13 | N/A | 5.9 MEDIUM |
Incomplete Cleanup vulnerability in Apache Tomcat. The internal fork of Commons FileUpload packaged with Apache Tomcat 9.0.70 through 9.0.80 and 8.5.85 through 8.5.93 included an unreleased, in progress refactoring that exposed a potential denial of service on Windows if a web application opened a stream for an uploaded file but failed to close the stream. The file would never be deleted from disk creating the possibility of an eventual denial of service due to the disk being full. Users are recommended to upgrade to version 9.0.81 onwards or 8.5.94 onwards, which fixes the issue. | |||||
CVE-2023-42792 | 1 Apache | 1 Airflow | 2025-02-13 | N/A | 6.5 MEDIUM |
Apache Airflow, in versions prior to 2.7.2, contains a security vulnerability that allows an authenticated user with limited access to some DAGs, to craft a request that could give the user write access to various DAG resources for DAGs that the user had no access to, thus, enabling the user to clear DAGs they shouldn't. Users of Apache Airflow are strongly advised to upgrade to version 2.7.2 or newer to mitigate the risk associated with this vulnerability. | |||||
CVE-2023-42663 | 1 Apache | 1 Airflow | 2025-02-13 | N/A | 6.5 MEDIUM |
Apache Airflow, versions before 2.7.2, has a vulnerability that allows an authorized user who has access to read specific DAGs only, to read information about task instances in other DAGs. Users of Apache Airflow are advised to upgrade to version 2.7.2 or newer to mitigate the risk associated with this vulnerability. | |||||
CVE-2023-42505 | 1 Apache | 1 Superset | 2025-02-13 | N/A | 4.3 MEDIUM |
An authenticated user with read permissions on database connections metadata could potentially access sensitive information such as the connection's username. This issue affects Apache Superset before 3.0.0. | |||||
CVE-2023-42504 | 1 Apache | 1 Superset | 2025-02-13 | N/A | 5.8 MEDIUM |
An authenticated malicious user could initiate multiple concurrent requests, each requesting multiple dashboard exports, leading to a possible denial of service. This issue affects Apache Superset: before 3.0.0 | |||||
CVE-2023-42503 | 1 Apache | 1 Commons Compress | 2025-02-13 | N/A | 5.5 MEDIUM |
Improper Input Validation, Uncontrolled Resource Consumption vulnerability in Apache Commons Compress in TAR parsing.This issue affects Apache Commons Compress: from 1.22 before 1.24.0. Users are recommended to upgrade to version 1.24.0, which fixes the issue. A third party can create a malformed TAR file by manipulating file modification times headers, which when parsed with Apache Commons Compress, will cause a denial of service issue via CPU consumption. In version 1.22 of Apache Commons Compress, support was added for file modification times with higher precision (issue # COMPRESS-612 [1]). The format for the PAX extended headers carrying this data consists of two numbers separated by a period [2], indicating seconds and subsecond precision (for example “1647221103.5998539”). The impacted fields are “atime”, “ctime”, “mtime” and “LIBARCHIVE.creationtime”. No input validation is performed prior to the parsing of header values. Parsing of these numbers uses the BigDecimal [3] class from the JDK which has a publicly known algorithmic complexity issue when doing operations on large numbers, causing denial of service (see issue # JDK-6560193 [4]). A third party can manipulate file time headers in a TAR file by placing a number with a very long fraction (300,000 digits) or a number with exponent notation (such as “9e9999999”) within a file modification time header, and the parsing of files with these headers will take hours instead of seconds, leading to a denial of service via exhaustion of CPU resources. This issue is similar to CVE-2012-2098 [5]. [1]: https://issues.apache.org/jira/browse/COMPRESS-612 [2]: https://pubs.opengroup.org/onlinepubs/9699919799/utilities/pax.html#tag_20_92_13_05 [3]: https://docs.oracle.com/javase/8/docs/api/java/math/BigDecimal.html [4]: https://bugs.openjdk.org/browse/JDK-6560193 [5]: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-2098 Only applications using CompressorStreamFactory class (with auto-detection of file types), TarArchiveInputStream and TarFile classes to parse TAR files are impacted. Since this code was introduced in v1.22, only that version and later versions are impacted. | |||||
CVE-2023-42501 | 1 Apache | 1 Superset | 2025-02-13 | N/A | 4.3 MEDIUM |
Unnecessary read permissions within the Gamma role would allow authenticated users to read configured CSS templates and annotations. This issue affects Apache Superset: before 2.1.2. Users should upgrade to version or above 2.1.2 and run `superset init` to reconstruct the Gamma role or remove `can_read` permission from the mentioned resources. | |||||
CVE-2023-41267 | 1 Apache | 1 Airflow Hdfs Provider | 2025-02-13 | N/A | 7.8 HIGH |
In the Apache Airflow HDFS Provider, versions prior to 4.1.1, a documentation info pointed users to an install incorrect pip package. As this package name was unclaimed, in theory, an attacker could claim this package and provide code that would be executed when this package was installed. The Airflow team has since taken ownership of the package (neutralizing the risk), and fixed the doc strings in version 4.1.1 | |||||
CVE-2023-40743 | 1 Apache | 1 Axis | 2025-02-13 | N/A | 9.8 CRITICAL |
** UNSUPPORTED WHEN ASSIGNED ** When integrating Apache Axis 1.x in an application, it may not have been obvious that looking up a service through "ServiceFactory.getService" allows potentially dangerous lookup mechanisms such as LDAP. When passing untrusted input to this API method, this could expose the application to DoS, SSRF and even attacks leading to RCE. As Axis 1 has been EOL we recommend you migrate to a different SOAP engine, such as Apache Axis 2/Java. As a workaround, you may review your code to verify no untrusted or unsanitized input is passed to "ServiceFactory.getService", or by applying the patch from https://github.com/apache/axis-axis1-java/commit/7e66753427466590d6def0125e448d2791723210 . The Apache Axis project does not expect to create an Axis 1.x release fixing this problem, though contributors that would like to work towards this are welcome. | |||||
CVE-2023-40610 | 1 Apache | 1 Superset | 2025-02-13 | N/A | 6.3 MEDIUM |
Improper authorization check and possible privilege escalation on Apache Superset up to but excluding 2.1.2. Using the default examples database connection that allows access to both the examples schema and Apache Superset's metadata database, an attacker using a specially crafted CTE SQL statement could change data on the metadata database. This weakness could result on tampering with the authentication/authorization data. | |||||
CVE-2023-40272 | 1 Apache | 1 Apache-airflow-providers-apache-spark | 2025-02-13 | N/A | 7.5 HIGH |
Apache Airflow Spark Provider, versions before 4.1.3, is affected by a vulnerability that allows an attacker to pass in malicious parameters when establishing a connection giving an opportunity to read files on the Airflow server. It is recommended to upgrade to a version that is not affected. | |||||
CVE-2023-40037 | 1 Apache | 1 Nifi | 2025-02-13 | N/A | 6.5 MEDIUM |
Apache NiFi 1.21.0 through 1.23.0 support JDBC and JNDI JMS access in several Processors and Controller Services with connection URL validation that does not provide sufficient protection against crafted inputs. An authenticated and authorized user can bypass connection URL validation using custom input formatting. The resolution enhances connection URL validation and introduces validation for additional related properties. Upgrading to Apache NiFi 1.23.1 is the recommended mitigation. | |||||
CVE-2023-39913 | 1 Apache | 1 Uimaj | 2025-02-13 | N/A | 8.8 HIGH |
Deserialization of Untrusted Data, Improper Input Validation vulnerability in Apache UIMA Java SDK, Apache UIMA Java SDK, Apache UIMA Java SDK, Apache UIMA Java SDK.This issue affects Apache UIMA Java SDK: before 3.5.0. Users are recommended to upgrade to version 3.5.0, which fixes the issue. There are several locations in the code where serialized Java objects are deserialized without verifying the data. This affects in particular: * the deserialization of a Java-serialized CAS, but also other binary CAS formats that include TSI information using the CasIOUtils class; * the CAS Editor Eclipse plugin which uses the the CasIOUtils class to load data; * the deserialization of a Java-serialized CAS of the Vinci Analysis Engine service which can receive using Java-serialized CAS objects over network connections; * the CasAnnotationViewerApplet and the CasTreeViewerApplet; * the checkpointing feature of the CPE module. Note that the UIMA framework by default does not start any remotely accessible services (i.e. Vinci) that would be vulnerable to this issue. A user or developer would need to make an active choice to start such a service. However, users or developers may use the CasIOUtils in their own applications and services to parse serialized CAS data. They are affected by this issue unless they ensure that the data passed to CasIOUtils is not a serialized Java object. When using Vinci or using CasIOUtils in own services/applications, the unrestricted deserialization of Java-serialized CAS files may allow arbitrary (remote) code execution. As a remedy, it is possible to set up a global or context-specific ObjectInputFilter (cf. https://openjdk.org/jeps/290 and https://openjdk.org/jeps/415 ) if running UIMA on a Java version that supports it. Note that Java 1.8 does not support the ObjectInputFilter, so there is no remedy when running on this out-of-support platform. An upgrade to a recent Java version is strongly recommended if you need to secure an UIMA version that is affected by this issue. To mitigate the issue on a Java 9+ platform, you can configure a filter pattern through the "jdk.serialFilter" system property using a semicolon as a separator: To allow deserializing Java-serialized binary CASes, add the classes: * org.apache.uima.cas.impl.CASCompleteSerializer * org.apache.uima.cas.impl.CASMgrSerializer * org.apache.uima.cas.impl.CASSerializer * java.lang.String To allow deserializing CPE Checkpoint data, add the following classes (and any custom classes your application uses to store its checkpoints): * org.apache.uima.collection.impl.cpm.CheckpointData * org.apache.uima.util.ProcessTrace * org.apache.uima.util.impl.ProcessTrace_impl * org.apache.uima.collection.base_cpm.SynchPoint Make sure to use "!*" as the final component to the filter pattern to disallow deserialization of any classes not listed in the pattern. Apache UIMA 3.5.0 uses tightly scoped ObjectInputFilters when reading Java-serialized data depending on the type of data being expected. Configuring a global filter is not necessary with this version. | |||||
CVE-2023-39553 | 1 Apache | 1 Apache-airflow-providers-apache-drill | 2025-02-13 | N/A | 7.5 HIGH |
Improper Input Validation vulnerability in Apache Software Foundation Apache Airflow Drill Provider. Apache Airflow Drill Provider is affected by a vulnerability that allows an attacker to pass in malicious parameters when establishing a connection with DrillHook giving an opportunity to read files on the Airflow server. This issue affects Apache Airflow Drill Provider: before 2.4.3. It is recommended to upgrade to a version that is not affected. | |||||
CVE-2023-39508 | 1 Apache | 1 Airflow | 2025-02-13 | N/A | 8.8 HIGH |
Execution with Unnecessary Privileges, : Exposure of Sensitive Information to an Unauthorized Actor vulnerability in Apache Software Foundation Apache Airflow.The "Run Task" feature enables authenticated user to bypass some of the restrictions put in place. It allows to execute code in the webserver context as well as allows to bypas limitation of access the user has to certain DAGs. The "Run Task" feature is considered dangerous and it has been removed entirely in Airflow 2.6.0 This issue affects Apache Airflow: before 2.6.0. | |||||
CVE-2023-39410 | 1 Apache | 1 Avro | 2025-02-13 | N/A | 7.5 HIGH |
When deserializing untrusted or corrupted data, it is possible for a reader to consume memory beyond the allowed constraints and thus lead to out of memory on the system. This issue affects Java applications using Apache Avro Java SDK up to and including 1.11.2. Users should update to apache-avro version 1.11.3 which addresses this issue. | |||||
CVE-2023-39196 | 1 Apache | 1 Ozone | 2025-02-13 | N/A | 5.3 MEDIUM |
Improper Authentication vulnerability in Apache Ozone. The vulnerability allows an attacker to download metadata internal to the Storage Container Manager service without proper authentication. The attacker is not allowed to do any modification within the Ozone Storage Container Manager service using this vulnerability. The accessible metadata does not contain sensitive information that can be used to exploit the system later on, and the accessible data does not make it possible to gain access to actual user data within Ozone. This issue affects Apache Ozone: 1.2.0 and subsequent releases up until 1.3.0. Users are recommended to upgrade to version 1.4.0, which fixes the issue. |