Skip to content

Install the amazon provider to be able to store DAG logs in S3

Without this provider installed and with the s3 configuration in place, the webserver crashes with the following error:

Unable to load the config, contains a configuration error.
Traceback (most recent call last):
  File "/usr/lib/python3.11/logging/config.py", line 389, in resolve
    found = getattr(found, frag)
            ^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'airflow.providers' has no attribute 'amazon'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.11/logging/config.py", line 391, in resolve
    self.importer(used)
ModuleNotFoundError: No module named 'airflow.providers.amazon'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/lib/python3.11/logging/config.py", line 562, in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/logging/config.py", line 724, in configure_handler
    klass = self.resolve(cname)
            ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/logging/config.py", line 396, in resolve
    raise v from e
ValueError: Cannot resolve 'airflow.providers.amazon.aws.log.s3_task_handler.S3TaskHandler': No module named 'airflow.providers.amazon'

According to https://github.com/apache/airflow/blob/main/hatch_build.py#L354, the right extra package to install is apache-airflow[amazon].

Bug: T372787

Edited by Brouberol

Merge request reports

Loading