Skip to content

dput -O debusine_workflow_data.enable_reverse_dependencies_autopkgtest=true results in a DebianPipelineWorkflowData backtrace

I'm following the (new) wiki.d.o/DebusineDebianNet tutorial to make my first upload 😄 This is on an up-to-date (2025-04-25) system with debusine-client 0.11.0 and dput-ng 1.43.

I ran dput -O debusine_workflow_data.enable_reverse_dependencies_autopkgtest=true ...changes, as (optionally) recommended in said tutorial. I was presented with the backtrace below, seemingly because "reverse_dependencies_autopkgtest_suite" is required if "enable_reverse_dependencies_autopkgtest" is set".

I'm not sure if the documentation is wrong here or not to suggest enable_reverse_dependencies_autopkgtest without reverse_dependencies_autopkgtest_suite, but in any case getting a backtrace for erroneous input is likely not the intended behavior.

Finally, note that an artifact was created (1653562) and linked to in the dput-ng output. It's unclear to me what I can do with it given no workflow was created. For example, I would have liked to have a way to clean it up.

Full backtrace:

$ dput -O debusine_workflow_data.enable_reverse_dependencies_autopkgtest=true debusine.debian.net librdkafka_2.10.0-1_source.changes 
Uploading librdkafka using debusine to debusine.debian.net (host: debusine.debian.net; directory: /)
running debusine-check-workflow: check debusine workflow for distribution
running checksum: verify checksums before uploading
running suite-mismatch: check the target distribution for common errors
running gpg: check GnuPG signatures before the upload
Not checking GPG signature due to allow_unsigned_uploads being set.
Uploading librdkafka_2.10.0-1.dsc
Uploading librdkafka_2.10.0.orig.tar.gz
Uploading librdkafka_2.10.0-1.debian.tar.xz
Uploading librdkafka_2.10.0-1_amd64.buildinfo
Uploading librdkafka_2.10.0-1_source.changes
Created artifact: https://debusine.debian.net/debian/developers/artifact/1653562/
running debusine-create-workflow: create a debusine workflow
Traceback (most recent call last):
  File "/usr/bin/dput", line 130, in <module>
    upload_package(changes, args)
    ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/dput/uploader.py", line 352, in invoke_dput
    run_post_hooks(changes, profile)
    ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/dput/hook.py", line 64, in run_post_hooks
    run_hook(name, hook, changes, profile)
    ~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/dput/hook.py", line 83, in run_hook
    return run_func_by_name('hooks', name, changes, profile)
  File "/usr/lib/python3/dist-packages/dput/util.py", line 379, in run_func_by_name
    obj(changes, profile, interface)
    ~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/debusine/client/dput_ng/hooks.py", line 81, in create_workflow
    workflow_created = debusine.workflow_create(workflow)
  File "/usr/lib/python3/dist-packages/debusine/client/debusine.py", line 270, in workflow_create
    return self._debusine_http_client.post(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        "/workflow/",
        ^^^^^^^^^^^^^
    ...<2 lines>...
        expected_statuses=[requests.codes.created],
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/lib/python3/dist-packages/debusine/client/debusine_http_client.py", line 102, in post
    return self._api_request(
           ~~~~~~~~~~~~~~~~~^
        "POST",
        ^^^^^^^
    ...<3 lines>...
        expected_statuses=expected_statuses,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/lib/python3/dist-packages/debusine/client/debusine_http_client.py", line 294, in _api_request
    raise exceptions.DebusineError(error)
debusine.client.exceptions.DebusineError: {'title': 'Cannot create workflow', 'detail': '1 validation error for DebianPipelineWorkflowData\n__root__\n  "reverse_dependencies_autopkgtest_suite" is required if "enable_reverse_dependencies_autopkgtest" is set (type=value_error)'}
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information