Skip to content
Commits on Source (17)
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
include MANIFEST.in
include LICENSE.txt
include *requirements.txt
include gittaggers.py Makefile cwltool.py
include tests/*
include tests/tmp1/tmp2/tmp3/.gitkeep
include tests/tmp4/alpha/*
include tests/wf/*
include tests/override/*
include tests/checker_wf/*
include tests/subgraph/*
include cwltool/schemas/v1.0/*.yml
include cwltool/schemas/draft-2/*.yml
include cwltool/schemas/draft-3/*.yml
include cwltool/schemas/draft-3/*.md
include cwltool/schemas/draft-3/salad/schema_salad/metaschema/*.yml
include cwltool/schemas/draft-3/salad/schema_salad/metaschema/*.md
include cwltool/schemas/v1.0/*.yml
include cwltool/schemas/v1.0/*.md
include cwltool/schemas/v1.0/salad/schema_salad/metaschema/*.yml
......@@ -19,6 +20,24 @@ include cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/*.yml
include cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/*.md
include cwltool/cwlNodeEngine.js
include cwltool/cwlNodeEngineJSConsole.js
include cwltool/cwlNodeEngineWithContext.js
include cwltool/extensions.yml
include cwltool/jshint/jshint_wrapper.js
include cwltool/jshint/jshint.js
prune cwltool/schemas/v1.0/salad/typeshed
prune cwltool/schemas/v1.0/salad/schema_salad/tests
prune cwltool/schemas/v1.1.0-dev1/salad/typeshed
prune cwltool/schemas/v1.1.0-dev1/salad/schema_salad/tests
prune cwltool/schemas/presentations
prune cwltool/schemas/draft-2
prune cwltool/schemas/draft-1
prune cwltool/schemas/draft-3
prune cwltool/schemas/site
prune cwltool/schemas/v1.0/examples
prune cwltool/schemas/v1.0/v1.0
prune cwltool/schemas/v1.1.0-dev1/examples
prune cwltool/schemas/v1.1.0-dev1/v1.1.0-dev1
recursive-exclude cwltool/schemas *.py
exclude debian.img
global-exclude *~
global-exclude *.pyc
......@@ -15,7 +15,7 @@
#
# Contact: common-workflow-language@googlegroups.com
# make pep8 to check for basic Python code compliance
# make pycodestyle to check for basic Python code compliance
# make autopep8 to fix most pep8 errors
# make pylint to check Python code for enhanced compliance including naming
# and documentation
......@@ -26,16 +26,24 @@ MODULE=cwltool
# `SHELL=bash` doesn't work for some, so don't use BASH-isms like
# `[[` conditional expressions.
PYSOURCES=$(wildcard ${MODULE}/**.py tests/*.py) setup.py
DEVPKGS=pep8 diff_cover autopep8 pylint coverage pydocstyle flake8 pytest isort mock
DEVPKGS=pycodestyle diff_cover autopep8 pylint coverage pydocstyle flake8 \
pytest pytest-xdist isort
DEBDEVPKGS=pep8 python-autopep8 pylint python-coverage pydocstyle sloccount \
python-flake8 python-mock shellcheck
VERSION=1.0.$(shell date +%Y%m%d%H%M%S --utc --date=`git log --first-parent \
--max-count=1 --format=format:%cI`)
mkfile_dir := $(dir $(abspath $(lastword $(MAKEFILE_LIST))))
UNAME_S=$(shell uname -s)
ifeq ($(UNAME_S),Linux)
nproc=$(shell nproc)
endif
ifeq ($(UNAME_S),Darwin)
nproc=$(shell sysctl -n hw.physicalcpu)
endif
## all : default task
all:
./setup.py develop
pip install -e .
## help : print this help message and exit
help: Makefile
......@@ -43,7 +51,7 @@ help: Makefile
## install-dep : install most of the development dependencies via pip
install-dep:
pip install --upgrade $(DEVPKGS)
pip install --upgrade $(DEVPKGS) -rtest-requirements.txt
## install-deb-dep: install most of the dev dependencies via apt-get
install-deb-dep:
......@@ -51,7 +59,12 @@ install-deb-dep:
## install : install the ${MODULE} module and schema-salad-tool
install: FORCE
pip install .
pip install .[deps]
## dev : install the ${MODULE} module in dev mode
dev: install-dep
pip install -e .[deps]
## dist : create a module package for distribution
dist: dist/${MODULE}-$(VERSION).tar.gz
......@@ -71,15 +84,18 @@ clean: FORCE
sort_imports:
isort ${MODULE}/*.py tests/*.py setup.py
## pep8 : check Python code style
pep8: $(PYSOURCES)
pep8 --exclude=_version.py --show-source --show-pep8 $^ || true
pep8: pycodestyle
## pycodestyle : check Python code style
pycodestyle: $(PYSOURCES)
pycodestyle --exclude=_version.py --show-source --show-pep8 $^ || true
pep8_report.txt: $(PYSOURCES)
pep8 --exclude=_version.py $^ > pep8_report.txt || true
pep8_report.txt: pycodestyle_report.txt
pycodestyle_report.txt: $(PYSOURCES)
pycodestyle --exclude=_version.py $^ > $@ || true
diff_pep8_report: pep8_report.txt
diff-quality --violations=pep8 pep8_report.txt
diff_pep8_report: diff_pycodestyle_report
diff_pycodestyle_report: pycodestyle_report.txt
diff-quality --violations=pycodestyle $^
pep257: pydocstyle
## pydocstyle : check Python code style
......@@ -87,7 +103,7 @@ pydocstyle: $(PYSOURCES)
pydocstyle --ignore=D100,D101,D102,D103 $^ || true
pydocstyle_report.txt: $(PYSOURCES)
pydocstyle setup.py $^ > pydocstyle_report.txt 2>&1 || true
pydocstyle setup.py $^ > $@ 2>&1 || true
diff_pydocstyle_report: pydocstyle_report.txt
diff-quality --violations=pycodestyle $^
......@@ -104,46 +120,48 @@ format: autopep8
## pylint : run static code analysis on Python code
pylint: $(PYSOURCES)
pylint --msg-template="{path}:{line}: [{msg_id}({symbol}), {obj}] {msg}" \
$^ || true
$^ -j$(nproc)|| true
pylint_report.txt: ${PYSOURCES}
pylint --msg-template="{path}:{line}: [{msg_id}({symbol}), {obj}] {msg}" \
$^ > pylint_report.txt || true
$^ -j$(nproc)> $@ || true
diff_pylint_report: pylint_report.txt
diff-quality --violations=pylint pylint_report.txt
.coverage: $(PYSOURCES) all
export COVERAGE_PROCESS_START=${mkfile_dir}.coveragerc; \
cd ${CWL}; ./run_test.sh RUNNER=cwltool
coverage run setup.py test
coverage combine ${CWL} ${CWL}/draft-3/ ./
.coverage: testcov
coverage: .coverage
coverage report
coverage.xml: .coverage
python-coverage xml
coverage xml
coverage.html: htmlcov/index.html
htmlcov/index.html: .coverage
python-coverage html
coverage html
@echo Test coverage of the Python code is now in htmlcov/index.html
coverage-report: .coverage
python-coverage report
coverage report
diff-cover: coverage-gcovr.xml coverage.xml
diff-cover coverage-gcovr.xml coverage.xml
diff-cover: coverage.xml
diff-cover $^
diff-cover.html: coverage-gcovr.xml coverage.xml
diff-cover coverage-gcovr.xml coverage.xml \
--html-report diff-cover.html
diff-cover.html: coverage.xml
diff-cover $^ --html-report diff-cover.html
## test : run the ${MODULE} test suite
test: FORCE
./setup.py test
test: $(pysources)
python setup.py test --addopts "-n$(nproc) --dist=loadfile"
## testcov : run the ${MODULE} test suite and collect coverage
testcov: $(pysources)
python setup.py test --addopts "--cov cwltool -n$(nproc) --dist=loadfile"
sloccount.sc: ${PYSOURCES} Makefile
sloccount --duplicates --wide --details $^ > sloccount.sc
sloccount --duplicates --wide --details $^ > $@
## sloccount : count lines of code
sloccount: ${PYSOURCES} Makefile
......@@ -153,7 +171,6 @@ list-author-emails:
@echo 'name, E-Mail Address'
@git log --format='%aN,%aE' | sort -u | grep -v 'root'
mypy2: ${PYSOURCES}
rm -Rf typeshed/2and3/ruamel/yaml
ln -s $(shell python -c 'from __future__ import print_function; import ruamel.yaml; import os.path; print(os.path.dirname(ruamel.yaml.__file__))') \
......@@ -175,9 +192,11 @@ mypy3: ${PYSOURCES}
MYPYPATH=$$MYPYPATH:typeshed/3:typeshed/2and3 mypy --disallow-untyped-calls \
--warn-redundant-casts \
cwltool
release: FORCE
release-test: FORCE
git diff-index --quiet HEAD -- || ( echo You have uncommited changes, please commit them and try again; false )
./release-test.sh
release: release-test
. testenv2/bin/activate && \
testenv2/src/${MODULE}/setup.py sdist bdist_wheel && \
pip install twine && \
......
This diff is collapsed.
......@@ -2,16 +2,27 @@
Common Workflow Language tool description reference implementation
==================================================================
CWL conformance tests: |Build Status| Travis CI: |Unix Build Status|
CWL conformance tests: |Conformance Status| |Linux Status| |Windows Status| |Coverage Status|
.. |Unix Build Status| image:: https://img.shields.io/travis/common-workflow-language/cwltool/master.svg?label=unix%20build
.. |Conformance Status| image:: https://ci.commonwl.org/buildStatus/icon?job=cwltool-conformance
:target: https://ci.commonwl.org/job/cwltool-conformance/
.. |Linux Status| image:: https://img.shields.io/travis/common-workflow-language/cwltool/master.svg?label=Linux%20builds
:target: https://travis-ci.org/common-workflow-language/cwltool
.. |Windows Status| image:: https://img.shields.io/appveyor/ci/mr-c/cwltool/master.svg?label=Windows%20builds
:target: https://ci.appveyor.com/project/mr-c/cwltool
.. |Coverage Status| image:: https://img.shields.io/codecov/c/github/common-workflow-language/cwltool.svg
:target: https://codecov.io/gh/common-workflow-language/cwltool
This is the reference implementation of the Common Workflow Language. It is
intended to feature complete and provide comprehensive validation of CWL
intended to be feature complete and provide comprehensive validation of CWL
files as well as provide other tools related to working with CWL.
This is written and tested for Python ``2.7 and 3.x {x = 3, 4, 5, 6}``
This is written and tested for
`Python <https://www.python.org/>`_ ``2.7 and 3.x {x = 4, 5, 6, 7}``
The reference implementation consists of two packages. The ``cwltool`` package
is the primary Python module containing the reference implementation in the
......@@ -57,35 +68,9 @@ Remember, if co-installing multiple CWL implementations then you need to
maintain which implementation ``cwl-runner`` points to via a symbolic file
system link or `another facility <https://wiki.debian.org/DebianAlternatives>`_.
Running tests locally
---------------------
- Running basic tests ``(/tests)``:
To run the basis tests after installing `cwltool` execute the following:
.. code:: bash
pip install pytest mock
py.test --ignore cwltool/schemas/ --pyarg cwltool
To run various tests in all supported Python environments we use `tox <https://github.com/common-workflow-language/cwltool/tree/master/tox.ini>`_. To run the test suite in all supported Python environments
first downloading the complete code repository (see the ``git clone`` instructions above) and then run
the following in the terminal:
``pip install tox; tox``
List of all environment can be seen using:
``tox --listenvs``
and running a specfic test env using:
``tox -e <env name>``
- Running the entire suite of CWL conformance tests:
The GitHub repository for the CWL specifications contains a script that tests a CWL
implementation against a wide array of valid CWL files using the `cwltest <https://github.com/common-workflow-language/cwltest>`_
program
Instructions for running these tests can be found in the Common Workflow Language Specification repository at https://github.com/common-workflow-language/common-workflow-language/blob/master/CONFORMANCE_TESTS.md
=====
Usage
=====
Run on the command line
-----------------------
......@@ -101,7 +86,7 @@ the default cwl-runner use::
Use with boot2docker
--------------------
boot2docker is running docker inside a virtual machine and it only mounts ``Users``
boot2docker runs Docker inside a virtual machine and it only mounts ``Users``
on it. The default behavior of CWL is to create temporary directories under e.g.
``/Var`` which is not accessible to Docker containers.
......@@ -110,14 +95,11 @@ and ``--tmp-outdir-prefix`` to somewhere under ``/Users``::
$ cwl-runner --tmp-outdir-prefix=/Users/username/project --tmpdir-prefix=/Users/username/project wc-tool.cwl wc-job.json
.. |Build Status| image:: https://ci.commonwl.org/buildStatus/icon?job=cwltool-conformance
:target: https://ci.commonwl.org/job/cwltool-conformance/
Using user-space replacements for Docker
----------------------------------------
Some shared computing environments don't support Docker software containers for technical or policy reasons.
As a work around, the CWL reference runner supports using a alternative ``docker`` implementations on Linux
As a work around, the CWL reference runner supports using alternative ``docker`` implementations on Linux
with the ``--user-space-docker-cmd`` option.
One such "user space" friendly docker replacement is ``udocker`` https://github.com/indigo-dc/udocker and another
......@@ -132,8 +114,8 @@ Run `cwltool` just as you normally would, but with the new option, e.g. from the
.. code:: bash
cwltool --user-space-docker-cmd=udocker https://raw.githubusercontent.com/common-workflow-language/common-workflow-language/master/v1.0/v1.0/test-cwl-out2.cwl https://github.com/common-workflow-language/common-workflow-language/blob/master/v1.0/v1.0/empty.json
or
or
.. code:: bash
......@@ -149,8 +131,8 @@ To use Singularity as the Docker container runtime, provide ``--singularity`` co
cwltool --singularity https://raw.githubusercontent.com/common-workflow-language/common-workflow-language/master/v1.0/v1.0/v1.0/cat3-tool-mediumcut.cwl https://github.com/common-workflow-language/common-workflow-language/blob/master/v1.0/v1.0/cat-job.json
Tool or workflow loading from remote or local locations
-------------------------------------------------------
Running a tool or workflow from remote or local locations
---------------------------------------------------------
``cwltool`` can run tool and workflow descriptions on both local and remote
systems via its support for HTTP[S] URLs.
......@@ -161,48 +143,117 @@ is referenced and that document isn't found in the current directory then the
following locations will be searched:
http://www.commonwl.org/v1.0/CommandLineTool.html#Discovering_CWL_documents_on_a_local_filesystem
You can also use `cwldep <https://github.com/common-workflow-language/cwldep>`
to manage dependencies on external tools and workflows.
Use with GA4GH Tool Registry API
--------------------------------
Overriding workflow requirements at load time
---------------------------------------------
Cwltool can launch tools directly from `GA4GH Tool Registry API`_ endpoints.
Sometimes a workflow needs additional requirements to run in a particular
environment or with a particular dataset. To avoid the need to modify the
underlying workflow, cwltool supports requirement "overrides".
By default, cwltool searches https://dockstore.org/ . Use --add-tool-registry to add other registries to the search path.
The format of the "overrides" object is a mapping of item identifier (workflow,
workflow step, or command line tool) to the process requirements that should be applied.
For example ::
.. code:: yaml
cwltool --non-strict quay.io/collaboratory/dockstore-tool-bamstats:master test.json
cwltool:overrides:
echo.cwl:
requirements:
EnvVarRequirement:
envDef:
MESSAGE: override_value
and (defaults to latest when a version is not specified) ::
Overrides can be specified either on the command line, or as part of the job
input document. Workflow steps are identified using the name of the workflow
file followed by the step name as a document fragment identifier "#id".
Override identifiers are relative to the toplevel workflow document.
cwltool --non-strict quay.io/collaboratory/dockstore-tool-bamstats test.json
.. code:: bash
For this example, grab the test.json (and input file) from https://github.com/CancerCollaboratory/dockstore-tool-bamstats
cwltool --overrides overrides.yml my-tool.cwl my-job.yml
.. _`GA4GH Tool Registry API`: https://github.com/ga4gh/tool-registry-schemas
.. code:: yaml
Import as a module
------------------
input_parameter1: value1
input_parameter2: value2
cwltool:overrides:
workflow.cwl#step1:
requirements:
EnvVarRequirement:
envDef:
MESSAGE: override_value
Add
.. code:: bash
.. code:: python
cwltool my-tool.cwl my-job-with-overrides.yml
import cwltool
to your script.
Combining parts of a workflow into a single document
----------------------------------------------------
The easiest way to use cwltool to run a tool or workflow from Python is to use a Factory
Use ``--pack`` to combine a workflow made up of multiple files into a
single compound document. This operation takes all the CWL files
referenced by a workflow and builds a new CWL document with all
Process objects (CommandLineTool and Workflow) in a list in the
``$graph`` field. Cross references (such as ``run:`` and ``source:``
fields) are updated to internal references within the new packed
document. The top level workflow is named ``#main``.
.. code:: python
.. code:: bash
import cwltool.factory
fac = cwltool.factory.Factory()
cwltool --pack my-wf.cwl > my-packed-wf.cwl
echo = f.make("echo.cwl")
result = echo(inp="foo")
# result["out"] == "foo"
Running only part of a workflow
-------------------------------
You can run a partial workflow with the ``--target`` (``-t``) option. This
takes the name of an output parameter, workflow step, or input
parameter in the top level workflow. You may provide multiple
targets.
.. code:: bash
cwltool --target step3 my-wf.cwl
If a target is an output parameter, it will only run only the steps
that contribute to that output. If a target is a workflow step, it
will run the workflow starting from that step. If a target is an
input parameter, it will only run only the steps that are connected to
that input.
Use ``--print-targets`` to get a listing of the targets of a workflow.
To see exactly which steps will run, use ``--print-subgraph`` with
``--target`` to get a printout of the workflow subgraph for the
selected targets.
.. code:: bash
cwltool --print-targets my-wf.cwl
cwltool --target step3 --print-subgraph my-wf.cwl > my-wf-starting-from-step3.cwl
Visualizing a CWL document
--------------------------
The ``--print-dot`` option will print a file suitable for Graphviz ``dot`` program. Here is a bash onliner to generate a Scalable Vector Graphic (SVG) file:
.. code:: bash
cwltool --print-dot my-wf.cwl | dot -Tsvg > my-wf.svg
Modeling a CWL document as RDF
------------------------------
CWL documents can be expressed as RDF triple graphs.
.. code:: bash
cwltool --print-rdf --rdf-serializer=turtle mywf.cwl
Leveraging SoftwareRequirements (Beta)
--------------------------------------
......@@ -220,7 +271,7 @@ installing cwltool. For instance::
Installing cwltool in this fashion enables several new command line options.
The most general of these options is ``--beta-dependency-resolvers-configuration``.
This option allows one to specify a dependency resolvers configuration file.
This option allows one to specify a dependency resolver's configuration file.
This file may be specified as either XML or YAML and very simply describes various
plugins to enable to "resolve" ``SoftwareRequirement`` dependencies.
......@@ -412,48 +463,87 @@ at the following links:
- `Specifications - Implementation <https://github.com/galaxyproject/galaxy/commit/81d71d2e740ee07754785306e4448f8425f890bc>`__
- `Initial cwltool Integration Pull Request <https://github.com/common-workflow-language/cwltool/pull/214>`__
Overriding workflow requirements at load time
---------------------------------------------
Use with GA4GH Tool Registry API
--------------------------------
Sometimes a workflow needs additional requirements to run in a particular
environment or with a particular dataset. To avoid the need to modify the
underlying workflow, cwltool supports requirement "overrides".
Cwltool can launch tools directly from `GA4GH Tool Registry API`_ endpoints.
The format of the "overrides" object is a mapping of item identifier (workflow,
workflow step, or command line tool) to the process requirements that should be applied.
By default, cwltool searches https://dockstore.org/ . Use ``--add-tool-registry`` to add other registries to the search path.
.. code:: yaml
For example ::
cwltool:overrides:
echo.cwl:
requirements:
EnvVarRequirement:
envDef:
MESSAGE: override_value
cwltool quay.io/collaboratory/dockstore-tool-bamstats:develop test.json
Overrides can be specified either on the command line, or as part of the job
input document. Workflow steps are identified using the name of the workflow
file followed by the step name as a document fragment identifier "#id".
Override identifiers are relative to the toplevel workflow document.
and (defaults to latest when a version is not specified) ::
.. code:: bash
cwltool quay.io/collaboratory/dockstore-tool-bamstats test.json
cwltool --overrides overrides.yml my-tool.cwl my-job.yml
For this example, grab the test.json (and input file) from https://github.com/CancerCollaboratory/dockstore-tool-bamstats ::
.. code:: yaml
wget https://dockstore.org/api/api/ga4gh/v2/tools/quay.io%2Fbriandoconnor%2Fdockstore-tool-bamstats/versions/develop/PLAIN-CWL/descriptor/test.json
wget https://github.com/CancerCollaboratory/dockstore-tool-bamstats/raw/develop/rna.SRR948778.bam
input_parameter1: value1
input_parameter2: value2
cwltool:overrides:
workflow.cwl#step1:
requirements:
EnvVarRequirement:
envDef:
MESSAGE: override_value
.. _`GA4GH Tool Registry API`: https://github.com/ga4gh/tool-registry-schemas
===========
Development
===========
Running tests locally
---------------------
- Running basic tests ``(/tests)``:
To run the basis tests after installing `cwltool` execute the following:
.. code:: bash
cwltool my-tool.cwl my-job-with-overrides.yml
pip install -rtest-requirements.txt
py.test --ignore cwltool/schemas/ --pyarg cwltool
To run various tests in all supported Python environments we use `tox <https://github.com/common-workflow-language/cwltool/tree/master/tox.ini>`_. To run the test suite in all supported Python environments
first downloading the complete code repository (see the ``git clone`` instructions above) and then run
the following in the terminal:
``pip install tox; tox``
List of all environment can be seen using:
``tox --listenvs``
and running a specfic test env using:
``tox -e <env name>``
and additionally run a specific test using this format:
``tox -e py36-unit -- tests/test_examples.py::TestParamMatching``
- Running the entire suite of CWL conformance tests:
The GitHub repository for the CWL specifications contains a script that tests a CWL
implementation against a wide array of valid CWL files using the `cwltest <https://github.com/common-workflow-language/cwltest>`_
program
Instructions for running these tests can be found in the Common Workflow Language Specification repository at https://github.com/common-workflow-language/common-workflow-language/blob/master/CONFORMANCE_TESTS.md
Import as a module
------------------
Add
.. code:: python
import cwltool
to your script.
The easiest way to use cwltool to run a tool or workflow from Python is to use a Factory
.. code:: python
import cwltool.factory
fac = cwltool.factory.Factory()
echo = fac.make("echo.cwl")
result = echo(inp="foo")
# result["out"] == "foo"
CWL Tool Control Flow
......@@ -500,7 +590,7 @@ Technical outline of how cwltool works internally, for maintainers.
#. ``CommandLineTool`` job() objects yield a single runnable object.
#. The CommandLineTool ``job()`` method calls ``makeJobRunner()`` to create a
#. The CommandLineTool ``job()`` method calls ``make_job_runner()`` to create a
``CommandLineJob`` object
#. The job method configures the CommandLineJob object by setting public
attributes
......@@ -520,33 +610,18 @@ Technical outline of how cwltool works internally, for maintainers.
Extension points
----------------
The following functions can be provided to main(), to load_tool(), or to the
executor to override or augment the listed behaviors.
The following functions can be passed to main() to override or augment
the listed behaviors.
executor
::
executor(tool, job_order_object, **kwargs)
(Process, Dict[Text, Any], **Any) -> Tuple[Dict[Text, Any], Text]
A toplevel workflow execution loop, should synchronously execute a process
object and return an output object.
makeTool
::
makeTool(toolpath_object, **kwargs)
(Dict[Text, Any], **Any) -> Process
Construct a Process object from a document.
selectResources
::
selectResources(request)
(Dict[Text, int]) -> Dict[Text, int]
executor(tool, job_order_object, runtimeContext, logger)
(Process, Dict[Text, Any], RuntimeContext) -> Tuple[Dict[Text, Any], Text]
Take a resource request and turn it into a concrete resource assignment.
An implementation of the toplevel workflow execution loop, should
synchronously run a process object to completion and return the
output object.
versionfunc
::
......@@ -556,13 +631,16 @@ versionfunc
Return version string.
make_fs_access
logger_handler
::
make_fs_access(basedir)
(Text) -> StdFsAccess
logger_handler
logging.Handler
Return a file system access object.
Handler object for logging.
The following functions can be set in LoadingContext to override or
augment the listed behaviors.
fetcher_constructor
::
......@@ -580,10 +658,47 @@ resolver
Resolve a relative document identifier to an absolute one which can be fetched.
logger_handler
The following functions can be set in RuntimeContext to override or
augment the listed behaviors.
construct_tool_object
::
logger_handler
logging.Handler
construct_tool_object(toolpath_object, loadingContext)
(MutableMapping[Text, Any], LoadingContext) -> Process
Handler object for logging.
Hook to construct a Process object (eg CommandLineTool) object from a document.
select_resources
::
selectResources(request)
(Dict[str, int], RuntimeContext) -> Dict[Text, int]
Take a resource request and turn it into a concrete resource assignment.
make_fs_access
::
make_fs_access(basedir)
(Text) -> StdFsAccess
Return a file system access object.
In addition, when providing custom subclasses of Process objects, you can override the following methods:
CommandLineTool.make_job_runner
::
make_job_runner(RuntimeContext)
(RuntimeContext) -> Type[JobBase]
Create and return a job runner object (this implements concrete execution of a command line tool).
Workflow.make_workflow_step
::
make_workflow_step(toolpath_object, pos, loadingContext, parentworkflowProv)
(Dict[Text, Any], int, LoadingContext, Optional[ProvenanceProfile]) -> WorkflowStep
Create and return a workflow step object.
This diff is collapsed.
LICENSE.txt
MANIFEST.in
Makefile
README.rst
cwltool.py
gittaggers.py
requirements.txt
setup.cfg
setup.py
test-requirements.txt
cwltool/__init__.py
cwltool/__main__.py
cwltool/argparser.py
cwltool/builder.py
cwltool/checker.py
cwltool/command_line_tool.py
cwltool/context.py
cwltool/cwlNodeEngine.js
cwltool/cwlNodeEngineJSConsole.js
cwltool/cwlNodeEngineWithContext.js
cwltool/cwlrdf.py
cwltool/docker.py
cwltool/docker_id.py
cwltool/draft2tool.py
cwltool/errors.py
cwltool/executors.py
cwltool/expression.py
......@@ -30,13 +35,17 @@ cwltool/mutation.py
cwltool/pack.py
cwltool/pathmapper.py
cwltool/process.py
cwltool/provenance.py
cwltool/resolver.py
cwltool/sandboxjs.py
cwltool/secrets.py
cwltool/singularity.py
cwltool/software_requirements.py
cwltool/stdfsaccess.py
cwltool/subgraph.py
cwltool/update.py
cwltool/utils.py
cwltool/validate_js.py
cwltool/workflow.py
cwltool.egg-info/PKG-INFO
cwltool.egg-info/SOURCES.txt
......@@ -45,40 +54,8 @@ cwltool.egg-info/entry_points.txt
cwltool.egg-info/requires.txt
cwltool.egg-info/top_level.txt
cwltool.egg-info/zip-safe
cwltool/schemas/draft-2/CommonWorkflowLanguage.yml
cwltool/schemas/draft-2/cwl-avro.yml
cwltool/schemas/draft-3/CommandLineTool-standalone.yml
cwltool/schemas/draft-3/CommandLineTool.yml
cwltool/schemas/draft-3/CommonWorkflowLanguage.yml
cwltool/schemas/draft-3/Process.yml
cwltool/schemas/draft-3/README.md
cwltool/schemas/draft-3/UserGuide.yml
cwltool/schemas/draft-3/Workflow.yml
cwltool/schemas/draft-3/concepts.md
cwltool/schemas/draft-3/contrib.md
cwltool/schemas/draft-3/index.yml
cwltool/schemas/draft-3/intro.md
cwltool/schemas/draft-3/invocation.md
cwltool/schemas/draft-3/userguide-intro.md
cwltool/schemas/draft-3/salad/schema_salad/metaschema/field_name.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/field_name_proc.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/field_name_schema.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/field_name_src.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/ident_res.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/ident_res_proc.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/ident_res_schema.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/ident_res_src.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/import_include.md
cwltool/schemas/draft-3/salad/schema_salad/metaschema/link_res.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/link_res_proc.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/link_res_schema.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/link_res_src.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/metaschema.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/salad.md
cwltool/schemas/draft-3/salad/schema_salad/metaschema/vocab_res.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/vocab_res_proc.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/vocab_res_schema.yml
cwltool/schemas/draft-3/salad/schema_salad/metaschema/vocab_res_src.yml
cwltool/jshint/jshint.js
cwltool/jshint/jshint_wrapper.js
cwltool/schemas/v1.0/CommandLineTool-standalone.yml
cwltool/schemas/v1.0/CommandLineTool.yml
cwltool/schemas/v1.0/CommonWorkflowLanguage.yml
......@@ -144,9 +121,17 @@ cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/link_res.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/link_res_proc.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/link_res_schema.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/link_res_src.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/map_res.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/map_res_proc.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/map_res_schema.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/map_res_src.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/metaschema.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/metaschema_base.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/salad.md
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/typedsl_res.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/typedsl_res_proc.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/typedsl_res_schema.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/typedsl_res_src.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/vocab_res.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/vocab_res_proc.yml
cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/vocab_res_schema.yml
......@@ -154,10 +139,15 @@ cwltool/schemas/v1.1.0-dev1/salad/schema_salad/metaschema/vocab_res_src.yml
tests/2.fasta
tests/2.fastq
tests/__init__.py
tests/echo-cwlrun-job.yaml
tests/bundle-context.jsonld
tests/echo-job.yaml
tests/echo.cwl
tests/echo_broken_outputs.cwl
tests/listing-job.yml
tests/listing2-job.yml
tests/non_portable.cwl
tests/non_portable2.cwl
tests/portable.cwl
tests/random_lines.cwl
tests/random_lines_job.json
tests/random_lines_mapping.cwl
......@@ -165,14 +155,20 @@ tests/seqtk_seq.cwl
tests/seqtk_seq_job.json
tests/seqtk_seq_with_docker.cwl
tests/seqtk_seq_wrong_name.cwl
tests/test_anon_types.py
tests/test_bad_outputs_wf.cwl
tests/test_check.py
tests/test_cwl_version.py
tests/test_default_path.py
tests/test_dependencies.py
tests/test_deps_env_modules_resolvers_conf.yml
tests/test_deps_env_resolvers_conf.yml
tests/test_deps_env_resolvers_conf_rewrite.yml
tests/test_deps_mapping.yml
tests/test_docker.py
tests/test_docker_info.py
tests/test_docker_warning.py
tests/test_empty_input.py
tests/test_examples.py
tests/test_ext.py
tests/test_fetch.py
......@@ -183,28 +179,75 @@ tests/test_override.py
tests/test_pack.py
tests/test_parallel.py
tests/test_pathmapper.py
tests/test_provenance.py
tests/test_rdfprint.py
tests/test_relax_path_checks.py
tests/test_relocate.py
tests/test_secrets.py
tests/test_singularity.py
tests/test_subgraph.py
tests/test_target.py
tests/test_toolargparse.py
tests/test_udocker.py
tests/test_validate_js.py
tests/utf_doc_example.cwl
tests/util.py
tests/checker_wf/broken-wf.cwl
tests/checker_wf/broken-wf2.cwl
tests/checker_wf/cat.cwl
tests/checker_wf/echo.cwl
tests/checker_wf/functional-wf.cwl
tests/override/echo-job-ov.yml
tests/override/echo-job-ov2.yml
tests/override/echo-job.yml
tests/override/echo-wf.cwl
tests/override/echo.cwl
tests/override/env-tool.cwl
tests/override/env-tool_cwl-requirement_override.yaml
tests/override/env-tool_cwl-requirement_override_default.yaml
tests/override/env-tool_cwl-requirement_override_default_wrongver.yaml
tests/override/env-tool_v1.1.0-dev1.cwl
tests/override/ov.yml
tests/override/ov2.yml
tests/override/ov3.yml
tests/subgraph/count-lines1-wf.cwl
tests/subgraph/extract_count_output.json
tests/subgraph/extract_file1.json
tests/subgraph/extract_file2.json
tests/subgraph/extract_file3.json
tests/subgraph/extract_output3.json
tests/subgraph/extract_output4.json
tests/subgraph/extract_output5.json
tests/subgraph/extract_step1.json
tests/subgraph/extract_step2.json
tests/subgraph/extract_step3.json
tests/subgraph/extract_step4.json
tests/subgraph/extract_step5.json
tests/subgraph/parseInt-tool.cwl
tests/subgraph/wc-tool.cwl
tests/tmp1/tmp2/tmp3/.gitkeep
tests/tmp4/alpha/baker
tests/tmp4/alpha/charlie
tests/tmp4/alpha/delta
tests/tmp4/alpha/echo
tests/tmp4/alpha/foxtrot
tests/wf/1st-workflow.cwl
tests/wf/910.cwl
tests/wf/arguments.cwl
tests/wf/badout1.cwl
tests/wf/badout2.cwl
tests/wf/badout3.cwl
tests/wf/cache_test_workflow.cwl
tests/wf/cat-tool.cwl
tests/wf/cat.cwl
tests/wf/count-lines1-wf.cwl
tests/wf/default-dir5.cwl
tests/wf/default-wf5.cwl
tests/wf/default_path.cwl
tests/wf/directory.cwl
tests/wf/echo.cwl
tests/wf/empty.ttl
tests/wf/empty2.ttl
tests/wf/expect_packed.cwl
tests/wf/formattest-job.json
tests/wf/formattest.cwl
......@@ -212,6 +255,8 @@ tests/wf/hello-workflow.cwl
tests/wf/hello.txt
tests/wf/hello_single_tool.cwl
tests/wf/iwdr-entry.cwl
tests/wf/iwdr_permutations.cwl
tests/wf/iwdr_permutations_inplace.yml
tests/wf/js_output.cwl
tests/wf/js_output_workflow.cwl
tests/wf/listing_deep.cwl
......@@ -219,19 +264,36 @@ tests/wf/listing_none.cwl
tests/wf/listing_shallow.cwl
tests/wf/listing_v1_0.cwl
tests/wf/malformed_outputs.cwl
tests/wf/missing-tool.cwl
tests/wf/missing_cwlVersion.cwl
tests/wf/mut.cwl
tests/wf/mut2.cwl
tests/wf/mut3.cwl
tests/wf/nested.cwl
tests/wf/networkaccess-fail.cwl
tests/wf/networkaccess.cwl
tests/wf/no-parameters-echo.cwl
tests/wf/override-no-secrets.yml
tests/wf/parseInt-tool.cwl
tests/wf/revsort-job.json
tests/wf/revsort.cwl
tests/wf/revtool.cwl
tests/wf/scatter-job2.json
tests/wf/scatter-wf4.cwl
tests/wf/scatter-wf4.json
tests/wf/scatterfail.cwl
tests/wf/sec-tool.cwl
tests/wf/sec-wf-out.cwl
tests/wf/sec-wf.cwl
tests/wf/secret_job.cwl
tests/wf/secret_wf.cwl
tests/wf/separate_without_prefix.cwl
tests/wf/sorttool.cwl
tests/wf/tar-param.cwl
tests/wf/timelimit-fail.cwl
tests/wf/timelimit.cwl
tests/wf/touch_tool.cwl
tests/wf/trick_defaults.cwl
tests/wf/updatedir.cwl
tests/wf/updatedir_inplace.cwl
tests/wf/updateval.cwl
......@@ -242,4 +304,6 @@ tests/wf/wc-job.json
tests/wf/wc-tool.cwl
tests/wf/wffail.cwl
tests/wf/whale.txt
tests/wf/workreuse-fail.cwl
tests/wf/workreuse.cwl
tests/wf/wrong_cwlVersion.cwl
\ No newline at end of file
[console_scripts]
cwltool = cwltool.main:main
cwltool = cwltool.main:run
setuptools
requests>=2.4.3
ruamel.yaml<0.15,>=0.12.4
requests>=2.6.1
ruamel.yaml<=0.15.77,>=0.12.4
rdflib<4.3.0,>=4.2.2
shellescape<3.5,>=3.4.1
schema-salad<3,>=2.6.20170927145003
schema-salad<3.1,>=3.0
mypy-extensions
six>=1.9.0
psutil
scandir
prov==1.5.1
bagit>=1.6.4
typing-extensions
[:os.name=="posix" and python_version<"3.5"]
subprocess32>=3.5.0
[:python_version<"3"]
pathlib2==2.3.2
[:python_version<"3.6"]
typing>=3.5.3
six>=1.8.0
[deps]
galaxy-lib>=17.09.3
galaxy-lib>=17.09.9
#!/usr/bin/env python
from __future__ import absolute_import
"""Convienance entry point for cwltool.
"""Convenience entry point for cwltool.
This can be used instead of the recommended method of `./setup.py install`
or `./setup.py develop` and then using the generated `cwltool` executable.
......@@ -11,4 +11,4 @@ import sys
from cwltool import main
if __name__ == "__main__":
sys.exit(main.main(sys.argv[1:]))
main.run(sys.argv[1:])
from __future__ import absolute_import
__author__ = 'peter.amstutz@curoverse.com'
__author__ = 'pamstutz@veritasgenetics.com'
"""Default entrypoint for the cwltool module."""
from __future__ import absolute_import
import sys
from . import main
sys.exit(main.main())
main.run()
from __future__ import absolute_import
from __future__ import print_function
"""Command line argument parsing for cwltool."""
from __future__ import absolute_import, print_function
import argparse
import logging
import os
from typing import (Any, AnyStr, Dict, List, MutableMapping, MutableSequence,
Optional, Sequence, Union, cast)
from typing import (Any, AnyStr, Dict, List, Sequence, Text, Union, cast)
from . import loghandler
from schema_salad.ref_resolver import file_uri
from .process import (Process, shortname)
from .resolver import ga4gh_tool_registries
from .software_requirements import (SOFTWARE_REQUIREMENTS_ENABLED)
_logger = logging.getLogger("cwltool")
from typing_extensions import Text # pylint: disable=unused-import
# move to a regular typing import when Python 3.3-3.6 is no longer supported
DEFAULT_TMP_PREFIX = "tmp"
from .loghandler import _logger
from .process import Process, shortname # pylint: disable=unused-import
from .resolver import ga4gh_tool_registries
from .software_requirements import SOFTWARE_REQUIREMENTS_ENABLED
from .utils import DEFAULT_TMP_PREFIX
def arg_parser(): # type: () -> argparse.ArgumentParser
......@@ -26,49 +25,47 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
help="Output directory, default current directory")
parser.add_argument("--parallel", action="store_true", default=False,
help="[experimental] Run jobs in parallel. "
"Does not currently keep track of ResourceRequirements like the number of cores"
"or memory and can overload this system")
help="[experimental] Run jobs in parallel. ")
envgroup = parser.add_mutually_exclusive_group()
envgroup.add_argument("--preserve-environment", type=Text, action="append",
help="Preserve specific environment variable when "
"running CommandLineTools. May be provided multiple "
"times.", metavar="ENVVAR", default=["PATH"],
dest="preserve_environment")
help="Preserve specific environment variable when "
"running CommandLineTools. May be provided multiple "
"times.", metavar="ENVVAR", default=["PATH"],
dest="preserve_environment")
envgroup.add_argument("--preserve-entire-environment", action="store_true",
help="Preserve all environment variable when running "
"CommandLineTools.", default=False,
dest="preserve_entire_environment")
help="Preserve all environment variable when running "
"CommandLineTools.", default=False,
dest="preserve_entire_environment")
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--rm-container", action="store_true", default=True,
help="Delete Docker container used by jobs after they exit (default)",
dest="rm_container")
exgroup.add_argument("--leave-container", action="store_false",
default=True, help="Do not delete Docker container used by jobs after they exit",
dest="rm_container")
exgroup.add_argument(
"--leave-container", action="store_false", default=True,
help="Do not delete Docker container used by jobs after they exit",
dest="rm_container")
cidgroup = parser.add_argument_group("Options for recording the Docker "
"container identifier into a file")
cidgroup = parser.add_argument_group(
"Options for recording the Docker container identifier into a file.")
# Disabled as containerid is now saved by default
cidgroup.add_argument("--record-container-id", action="store_true",
default=False,
help="If enabled, store the Docker container ID into a file. "
"See --cidfile-dir to specify the directory.",
help = argparse.SUPPRESS,
dest="record_container_id")
cidgroup.add_argument("--cidfile-dir", type=Text,
help="Directory for storing the Docker container ID file. "
"The default is the current directory",
default="",
dest="cidfile_dir")
cidgroup.add_argument(
"--cidfile-dir", type=Text, help="Store the Docker "
"container ID into a file in the specifed directory.",
default=None, dest="cidfile_dir")
cidgroup.add_argument("--cidfile-prefix", type=Text,
help="Specify a prefix to the container ID filename. "
"Final file name will be followed by a timestamp. "
"The default is no prefix.",
default="",
dest="cidfile_prefix")
cidgroup.add_argument(
"--cidfile-prefix", type=Text,
help="Specify a prefix to the container ID filename. "
"Final file name will be followed by a timestamp. "
"The default is no prefix.",
default=None, dest="cidfile_prefix")
parser.add_argument("--tmpdir-prefix", type=Text,
help="Path prefix for temporary directories",
......@@ -79,8 +76,9 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
help="Path prefix for intermediate output directories",
default=DEFAULT_TMP_PREFIX)
exgroup.add_argument("--cachedir", type=Text, default="",
help="Directory to cache intermediate workflow outputs to avoid recomputing steps.")
exgroup.add_argument(
"--cachedir", type=Text, default="",
help="Directory to cache intermediate workflow outputs to avoid recomputing steps.")
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--rm-tmpdir", action="store_true", default=True,
......@@ -92,9 +90,10 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
dest="rm_tmpdir")
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--move-outputs", action="store_const", const="move", default="move",
help="Move output files to the workflow output directory and delete intermediate output directories (default).",
dest="move_outputs")
exgroup.add_argument(
"--move-outputs", action="store_const", const="move", default="move",
help="Move output files to the workflow output directory and delete "
"intermediate output directories (default).", dest="move_outputs")
exgroup.add_argument("--leave-outputs", action="store_const", const="leave", default="move",
help="Leave output files in intermediate output directories.",
......@@ -120,6 +119,43 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
type=float,
default=20)
provgroup = parser.add_argument_group("Options for recording provenance "
"information of the execution")
provgroup.add_argument("--provenance",
help="Save provenance to specified folder as a "
"Research Object that captures and aggregates "
"workflow execution and data products.",
type=Text)
provgroup.add_argument("--enable-user-provenance", default=False,
action="store_true",
help="Record user account info as part of provenance.",
dest="user_provenance")
provgroup.add_argument("--disable-user-provenance", default=False,
action="store_false",
help="Do not record user account info in provenance.",
dest="user_provenance")
provgroup.add_argument("--enable-host-provenance", default=False,
action="store_true",
help="Record host info as part of provenance.",
dest="host_provenance")
provgroup.add_argument("--disable-host-provenance", default=False,
action="store_false",
help="Do not record host info in provenance.",
dest="host_provenance")
provgroup.add_argument(
"--orcid", help="Record user ORCID identifier as part of "
"provenance, e.g. https://orcid.org/0000-0002-1825-0097 "
"or 0000-0002-1825-0097. Alternatively the environment variable "
"ORCID may be set.", dest="orcid", default=os.environ.get("ORCID", ''),
type=Text)
provgroup.add_argument(
"--full-name", help="Record full name of user as part of provenance, "
"e.g. Josiah Carberry. You may need to use shell quotes to preserve "
"spaces. Alternatively the environment variable CWL_FULL_NAME may "
"be set.", dest="cwl_full_name", default=os.environ.get("CWL_FULL_NAME", ''),
type=Text)
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--print-rdf", action="store_true",
help="Print corresponding RDF graph for workflow and exit")
......@@ -132,6 +168,10 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
exgroup.add_argument("--version", action="store_true", help="Print version and exit")
exgroup.add_argument("--validate", action="store_true", help="Validate CWL document only.")
exgroup.add_argument("--print-supported-versions", action="store_true", help="Print supported CWL specs.")
exgroup.add_argument("--print-subgraph", action="store_true",
help="Print workflow subgraph that will execute "
"(can combine with --target)")
exgroup.add_argument("--print-targets", action="store_true", help="Print targets (output parameters)")
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--strict", action="store_true",
......@@ -148,10 +188,21 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
exgroup.add_argument("--quiet", action="store_true", help="Only print warnings and errors.")
exgroup.add_argument("--debug", action="store_true", help="Print even more logging")
parser.add_argument(
"--strict-memory-limit", action="store_true", help="When running with "
"software containers and the Docker engine, pass either the "
"calculated memory allocation from ResourceRequirements or the "
"default of 1 gigabyte to Docker's --memory option.")
parser.add_argument("--timestamps", action="store_true", help="Add "
"timestamps to the errors, warnings, and "
"notifications.")
parser.add_argument("--js-console", action="store_true", help="Enable javascript console output")
parser.add_argument("--disable-js-validation", action="store_true", help="Disable javascript validation.")
parser.add_argument("--js-hint-options-file",
type=Text,
help="File of options to pass to jshint."
"This includes the added option \"includewarnings\". ")
dockergroup = parser.add_mutually_exclusive_group()
dockergroup.add_argument("--user-space-docker-cmd", metavar="CMD",
help="(Linux/OS X only) Specify a user space docker "
......@@ -203,12 +254,12 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
help="Specify a default docker container that will be used if the workflow fails to specify one.")
parser.add_argument("--no-match-user", action="store_true",
help="Disable passing the current uid to `docker run --user`")
parser.add_argument("--disable-net", action="store_true",
help="Use docker's default networking for containers;"
" the default is to enable networking.")
parser.add_argument("--custom-net", type=Text,
help="Will be passed to `docker run` as the '--net' "
"parameter. Implies '--enable-net'.")
help="Passed to `docker run` as the '--net' "
"parameter when NetworkAccess is true.")
parser.add_argument("--disable-validate", dest="do_validate",
action="store_false", default=True,
help=argparse.SUPPRESS)
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--enable-ga4gh-tool-registry", action="store_true", help="Enable resolution using GA4GH tool registry API",
......@@ -220,8 +271,9 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
dest="ga4gh_tool_registries", default=[])
parser.add_argument("--on-error",
help="Desired workflow behavior when a step fails. One of 'stop' or 'continue'. "
"Default is 'stop'.", default="stop", choices=("stop", "continue"))
help="Desired workflow behavior when a step fails. One of 'stop' (do not submit any more steps) or "
"'continue' (may submit other steps that are not downstream from the error). Default is 'stop'.",
default="stop", choices=("stop", "continue"))
exgroup = parser.add_mutually_exclusive_group()
exgroup.add_argument("--compute-checksum", action="store_true", default=True,
......@@ -247,6 +299,10 @@ def arg_parser(): # type: () -> argparse.ArgumentParser
parser.add_argument("--overrides", type=str,
default=None, help="Read process requirement overrides from file.")
parser.add_argument("--target", "-t", action="append",
help="Only execute steps that contribute to "
"listed targets (can provide more than once).")
parser.add_argument("workflow", type=Text, nargs="?", default=None,
metavar='cwl_document', help="path or URL to a CWL Workflow, "
"CommandLineTool, or ExpressionTool. If the `inputs_object` has a "
......@@ -266,7 +322,7 @@ def get_default_args():
Get default values of cwltool's command line options
"""
ap = arg_parser()
args = ap.parse_args()
args = ap.parse_args([])
return vars(args)
......@@ -335,8 +391,8 @@ def add_argument(toolparser, name, inptype, records, description="",
else:
flag = "--"
required = True
if isinstance(inptype, list):
required = default is None
if isinstance(inptype, MutableSequence):
if inptype[0] == "null":
required = False
if len(inptype) == 2:
......@@ -346,23 +402,23 @@ def add_argument(toolparser, name, inptype, records, description="",
return None
ahelp = description.replace("%", "%%")
action = None # type: Union[argparse.Action, Text]
action = None # type: Optional[Union[argparse.Action, Text]]
atype = None # type: Any
if inptype == "File":
action = cast(argparse.Action, FileAction)
elif inptype == "Directory":
action = cast(argparse.Action, DirectoryAction)
elif isinstance(inptype, dict) and inptype["type"] == "array":
elif isinstance(inptype, MutableMapping) and inptype["type"] == "array":
if inptype["items"] == "File":
action = cast(argparse.Action, FileAppendAction)
elif inptype["items"] == "Directory":
action = cast(argparse.Action, DirectoryAppendAction)
else:
action = "append"
elif isinstance(inptype, dict) and inptype["type"] == "enum":
elif isinstance(inptype, MutableMapping) and inptype["type"] == "enum":
atype = Text
elif isinstance(inptype, dict) and inptype["type"] == "record":
elif isinstance(inptype, MutableMapping) and inptype["type"] == "record":
records.append(name)
for field in inptype['fields']:
fieldname = name + "." + shortname(field['name'])
......@@ -372,7 +428,7 @@ def add_argument(toolparser, name, inptype, records, description="",
toolparser, fieldname, fieldtype, records,
fielddescription)
return
if inptype == "string":
elif inptype == "string":
atype = Text
elif inptype == "int":
atype = int
......@@ -382,11 +438,7 @@ def add_argument(toolparser, name, inptype, records, description="",
atype = float
elif inptype == "boolean":
action = "store_true"
if default:
required = False
if not atype and not action:
else:
_logger.debug(u"Can't make command line argument from %s", inptype)
return None
......
This diff is collapsed.
"""Static checking of CWL workflow connectivity."""
from collections import namedtuple
from typing import Any, Dict, List, MutableMapping, MutableSequence, Optional
import six
from schema_salad import validate
from schema_salad.sourceline import SourceLine, bullets, strip_dup_lineno
from typing_extensions import Text # pylint: disable=unused-import
# move to a regular typing import when Python 3.3-3.6 is no longer supported
from .errors import WorkflowException
from .loghandler import _logger
from .process import shortname
from .utils import json_dumps
def _get_type(tp):
# type: (Any) -> Any
if isinstance(tp, MutableMapping):
if tp.get("type") not in ("array", "record", "enum"):
return tp["type"]
return tp
def check_types(srctype, sinktype, linkMerge, valueFrom):
# type: (Any, Any, Optional[Text], Optional[Text]) -> Text
"""Check if the source and sink types are "pass", "warning", or "exception".
"""
if valueFrom is not None:
return "pass"
if linkMerge is None:
if can_assign_src_to_sink(srctype, sinktype, strict=True):
return "pass"
if can_assign_src_to_sink(srctype, sinktype, strict=False):
return "warning"
return "exception"
if linkMerge == "merge_nested":
return check_types({"items": _get_type(srctype), "type": "array"},
_get_type(sinktype), None, None)
if linkMerge == "merge_flattened":
return check_types(merge_flatten_type(_get_type(srctype)), _get_type(sinktype), None, None)
raise WorkflowException(u"Unrecognized linkMerge enum '{}'".format(linkMerge))
def merge_flatten_type(src):
# type: (Any) -> Any
"""Return the merge flattened type of the source type
"""
if isinstance(src, MutableSequence):
return [merge_flatten_type(t) for t in src]
if isinstance(src, MutableMapping) and src.get("type") == "array":
return src
return {"items": src, "type": "array"}
def can_assign_src_to_sink(src, sink, strict=False): # type: (Any, Any, bool) -> bool
"""Check for identical type specifications, ignoring extra keys like inputBinding.
src: admissible source types
sink: admissible sink types
In non-strict comparison, at least one source type must match one sink type.
In strict comparison, all source types must match at least one sink type.
"""
if src == "Any" or sink == "Any":
return True
if isinstance(src, MutableMapping) and isinstance(sink, MutableMapping):
if sink.get("not_connected") and strict:
return False
if src["type"] == "array" and sink["type"] == "array":
return can_assign_src_to_sink(src["items"], sink["items"], strict)
if src["type"] == "record" and sink["type"] == "record":
return _compare_records(src, sink, strict)
if src["type"] == "File" and sink["type"] == "File":
for sinksf in sink.get("secondaryFiles", []):
if not [1 for srcsf in src.get("secondaryFiles", []) if sinksf == srcsf]:
if strict:
return False
return True
return can_assign_src_to_sink(src["type"], sink["type"], strict)
if isinstance(src, MutableSequence):
if strict:
for this_src in src:
if not can_assign_src_to_sink(this_src, sink):
return False
return True
for this_src in src:
if can_assign_src_to_sink(this_src, sink):
return True
return False
if isinstance(sink, MutableSequence):
for this_sink in sink:
if can_assign_src_to_sink(src, this_sink):
return True
return False
return src == sink
def _compare_records(src, sink, strict=False):
# type: (MutableMapping[Text, Any], MutableMapping[Text, Any], bool) -> bool
"""Compare two records, ensuring they have compatible fields.
This handles normalizing record names, which will be relative to workflow
step, so that they can be compared.
"""
def _rec_fields(rec): # type: (MutableMapping[Text, Any]) -> MutableMapping[Text, Any]
out = {}
for field in rec["fields"]:
name = shortname(field["name"])
out[name] = field["type"]
return out
srcfields = _rec_fields(src)
sinkfields = _rec_fields(sink)
for key in six.iterkeys(sinkfields):
if (not can_assign_src_to_sink(
srcfields.get(key, "null"), sinkfields.get(key, "null"), strict)
and sinkfields.get(key) is not None):
_logger.info("Record comparison failure for %s and %s\n"
"Did not match fields for %s: %s and %s",
src["name"], sink["name"], key, srcfields.get(key),
sinkfields.get(key))
return False
return True
def static_checker(workflow_inputs, workflow_outputs, step_inputs, step_outputs, param_to_step):
# type: (List[Dict[Text, Any]], List[Dict[Text, Any]], List[Dict[Text, Any]], List[Dict[Text, Any]], Dict[Text, Dict[Text, Any]]) -> None
"""Check if all source and sink types of a workflow are compatible before run time.
"""
# source parameters: workflow_inputs and step_outputs
# sink parameters: step_inputs and workflow_outputs
# make a dictionary of source parameters, indexed by the "id" field
src_parms = workflow_inputs + step_outputs
src_dict = {}
for parm in src_parms:
src_dict[parm["id"]] = parm
step_inputs_val = check_all_types(src_dict, step_inputs, "source")
workflow_outputs_val = check_all_types(src_dict, workflow_outputs, "outputSource")
warnings = step_inputs_val["warning"] + workflow_outputs_val["warning"]
exceptions = step_inputs_val["exception"] + workflow_outputs_val["exception"]
warning_msgs = []
exception_msgs = []
for warning in warnings:
src = warning.src
sink = warning.sink
linkMerge = warning.linkMerge
if sink.get("secondaryFiles") and sorted(
sink.get("secondaryFiles", [])) != sorted(src.get("secondaryFiles", [])):
msg1 = "Sink '%s'" % (shortname(sink["id"]))
msg2 = SourceLine(sink.get("_tool_entry", sink), "secondaryFiles").makeError(
"expects secondaryFiles: %s but" % (sink.get("secondaryFiles")))
if "secondaryFiles" in src:
msg3 = SourceLine(src, "secondaryFiles").makeError(
"source '%s' has secondaryFiles %s." % (shortname(src["id"]), src.get("secondaryFiles")))
else:
msg3 = SourceLine(src, "id").makeError(
"source '%s' does not include secondaryFiles." % (shortname(src["id"])))
msg4 = SourceLine(src, "id").makeError("To fix, add secondaryFiles: %s to definition of '%s'." % (sink.get("secondaryFiles"), shortname(src["id"])))
msg = SourceLine(sink).makeError("%s\n%s" % (msg1, bullets([msg2, msg3, msg4], " ")))
elif sink.get("not_connected"):
msg = SourceLine(sink, "type").makeError(
"'%s' is not an input parameter of %s, expected %s"
% (shortname(sink["id"]), param_to_step[sink["id"]]["run"],
", ".join(shortname(s["id"])
for s in param_to_step[sink["id"]]["inputs"]
if not s.get("not_connected"))))
else:
msg = SourceLine(src, "type").makeError(
"Source '%s' of type %s may be incompatible"
% (shortname(src["id"]), json_dumps(src["type"]))) + "\n" + \
SourceLine(sink, "type").makeError(
" with sink '%s' of type %s"
% (shortname(sink["id"]), json_dumps(sink["type"])))
if linkMerge is not None:
msg += "\n" + SourceLine(sink).makeError(" source has linkMerge method %s" % linkMerge)
warning_msgs.append(msg)
for exception in exceptions:
src = exception.src
sink = exception.sink
linkMerge = exception.linkMerge
msg = SourceLine(src, "type").makeError(
"Source '%s' of type %s is incompatible"
% (shortname(src["id"]), json_dumps(src["type"]))) + "\n" + \
SourceLine(sink, "type").makeError(
" with sink '%s' of type %s"
% (shortname(sink["id"]), json_dumps(sink["type"])))
if linkMerge is not None:
msg += "\n" + SourceLine(sink).makeError(" source has linkMerge method %s" % linkMerge)
exception_msgs.append(msg)
for sink in step_inputs:
if ('null' != sink["type"] and 'null' not in sink["type"]
and "source" not in sink and "default" not in sink and "valueFrom" not in sink):
msg = SourceLine(sink).makeError(
"Required parameter '%s' does not have source, default, or valueFrom expression"
% shortname(sink["id"]))
exception_msgs.append(msg)
all_warning_msg = strip_dup_lineno("\n".join(warning_msgs))
all_exception_msg = strip_dup_lineno("\n".join(exception_msgs))
if warnings:
_logger.warning("Workflow checker warning:\n%s", all_warning_msg)
if exceptions:
raise validate.ValidationException(all_exception_msg)
SrcSink = namedtuple("SrcSink", ["src", "sink", "linkMerge"])
def check_all_types(src_dict, sinks, sourceField):
# type: (Dict[Text, Any], List[Dict[Text, Any]], Text) -> Dict[Text, List[SrcSink]]
# sourceField is either "soure" or "outputSource"
"""Given a list of sinks, check if their types match with the types of their sources.
"""
validation = {"warning": [], "exception": []} # type: Dict[Text, List[SrcSink]]
for sink in sinks:
if sourceField in sink:
valueFrom = sink.get("valueFrom")
if isinstance(sink[sourceField], MutableSequence):
srcs_of_sink = [src_dict[parm_id] for parm_id in sink[sourceField]]
linkMerge = sink.get("linkMerge", ("merge_nested"
if len(sink[sourceField]) > 1 else None))
else:
parm_id = sink[sourceField]
srcs_of_sink = [src_dict[parm_id]]
linkMerge = None
for src in srcs_of_sink:
check_result = check_types(src, sink, linkMerge, valueFrom)
if check_result == "warning":
validation["warning"].append(SrcSink(src, sink, linkMerge))
elif check_result == "exception":
validation["exception"].append(SrcSink(src, sink, linkMerge))
return validation
This diff is collapsed.
"""Shared context objects that replace use of kwargs."""
import copy
import threading # pylint: disable=unused-import
from typing import (Any, Callable, Dict, Iterable, List, MutableMapping,
Optional)
from schema_salad import schema
from schema_salad.ref_resolver import (ContextType, # pylint: disable=unused-import
Fetcher, Loader)
from typing_extensions import (TYPE_CHECKING, # pylint: disable=unused-import
Text)
# move to a regular typing import when Python 3.3-3.6 is no longer supported
from .builder import Builder, HasReqsHints
from .mutation import MutationManager
from .pathmapper import PathMapper
from .secrets import SecretStore
from .software_requirements import DependenciesConfiguration
from .stdfsaccess import StdFsAccess
from .utils import DEFAULT_TMP_PREFIX
if TYPE_CHECKING:
from .process import Process
from .provenance import (ResearchObject, # pylint: disable=unused-import
ProvenanceProfile)
class ContextBase(object):
def __init__(self, kwargs=None):
# type: (Optional[Dict[str, Any]]) -> None
if kwargs:
for k, v in kwargs.items():
if hasattr(self, k):
setattr(self, k, v)
def make_tool_notimpl(toolpath_object, # type: MutableMapping[Text, Any]
loadingContext # type: LoadingContext
): # type: (...) -> Process
raise NotImplementedError()
default_make_tool = make_tool_notimpl # type: Callable[[MutableMapping[Text, Any], LoadingContext], Process]
class LoadingContext(ContextBase):
def __init__(self, kwargs=None):
# type: (Optional[Dict[str, Any]]) -> None
self.debug = False # type: bool
self.metadata = {} # type: Dict[Text, Any]
self.requirements = None
self.hints = None
self.overrides_list = [] # type: List[Dict[Text, Any]]
self.loader = None # type: Optional[Loader]
self.avsc_names = None # type: Optional[schema.Names]
self.disable_js_validation = False # type: bool
self.js_hint_options_file = None
self.do_validate = True # type: bool
self.enable_dev = False # type: bool
self.strict = True # type: bool
self.resolver = None
self.fetcher_constructor = None
self.construct_tool_object = default_make_tool
self.research_obj = None # type: Optional[ResearchObject]
self.orcid = '' # type: str
self.cwl_full_name = "" # type: str
self.host_provenance = False # type: bool
self.user_provenance = False # type: bool
self.prov_obj = None # type: Optional[ProvenanceProfile]
super(LoadingContext, self).__init__(kwargs)
def copy(self):
# type: () -> LoadingContext
return copy.copy(self)
class RuntimeContext(ContextBase):
def __init__(self, kwargs=None):
# type: (Optional[Dict[str, Any]]) -> None
select_resources_callable = Callable[ # pylint: disable=unused-variable
[Dict[str, int], RuntimeContext], Dict[str, int]]
self.user_space_docker_cmd = "" # type: Text
self.secret_store = None # type: Optional[SecretStore]
self.no_read_only = False # type: bool
self.custom_net = "" # type: Text
self.no_match_user = False # type: bool
self.preserve_environment = "" # type: Optional[Iterable[str]]
self.preserve_entire_environment = False # type: bool
self.use_container = True # type: bool
self.force_docker_pull = False # type: bool
self.tmp_outdir_prefix = DEFAULT_TMP_PREFIX # type: Text
self.tmpdir_prefix = DEFAULT_TMP_PREFIX # type: Text
self.tmpdir = "" # type: Text
self.rm_tmpdir = True # type: bool
self.pull_image = True # type: bool
self.rm_container = True # type: bool
self.move_outputs = "move" # type: Text
self.singularity = False # type: bool
self.disable_net = False # type: bool
self.debug = False # type: bool
self.compute_checksum = True # type: bool
self.name = "" # type: Text
self.default_container = "" # type: Text
self.find_default_container = None # type: Optional[Callable[[HasReqsHints], Optional[Text]]]
self.cachedir = None # type: Optional[Text]
self.outdir = None # type: Optional[Text]
self.stagedir = "" # type: Text
self.part_of = "" # type: Text
self.basedir = "" # type: Text
self.toplevel = False # type: bool
self.mutation_manager = None # type: Optional[MutationManager]
self.make_fs_access = StdFsAccess # type: Callable[[Text], StdFsAccess]
self.path_mapper = PathMapper
self.builder = None # type: Optional[Builder]
self.docker_outdir = "" # type: Text
self.docker_tmpdir = "" # type: Text
self.docker_stagedir = "" # type: Text
self.js_console = False # type: bool
self.job_script_provider = None # type: Optional[DependenciesConfiguration]
self.select_resources = None # type: Optional[select_resources_callable]
self.eval_timeout = 20 # type: float
self.postScatterEval = None # type: Optional[Callable[[MutableMapping[Text, Any]], Dict[Text, Any]]]
self.on_error = "stop" # type: Text
self.strict_memory_limit = False # type: bool
self.cidfile_dir = None
self.cidfile_prefix = None
self.workflow_eval_lock = None # type: Optional[threading.Condition]
self.research_obj = None # type: Optional[ResearchObject]
self.orcid = '' # type: str
self.cwl_full_name = "" # type: str
self.process_run_id = None # type: Optional[str]
self.prov_obj = None # type: Optional[ProvenanceProfile]
super(RuntimeContext, self).__init__(kwargs)
def copy(self):
# type: () -> RuntimeContext
return copy.copy(self)
def getdefault(val, default):
# type: (Any, Any) -> Any
if val is None:
return default
else:
return val
"use strict";
process.stdin.setEncoding("utf8");
var incoming = "";
var firstInput = true;
var context = {};
process.stdin.on("data", function(chunk) {
incoming += chunk;
var i = incoming.indexOf("\n");
while (i > -1) {
try{
var input = incoming.substr(0, i);
incoming = incoming.substr(i+1);
var fn = JSON.parse(input);
if(firstInput){
context = require("vm").runInNewContext(fn, {});
}
else{
process.stdout.write(JSON.stringify(require("vm").runInNewContext(fn, context)) + "\n");
}
}
catch(e){
console.error(e);
}
if(firstInput){
firstInput = false;
}
else{
/*strings to indicate the process has finished*/
console.log("r1cepzbhUTxtykz5XTC4");
console.error("r1cepzbhUTxtykz5XTC4");
}
i = incoming.indexOf("\n");
}
});
process.stdin.on("end", process.exit);
from __future__ import absolute_import
from typing import IO, Any, Dict, Text
from rdflib import Graph
from typing import IO, Any, Dict
from rdflib import Graph
from schema_salad.jsonld_context import makerdf
from schema_salad.ref_resolver import ContextType
from six.moves import urllib
from typing_extensions import Text # pylint: disable=unused-import
# move to a regular typing import when Python 3.3-3.6 is no longer supported
from .process import Process
......@@ -20,17 +22,20 @@ def gather(tool, ctx): # type: (Process, ContextType) -> Graph
return g
def printrdf(wf, ctx, sr):
# type: (Process, ContextType, Text) -> Text
return gather(wf, ctx).serialize(format=sr).decode('utf-8')
def printrdf(wflow, ctx, style): # type: (Process, ContextType, Text) -> Text
"""Serialize the CWL document into a string, ready for printing."""
rdf = gather(wflow, ctx).serialize(format=style, encoding='utf-8')
if not rdf:
return u""
assert rdf is not None
return rdf.decode('utf-8')
def lastpart(uri): # type: (Any) -> Text
uri = Text(uri)
if "/" in uri:
return uri[uri.rindex("/") + 1:]
else:
return uri
return uri
def dot_with_parameters(g, stdout): # type: (Graph, IO[Any]) -> None
......@@ -41,7 +46,7 @@ def dot_with_parameters(g, stdout): # type: (Graph, IO[Any]) -> None
?run rdf:type ?runtype .
}""")
for step, run, runtype in qres:
for step, run, _ in qres:
stdout.write(u'"%s" [label="%s"]\n' % (lastpart(step), "%s (%s)" % (lastpart(step), lastpart(run))))
qres = g.query(
......
This diff is collapsed.