Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results
Show changes
Commits on Source (91)
Showing
with 72 additions and 73 deletions
...@@ -34,3 +34,4 @@ doc/source/modules.rst ...@@ -34,3 +34,4 @@ doc/source/modules.rst
doc/source/buildstream.rst doc/source/buildstream.rst
doc/source/buildstream.*.rst doc/source/buildstream.*.rst
doc/build/ doc/build/
versioneer.pyc
image: buildstream/testsuite-debian:9-master-114-4cab18e3 image: buildstream/testsuite-debian:9-master-119-552f5fc6
cache: cache:
key: "$CI_JOB_NAME-" key: "$CI_JOB_NAME-"
...@@ -78,7 +78,7 @@ source_dist: ...@@ -78,7 +78,7 @@ source_dist:
# Go back to the toplevel and collect our reports # Go back to the toplevel and collect our reports
- cd ../.. - cd ../..
- mkdir -p coverage-linux/ - mkdir -p coverage-linux/
- cp dist/buildstream/.coverage.* coverage-linux/coverage."${CI_JOB_NAME}" - cp dist/buildstream/.coverage coverage-linux/coverage."${CI_JOB_NAME}"
except: except:
- schedules - schedules
artifacts: artifacts:
...@@ -86,25 +86,25 @@ source_dist: ...@@ -86,25 +86,25 @@ source_dist:
- coverage-linux/ - coverage-linux/
tests-debian-9: tests-debian-9:
image: buildstream/testsuite-debian:9-master-117-aa3a33b3 image: buildstream/testsuite-debian:9-master-119-552f5fc6
<<: *linux-tests <<: *linux-tests
tests-fedora-27: tests-fedora-27:
image: buildstream/testsuite-fedora:27-master-117-aa3a33b3 image: buildstream/testsuite-fedora:27-master-119-552f5fc6
<<: *linux-tests <<: *linux-tests
tests-fedora-28: tests-fedora-28:
image: buildstream/testsuite-fedora:28-master-117-aa3a33b3 image: buildstream/testsuite-fedora:28-master-119-552f5fc6
<<: *linux-tests <<: *linux-tests
tests-ubuntu-18.04: tests-ubuntu-18.04:
image: buildstream/testsuite-ubuntu:18.04-master-117-aa3a33b3 image: buildstream/testsuite-ubuntu:18.04-master-119-552f5fc6
<<: *linux-tests <<: *linux-tests
tests-unix: tests-unix:
# Use fedora here, to a) run a test on fedora and b) ensure that we # Use fedora here, to a) run a test on fedora and b) ensure that we
# can get rid of ostree - this is not possible with debian-8 # can get rid of ostree - this is not possible with debian-8
image: buildstream/testsuite-fedora:27-master-117-aa3a33b3 image: buildstream/testsuite-fedora:27-master-119-552f5fc6
stage: test stage: test
variables: variables:
BST_FORCE_BACKEND: "unix" BST_FORCE_BACKEND: "unix"
...@@ -128,7 +128,7 @@ tests-unix: ...@@ -128,7 +128,7 @@ tests-unix:
# Go back to the toplevel and collect our reports # Go back to the toplevel and collect our reports
- cd ../.. - cd ../..
- mkdir -p coverage-unix/ - mkdir -p coverage-unix/
- cp dist/buildstream/.coverage.* coverage-unix/coverage.unix - cp dist/buildstream/.coverage coverage-unix/coverage.unix
except: except:
- schedules - schedules
artifacts: artifacts:
......
...@@ -8,19 +8,29 @@ include README.rst ...@@ -8,19 +8,29 @@ include README.rst
# Documentation package includes # Documentation package includes
include doc/Makefile include doc/Makefile
include doc/badges.py
include doc/bst2html.py
include doc/source/conf.py include doc/source/conf.py
include doc/source/index.rst include doc/source/plugin.rsttemplate
recursive-include doc/source *.rst
recursive-include doc/source *.py
recursive-include doc/source *.in
recursive-include doc/source *.html
recursive-include doc/source *.odg
recursive-include doc/source *.svg
recursive-include doc/examples *
# Tests # Tests
recursive-include tests *.py recursive-include tests *
recursive-include tests *.yaml include conftest.py
recursive-include tests *.bst include .coveragerc
recursive-include tests *.conf include .pylintrc
recursive-include tests *.sh
recursive-include tests *.expected
# Protocol Buffers # Protocol Buffers
recursive-include buildstream/_protos *.proto recursive-include buildstream/_protos *.proto
# Requirements files # Requirements files
include dev-requirements.txt include dev-requirements.txt
# Versioneer
include versioneer.py
...@@ -31,6 +31,15 @@ buildstream 1.3.1 ...@@ -31,6 +31,15 @@ buildstream 1.3.1
new the `conf-root` variable to make the process easier. And there has been new the `conf-root` variable to make the process easier. And there has been
a bug fix to workspaces so they can be build in workspaces too. a bug fix to workspaces so they can be build in workspaces too.
o Creating a build shell through the interactive mode or `bst shell --build`
will now use the cached build tree. It is now easier to debug local build
failures.
o `bst shell --sysroot` now takes any directory that contains a sysroot,
instead of just a specially-formatted build-root with a `root` and `scratch`
subdirectory.
================= =================
buildstream 1.1.5 buildstream 1.1.5
================= =================
......
...@@ -156,7 +156,7 @@ class ArtifactCache(): ...@@ -156,7 +156,7 @@ class ArtifactCache():
def setup_remotes(self, *, use_config=False, remote_url=None): def setup_remotes(self, *, use_config=False, remote_url=None):
# Ensure we do not double-initialise since this can be expensive # Ensure we do not double-initialise since this can be expensive
assert(not self._remotes_setup) assert not self._remotes_setup
self._remotes_setup = True self._remotes_setup = True
# Initialize remote artifact caches. We allow the commandline to override # Initialize remote artifact caches. We allow the commandline to override
...@@ -252,7 +252,7 @@ class ArtifactCache(): ...@@ -252,7 +252,7 @@ class ArtifactCache():
# (int): The size of the cache after having cleaned up # (int): The size of the cache after having cleaned up
# #
def clean(self): def clean(self):
artifacts = self.list_artifacts() artifacts = self.list_artifacts() # pylint: disable=assignment-from-no-return
# Build a set of the cache keys which are required # Build a set of the cache keys which are required
# based on the required elements at cleanup time # based on the required elements at cleanup time
...@@ -294,7 +294,7 @@ class ArtifactCache(): ...@@ -294,7 +294,7 @@ class ArtifactCache():
if key not in required_artifacts: if key not in required_artifacts:
# Remove the actual artifact, if it's not required. # Remove the actual artifact, if it's not required.
size = self.remove(to_remove) size = self.remove(to_remove) # pylint: disable=assignment-from-no-return
# Remove the size from the removed size # Remove the size from the removed size
self.set_cache_size(self._cache_size - size) self.set_cache_size(self._cache_size - size)
...@@ -311,7 +311,7 @@ class ArtifactCache(): ...@@ -311,7 +311,7 @@ class ArtifactCache():
# (int): The size of the artifact cache. # (int): The size of the artifact cache.
# #
def compute_cache_size(self): def compute_cache_size(self):
self._cache_size = self.calculate_cache_size() self._cache_size = self.calculate_cache_size() # pylint: disable=assignment-from-no-return
return self._cache_size return self._cache_size
......
...@@ -33,11 +33,11 @@ import grpc ...@@ -33,11 +33,11 @@ import grpc
from .. import _yaml from .. import _yaml
from .._protos.google.rpc import code_pb2
from .._protos.google.bytestream import bytestream_pb2, bytestream_pb2_grpc from .._protos.google.bytestream import bytestream_pb2, bytestream_pb2_grpc
from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2, remote_execution_pb2_grpc from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2, remote_execution_pb2_grpc
from .._protos.buildstream.v2 import buildstream_pb2, buildstream_pb2_grpc from .._protos.buildstream.v2 import buildstream_pb2, buildstream_pb2_grpc
from .._message import MessageType, Message
from .. import _signals, utils from .. import _signals, utils
from .._exceptions import ArtifactError from .._exceptions import ArtifactError
...@@ -81,8 +81,9 @@ class CASCache(ArtifactCache): ...@@ -81,8 +81,9 @@ class CASCache(ArtifactCache):
################################################ ################################################
def preflight(self): def preflight(self):
if (not os.path.isdir(os.path.join(self.casdir, 'refs', 'heads')) or headdir = os.path.join(self.casdir, 'refs', 'heads')
not os.path.isdir(os.path.join(self.casdir, 'objects'))): objdir = os.path.join(self.casdir, 'objects')
if not (os.path.isdir(headdir) and os.path.isdir(objdir)):
raise ArtifactError("CAS repository check failed for '{}'" raise ArtifactError("CAS repository check failed for '{}'"
.format(self.casdir)) .format(self.casdir))
...@@ -918,7 +919,7 @@ class CASCache(ArtifactCache): ...@@ -918,7 +919,7 @@ class CASCache(ArtifactCache):
# Skip download, already in local cache. # Skip download, already in local cache.
pass pass
elif (digest.size_bytes >= remote.max_batch_total_size_bytes or elif (digest.size_bytes >= remote.max_batch_total_size_bytes or
not remote.batch_read_supported): not remote.batch_read_supported):
# Too large for batch request, download in independent request. # Too large for batch request, download in independent request.
self._ensure_blob(remote, digest) self._ensure_blob(remote, digest)
in_local_cache = True in_local_cache = True
...@@ -958,7 +959,7 @@ class CASCache(ArtifactCache): ...@@ -958,7 +959,7 @@ class CASCache(ArtifactCache):
batch = _CASBatchRead(remote) batch = _CASBatchRead(remote)
while len(fetch_queue) + len(fetch_next_queue) > 0: while len(fetch_queue) + len(fetch_next_queue) > 0:
if len(fetch_queue) == 0: if not fetch_queue:
batch = self._fetch_directory_batch(remote, batch, fetch_queue, fetch_next_queue) batch = self._fetch_directory_batch(remote, batch, fetch_queue, fetch_next_queue)
dir_digest = fetch_queue.pop(0) dir_digest = fetch_queue.pop(0)
...@@ -1087,6 +1088,10 @@ class _CASRemote(): ...@@ -1087,6 +1088,10 @@ class _CASRemote():
self.bytestream = None self.bytestream = None
self.cas = None self.cas = None
self.ref_storage = None self.ref_storage = None
self.batch_update_supported = None
self.batch_read_supported = None
self.capabilities = None
self.max_batch_total_size_bytes = None
def init(self): def init(self):
if not self._initialized: if not self._initialized:
...@@ -1191,13 +1196,13 @@ class _CASBatchRead(): ...@@ -1191,13 +1196,13 @@ class _CASBatchRead():
assert not self._sent assert not self._sent
self._sent = True self._sent = True
if len(self._request.digests) == 0: if not self._request.digests:
return return
batch_response = self._remote.cas.BatchReadBlobs(self._request) batch_response = self._remote.cas.BatchReadBlobs(self._request)
for response in batch_response.responses: for response in batch_response.responses:
if response.status.code != grpc.StatusCode.OK.value[0]: if response.status.code != code_pb2.OK:
raise ArtifactError("Failed to download blob {}: {}".format( raise ArtifactError("Failed to download blob {}: {}".format(
response.digest.hash, response.status.code)) response.digest.hash, response.status.code))
if response.digest.size_bytes != len(response.data): if response.digest.size_bytes != len(response.data):
...@@ -1236,13 +1241,13 @@ class _CASBatchUpdate(): ...@@ -1236,13 +1241,13 @@ class _CASBatchUpdate():
assert not self._sent assert not self._sent
self._sent = True self._sent = True
if len(self._request.requests) == 0: if not self._request.requests:
return return
batch_response = self._remote.cas.BatchUpdateBlobs(self._request) batch_response = self._remote.cas.BatchUpdateBlobs(self._request)
for response in batch_response.responses: for response in batch_response.responses:
if response.status.code != grpc.StatusCode.OK.value[0]: if response.status.code != code_pb2.OK:
raise ArtifactError("Failed to upload blob {}: {}".format( raise ArtifactError("Failed to upload blob {}: {}".format(
response.digest.hash, response.status.code)) response.digest.hash, response.status.code))
......
...@@ -364,7 +364,6 @@ class Context(): ...@@ -364,7 +364,6 @@ class Context():
assert self._message_handler assert self._message_handler
self._message_handler(message, context=self) self._message_handler(message, context=self)
return
# silence() # silence()
# #
......
...@@ -111,10 +111,8 @@ class BstError(Exception): ...@@ -111,10 +111,8 @@ class BstError(Exception):
# #
self.detail = detail self.detail = detail
# The build sandbox in which the error occurred, if the # A sandbox can be created to debug this error
# error occurred at element assembly time. self.sandbox = False
#
self.sandbox = None
# When this exception occurred during the handling of a job, indicate # When this exception occurred during the handling of a job, indicate
# whether or not there is any point retrying the job. # whether or not there is any point retrying the job.
......
...@@ -20,7 +20,6 @@ ...@@ -20,7 +20,6 @@
from contextlib import contextmanager from contextlib import contextmanager
import os import os
import sys import sys
import resource
import traceback import traceback
import datetime import datetime
from textwrap import TextWrapper from textwrap import TextWrapper
...@@ -598,7 +597,7 @@ class App(): ...@@ -598,7 +597,7 @@ class App():
click.echo("\nDropping into an interactive shell in the failed build sandbox\n", err=True) click.echo("\nDropping into an interactive shell in the failed build sandbox\n", err=True)
try: try:
prompt = self.shell_prompt(element) prompt = self.shell_prompt(element)
self.stream.shell(element, Scope.BUILD, prompt, directory=failure.sandbox, isolate=True) self.stream.shell(element, Scope.BUILD, prompt, isolate=True)
except BstError as e: except BstError as e:
click.echo("Error while attempting to create interactive shell: {}".format(e), err=True) click.echo("Error while attempting to create interactive shell: {}".format(e), err=True)
elif choice == 'log': elif choice == 'log':
......
...@@ -18,8 +18,8 @@ ...@@ -18,8 +18,8 @@
# Tristan Van Berkom <tristan.vanberkom@codethink.co.uk> # Tristan Van Berkom <tristan.vanberkom@codethink.co.uk>
import os import os
import sys import sys
import click
import curses import curses
import click
# Import a widget internal for formatting time codes # Import a widget internal for formatting time codes
from .widget import TimeCode from .widget import TimeCode
......
...@@ -668,17 +668,6 @@ class LogLine(Widget): ...@@ -668,17 +668,6 @@ class LogLine(Widget):
extra_nl = True extra_nl = True
if message.sandbox is not None:
sandbox = self._indent + 'Sandbox directory: ' + message.sandbox
text += '\n'
if message.message_type == MessageType.FAIL:
text += self._err_profile.fmt(sandbox, bold=True)
else:
text += self._detail_profile.fmt(sandbox)
text += '\n'
extra_nl = True
if message.scheduler and message.message_type == MessageType.FAIL: if message.scheduler and message.message_type == MessageType.FAIL:
text += '\n' text += '\n'
......
...@@ -42,9 +42,11 @@ from .mount import Mount ...@@ -42,9 +42,11 @@ from .mount import Mount
# #
class SafeHardlinks(Mount): class SafeHardlinks(Mount):
def __init__(self, directory, tempdir, fuse_mount_options={}): def __init__(self, directory, tempdir, fuse_mount_options=None):
self.directory = directory self.directory = directory
self.tempdir = tempdir self.tempdir = tempdir
if fuse_mount_options is None:
fuse_mount_options = {}
super().__init__(fuse_mount_options=fuse_mount_options) super().__init__(fuse_mount_options=fuse_mount_options)
def create_operations(self): def create_operations(self):
......
...@@ -87,8 +87,8 @@ class Mount(): ...@@ -87,8 +87,8 @@ class Mount():
# User Facing API # # User Facing API #
################################################ ################################################
def __init__(self, fuse_mount_options={}): def __init__(self, fuse_mount_options=None):
self._fuse_mount_options = fuse_mount_options self._fuse_mount_options = {} if fuse_mount_options is None else fuse_mount_options
# mount(): # mount():
# #
...@@ -182,7 +182,7 @@ class Mount(): ...@@ -182,7 +182,7 @@ class Mount():
# Ask the subclass to give us an Operations object # Ask the subclass to give us an Operations object
# #
self.__operations = self.create_operations() self.__operations = self.create_operations() # pylint: disable=assignment-from-no-return
# Run fuse in foreground in this child process, internally libfuse # Run fuse in foreground in this child process, internally libfuse
# will handle SIGTERM and gracefully exit its own little main loop. # will handle SIGTERM and gracefully exit its own little main loop.
......
...@@ -146,8 +146,8 @@ def _extract_depends_from_node(node, *, key=None): ...@@ -146,8 +146,8 @@ def _extract_depends_from_node(node, *, key=None):
depends = _yaml.node_get(node, list, key, default_value=[]) depends = _yaml.node_get(node, list, key, default_value=[])
output_deps = [] output_deps = []
for dep in depends: for index, dep in enumerate(depends):
dep_provenance = _yaml.node_get_provenance(node, key=key, indices=[depends.index(dep)]) dep_provenance = _yaml.node_get_provenance(node, key=key, indices=[index])
if isinstance(dep, str): if isinstance(dep, str):
dependency = Dependency(dep, provenance=dep_provenance, dep_type=default_dep_type) dependency = Dependency(dep, provenance=dep_provenance, dep_type=default_dep_type)
...@@ -177,10 +177,8 @@ def _extract_depends_from_node(node, *, key=None): ...@@ -177,10 +177,8 @@ def _extract_depends_from_node(node, *, key=None):
provenance=dep_provenance) provenance=dep_provenance)
else: else:
index = depends.index(dep)
p = _yaml.node_get_provenance(node, key=key, indices=[index])
raise LoadError(LoadErrorReason.INVALID_DATA, raise LoadError(LoadErrorReason.INVALID_DATA,
"{}: Dependency is not specified as a string or a dictionary".format(p)) "{}: Dependency is not specified as a string or a dictionary".format(dep_provenance))
output_deps.append(dependency) output_deps.append(dependency)
......
...@@ -70,7 +70,7 @@ class Message(): ...@@ -70,7 +70,7 @@ class Message():
self.elapsed = elapsed # The elapsed time, in timed messages self.elapsed = elapsed # The elapsed time, in timed messages
self.depth = depth # The depth of a timed message self.depth = depth # The depth of a timed message
self.logfile = logfile # The log file path where commands took place self.logfile = logfile # The log file path where commands took place
self.sandbox = sandbox # The sandbox directory where an error occurred (if any) self.sandbox = sandbox # The error that caused this message used a sandbox
self.pid = os.getpid() # The process pid self.pid = os.getpid() # The process pid
self.unique_id = unique_id # The plugin object ID issueing the message self.unique_id = unique_id # The plugin object ID issueing the message
self.task_id = task_id # The plugin object ID of the task self.task_id = task_id # The plugin object ID of the task
......
...@@ -43,9 +43,9 @@ class OptionBool(Option): ...@@ -43,9 +43,9 @@ class OptionBool(Option):
self.value = _yaml.node_get(node, bool, self.name) self.value = _yaml.node_get(node, bool, self.name)
def set_value(self, value): def set_value(self, value):
if value == 'True' or value == 'true': if value in ('True', 'true'):
self.value = True self.value = True
elif value == 'False' or value == 'false': elif value in ('False', 'false'):
self.value = False self.value = False
else: else:
raise LoadError(LoadErrorReason.INVALID_DATA, raise LoadError(LoadErrorReason.INVALID_DATA,
......
...@@ -16,9 +16,7 @@ ...@@ -16,9 +16,7 @@
# License along with this library. If not, see <http://www.gnu.org/licenses/>. # License along with this library. If not, see <http://www.gnu.org/licenses/>.
import os import os
import resource
from .._exceptions import PlatformError
from ..sandbox import SandboxDummy from ..sandbox import SandboxDummy
from . import Platform from . import Platform
...@@ -29,10 +27,6 @@ class Darwin(Platform): ...@@ -29,10 +27,6 @@ class Darwin(Platform):
# This value comes from OPEN_MAX in syslimits.h # This value comes from OPEN_MAX in syslimits.h
OPEN_MAX = 10240 OPEN_MAX = 10240
def __init__(self):
super().__init__()
def create_sandbox(self, *args, **kwargs): def create_sandbox(self, *args, **kwargs):
kwargs['dummy_reason'] = \ kwargs['dummy_reason'] = \
"OSXFUSE is not supported and there are no supported sandbox" + \ "OSXFUSE is not supported and there are no supported sandbox" + \
......
...@@ -22,7 +22,6 @@ import subprocess ...@@ -22,7 +22,6 @@ import subprocess
from .. import _site from .. import _site
from .. import utils from .. import utils
from .._message import Message, MessageType
from ..sandbox import SandboxDummy from ..sandbox import SandboxDummy
from . import Platform from . import Platform
...@@ -112,8 +111,4 @@ class Linux(Platform): ...@@ -112,8 +111,4 @@ class Linux(Platform):
except subprocess.CalledProcessError: except subprocess.CalledProcessError:
output = '' output = ''
if output == 'root': return output == 'root'
return True
else:
return False
...@@ -414,7 +414,7 @@ class Job(): ...@@ -414,7 +414,7 @@ class Job():
try: try:
# Try the task action # Try the task action
result = self.child_process() result = self.child_process() # pylint: disable=assignment-from-no-return
except SkipJob as e: except SkipJob as e:
elapsed = datetime.datetime.now() - starttime elapsed = datetime.datetime.now() - starttime
self.message(MessageType.SKIPPED, str(e), self.message(MessageType.SKIPPED, str(e),
......
...@@ -57,7 +57,7 @@ class PullQueue(Queue): ...@@ -57,7 +57,7 @@ class PullQueue(Queue):
def done(self, _, element, result, success): def done(self, _, element, result, success):
if not success: if not success:
return False return
element._pull_done() element._pull_done()
......