Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results
Show changes
Commits on Source (58)
Showing
with 898 additions and 641 deletions
...@@ -166,6 +166,12 @@ docs: ...@@ -166,6 +166,12 @@ docs:
BST_EXT_REF: 1d6ab71151b93c8cbc0a91a36ffe9270f3b835f1 # 0.5.1 BST_EXT_REF: 1d6ab71151b93c8cbc0a91a36ffe9270f3b835f1 # 0.5.1
FD_SDK_REF: 88d7c22c2281b987faa02edd57df80d430eecf1f # 18.08.11-35-g88d7c22c FD_SDK_REF: 88d7c22c2281b987faa02edd57df80d430eecf1f # 18.08.11-35-g88d7c22c
before_script: before_script:
- |
mkdir -p "${HOME}/.config"
cat <<EOF >"${HOME}/.config/buildstream.conf"
scheduler:
fetchers: 2
EOF
- (cd dist && ./unpack.sh && cd buildstream && pip3 install .) - (cd dist && ./unpack.sh && cd buildstream && pip3 install .)
- pip3 install --user -e ${BST_EXT_URL}@${BST_EXT_REF}#egg=bst_ext - pip3 install --user -e ${BST_EXT_URL}@${BST_EXT_REF}#egg=bst_ext
- git clone https://gitlab.com/freedesktop-sdk/freedesktop-sdk.git - git clone https://gitlab.com/freedesktop-sdk/freedesktop-sdk.git
......
...@@ -2,6 +2,15 @@ ...@@ -2,6 +2,15 @@
buildstream 1.3.1 buildstream 1.3.1
================= =================
o `bst shell` learned the `-e` option for staging multiple elements
provided the element's kind implements `BST_GRANULAR_STAGE`.
o BREAKING CHANGE: The 'manual' element lost its default 'MAKEFLAGS' and 'V'
environment variables. There is already a 'make' element with the same
variables. Note that this is a breaking change, it will require users to
make changes to their .bst files if they are expecting these environment
variables to be set.
o Failed builds are included in the cache as well. o Failed builds are included in the cache as well.
`bst checkout` will provide anything in `%{install-root}`. `bst checkout` will provide anything in `%{install-root}`.
A build including cached fails will cause any dependant elements A build including cached fails will cause any dependant elements
...@@ -31,6 +40,15 @@ buildstream 1.3.1 ...@@ -31,6 +40,15 @@ buildstream 1.3.1
new the `conf-root` variable to make the process easier. And there has been new the `conf-root` variable to make the process easier. And there has been
a bug fix to workspaces so they can be build in workspaces too. a bug fix to workspaces so they can be build in workspaces too.
o Creating a build shell through the interactive mode or `bst shell --build`
will now use the cached build tree. It is now easier to debug local build
failures.
o `bst shell --sysroot` now takes any directory that contains a sysroot,
instead of just a specially-formatted build-root with a `root` and `scratch`
subdirectory.
================= =================
buildstream 1.1.5 buildstream 1.1.5
================= =================
......
...@@ -17,17 +17,22 @@ ...@@ -17,17 +17,22 @@
# Authors: # Authors:
# Tristan Maat <tristan.maat@codethink.co.uk> # Tristan Maat <tristan.maat@codethink.co.uk>
import multiprocessing
import os import os
import signal
import string import string
from collections import namedtuple from collections import namedtuple
from collections.abc import Mapping from collections.abc import Mapping
from ..types import _KeyStrength from ..types import _KeyStrength
from .._exceptions import ArtifactError, ImplError, LoadError, LoadErrorReason from .._exceptions import ArtifactError, CASError, LoadError, LoadErrorReason
from .._message import Message, MessageType from .._message import Message, MessageType
from .. import _signals
from .. import utils from .. import utils
from .. import _yaml from .. import _yaml
from .cascache import CASCache, CASRemote
CACHE_SIZE_FILE = "cache_size" CACHE_SIZE_FILE = "cache_size"
...@@ -93,7 +98,8 @@ class ArtifactCache(): ...@@ -93,7 +98,8 @@ class ArtifactCache():
def __init__(self, context): def __init__(self, context):
self.context = context self.context = context
self.extractdir = os.path.join(context.artifactdir, 'extract') self.extractdir = os.path.join(context.artifactdir, 'extract')
self.tmpdir = os.path.join(context.artifactdir, 'tmp')
self.cas = CASCache(context.artifactdir)
self.global_remote_specs = [] self.global_remote_specs = []
self.project_remote_specs = {} self.project_remote_specs = {}
...@@ -104,12 +110,15 @@ class ArtifactCache(): ...@@ -104,12 +110,15 @@ class ArtifactCache():
self._cache_lower_threshold = None # The target cache size for a cleanup self._cache_lower_threshold = None # The target cache size for a cleanup
self._remotes_setup = False # Check to prevent double-setup of remotes self._remotes_setup = False # Check to prevent double-setup of remotes
# Per-project list of _CASRemote instances.
self._remotes = {}
self._has_fetch_remotes = False
self._has_push_remotes = False
os.makedirs(self.extractdir, exist_ok=True) os.makedirs(self.extractdir, exist_ok=True)
os.makedirs(self.tmpdir, exist_ok=True)
################################################ self._calculate_cache_quota()
# Methods implemented on the abstract class #
################################################
# get_artifact_fullname() # get_artifact_fullname()
# #
...@@ -240,8 +249,10 @@ class ArtifactCache(): ...@@ -240,8 +249,10 @@ class ArtifactCache():
for key in (strong_key, weak_key): for key in (strong_key, weak_key):
if key: if key:
try: try:
self.update_mtime(element, key) ref = self.get_artifact_fullname(element, key)
except ArtifactError:
self.cas.update_mtime(ref)
except CASError:
pass pass
# clean(): # clean():
...@@ -252,7 +263,7 @@ class ArtifactCache(): ...@@ -252,7 +263,7 @@ class ArtifactCache():
# (int): The size of the cache after having cleaned up # (int): The size of the cache after having cleaned up
# #
def clean(self): def clean(self):
artifacts = self.list_artifacts() # pylint: disable=assignment-from-no-return artifacts = self.list_artifacts()
# Build a set of the cache keys which are required # Build a set of the cache keys which are required
# based on the required elements at cleanup time # based on the required elements at cleanup time
...@@ -294,7 +305,7 @@ class ArtifactCache(): ...@@ -294,7 +305,7 @@ class ArtifactCache():
if key not in required_artifacts: if key not in required_artifacts:
# Remove the actual artifact, if it's not required. # Remove the actual artifact, if it's not required.
size = self.remove(to_remove) # pylint: disable=assignment-from-no-return size = self.remove(to_remove)
# Remove the size from the removed size # Remove the size from the removed size
self.set_cache_size(self._cache_size - size) self.set_cache_size(self._cache_size - size)
...@@ -311,7 +322,7 @@ class ArtifactCache(): ...@@ -311,7 +322,7 @@ class ArtifactCache():
# (int): The size of the artifact cache. # (int): The size of the artifact cache.
# #
def compute_cache_size(self): def compute_cache_size(self):
self._cache_size = self.calculate_cache_size() # pylint: disable=assignment-from-no-return self._cache_size = self.cas.calculate_cache_size()
return self._cache_size return self._cache_size
...@@ -380,28 +391,12 @@ class ArtifactCache(): ...@@ -380,28 +391,12 @@ class ArtifactCache():
def has_quota_exceeded(self): def has_quota_exceeded(self):
return self.get_cache_size() > self._cache_quota return self.get_cache_size() > self._cache_quota
################################################
# Abstract methods for subclasses to implement #
################################################
# preflight(): # preflight():
# #
# Preflight check. # Preflight check.
# #
def preflight(self): def preflight(self):
pass self.cas.preflight()
# update_mtime()
#
# Update the mtime of an artifact.
#
# Args:
# element (Element): The Element to update
# key (str): The key of the artifact.
#
def update_mtime(self, element, key):
raise ImplError("Cache '{kind}' does not implement update_mtime()"
.format(kind=type(self).__name__))
# initialize_remotes(): # initialize_remotes():
# #
...@@ -411,7 +406,59 @@ class ArtifactCache(): ...@@ -411,7 +406,59 @@ class ArtifactCache():
# on_failure (callable): Called if we fail to contact one of the caches. # on_failure (callable): Called if we fail to contact one of the caches.
# #
def initialize_remotes(self, *, on_failure=None): def initialize_remotes(self, *, on_failure=None):
pass remote_specs = self.global_remote_specs
for project in self.project_remote_specs:
remote_specs += self.project_remote_specs[project]
remote_specs = list(utils._deduplicate(remote_specs))
remotes = {}
q = multiprocessing.Queue()
for remote_spec in remote_specs:
# Use subprocess to avoid creation of gRPC threads in main BuildStream process
# See https://github.com/grpc/grpc/blob/master/doc/fork_support.md for details
p = multiprocessing.Process(target=self.cas.initialize_remote, args=(remote_spec, q))
try:
# Keep SIGINT blocked in the child process
with _signals.blocked([signal.SIGINT], ignore=False):
p.start()
error = q.get()
p.join()
except KeyboardInterrupt:
utils._kill_process_tree(p.pid)
raise
if error and on_failure:
on_failure(remote_spec.url, error)
elif error:
raise ArtifactError(error)
else:
self._has_fetch_remotes = True
if remote_spec.push:
self._has_push_remotes = True
remotes[remote_spec.url] = CASRemote(remote_spec)
for project in self.context.get_projects():
remote_specs = self.global_remote_specs
if project in self.project_remote_specs:
remote_specs = list(utils._deduplicate(remote_specs + self.project_remote_specs[project]))
project_remotes = []
for remote_spec in remote_specs:
# Errors are already handled in the loop above,
# skip unreachable remotes here.
if remote_spec.url not in remotes:
continue
remote = remotes[remote_spec.url]
project_remotes.append(remote)
self._remotes[project] = project_remotes
# contains(): # contains():
# #
...@@ -425,8 +472,9 @@ class ArtifactCache(): ...@@ -425,8 +472,9 @@ class ArtifactCache():
# Returns: True if the artifact is in the cache, False otherwise # Returns: True if the artifact is in the cache, False otherwise
# #
def contains(self, element, key): def contains(self, element, key):
raise ImplError("Cache '{kind}' does not implement contains()" ref = self.get_artifact_fullname(element, key)
.format(kind=type(self).__name__))
return self.cas.contains(ref)
# list_artifacts(): # list_artifacts():
# #
...@@ -437,8 +485,7 @@ class ArtifactCache(): ...@@ -437,8 +485,7 @@ class ArtifactCache():
# `ArtifactCache.get_artifact_fullname` in LRU order # `ArtifactCache.get_artifact_fullname` in LRU order
# #
def list_artifacts(self): def list_artifacts(self):
raise ImplError("Cache '{kind}' does not implement list_artifacts()" return self.cas.list_refs()
.format(kind=type(self).__name__))
# remove(): # remove():
# #
...@@ -450,9 +497,31 @@ class ArtifactCache(): ...@@ -450,9 +497,31 @@ class ArtifactCache():
# generated by # generated by
# `ArtifactCache.get_artifact_fullname`) # `ArtifactCache.get_artifact_fullname`)
# #
def remove(self, artifact_name): # Returns:
raise ImplError("Cache '{kind}' does not implement remove()" # (int|None) The amount of space pruned from the repository in
.format(kind=type(self).__name__)) # Bytes, or None if defer_prune is True
#
def remove(self, ref):
# Remove extract if not used by other ref
tree = self.cas.resolve_ref(ref)
ref_name, ref_hash = os.path.split(ref)
extract = os.path.join(self.extractdir, ref_name, tree.hash)
keys_file = os.path.join(extract, 'meta', 'keys.yaml')
if os.path.exists(keys_file):
keys_meta = _yaml.load(keys_file)
keys = [keys_meta['strong'], keys_meta['weak']]
remove_extract = True
for other_hash in keys:
if other_hash == ref_hash:
continue
remove_extract = False
break
if remove_extract:
utils._force_rmtree(extract)
return self.cas.remove(ref)
# extract(): # extract():
# #
...@@ -472,8 +541,11 @@ class ArtifactCache(): ...@@ -472,8 +541,11 @@ class ArtifactCache():
# Returns: path to extracted artifact # Returns: path to extracted artifact
# #
def extract(self, element, key): def extract(self, element, key):
raise ImplError("Cache '{kind}' does not implement extract()" ref = self.get_artifact_fullname(element, key)
.format(kind=type(self).__name__))
path = os.path.join(self.extractdir, element._get_project().name, element.normal_name)
return self.cas.extract(ref, path)
# commit(): # commit():
# #
...@@ -485,8 +557,9 @@ class ArtifactCache(): ...@@ -485,8 +557,9 @@ class ArtifactCache():
# keys (list): The cache keys to use # keys (list): The cache keys to use
# #
def commit(self, element, content, keys): def commit(self, element, content, keys):
raise ImplError("Cache '{kind}' does not implement commit()" refs = [self.get_artifact_fullname(element, key) for key in keys]
.format(kind=type(self).__name__))
self.cas.commit(refs, content)
# diff(): # diff():
# #
...@@ -500,8 +573,10 @@ class ArtifactCache(): ...@@ -500,8 +573,10 @@ class ArtifactCache():
# subdir (str): A subdirectory to limit the comparison to # subdir (str): A subdirectory to limit the comparison to
# #
def diff(self, element, key_a, key_b, *, subdir=None): def diff(self, element, key_a, key_b, *, subdir=None):
raise ImplError("Cache '{kind}' does not implement diff()" ref_a = self.get_artifact_fullname(element, key_a)
.format(kind=type(self).__name__)) ref_b = self.get_artifact_fullname(element, key_b)
return self.cas.diff(ref_a, ref_b, subdir=subdir)
# has_fetch_remotes(): # has_fetch_remotes():
# #
...@@ -513,7 +588,16 @@ class ArtifactCache(): ...@@ -513,7 +588,16 @@ class ArtifactCache():
# Returns: True if any remote repositories are configured, False otherwise # Returns: True if any remote repositories are configured, False otherwise
# #
def has_fetch_remotes(self, *, element=None): def has_fetch_remotes(self, *, element=None):
return False if not self._has_fetch_remotes:
# No project has fetch remotes
return False
elif element is None:
# At least one (sub)project has fetch remotes
return True
else:
# Check whether the specified element's project has fetch remotes
remotes_for_project = self._remotes[element._get_project()]
return bool(remotes_for_project)
# has_push_remotes(): # has_push_remotes():
# #
...@@ -525,7 +609,16 @@ class ArtifactCache(): ...@@ -525,7 +609,16 @@ class ArtifactCache():
# Returns: True if any remote repository is configured, False otherwise # Returns: True if any remote repository is configured, False otherwise
# #
def has_push_remotes(self, *, element=None): def has_push_remotes(self, *, element=None):
return False if not self._has_push_remotes:
# No project has push remotes
return False
elif element is None:
# At least one (sub)project has push remotes
return True
else:
# Check whether the specified element's project has push remotes
remotes_for_project = self._remotes[element._get_project()]
return any(remote.spec.push for remote in remotes_for_project)
# push(): # push():
# #
...@@ -542,8 +635,28 @@ class ArtifactCache(): ...@@ -542,8 +635,28 @@ class ArtifactCache():
# (ArtifactError): if there was an error # (ArtifactError): if there was an error
# #
def push(self, element, keys): def push(self, element, keys):
raise ImplError("Cache '{kind}' does not implement push()" refs = [self.get_artifact_fullname(element, key) for key in list(keys)]
.format(kind=type(self).__name__))
project = element._get_project()
push_remotes = [r for r in self._remotes[project] if r.spec.push]
pushed = False
for remote in push_remotes:
remote.init()
display_key = element._get_brief_display_key()
element.status("Pushing artifact {} -> {}".format(display_key, remote.spec.url))
if self.cas.push(refs, remote):
element.info("Pushed artifact {} -> {}".format(display_key, remote.spec.url))
pushed = True
else:
element.info("Remote ({}) already has {} cached".format(
remote.spec.url, element._get_brief_display_key()
))
return pushed
# pull(): # pull():
# #
...@@ -558,8 +671,130 @@ class ArtifactCache(): ...@@ -558,8 +671,130 @@ class ArtifactCache():
# (bool): True if pull was successful, False if artifact was not available # (bool): True if pull was successful, False if artifact was not available
# #
def pull(self, element, key, *, progress=None): def pull(self, element, key, *, progress=None):
raise ImplError("Cache '{kind}' does not implement pull()" ref = self.get_artifact_fullname(element, key)
.format(kind=type(self).__name__))
project = element._get_project()
for remote in self._remotes[project]:
try:
display_key = element._get_brief_display_key()
element.status("Pulling artifact {} <- {}".format(display_key, remote.spec.url))
if self.cas.pull(ref, remote, progress=progress):
element.info("Pulled artifact {} <- {}".format(display_key, remote.spec.url))
# no need to pull from additional remotes
return True
else:
element.info("Remote ({}) does not have {} cached".format(
remote.spec.url, element._get_brief_display_key()
))
except CASError as e:
raise ArtifactError("Failed to pull artifact {}: {}".format(
element._get_brief_display_key(), e)) from e
return False
# pull_tree():
#
# Pull a single Tree rather than an artifact.
# Does not update local refs.
#
# Args:
# project (Project): The current project
# digest (Digest): The digest of the tree
#
def pull_tree(self, project, digest):
for remote in self._remotes[project]:
digest = self.cas.pull_tree(remote, digest)
if digest:
# no need to pull from additional remotes
return digest
return None
# push_directory():
#
# Push the given virtual directory to all remotes.
#
# Args:
# project (Project): The current project
# directory (Directory): A virtual directory object to push.
#
# Raises:
# (ArtifactError): if there was an error
#
def push_directory(self, project, directory):
if self._has_push_remotes:
push_remotes = [r for r in self._remotes[project] if r.spec.push]
else:
push_remotes = []
if not push_remotes:
raise ArtifactError("push_directory was called, but no remote artifact " +
"servers are configured as push remotes.")
if directory.ref is None:
return
for remote in push_remotes:
self.cas.push_directory(remote, directory)
# push_message():
#
# Push the given protobuf message to all remotes.
#
# Args:
# project (Project): The current project
# message (Message): A protobuf message to push.
#
# Raises:
# (ArtifactError): if there was an error
#
def push_message(self, project, message):
if self._has_push_remotes:
push_remotes = [r for r in self._remotes[project] if r.spec.push]
else:
push_remotes = []
if not push_remotes:
raise ArtifactError("push_message was called, but no remote artifact " +
"servers are configured as push remotes.")
for remote in push_remotes:
message_digest = self.cas.push_message(remote, message)
return message_digest
# verify_digest_pushed():
#
# Check whether the object is already on the server in which case
# there is no need to upload it.
#
# Args:
# project (Project): The current project
# digest (Digest): The object digest.
#
def verify_digest_pushed(self, project, digest):
if self._has_push_remotes:
push_remotes = [r for r in self._remotes[project] if r.spec.push]
else:
push_remotes = []
if not push_remotes:
raise ArtifactError("verify_digest_pushed was called, but no remote artifact " +
"servers are configured as push remotes.")
pushed = False
for remote in push_remotes:
if self.cas.verify_digest_on_remote(remote, digest):
pushed = True
return pushed
# link_key(): # link_key():
# #
...@@ -571,19 +806,10 @@ class ArtifactCache(): ...@@ -571,19 +806,10 @@ class ArtifactCache():
# newkey (str): A new cache key for the artifact # newkey (str): A new cache key for the artifact
# #
def link_key(self, element, oldkey, newkey): def link_key(self, element, oldkey, newkey):
raise ImplError("Cache '{kind}' does not implement link_key()" oldref = self.get_artifact_fullname(element, oldkey)
.format(kind=type(self).__name__)) newref = self.get_artifact_fullname(element, newkey)
# calculate_cache_size() self.cas.link_ref(oldref, newref)
#
# Return the real artifact cache size.
#
# Returns:
# (int): The size of the artifact cache.
#
def calculate_cache_size(self):
raise ImplError("Cache '{kind}' does not implement calculate_cache_size()"
.format(kind=type(self).__name__))
################################################ ################################################
# Local Private Methods # # Local Private Methods #
......
This diff is collapsed.
...@@ -32,8 +32,9 @@ from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2, remo ...@@ -32,8 +32,9 @@ from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2, remo
from .._protos.google.bytestream import bytestream_pb2, bytestream_pb2_grpc from .._protos.google.bytestream import bytestream_pb2, bytestream_pb2_grpc
from .._protos.buildstream.v2 import buildstream_pb2, buildstream_pb2_grpc from .._protos.buildstream.v2 import buildstream_pb2, buildstream_pb2_grpc
from .._exceptions import ArtifactError from .._exceptions import CASError
from .._context import Context
from .cascache import CASCache
# The default limit for gRPC messages is 4 MiB. # The default limit for gRPC messages is 4 MiB.
...@@ -55,26 +56,23 @@ class ArtifactTooLargeException(Exception): ...@@ -55,26 +56,23 @@ class ArtifactTooLargeException(Exception):
# enable_push (bool): Whether to allow blob uploads and artifact updates # enable_push (bool): Whether to allow blob uploads and artifact updates
# #
def create_server(repo, *, enable_push): def create_server(repo, *, enable_push):
context = Context() cas = CASCache(os.path.abspath(repo))
context.artifactdir = os.path.abspath(repo)
artifactcache = context.artifactcache
# Use max_workers default from Python 3.5+ # Use max_workers default from Python 3.5+
max_workers = (os.cpu_count() or 1) * 5 max_workers = (os.cpu_count() or 1) * 5
server = grpc.server(futures.ThreadPoolExecutor(max_workers)) server = grpc.server(futures.ThreadPoolExecutor(max_workers))
bytestream_pb2_grpc.add_ByteStreamServicer_to_server( bytestream_pb2_grpc.add_ByteStreamServicer_to_server(
_ByteStreamServicer(artifactcache, enable_push=enable_push), server) _ByteStreamServicer(cas, enable_push=enable_push), server)
remote_execution_pb2_grpc.add_ContentAddressableStorageServicer_to_server( remote_execution_pb2_grpc.add_ContentAddressableStorageServicer_to_server(
_ContentAddressableStorageServicer(artifactcache, enable_push=enable_push), server) _ContentAddressableStorageServicer(cas, enable_push=enable_push), server)
remote_execution_pb2_grpc.add_CapabilitiesServicer_to_server( remote_execution_pb2_grpc.add_CapabilitiesServicer_to_server(
_CapabilitiesServicer(), server) _CapabilitiesServicer(), server)
buildstream_pb2_grpc.add_ReferenceStorageServicer_to_server( buildstream_pb2_grpc.add_ReferenceStorageServicer_to_server(
_ReferenceStorageServicer(artifactcache, enable_push=enable_push), server) _ReferenceStorageServicer(cas, enable_push=enable_push), server)
return server return server
...@@ -333,7 +331,7 @@ class _ReferenceStorageServicer(buildstream_pb2_grpc.ReferenceStorageServicer): ...@@ -333,7 +331,7 @@ class _ReferenceStorageServicer(buildstream_pb2_grpc.ReferenceStorageServicer):
response.digest.hash = tree.hash response.digest.hash = tree.hash
response.digest.size_bytes = tree.size_bytes response.digest.size_bytes = tree.size_bytes
except ArtifactError: except CASError:
context.set_code(grpc.StatusCode.NOT_FOUND) context.set_code(grpc.StatusCode.NOT_FOUND)
return response return response
...@@ -437,7 +435,7 @@ def _clean_up_cache(cas, object_size): ...@@ -437,7 +435,7 @@ def _clean_up_cache(cas, object_size):
return 0 return 0
# obtain a list of LRP artifacts # obtain a list of LRP artifacts
LRP_artifacts = cas.list_artifacts() LRP_artifacts = cas.list_refs()
removed_size = 0 # in bytes removed_size = 0 # in bytes
while object_size - removed_size > free_disk_space: while object_size - removed_size > free_disk_space:
......
...@@ -31,7 +31,6 @@ from ._exceptions import LoadError, LoadErrorReason, BstError ...@@ -31,7 +31,6 @@ from ._exceptions import LoadError, LoadErrorReason, BstError
from ._message import Message, MessageType from ._message import Message, MessageType
from ._profile import Topics, profile_start, profile_end from ._profile import Topics, profile_start, profile_end
from ._artifactcache import ArtifactCache from ._artifactcache import ArtifactCache
from ._artifactcache.cascache import CASCache
from ._workspaces import Workspaces from ._workspaces import Workspaces
from .plugin import _plugin_lookup from .plugin import _plugin_lookup
...@@ -233,7 +232,7 @@ class Context(): ...@@ -233,7 +232,7 @@ class Context():
@property @property
def artifactcache(self): def artifactcache(self):
if not self._artifactcache: if not self._artifactcache:
self._artifactcache = CASCache(self) self._artifactcache = ArtifactCache(self)
return self._artifactcache return self._artifactcache
......
...@@ -47,7 +47,6 @@ class ElementFactory(PluginContext): ...@@ -47,7 +47,6 @@ class ElementFactory(PluginContext):
# Args: # Args:
# context (object): The Context object for processing # context (object): The Context object for processing
# project (object): The project object # project (object): The project object
# artifacts (ArtifactCache): The artifact cache
# meta (object): The loaded MetaElement # meta (object): The loaded MetaElement
# #
# Returns: A newly created Element object of the appropriate kind # Returns: A newly created Element object of the appropriate kind
...@@ -56,9 +55,9 @@ class ElementFactory(PluginContext): ...@@ -56,9 +55,9 @@ class ElementFactory(PluginContext):
# PluginError (if the kind lookup failed) # PluginError (if the kind lookup failed)
# LoadError (if the element itself took issue with the config) # LoadError (if the element itself took issue with the config)
# #
def create(self, context, project, artifacts, meta): def create(self, context, project, meta):
element_type, default_config = self.lookup(meta.kind) element_type, default_config = self.lookup(meta.kind)
element = element_type(context, project, artifacts, meta, default_config) element = element_type(context, project, meta, default_config)
version = self._format_versions.get(meta.kind, 0) version = self._format_versions.get(meta.kind, 0)
self._assert_plugin_format(element, version) self._assert_plugin_format(element, version)
return element return element
...@@ -90,6 +90,7 @@ class ErrorDomain(Enum): ...@@ -90,6 +90,7 @@ class ErrorDomain(Enum):
APP = 12 APP = 12
STREAM = 13 STREAM = 13
VIRTUAL_FS = 14 VIRTUAL_FS = 14
CAS = 15
# BstError is an internal base exception class for BuildSream # BstError is an internal base exception class for BuildSream
...@@ -111,10 +112,8 @@ class BstError(Exception): ...@@ -111,10 +112,8 @@ class BstError(Exception):
# #
self.detail = detail self.detail = detail
# The build sandbox in which the error occurred, if the # A sandbox can be created to debug this error
# error occurred at element assembly time. self.sandbox = False
#
self.sandbox = None
# When this exception occurred during the handling of a job, indicate # When this exception occurred during the handling of a job, indicate
# whether or not there is any point retrying the job. # whether or not there is any point retrying the job.
...@@ -276,6 +275,15 @@ class ArtifactError(BstError): ...@@ -276,6 +275,15 @@ class ArtifactError(BstError):
super().__init__(message, detail=detail, domain=ErrorDomain.ARTIFACT, reason=reason, temporary=True) super().__init__(message, detail=detail, domain=ErrorDomain.ARTIFACT, reason=reason, temporary=True)
# CASError
#
# Raised when errors are encountered in the CAS
#
class CASError(BstError):
def __init__(self, message, *, detail=None, reason=None, temporary=False):
super().__init__(message, detail=detail, domain=ErrorDomain.CAS, reason=reason, temporary=True)
# PipelineError # PipelineError
# #
# Raised from pipeline operations # Raised from pipeline operations
......
...@@ -305,7 +305,6 @@ class App(): ...@@ -305,7 +305,6 @@ class App():
directory = self._main_options['directory'] directory = self._main_options['directory']
directory = os.path.abspath(directory) directory = os.path.abspath(directory)
project_path = os.path.join(directory, 'project.conf') project_path = os.path.join(directory, 'project.conf')
elements_path = os.path.join(directory, element_path)
try: try:
# Abort if the project.conf already exists, unless `--force` was specified in `bst init` # Abort if the project.conf already exists, unless `--force` was specified in `bst init`
...@@ -335,6 +334,7 @@ class App(): ...@@ -335,6 +334,7 @@ class App():
raise AppError("Error creating project directory {}: {}".format(directory, e)) from e raise AppError("Error creating project directory {}: {}".format(directory, e)) from e
# Create the elements sub-directory if it doesnt exist # Create the elements sub-directory if it doesnt exist
elements_path = os.path.join(directory, element_path)
try: try:
os.makedirs(elements_path, exist_ok=True) os.makedirs(elements_path, exist_ok=True)
except IOError as e: except IOError as e:
...@@ -597,7 +597,7 @@ class App(): ...@@ -597,7 +597,7 @@ class App():
click.echo("\nDropping into an interactive shell in the failed build sandbox\n", err=True) click.echo("\nDropping into an interactive shell in the failed build sandbox\n", err=True)
try: try:
prompt = self.shell_prompt(element) prompt = self.shell_prompt(element)
self.stream.shell(element, Scope.BUILD, prompt, directory=failure.sandbox, isolate=True) self.stream.shell(element, Scope.BUILD, prompt, isolate=True)
except BstError as e: except BstError as e:
click.echo("Error while attempting to create interactive shell: {}".format(e), err=True) click.echo("Error while attempting to create interactive shell: {}".format(e), err=True)
elif choice == 'log': elif choice == 'log':
......
...@@ -572,11 +572,13 @@ def show(app, elements, deps, except_, order, format_): ...@@ -572,11 +572,13 @@ def show(app, elements, deps, except_, order, format_):
help="Mount a file or directory into the sandbox") help="Mount a file or directory into the sandbox")
@click.option('--isolate', is_flag=True, default=False, @click.option('--isolate', is_flag=True, default=False,
help='Create an isolated build sandbox') help='Create an isolated build sandbox')
@click.option('--element', '-e', 'elements', multiple=True,
type=click.Path(readable=False), required=False)
@click.argument('element', @click.argument('element',
type=click.Path(readable=False)) type=click.Path(readable=False), required=False)
@click.argument('command', type=click.STRING, nargs=-1) @click.argument('command', type=click.STRING, nargs=-1)
@click.pass_obj @click.pass_obj
def shell(app, element, sysroot, mount, isolate, build_, command): def shell(app, element, elements, sysroot, mount, isolate, build_, command):
"""Run a command in the target element's sandbox environment """Run a command in the target element's sandbox environment
This will stage a temporary sysroot for running the target This will stage a temporary sysroot for running the target
...@@ -597,13 +599,21 @@ def shell(app, element, sysroot, mount, isolate, build_, command): ...@@ -597,13 +599,21 @@ def shell(app, element, sysroot, mount, isolate, build_, command):
from .._project import HostMount from .._project import HostMount
from .._pipeline import PipelineSelection from .._pipeline import PipelineSelection
if elements and element is not None:
command = (element,) + command
element = None
if not elements and element is not None:
elements = (element,)
if not elements:
raise AppError('No elements specified to open a shell in')
if build_: if build_:
scope = Scope.BUILD scope = Scope.BUILD
else: else:
scope = Scope.RUN scope = Scope.RUN
with app.initialized(): with app.initialized():
dependencies = app.stream.load_selection((element,), selection=PipelineSelection.NONE) dependencies = app.stream.load_selection(elements, selection=PipelineSelection.NONE)
element = dependencies[0] element = dependencies[0]
prompt = app.shell_prompt(element) prompt = app.shell_prompt(element)
mounts = [ mounts = [
...@@ -611,7 +621,7 @@ def shell(app, element, sysroot, mount, isolate, build_, command): ...@@ -611,7 +621,7 @@ def shell(app, element, sysroot, mount, isolate, build_, command):
for host_path, path in mount for host_path, path in mount
] ]
try: try:
exitcode = app.stream.shell(element, scope, prompt, exitcode = app.stream.shell(dependencies, scope, prompt,
directory=sysroot, directory=sysroot,
mounts=mounts, mounts=mounts,
isolate=isolate, isolate=isolate,
......
...@@ -668,17 +668,6 @@ class LogLine(Widget): ...@@ -668,17 +668,6 @@ class LogLine(Widget):
extra_nl = True extra_nl = True
if message.sandbox is not None:
sandbox = self._indent + 'Sandbox directory: ' + message.sandbox
text += '\n'
if message.message_type == MessageType.FAIL:
text += self._err_profile.fmt(sandbox, bold=True)
else:
text += self._detail_profile.fmt(sandbox)
text += '\n'
extra_nl = True
if message.scheduler and message.message_type == MessageType.FAIL: if message.scheduler and message.message_type == MessageType.FAIL:
text += '\n' text += '\n'
......
...@@ -537,7 +537,7 @@ class Loader(): ...@@ -537,7 +537,7 @@ class Loader():
raise LoadError(LoadErrorReason.INVALID_DATA, raise LoadError(LoadErrorReason.INVALID_DATA,
"{}: Expected junction but element kind is {}".format(filename, meta_element.kind)) "{}: Expected junction but element kind is {}".format(filename, meta_element.kind))
element = Element._new_from_meta(meta_element, self._context.artifactcache) element = Element._new_from_meta(meta_element)
element._preflight() element._preflight()
sources = list(element.sources()) sources = list(element.sources())
......
...@@ -70,7 +70,7 @@ class Message(): ...@@ -70,7 +70,7 @@ class Message():
self.elapsed = elapsed # The elapsed time, in timed messages self.elapsed = elapsed # The elapsed time, in timed messages
self.depth = depth # The depth of a timed message self.depth = depth # The depth of a timed message
self.logfile = logfile # The log file path where commands took place self.logfile = logfile # The log file path where commands took place
self.sandbox = sandbox # The sandbox directory where an error occurred (if any) self.sandbox = sandbox # The error that caused this message used a sandbox
self.pid = os.getpid() # The process pid self.pid = os.getpid() # The process pid
self.unique_id = unique_id # The plugin object ID issueing the message self.unique_id = unique_id # The plugin object ID issueing the message
self.task_id = task_id # The plugin object ID of the task self.task_id = task_id # The plugin object ID of the task
......
...@@ -106,7 +106,7 @@ class Pipeline(): ...@@ -106,7 +106,7 @@ class Pipeline():
profile_start(Topics.LOAD_PIPELINE, "_".join(t.replace(os.sep, '-') for t in targets)) profile_start(Topics.LOAD_PIPELINE, "_".join(t.replace(os.sep, '-') for t in targets))
elements = self._project.load_elements(targets, self._artifacts, elements = self._project.load_elements(targets,
rewritable=rewritable, rewritable=rewritable,
fetch_subprojects=fetch_subprojects) fetch_subprojects=fetch_subprojects)
......
...@@ -224,18 +224,17 @@ class Project(): ...@@ -224,18 +224,17 @@ class Project():
# Instantiate and return an element # Instantiate and return an element
# #
# Args: # Args:
# artifacts (ArtifactCache): The artifact cache
# meta (MetaElement): The loaded MetaElement # meta (MetaElement): The loaded MetaElement
# first_pass (bool): Whether to use first pass configuration (for junctions) # first_pass (bool): Whether to use first pass configuration (for junctions)
# #
# Returns: # Returns:
# (Element): A newly created Element object of the appropriate kind # (Element): A newly created Element object of the appropriate kind
# #
def create_element(self, artifacts, meta, *, first_pass=False): def create_element(self, meta, *, first_pass=False):
if first_pass: if first_pass:
return self.first_pass_config.element_factory.create(self._context, self, artifacts, meta) return self.first_pass_config.element_factory.create(self._context, self, meta)
else: else:
return self.config.element_factory.create(self._context, self, artifacts, meta) return self.config.element_factory.create(self._context, self, meta)
# create_source() # create_source()
# #
...@@ -305,7 +304,6 @@ class Project(): ...@@ -305,7 +304,6 @@ class Project():
# #
# Args: # Args:
# targets (list): Target names # targets (list): Target names
# artifacts (ArtifactCache): Artifact cache
# rewritable (bool): Whether the loaded files should be rewritable # rewritable (bool): Whether the loaded files should be rewritable
# this is a bit more expensive due to deep copies # this is a bit more expensive due to deep copies
# fetch_subprojects (bool): Whether we should fetch subprojects as a part of the # fetch_subprojects (bool): Whether we should fetch subprojects as a part of the
...@@ -314,7 +312,7 @@ class Project(): ...@@ -314,7 +312,7 @@ class Project():
# Returns: # Returns:
# (list): A list of loaded Element # (list): A list of loaded Element
# #
def load_elements(self, targets, artifacts, *, def load_elements(self, targets, *,
rewritable=False, fetch_subprojects=False): rewritable=False, fetch_subprojects=False):
with self._context.timed_activity("Loading elements", silent_nested=True): with self._context.timed_activity("Loading elements", silent_nested=True):
meta_elements = self.loader.load(targets, rewritable=rewritable, meta_elements = self.loader.load(targets, rewritable=rewritable,
...@@ -323,7 +321,7 @@ class Project(): ...@@ -323,7 +321,7 @@ class Project():
with self._context.timed_activity("Resolving elements"): with self._context.timed_activity("Resolving elements"):
elements = [ elements = [
Element._new_from_meta(meta, artifacts) Element._new_from_meta(meta)
for meta in meta_elements for meta in meta_elements
] ]
......
...@@ -25,15 +25,17 @@ import stat ...@@ -25,15 +25,17 @@ import stat
import shlex import shlex
import shutil import shutil
import tarfile import tarfile
from contextlib import contextmanager from contextlib import contextmanager, ExitStack
from tempfile import TemporaryDirectory from tempfile import TemporaryDirectory
from ._exceptions import StreamError, ImplError, BstError, set_last_task_error from ._exceptions import StreamError, ImplError, BstError, set_last_task_error
from ._message import Message, MessageType from ._message import Message, MessageType
from ._scheduler import Scheduler, SchedStatus, TrackQueue, FetchQueue, BuildQueue, PullQueue, PushQueue
from ._pipeline import Pipeline, PipelineSelection from ._pipeline import Pipeline, PipelineSelection
from ._platform import Platform
from .sandbox._config import SandboxConfig
from ._scheduler import Scheduler, SchedStatus, TrackQueue, FetchQueue, BuildQueue, PullQueue, PushQueue
from . import utils, _yaml, _site from . import utils, _yaml, _site
from . import Scope, Consistency from . import SandboxFlags, Scope, Consistency
# Stream() # Stream()
...@@ -117,7 +119,7 @@ class Stream(): ...@@ -117,7 +119,7 @@ class Stream():
# Run a shell # Run a shell
# #
# Args: # Args:
# element (Element): An Element object to run the shell for # elements (List of Element): Elements to run the shell for
# scope (Scope): The scope for the shell (Scope.BUILD or Scope.RUN) # scope (Scope): The scope for the shell (Scope.BUILD or Scope.RUN)
# prompt (str): The prompt to display in the shell # prompt (str): The prompt to display in the shell
# directory (str): A directory where an existing prestaged sysroot is expected, or None # directory (str): A directory where an existing prestaged sysroot is expected, or None
...@@ -128,7 +130,7 @@ class Stream(): ...@@ -128,7 +130,7 @@ class Stream():
# Returns: # Returns:
# (int): The exit code of the launched shell # (int): The exit code of the launched shell
# #
def shell(self, element, scope, prompt, *, def shell(self, elements, scope, prompt, *,
directory=None, directory=None,
mounts=None, mounts=None,
isolate=False, isolate=False,
...@@ -140,14 +142,114 @@ class Stream(): ...@@ -140,14 +142,114 @@ class Stream():
if directory is None: if directory is None:
missing_deps = [ missing_deps = [
dep._get_full_name() dep._get_full_name()
for dep in self._pipeline.dependencies([element], scope) for dep in self._pipeline.dependencies(elements, scope)
if not dep._cached() if not dep._cached()
] ]
if missing_deps: if missing_deps:
raise StreamError("Elements need to be built or downloaded before staging a shell environment", raise StreamError("Elements need to be built or downloaded before staging a shell environment",
detail="\n".join(missing_deps)) detail="\n".join(missing_deps))
return element._shell(scope, directory, mounts=mounts, isolate=isolate, prompt=prompt, command=command) # Assert we're not mixing virtual directory compatible
# and non-virtual directory compatible elements
if any(e.BST_VIRTUAL_DIRECTORY for e in elements) and not all(e.BST_VIRTUAL_DIRECTORY for e in elements):
raise StreamError(
"Elements do not support multiple-element staging",
detail=("Multi-element staging is not supported" +
" because elements {} support BST_VIRTUAL_DIRECTORY and {} do not.").format(
', '.join(e.name for e in elements if e.BST_VIRTUAL_DIRECTORY),
', '.join(e.name for e in elements if not e.BST_VIRTUAL_DIRECTORY)))
with ExitStack() as stack:
# Creation logic duplicated from Element.__sandbox
# since most of it is creating the tmpdir
# and deciding whether to make a remote sandbox,
# which we don't want to.
if directory is None:
os.makedirs(self._context.builddir, exist_ok=True)
rootdir = stack.enter_context(TemporaryDirectory(dir=self._context.builddir))
else:
rootdir = directory
# SandboxConfig comes from project, element defaults and MetaElement sandbox config
# In the absence of it being exposed to other APIs and a merging strategy
# just make it from the project sandbox config.
sandbox_config = SandboxConfig(_yaml.node_get(self._project._sandbox, int, 'build-uid'),
_yaml.node_get(self._project._sandbox, int, 'build-gid'))
platform = Platform.get_platform()
sandbox = platform.create_sandbox(context=self._context,
project=self._project,
directory=rootdir,
stdout=None, stderr=None, config=sandbox_config,
bare_directory=directory is not None,
allow_real_directory=not any(e.BST_VIRTUAL_DIRECTORY
for e in elements))
# Configure the sandbox with the last element taking precedence for config.
for e in elements:
e.configure_sandbox(sandbox)
# Stage contents if not passed --sysroot
if not directory:
if not all(e.BST_GRANULAR_STAGE for e in elements):
if len(elements) > 1:
raise StreamError(
"Elements do not support multiple-element staging",
detail=("Elements {} do not support multi-element staging " +
" because element kinds {} do not support BST_GRANULAR_STAGE").format(
', '.join(e.name for e in elements if not e.BST_GRANULAR_STAGE),
', '.join(set(e.get_kind() for e in elements))))
elements[0].stage(sandbox)
else:
visited = {}
for e in elements:
e.stage_dependency_artifacts(sandbox, scope, visited=visited)
visited = {}
for e in elements:
e.integrate_dependencies(sandbox, scope, visited=visited)
for e in elements:
e.post_integration_staging(sandbox)
environment = {}
for e in elements:
environment.update(e.get_environment())
flags = SandboxFlags.INTERACTIVE | SandboxFlags.ROOT_READ_ONLY
shell_command, shell_environment, shell_host_files = self._project.get_shell_config()
environment['PS1'] = prompt
# Special configurations for non-isolated sandboxes
if not isolate:
# Open the network, and reuse calling uid/gid
#
flags |= SandboxFlags.NETWORK_ENABLED | SandboxFlags.INHERIT_UID
# Apply project defined environment vars to set for a shell
for key, value in _yaml.node_items(shell_environment):
environment[key] = value
# Setup any requested bind mounts
if mounts is None:
mounts = []
for mount in shell_host_files + mounts:
if not os.path.exists(mount.host_path):
if not mount.optional:
self._message(MessageType.WARN,
"Not mounting non-existing host file: {}".format(mount.host_path))
else:
sandbox.mark_directory(mount.path)
sandbox._set_mount_source(mount.path, mount.host_path)
if command:
argv = list(command)
else:
argv = shell_command
self._message(MessageType.STATUS, "Running command", detail=" ".join(argv))
# Run shells with network enabled and readonly root.
return sandbox.run(argv, flags, env=environment)
# build() # build()
# #
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
# This version is bumped whenever enhancements are made # This version is bumped whenever enhancements are made
# to the `project.conf` format or the core element format. # to the `project.conf` format or the core element format.
# #
BST_FORMAT_VERSION = 17 BST_FORMAT_VERSION = 18
# The base BuildStream artifact version # The base BuildStream artifact version
......
...@@ -1049,6 +1049,12 @@ class ChainMap(collections.ChainMap): ...@@ -1049,6 +1049,12 @@ class ChainMap(collections.ChainMap):
for key in clearable: for key in clearable:
del self[key] del self[key]
def get(self, key, default=None):
try:
return self[key]
except KeyError:
return default
def node_chain_copy(source): def node_chain_copy(source):
copy = ChainMap({}, source) copy = ChainMap({}, source)
......
...@@ -208,7 +208,6 @@ class BuildElement(Element): ...@@ -208,7 +208,6 @@ class BuildElement(Element):
sandbox.set_environment(self.get_environment()) sandbox.set_environment(self.get_environment())
def stage(self, sandbox): def stage(self, sandbox):
# Stage deps in the sandbox root # Stage deps in the sandbox root
with self.timed_activity("Staging dependencies", silent_nested=True): with self.timed_activity("Staging dependencies", silent_nested=True):
self.stage_dependency_artifacts(sandbox, Scope.BUILD) self.stage_dependency_artifacts(sandbox, Scope.BUILD)
...@@ -216,9 +215,11 @@ class BuildElement(Element): ...@@ -216,9 +215,11 @@ class BuildElement(Element):
# Run any integration commands provided by the dependencies # Run any integration commands provided by the dependencies
# once they are all staged and ready # once they are all staged and ready
with self.timed_activity("Integrating sandbox"): with self.timed_activity("Integrating sandbox"):
for dep in self.dependencies(Scope.BUILD): self.integrate_dependencies(sandbox, Scope.BUILD)
dep.integrate(sandbox)
self.post_integration_staging(sandbox)
def post_integration_staging(self, sandbox):
# Stage sources in the build root # Stage sources in the build root
self.stage_sources(sandbox, self.get_variable('build-root')) self.stage_sources(sandbox, self.get_variable('build-root'))
......
This diff is collapsed.