Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results
Show changes
Commits on Source (35)
Showing
with 822 additions and 270 deletions
......@@ -1547,6 +1547,24 @@ Tests that run a sandbox should be decorated with::
and use the integration cli helper.
You should first aim to write tests that exercise your changes from the cli.
This is so that the testing is end-to-end, and the changes are guaranteed to
work for the end-user. The cli is considered stable, and so tests written in
terms of it are unlikely to require updating as the internals of the software
change over time.
It may be impractical to sufficiently examine some changes this way. For
example, the number of cases to test and the running time of each test may be
too high. It may also be difficult to contrive circumstances to cover every
line of the change. If this is the case, next you can consider also writing
unit tests that work more directly on the changes.
It is important to write unit tests in such a way that they do not break due to
changes unrelated to what they are meant to test. For example, if the test
relies on a lot of BuildStream internals, a large refactoring will likely
require the test to be rewritten. Pure functions that only rely on the Python
Standard Library are excellent candidates for unit testing.
Measuring performance
---------------------
......
......@@ -38,13 +38,31 @@ buildstream 1.3.1
a bug fix to workspaces so they can be build in workspaces too.
o Creating a build shell through the interactive mode or `bst shell --build`
will now use the cached build tree. It is now easier to debug local build
failures.
will now use the cached build tree if available locally. It is now easier to
debug local build failures.
o `bst shell --sysroot` now takes any directory that contains a sysroot,
instead of just a specially-formatted build-root with a `root` and `scratch`
subdirectory.
o The buildstream.conf file learned new 'prompt.auto-init',
'prompt.really-workspace-close-remove-dir', and
'prompt.really-workspace-reset-hard' options. These allow users to suppress
certain confirmation prompts, e.g. double-checking that the user meant to
run the command as typed.
o Due to the element `build tree` being cached in the respective artifact their
size in some cases has significantly increased. In *most* cases the build trees
are not utilised when building targets, as such by default bst 'pull' & 'build'
will not fetch build trees from remotes. This behaviour can be overridden with
the cli main option '--pull-buildtrees', or the user configuration cache group
option 'pull-buildtrees = True'. The override will also add the build tree to
already cached artifacts. When attempting to populate an artifactcache server
with cached artifacts, only 'complete' elements can be pushed. If the element
is expected to have a populated build tree then it must be cached before pushing.
o Added new `bst source-checkout` command to checkout sources of an element.
=================
buildstream 1.1.5
......
......@@ -476,6 +476,22 @@ class ArtifactCache():
return self.cas.contains(ref)
# contains_subdir_artifact():
#
# Check whether an artifact element contains a digest for a subdir
# which is populated in the cache, i.e non dangling.
#
# Args:
# element (Element): The Element to check
# key (str): The cache key to use
# subdir (str): The subdir to check
#
# Returns: True if the subdir exists & is populated in the cache, False otherwise
#
def contains_subdir_artifact(self, element, key, subdir):
ref = self.get_artifact_fullname(element, key)
return self.cas.contains_subdir_artifact(ref, subdir)
# list_artifacts():
#
# List artifacts in this cache in LRU order.
......@@ -533,6 +549,7 @@ class ArtifactCache():
# Args:
# element (Element): The Element to extract
# key (str): The cache key to use
# subdir (str): Optional specific subdir to extract
#
# Raises:
# ArtifactError: In cases there was an OSError, or if the artifact
......@@ -540,12 +557,12 @@ class ArtifactCache():
#
# Returns: path to extracted artifact
#
def extract(self, element, key):
def extract(self, element, key, subdir=None):
ref = self.get_artifact_fullname(element, key)
path = os.path.join(self.extractdir, element._get_project().name, element.normal_name)
return self.cas.extract(ref, path)
return self.cas.extract(ref, path, subdir=subdir)
# commit():
#
......@@ -666,11 +683,13 @@ class ArtifactCache():
# element (Element): The Element whose artifact is to be fetched
# key (str): The cache key to use
# progress (callable): The progress callback, if any
# subdir (str): The optional specific subdir to pull
# excluded_subdirs (list): The optional list of subdirs to not pull
#
# Returns:
# (bool): True if pull was successful, False if artifact was not available
#
def pull(self, element, key, *, progress=None):
def pull(self, element, key, *, progress=None, subdir=None, excluded_subdirs=None):
ref = self.get_artifact_fullname(element, key)
project = element._get_project()
......@@ -680,8 +699,13 @@ class ArtifactCache():
display_key = element._get_brief_display_key()
element.status("Pulling artifact {} <- {}".format(display_key, remote.spec.url))
if self.cas.pull(ref, remote, progress=progress):
if self.cas.pull(ref, remote, progress=progress, subdir=subdir, excluded_subdirs=excluded_subdirs):
element.info("Pulled artifact {} <- {}".format(display_key, remote.spec.url))
if subdir:
# Attempt to extract subdir into artifact extract dir if it already exists
# without containing the subdir. If the respective artifact extract dir does not
# exist a complete extraction will complete.
self.extract(element, key, subdir)
# no need to pull from additional remotes
return True
else:
......@@ -811,6 +835,20 @@ class ArtifactCache():
self.cas.link_ref(oldref, newref)
# checkout_artifact_subdir()
#
# Checkout given artifact subdir into provided directory
#
# Args:
# element (Element): The Element
# key (str): The cache key to use
# subdir (str): The subdir to checkout
# tmpdir (str): The dir to place the subdir content
#
def checkout_artifact_subdir(self, element, key, subdir, tmpdir):
ref = self.get_artifact_fullname(element, key)
return self.cas.checkout_artifact_subdir(ref, subdir, tmpdir)
################################################
# Local Private Methods #
################################################
......
......@@ -24,7 +24,6 @@ import os
import stat
import tempfile
import uuid
import errno
from urllib.parse import urlparse
import grpc
......@@ -82,6 +81,27 @@ class CASCache():
# This assumes that the repository doesn't have any dangling pointers
return os.path.exists(refpath)
# contains_subdir_artifact():
#
# Check whether the specified artifact element tree has a digest for a subdir
# which is populated in the cache, i.e non dangling.
#
# Args:
# ref (str): The ref to check
# subdir (str): The subdir to check
#
# Returns: True if the subdir exists & is populated in the cache, False otherwise
#
def contains_subdir_artifact(self, ref, subdir):
tree = self.resolve_ref(ref)
# This assumes that the subdir digest is present in the element tree
subdirdigest = self._get_subdir(tree, subdir)
objpath = self.objpath(subdirdigest)
# True if subdir content is cached or if empty as expected
return os.path.exists(objpath)
# extract():
#
# Extract cached directory for the specified ref if it hasn't
......@@ -90,37 +110,44 @@ class CASCache():
# Args:
# ref (str): The ref whose directory to extract
# path (str): The destination path
# subdir (str): Optional specific dir to extract
#
# Raises:
# CASError: In cases there was an OSError, or if the ref did not exist.
#
# Returns: path to extracted directory
#
def extract(self, ref, path):
def extract(self, ref, path, subdir=None):
tree = self.resolve_ref(ref, update_mtime=True)
dest = os.path.join(path, tree.hash)
originaldest = dest = os.path.join(path, tree.hash)
# If artifact is already extracted, check if the optional subdir
# has also been extracted. If the artifact has not been extracted
# a full extraction would include the optional subdir
if os.path.isdir(dest):
# directory has already been extracted
if subdir:
if not os.path.isdir(os.path.join(dest, subdir)):
dest = os.path.join(dest, subdir)
tree = self._get_subdir(tree, subdir)
else:
return dest
else:
return dest
with tempfile.TemporaryDirectory(prefix='tmp', dir=self.tmpdir) as tmpdir:
checkoutdir = os.path.join(tmpdir, ref)
self._checkout(checkoutdir, tree)
os.makedirs(os.path.dirname(dest), exist_ok=True)
try:
os.rename(checkoutdir, dest)
utils.move_atomic(checkoutdir, dest)
except utils.DirectoryExistsError:
# Another process beat us to rename
pass
except OSError as e:
# With rename it's possible to get either ENOTEMPTY or EEXIST
# in the case that the destination path is a not empty directory.
#
# If rename fails with these errors, another process beat
# us to it so just ignore.
if e.errno not in [errno.ENOTEMPTY, errno.EEXIST]:
raise CASError("Failed to extract directory for ref '{}': {}".format(ref, e)) from e
return dest
return originaldest
# commit():
#
......@@ -193,11 +220,13 @@ class CASCache():
# ref (str): The ref to pull
# remote (CASRemote): The remote repository to pull from
# progress (callable): The progress callback, if any
# subdir (str): The optional specific subdir to pull
# excluded_subdirs (list): The optional list of subdirs to not pull
#
# Returns:
# (bool): True if pull was successful, False if ref was not available
#
def pull(self, ref, remote, *, progress=None):
def pull(self, ref, remote, *, progress=None, subdir=None, excluded_subdirs=None):
try:
remote.init()
......@@ -209,7 +238,12 @@ class CASCache():
tree.hash = response.digest.hash
tree.size_bytes = response.digest.size_bytes
self._fetch_directory(remote, tree)
# Check if the element artifact is present, if so just fetch the subdir.
if subdir and os.path.exists(self.objpath(tree)):
self._fetch_subdir(remote, tree, subdir)
else:
# Fetch artifact, excluded_subdirs determined in pullqueue
self._fetch_directory(remote, tree, excluded_subdirs=excluded_subdirs)
self.set_ref(ref, tree)
......@@ -370,6 +404,21 @@ class CASCache():
return True
# checkout_artifact_subdir():
#
# Checkout given artifact subdir into provided directory
#
# Args:
# ref (str): The ref to check
# subdir (str): The subdir to checkout
# tmpdir (str): The dir to place the subdir content
#
def checkout_artifact_subdir(self, ref, subdir, tmpdir):
tree = self.resolve_ref(ref)
# This assumes that the subdir digest is present in the element tree
subdirdigest = self._get_subdir(tree, subdir)
self._checkout(tmpdir, subdirdigest)
# objpath():
#
# Return the path of an object based on its digest.
......@@ -607,6 +656,8 @@ class CASCache():
stat.S_IRGRP | stat.S_IXGRP | stat.S_IROTH | stat.S_IXOTH)
for dirnode in directory.directories:
# Don't try to checkout a dangling ref
if os.path.exists(self.objpath(dirnode.digest)):
fullpath = os.path.join(dest, dirnode.name)
self._checkout(fullpath, dirnode.digest)
......@@ -863,11 +914,14 @@ class CASCache():
# Args:
# remote (Remote): The remote to use.
# dir_digest (Digest): Digest object for the directory to fetch.
# excluded_subdirs (list): The optional list of subdirs to not fetch
#
def _fetch_directory(self, remote, dir_digest):
def _fetch_directory(self, remote, dir_digest, *, excluded_subdirs=None):
fetch_queue = [dir_digest]
fetch_next_queue = []
batch = _CASBatchRead(remote)
if not excluded_subdirs:
excluded_subdirs = []
while len(fetch_queue) + len(fetch_next_queue) > 0:
if not fetch_queue:
......@@ -882,6 +936,7 @@ class CASCache():
directory.ParseFromString(f.read())
for dirnode in directory.directories:
if dirnode.name not in excluded_subdirs:
batch = self._fetch_directory_node(remote, dirnode.digest, batch,
fetch_queue, fetch_next_queue, recursive=True)
......@@ -892,6 +947,10 @@ class CASCache():
# Fetch final batch
self._fetch_directory_batch(remote, batch, fetch_queue, fetch_next_queue)
def _fetch_subdir(self, remote, tree, subdir):
subdirdigest = self._get_subdir(tree, subdir)
self._fetch_directory(remote, subdirdigest)
def _fetch_tree(self, remote, digest):
# download but do not store the Tree object
with tempfile.NamedTemporaryFile(dir=self.tmpdir) as out:
......
......@@ -63,25 +63,25 @@ class Context():
self.artifactdir = None
# The locations from which to push and pull prebuilt artifacts
self.artifact_cache_specs = []
self.artifact_cache_specs = None
# The directory to store build logs
self.logdir = None
# The abbreviated cache key length to display in the UI
self.log_key_length = 0
self.log_key_length = None
# Whether debug mode is enabled
self.log_debug = False
self.log_debug = None
# Whether verbose mode is enabled
self.log_verbose = False
self.log_verbose = None
# Maximum number of lines to print from build logs
self.log_error_lines = 0
self.log_error_lines = None
# Maximum number of lines to print in the master log for a detailed message
self.log_message_lines = 0
self.log_message_lines = None
# Format string for printing the pipeline at startup time
self.log_element_format = None
......@@ -90,19 +90,40 @@ class Context():
self.log_message_format = None
# Maximum number of fetch or refresh tasks
self.sched_fetchers = 4
self.sched_fetchers = None
# Maximum number of build tasks
self.sched_builders = 4
self.sched_builders = None
# Maximum number of push tasks
self.sched_pushers = 4
self.sched_pushers = None
# Maximum number of retries for network tasks
self.sched_network_retries = 2
self.sched_network_retries = None
# What to do when a build fails in non interactive mode
self.sched_error_action = 'continue'
self.sched_error_action = None
# Size of the artifact cache in bytes
self.config_cache_quota = None
# Whether or not to attempt to pull build trees globally
self.pull_buildtrees = None
# Boolean, whether to offer to create a project for the user, if we are
# invoked outside of a directory where we can resolve the project.
self.prompt_auto_init = None
# Boolean, whether we double-check with the user that they meant to
# remove a workspace directory.
self.prompt_workspace_close_remove_dir = None
# Boolean, whether we double-check with the user that they meant to do
# a hard reset of a workspace, potentially losing changes.
self.prompt_workspace_reset_hard = None
# Whether to not include artifact buildtrees in workspaces if available
self.workspace_build_trees = True
# Whether elements must be rebuilt when their dependencies have changed
self._strict_build_plan = None
......@@ -120,7 +141,6 @@ class Context():
self._workspaces = None
self._log_handle = None
self._log_filename = None
self.config_cache_quota = 'infinity'
# load()
#
......@@ -160,7 +180,7 @@ class Context():
_yaml.node_validate(defaults, [
'sourcedir', 'builddir', 'artifactdir', 'logdir',
'scheduler', 'artifacts', 'logging', 'projects',
'cache'
'cache', 'prompt', 'workspacebuildtrees'
])
for directory in ['sourcedir', 'builddir', 'artifactdir', 'logdir']:
......@@ -178,13 +198,19 @@ class Context():
# our artifactdir - the artifactdir may not have been created
# yet.
cache = _yaml.node_get(defaults, Mapping, 'cache')
_yaml.node_validate(cache, ['quota'])
_yaml.node_validate(cache, ['quota', 'pull-buildtrees'])
self.config_cache_quota = _yaml.node_get(cache, str, 'quota', default_value='infinity')
self.config_cache_quota = _yaml.node_get(cache, str, 'quota')
# Load artifact share configuration
self.artifact_cache_specs = ArtifactCache.specs_from_config_node(defaults)
# Load pull build trees configuration
self.pull_buildtrees = _yaml.node_get(cache, bool, 'pull-buildtrees')
# Load workspace buildtrees configuration
self.workspace_build_trees = _yaml.node_get(defaults, bool, 'workspacebuildtrees', default_value='True')
# Load logging config
logging = _yaml.node_get(defaults, Mapping, 'logging')
_yaml.node_validate(logging, [
......@@ -206,12 +232,34 @@ class Context():
'on-error', 'fetchers', 'builders',
'pushers', 'network-retries'
])
self.sched_error_action = _yaml.node_get(scheduler, str, 'on-error')
self.sched_error_action = _node_get_option_str(
scheduler, 'on-error', ['continue', 'quit', 'terminate'])
self.sched_fetchers = _yaml.node_get(scheduler, int, 'fetchers')
self.sched_builders = _yaml.node_get(scheduler, int, 'builders')
self.sched_pushers = _yaml.node_get(scheduler, int, 'pushers')
self.sched_network_retries = _yaml.node_get(scheduler, int, 'network-retries')
# Load prompt preferences
#
# We convert string options to booleans here, so we can be both user
# and coder-friendly. The string options are worded to match the
# responses the user would give at the cli, for least surprise. The
# booleans are converted here because it's easiest to eyeball that the
# strings are right.
#
prompt = _yaml.node_get(
defaults, Mapping, 'prompt')
_yaml.node_validate(prompt, [
'auto-init', 'really-workspace-close-remove-dir',
'really-workspace-reset-hard',
])
self.prompt_auto_init = _node_get_option_str(
prompt, 'auto-init', ['ask', 'no']) == 'ask'
self.prompt_workspace_close_remove_dir = _node_get_option_str(
prompt, 'really-workspace-close-remove-dir', ['ask', 'yes']) == 'ask'
self.prompt_workspace_reset_hard = _node_get_option_str(
prompt, 'really-workspace-reset-hard', ['ask', 'yes']) == 'ask'
# Load per-projects overrides
self._project_overrides = _yaml.node_get(defaults, Mapping, 'projects', default_value={})
......@@ -222,13 +270,6 @@ class Context():
profile_end(Topics.LOAD_CONTEXT, 'load')
valid_actions = ['continue', 'quit']
if self.sched_error_action not in valid_actions:
provenance = _yaml.node_get_provenance(scheduler, 'on-error')
raise LoadError(LoadErrorReason.INVALID_DATA,
"{}: on-error should be one of: {}".format(
provenance, ", ".join(valid_actions)))
@property
def artifactcache(self):
if not self._artifactcache:
......@@ -581,3 +622,30 @@ class Context():
os.environ['XDG_CONFIG_HOME'] = os.path.expanduser('~/.config')
if not os.environ.get('XDG_DATA_HOME'):
os.environ['XDG_DATA_HOME'] = os.path.expanduser('~/.local/share')
# _node_get_option_str()
#
# Like _yaml.node_get(), but also checks value is one of the allowed option
# strings. Fetches a value from a dictionary node, and makes sure it's one of
# the pre-defined options.
#
# Args:
# node (dict): The dictionary node
# key (str): The key to get a value for in node
# allowed_options (iterable): Only accept these values
#
# Returns:
# The value, if found in 'node'.
#
# Raises:
# LoadError, when the value is not of the expected type, or is not found.
#
def _node_get_option_str(node, key, allowed_options):
result = _yaml.node_get(node, str, key)
if result not in allowed_options:
provenance = _yaml.node_get_provenance(node, key)
raise LoadError(LoadErrorReason.INVALID_DATA,
"{}: {} should be one of: {}".format(
provenance, key, ", ".join(allowed_options)))
return result
......@@ -182,7 +182,8 @@ class App():
'fetchers': 'sched_fetchers',
'builders': 'sched_builders',
'pushers': 'sched_pushers',
'network_retries': 'sched_network_retries'
'network_retries': 'sched_network_retries',
'pull_buildtrees': 'pull_buildtrees'
}
for cli_option, context_attr in override_map.items():
option_value = self._main_options.get(cli_option)
......@@ -221,6 +222,7 @@ class App():
# Let's automatically start a `bst init` session in this case
if e.reason == LoadErrorReason.MISSING_PROJECT_CONF and self.interactive:
click.echo("A project was not detected in the directory: {}".format(directory), err=True)
if self.context.prompt_auto_init:
click.echo("", err=True)
if click.confirm("Would you like to create a new project here?"):
self.init_project(None)
......
......@@ -219,6 +219,8 @@ def print_version(ctx, param, value):
help="Specify a project option")
@click.option('--default-mirror', default=None,
help="The mirror to fetch from first, before attempting other mirrors")
@click.option('--pull-buildtrees', is_flag=True, default=None,
help="Include an element's build tree when pulling remote element artifacts")
@click.pass_context
def cli(context, **kwargs):
"""Build and manipulate BuildStream projects
......@@ -662,6 +664,33 @@ def checkout(app, element, location, force, deps, integrate, hardlinks, tar):
tar=tar)
##################################################################
# Source Checkout Command #
##################################################################
@cli.command(name='source-checkout', short_help='Checkout sources for an element')
@click.option('--except', 'except_', multiple=True,
type=click.Path(readable=False),
help="Except certain dependencies")
@click.option('--deps', '-d', default='none',
type=click.Choice(['build', 'none', 'run', 'all']),
help='The dependencies whose sources to checkout (default: none)')
@click.option('--fetch', 'fetch_', default=False, is_flag=True,
help='Fetch elements if they are not fetched')
@click.argument('element',
type=click.Path(readable=False))
@click.argument('location', type=click.Path())
@click.pass_obj
def source_checkout(app, element, location, deps, fetch_, except_):
"""Checkout sources of an element to the specified location
"""
with app.initialized():
app.stream.source_checkout(element,
location=location,
deps=deps,
fetch=fetch_,
except_targets=except_)
##################################################################
# Workspace Command #
##################################################################
......@@ -676,17 +705,21 @@ def workspace():
##################################################################
@workspace.command(name='open', short_help="Open a new workspace")
@click.option('--no-checkout', default=False, is_flag=True,
help="Do not checkout the source, only link to the given directory")
help="Do not checkout the source or cached buildtree, only link to the given directory")
@click.option('--force', '-f', default=False, is_flag=True,
help="Overwrite files existing in checkout directory")
@click.option('--track', 'track_', default=False, is_flag=True,
help="Track and fetch new source references before checking out the workspace")
@click.option('--no-cache', default=False, is_flag=True,
help="Do not checkout the cached buildtree")
@click.argument('element',
type=click.Path(readable=False))
@click.argument('directory', type=click.Path(file_okay=False))
@click.pass_obj
def workspace_open(app, no_checkout, force, track_, element, directory):
"""Open a workspace for manual source modification"""
def workspace_open(app, no_checkout, force, track_, no_cache, element, directory):
"""Open a workspace for manual source modification, the elements buildtree
will be provided if available in the local artifact cache.
"""
if os.path.exists(directory):
......@@ -698,11 +731,15 @@ def workspace_open(app, no_checkout, force, track_, element, directory):
click.echo("Checkout directory is not empty: {}".format(directory), err=True)
sys.exit(-1)
if not no_cache and not no_checkout:
click.echo("WARNING: Workspace will be opened without the cached buildtree if not cached locally")
with app.initialized():
app.stream.workspace_open(element, directory,
no_checkout=no_checkout,
track_first=track_,
force=force)
force=force,
no_cache=no_cache)
##################################################################
......@@ -743,7 +780,7 @@ def workspace_close(app, remove_dir, all_, elements):
if nonexisting:
raise AppError("Workspace does not exist", detail="\n".join(nonexisting))
if app.interactive and remove_dir:
if app.interactive and remove_dir and app.context.prompt_workspace_close_remove_dir:
if not click.confirm('This will remove all your changes, are you sure?'):
click.echo('Aborting', err=True)
sys.exit(-1)
......@@ -777,7 +814,7 @@ def workspace_reset(app, soft, track_, all_, elements):
if all_ and not app.stream.workspace_exists():
raise AppError("No open workspaces to reset")
if app.interactive and not soft:
if app.interactive and not soft and app.context.prompt_workspace_reset_hard:
if not click.confirm('This will remove all your changes, are you sure?'):
click.echo('Aborting', err=True)
sys.exit(-1)
......
......@@ -370,7 +370,7 @@ class Pipeline():
detail += " Element: {} is inconsistent\n".format(element._get_full_name())
for source in element.sources():
if source._get_consistency() == Consistency.INCONSISTENT:
detail += " Source {} is missing ref\n".format(source)
detail += " {} is missing ref\n".format(source)
detail += '\n'
detail += "Try tracking these elements first with `bst track`\n"
......@@ -383,6 +383,33 @@ class Pipeline():
detail += " " + element._get_full_name() + "\n"
raise PipelineError("Inconsistent pipeline", detail=detail, reason="inconsistent-pipeline-workspaced")
# assert_sources_cached()
#
# Asserts that sources for the given list of elements are cached.
#
# Args:
# elements (list): The list of elements
#
def assert_sources_cached(self, elements):
uncached = []
with self._context.timed_activity("Checking sources"):
for element in elements:
if element._get_consistency() != Consistency.CACHED:
uncached.append(element)
if uncached:
detail = "Sources are not cached for the following elements:\n\n"
for element in uncached:
detail += " Following sources for element: {} are not cached:\n".format(element._get_full_name())
for source in element.sources():
if source._get_consistency() != Consistency.CACHED:
detail += " {}\n".format(source)
detail += '\n'
detail += "Try fetching these elements first with `bst fetch`,\n" + \
"or run this command with `--fetch` option\n"
raise PipelineError("Uncached sources", detail=detail, reason="uncached-sources")
#############################################################
# Private Methods #
#############################################################
......
......@@ -379,27 +379,7 @@ class Stream():
elements, _ = self._load((target,), (), fetch_subprojects=True)
target = elements[0]
if not tar:
try:
os.makedirs(location, exist_ok=True)
except OSError as e:
raise StreamError("Failed to create checkout directory: '{}'"
.format(e)) from e
if not tar:
if not os.access(location, os.W_OK):
raise StreamError("Checkout directory '{}' not writable"
.format(location))
if not force and os.listdir(location):
raise StreamError("Checkout directory '{}' not empty"
.format(location))
elif os.path.exists(location) and location != '-':
if not os.access(location, os.W_OK):
raise StreamError("Output file '{}' not writable"
.format(location))
if not force and os.path.exists(location):
raise StreamError("Output file '{}' already exists"
.format(location))
self._check_location_writable(location, force=force, tar=tar)
# Stage deps into a temporary sandbox first
try:
......@@ -443,6 +423,42 @@ class Stream():
raise StreamError("Error while staging dependencies into a sandbox"
": '{}'".format(e), detail=e.detail, reason=e.reason) from e
# source_checkout()
#
# Checkout sources of the target element to the specified location
#
# Args:
# target (str): The target element whose sources to checkout
# location (str): Location to checkout the sources to
# deps (str): The dependencies to checkout
# fetch (bool): Whether to fetch missing sources
# except_targets (list): List of targets to except from staging
#
def source_checkout(self, target, *,
location=None,
deps='none',
fetch=False,
except_targets=()):
self._check_location_writable(location)
elements, _ = self._load((target,), (),
selection=deps,
except_targets=except_targets,
fetch_subprojects=True)
# Assert all sources are cached
if fetch:
self._fetch(elements)
self._pipeline.assert_sources_cached(elements)
# Stage all sources determined by scope
try:
self._write_element_sources(location, elements)
except BstError as e:
raise StreamError("Error while writing sources"
": '{}'".format(e), detail=e.detail, reason=e.reason) from e
# workspace_open
#
# Open a project workspace
......@@ -453,11 +469,17 @@ class Stream():
# no_checkout (bool): Whether to skip checking out the source
# track_first (bool): Whether to track and fetch first
# force (bool): Whether to ignore contents in an existing directory
# no_cache (bool): Whether to not include the cached buildtree
#
def workspace_open(self, target, directory, *,
no_checkout,
track_first,
force):
force,
no_cache):
# Override no_cache if the global user conf workspacebuildtrees is false
if not self._context.workspace_build_trees:
no_cache = True
if track_first:
track_targets = (target,)
......@@ -470,6 +492,20 @@ class Stream():
target = elements[0]
directory = os.path.abspath(directory)
# Check if given target has a buildtree artifact cached locally
buildtree = None
if target._cached():
buildtree = self._artifacts.contains_subdir_artifact(target, target._get_cache_key(), 'buildtree')
# If we're running in the default state, make the user aware of buildtree usage
if not no_cache and not no_checkout:
if buildtree:
self._message(MessageType.INFO, "{} buildtree artifact is available,"
" workspace will be opened with it".format(target.name))
else:
self._message(MessageType.WARN, "{} buildtree artifact not available,"
" workspace will be opened with source checkout".format(target.name))
if not list(target.sources()):
build_depends = [x.name for x in target.dependencies(Scope.BUILD, recurse=False)]
if not build_depends:
......@@ -501,6 +537,7 @@ class Stream():
"fetch the latest version of the " +
"source.")
# Presume workspace to be forced if previous StreamError not raised
if workspace:
workspaces.delete_workspace(target._get_full_name())
workspaces.save_config()
......@@ -510,9 +547,14 @@ class Stream():
except OSError as e:
raise StreamError("Failed to create workspace directory: {}".format(e)) from e
# Handle opening workspace with buildtree included
if (buildtree and not no_cache) and not no_checkout:
workspaces.create_workspace(target._get_full_name(), directory, cached_build=buildtree)
with target.timed_activity("Staging buildtree to {}".format(directory)):
target._open_workspace(buildtree=buildtree)
else:
workspaces.create_workspace(target._get_full_name(), directory)
if not no_checkout:
if (not buildtree or no_cache) and not no_checkout:
with target.timed_activity("Staging sources to {}".format(directory)):
target._open_workspace()
......@@ -598,10 +640,24 @@ class Stream():
.format(workspace_path, e)) from e
workspaces.delete_workspace(element._get_full_name())
workspaces.create_workspace(element._get_full_name(), workspace_path)
with element.timed_activity("Staging sources to {}".format(workspace_path)):
element._open_workspace()
# Create the workspace, ensuring the original optional cached build state is preserved if
# possible.
buildtree = False
if workspace.cached_build and element._cached():
if self._artifacts.contains_subdir_artifact(element, element._get_cache_key(), 'buildtree'):
buildtree = True
# Warn the user if the workspace cannot be opened with the original cached build state
if workspace.cached_build and not buildtree:
self._message(MessageType.WARN, "{} original buildtree artifact not available,"
" workspace will be opened with source checkout".format(element.name))
workspaces.create_workspace(element._get_full_name(), workspace_path,
cached_build=buildtree)
with element.timed_activity("Staging to {}".format(workspace_path)):
element._open_workspace(buildtree=buildtree)
self._message(MessageType.INFO,
"Reset workspace for {} at: {}".format(element.name,
......@@ -726,7 +782,7 @@ class Stream():
if self._write_element_script(source_directory, element)
]
self._write_element_sources(tempdir, elements)
self._write_element_sources(os.path.join(tempdir, "source"), elements)
self._write_build_script(tempdir, elements)
self._collect_sources(tempdir, tar_location,
target.normal_name, compression)
......@@ -1068,6 +1124,39 @@ class Stream():
self._enqueue_plan(fetch_plan)
self._run()
# _check_location_writable()
#
# Check if given location is writable.
#
# Args:
# location (str): Destination path
# force (bool): Allow files to be overwritten
# tar (bool): Whether destination is a tarball
#
# Raises:
# (StreamError): If the destination is not writable
#
def _check_location_writable(self, location, force=False, tar=False):
if not tar:
try:
os.makedirs(location, exist_ok=True)
except OSError as e:
raise StreamError("Failed to create destination directory: '{}'"
.format(e)) from e
if not os.access(location, os.W_OK):
raise StreamError("Destination directory '{}' not writable"
.format(location))
if not force and os.listdir(location):
raise StreamError("Destination directory '{}' not empty"
.format(location))
elif os.path.exists(location) and location != '-':
if not os.access(location, os.W_OK):
raise StreamError("Output file '{}' not writable"
.format(location))
if not force and os.path.exists(location):
raise StreamError("Output file '{}' already exists"
.format(location))
# Helper function for checkout()
#
def _checkout_hardlinks(self, sandbox_vroot, directory):
......@@ -1089,10 +1178,9 @@ class Stream():
# Write all source elements to the given directory
def _write_element_sources(self, directory, elements):
for element in elements:
source_dir = os.path.join(directory, "source")
element_source_dir = os.path.join(source_dir, element.normal_name)
element_source_dir = self._get_element_dirname(directory, element)
if list(element.sources()):
os.makedirs(element_source_dir)
element._stage_sources_at(element_source_dir)
# Write a master build script to the sandbox
......@@ -1122,3 +1210,25 @@ class Stream():
with tarfile.open(tar_name, permissions) as tar:
tar.add(directory, arcname=element_name)
# _get_element_dirname()
#
# Get path to directory for an element based on its normal name.
#
# For cross-junction elements, the path will be prefixed with the name
# of the junction element.
#
# Args:
# directory (str): path to base directory
# element (Element): the element
#
# Returns:
# (str): Path to directory for this element
#
def _get_element_dirname(self, directory, element):
parts = [element.normal_name]
while element._get_project() != self._project:
element = element._get_project().junction
parts.append(element.normal_name)
return os.path.join(directory, *reversed(parts))
......@@ -24,7 +24,7 @@ from . import _yaml
from ._exceptions import LoadError, LoadErrorReason
BST_WORKSPACE_FORMAT_VERSION = 3
BST_WORKSPACE_FORMAT_VERSION = 4
# Workspace()
......@@ -43,9 +43,11 @@ BST_WORKSPACE_FORMAT_VERSION = 3
# running_files (dict): A dict mapping dependency elements to files
# changed between failed builds. Should be
# made obsolete with failed build artifacts.
# cached_build (bool): If the workspace is staging the cached build artifact
#
class Workspace():
def __init__(self, toplevel_project, *, last_successful=None, path=None, prepared=False, running_files=None):
def __init__(self, toplevel_project, *, last_successful=None, path=None, prepared=False,
running_files=None, cached_build=False):
self.prepared = prepared
self.last_successful = last_successful
self._path = path
......@@ -53,6 +55,7 @@ class Workspace():
self._toplevel_project = toplevel_project
self._key = None
self.cached_build = cached_build
# to_dict()
#
......@@ -65,7 +68,8 @@ class Workspace():
ret = {
'prepared': self.prepared,
'path': self._path,
'running_files': self.running_files
'running_files': self.running_files,
'cached_build': self.cached_build
}
if self.last_successful is not None:
ret["last_successful"] = self.last_successful
......@@ -224,12 +228,13 @@ class Workspaces():
# Args:
# element_name (str) - The element name to create a workspace for
# path (str) - The path in which the workspace should be kept
# cached_build (bool) - If the workspace is staging the cached build artifact
#
def create_workspace(self, element_name, path):
def create_workspace(self, element_name, path, cached_build=False):
if path.startswith(self._toplevel_project.directory):
path = os.path.relpath(path, self._toplevel_project.directory)
self._workspaces[element_name] = Workspace(self._toplevel_project, path=path)
self._workspaces[element_name] = Workspace(self._toplevel_project, path=path, cached_build=cached_build)
return self._workspaces[element_name]
......@@ -396,6 +401,7 @@ class Workspaces():
'path': _yaml.node_get(node, str, 'path'),
'last_successful': _yaml.node_get(node, str, 'last_successful', default_value=None),
'running_files': _yaml.node_get(node, dict, 'running_files', default_value=None),
'cached_build': _yaml.node_get(node, bool, 'cached_build', default_value=False)
}
return Workspace.from_dict(self._toplevel_project, dictionary)
......
......@@ -351,6 +351,7 @@ _sentinel = object()
# expected_type (type): The expected type for the value being searched
# key (str): The key to get a value for in node
# indices (list of ints): Optionally decend into lists of lists
# default_value: Optionally return this value if the key is not found
#
# Returns:
# The value if found in node, otherwise default_value is returned
......
......@@ -35,6 +35,9 @@ cache:
# to the isize of the file system containing the cache.
quota: infinity
# Whether to pull build trees when downloading element artifacts
pull-buildtrees: False
#
# Scheduler
#
......@@ -97,3 +100,35 @@ logging:
[%{elapsed}][%{key}][%{element}] %{action} %{message}
#
# Prompt overrides
#
# Here you can suppress 'are you sure?' and other kinds of prompts by supplying
# override values. Note that e.g. 'yes' and 'no' have the same meaning here as
# they do in the actual cli prompt.
#
prompt:
# Whether to create a project with 'bst init' if we are invoked outside of a
# directory where we can resolve the project.
#
# ask - Prompt the user to choose.
# no - Never create the project.
#
auto-init: ask
# Whether to really proceed with 'bst workspace close --remove-dir' removing
# a workspace directory, potentially losing changes.
#
# ask - Ask the user if they are sure.
# yes - Always remove, without asking.
#
really-workspace-close-remove-dir: ask
# Whether to really proceed with 'bst workspace reset' doing a hard reset of
# a workspace, potentially losing changes.
#
# ask - Ask the user if they are sure.
# yes - Always hard reset, without asking.
#
really-workspace-reset-hard: ask
......@@ -85,7 +85,8 @@ import shutil
from . import _yaml
from ._variables import Variables
from ._versions import BST_CORE_ARTIFACT_VERSION
from ._exceptions import BstError, LoadError, LoadErrorReason, ImplError, ErrorDomain
from ._exceptions import BstError, LoadError, LoadErrorReason, ImplError, \
ErrorDomain
from .utils import UtilError
from . import Plugin, Consistency, Scope
from . import SandboxFlags
......@@ -1397,12 +1398,12 @@ class Element(Plugin):
with self.timed_activity("Staging local files at {}"
.format(workspace.get_absolute_path())):
workspace.stage(temp_staging_directory)
elif self._cached():
# We have a cached buildtree to use, instead
# Check if we have a cached buildtree to use
elif self.__cached_buildtree():
artifact_base, _ = self.__extract()
import_dir = os.path.join(artifact_base, 'buildtree')
else:
# No workspace, stage directly
# No workspace or cached buildtree, stage source directly
for source in self.sources():
source._stage(temp_staging_directory)
......@@ -1553,7 +1554,6 @@ class Element(Plugin):
self.__dynamic_public = _yaml.node_copy(self.__public)
# Call the abstract plugin methods
collect = None
try:
# Step 1 - Configure
self.configure_sandbox(sandbox)
......@@ -1564,7 +1564,7 @@ class Element(Plugin):
# Step 4 - Assemble
collect = self.assemble(sandbox) # pylint: disable=assignment-from-no-return
self.__set_build_result(success=True, description="succeeded")
except BstError as e:
except ElementError as e:
# Shelling into a sandbox is useful to debug this error
e.sandbox = True
......@@ -1586,12 +1586,16 @@ class Element(Plugin):
self.warn("Failed to preserve workspace state for failed build sysroot: {}"
.format(e))
if isinstance(e, ElementError):
collect = e.collect # pylint: disable=no-member
self.__set_build_result(success=False, description=str(e), detail=e.detail)
self._cache_artifact(rootdir, sandbox, e.collect)
raise
else:
return self._cache_artifact(rootdir, sandbox, collect)
finally:
cleanup_rootdir()
def _cache_artifact(self, rootdir, sandbox, collect):
if collect is not None:
try:
sandbox_vroot = sandbox.get_virtual_directory()
......@@ -1630,7 +1634,7 @@ class Element(Plugin):
pass
# Copy build log
log_filename = context.get_log_filename()
log_filename = self._get_context().get_log_filename()
self._build_log_path = os.path.join(logsdir, 'build.log')
if log_filename:
shutil.copyfile(log_filename, self._build_log_path)
......@@ -1681,9 +1685,6 @@ class Element(Plugin):
"unable to collect artifact contents"
.format(collect))
# Finally cleanup the build dir
cleanup_rootdir()
return artifact_size
def _get_build_log(self):
......@@ -1691,7 +1692,9 @@ class Element(Plugin):
# _pull_pending()
#
# Check whether the artifact will be pulled.
# Check whether the artifact will be pulled. If the pull operation is to
# include a specific subdir of the element artifact (from cli or user conf)
# then the local cache is queried for the subdirs existence.
#
# Returns:
# (bool): Whether a pull operation is pending
......@@ -1701,8 +1704,15 @@ class Element(Plugin):
# Workspace builds are never pushed to artifact servers
return False
if self.__strong_cached:
# Artifact already in local cache
# Check whether the pull has been invoked with a specific subdir requested
# in user context, as to complete a partial artifact
subdir, _ = self.__pull_directories()
if self.__strong_cached and subdir:
# If we've specified a subdir, check if the subdir is cached locally
if self.__artifacts.contains_subdir_artifact(self, self.__strict_cache_key, subdir):
return False
elif self.__strong_cached:
return False
# Pull is pending if artifact remote server available
......@@ -1724,33 +1734,6 @@ class Element(Plugin):
self._update_state()
def _pull_strong(self, *, progress=None):
weak_key = self._get_cache_key(strength=_KeyStrength.WEAK)
key = self.__strict_cache_key
if not self.__artifacts.pull(self, key, progress=progress):
return False
# update weak ref by pointing it to this newly fetched artifact
self.__artifacts.link_key(self, key, weak_key)
return True
def _pull_weak(self, *, progress=None):
weak_key = self._get_cache_key(strength=_KeyStrength.WEAK)
if not self.__artifacts.pull(self, weak_key, progress=progress):
return False
# extract strong cache key from this newly fetched artifact
self._pull_done()
# create tag for strong cache key
key = self._get_cache_key(strength=_KeyStrength.STRONG)
self.__artifacts.link_key(self, weak_key, key)
return True
# _pull():
#
# Pull artifact from remote artifact repository into local artifact cache.
......@@ -1763,11 +1746,15 @@ class Element(Plugin):
def progress(percent, message):
self.status(message)
# Get optional specific subdir to pull and optional list to not pull
# based off of user context
subdir, excluded_subdirs = self.__pull_directories()
# Attempt to pull artifact without knowing whether it's available
pulled = self._pull_strong(progress=progress)
pulled = self.__pull_strong(progress=progress, subdir=subdir, excluded_subdirs=excluded_subdirs)
if not pulled and not self._cached() and not context.get_strict():
pulled = self._pull_weak(progress=progress)
pulled = self.__pull_weak(progress=progress, subdir=subdir, excluded_subdirs=excluded_subdirs)
if not pulled:
return False
......@@ -1787,10 +1774,12 @@ class Element(Plugin):
# No push remotes for this element's project
return True
if not self._cached():
# Do not push elements that aren't cached, or that are cached with a dangling buildtree
# artifact unless element type is expected to have an an empty buildtree directory
if not self.__cached_buildtree():
return True
# Do not push tained artifact
# Do not push tainted artifact
if self.__get_tainted():
return True
......@@ -1891,7 +1880,10 @@ class Element(Plugin):
# This requires that a workspace already be created in
# the workspaces metadata first.
#
def _open_workspace(self):
# Args:
# buildtree (bool): Whether to open workspace with artifact buildtree
#
def _open_workspace(self, buildtree=False):
context = self._get_context()
workspace = self._get_workspace()
assert workspace is not None
......@@ -1904,12 +1896,22 @@ class Element(Plugin):
# files in the target directory actually works without any
# additional support from Source implementations.
#
os.makedirs(context.builddir, exist_ok=True)
with utils._tempdir(dir=context.builddir, prefix='workspace-{}'
.format(self.normal_name)) as temp:
with utils._tempdir(dir=context.builddir, prefix='workspace-source-{}'
.format(self.normal_name)) as temp,\
utils._tempdir(dir=context.builddir, prefix='workspace-buildtree-{}'
.format(self.normal_name)) as buildtreetemp:
for source in self.sources():
source._init_workspace(temp)
# Overwrite the source checkout with the cached buildtree
if buildtree:
self.__artifacts.checkout_artifact_subdir(self, self._get_cache_key(), 'buildtree', buildtreetemp)
if utils._call([utils.get_host_tool('cp'), '-pfr', "".join((buildtreetemp, '/.')), temp])[0] != 0:
raise ElementError("Failed to copy buildtree into workspace checkout at {}".format(buildtreetemp))
# Now hardlink the files into the workspace target.
utils.link_files(temp, workspace.get_absolute_path())
......@@ -2674,6 +2676,106 @@ class Element(Plugin):
return utils._deduplicate(keys)
# __pull_strong():
#
# Attempt pulling given element from configured artifact caches with
# the strict cache key
#
# Args:
# progress (callable): The progress callback, if any
# subdir (str): The optional specific subdir to pull
# excluded_subdirs (list): The optional list of subdirs to not pull
#
# Returns:
# (bool): Whether or not the pull was successful
#
def __pull_strong(self, *, progress=None, subdir=None, excluded_subdirs=None):
weak_key = self._get_cache_key(strength=_KeyStrength.WEAK)
key = self.__strict_cache_key
if not self.__artifacts.pull(self, key, progress=progress, subdir=subdir,
excluded_subdirs=excluded_subdirs):
return False
# update weak ref by pointing it to this newly fetched artifact
self.__artifacts.link_key(self, key, weak_key)
return True
# __pull_weak():
#
# Attempt pulling given element from configured artifact caches with
# the weak cache key
#
# Args:
# progress (callable): The progress callback, if any
# subdir (str): The optional specific subdir to pull
# excluded_subdirs (list): The optional list of subdirs to not pull
#
# Returns:
# (bool): Whether or not the pull was successful
#
def __pull_weak(self, *, progress=None, subdir=None, excluded_subdirs=None):
weak_key = self._get_cache_key(strength=_KeyStrength.WEAK)
if not self.__artifacts.pull(self, weak_key, progress=progress, subdir=subdir,
excluded_subdirs=excluded_subdirs):
return False
# extract strong cache key from this newly fetched artifact
self._pull_done()
# create tag for strong cache key
key = self._get_cache_key(strength=_KeyStrength.STRONG)
self.__artifacts.link_key(self, weak_key, key)
return True
# __cached_buildtree():
#
# Check if cached element artifact contains expected buildtree
#
# Returns:
# (bool): True if artifact cached with buildtree, False if
# element not cached or missing expected buildtree
#
def __cached_buildtree(self):
context = self._get_context()
if not self._cached():
return False
elif context.get_strict():
if not self.__artifacts.contains_subdir_artifact(self, self.__strict_cache_key, 'buildtree'):
return False
elif not self.__artifacts.contains_subdir_artifact(self, self.__weak_cache_key, 'buildtree'):
return False
return True
# __pull_directories():
#
# Which directories to include or exclude given the current
# context
#
# Returns:
# subdir (str): The optional specific subdir to include, based
# on user context
# excluded_subdirs (list): The optional list of subdirs to not
# pull, referenced against subdir value
#
def __pull_directories(self):
context = self._get_context()
# Current default exclusions on pull
excluded_subdirs = ["buildtree"]
subdir = ''
# If buildtrees are to be pulled, remove the value from exclusion list
# and set specific subdir
if context.pull_buildtrees:
subdir = "buildtree"
excluded_subdirs.remove(subdir)
return (subdir, excluded_subdirs)
def _overlap_error_detail(f, forbidden_overlap_elements, elements):
if forbidden_overlap_elements:
......
......@@ -19,7 +19,7 @@ variables:
cmake-args: |
-DCMAKE_INSTALL_PREFIX:PATH="%{prefix}" \
-DCMAKE_INSTALL_LIBDIR=%{lib} %{cmake-extra} %{cmake-global} %{cmake-local}
-DCMAKE_INSTALL_LIBDIR:PATH="%{lib}" %{cmake-extra} %{cmake-global} %{cmake-local}
cmake: |
......
......@@ -86,7 +86,6 @@ This plugin also utilises the following configurable core plugin warnings:
"""
import os
import errno
import re
import shutil
from collections.abc import Mapping
......@@ -97,6 +96,7 @@ from configparser import RawConfigParser
from buildstream import Source, SourceError, Consistency, SourceFetcher
from buildstream import utils
from buildstream.plugin import CoreWarnings
from buildstream.utils import move_atomic, DirectoryExistsError
GIT_MODULES = '.gitmodules'
......@@ -141,19 +141,14 @@ class GitMirror(SourceFetcher):
fail="Failed to clone git repository {}".format(url),
fail_temporarily=True)
# Attempt atomic rename into destination, this will fail if
# another process beat us to the punch
try:
os.rename(tmpdir, self.mirror)
except OSError as e:
# When renaming and the destination repo already exists, os.rename()
# will fail with ENOTEMPTY, since an empty directory will be silently
# replaced
if e.errno == errno.ENOTEMPTY:
move_atomic(tmpdir, self.mirror)
except DirectoryExistsError:
# Another process was quicker to download this repository.
# Let's discard our own
self.source.status("{}: Discarding duplicate clone of {}"
.format(self.source, url))
else:
except OSError as e:
raise SourceError("{}: Failed to move cloned git repository {} from '{}' to '{}': {}"
.format(self.source, url, tmpdir, self.mirror, e)) from e
......
......@@ -68,7 +68,6 @@ details on common configuration options for sources.
The ``pip`` plugin is available since :ref:`format version 16 <project_format_version>`
"""
import errno
import hashlib
import os
import re
......@@ -80,6 +79,7 @@ _PYPI_INDEX_URL = 'https://pypi.org/simple/'
# Used only for finding pip command
_PYTHON_VERSIONS = [
'python', # when running in a venv, we might not have the exact version
'python2.7',
'python3.0',
'python3.1',
......@@ -192,13 +192,14 @@ class PipSource(Source):
# process has fetched the sources before us and ensure that we do
# not raise an error in that case.
try:
os.makedirs(self._mirror)
os.rename(package_dir, self._mirror)
except FileExistsError:
return
utils.move_atomic(package_dir, self._mirror)
except utils.DirectoryExistsError:
# Another process has beaten us and has fetched the sources
# before us.
pass
except OSError as e:
if e.errno != errno.ENOTEMPTY:
raise
raise SourceError("{}: Failed to move downloaded pip packages from '{}' to '{}': {}"
.format(self, package_dir, self._mirror, e)) from e
def stage(self, directory):
with self.timed_activity("Staging Python packages", silent_nested=True):
......
......@@ -182,7 +182,7 @@ class SandboxRemote(Sandbox):
# to replace the sandbox's virtual directory with that. Creating a new virtual directory object
# from another hash will be interesting, though...
new_dir = CasBasedDirectory(self._get_context(), ref=dir_digest)
new_dir = CasBasedDirectory(self._get_context().artifactcache.cas, ref=dir_digest)
self._set_virtual_directory(new_dir)
def run(self, command, flags, *, cwd=None, env=None):
......@@ -191,7 +191,7 @@ class SandboxRemote(Sandbox):
if isinstance(upload_vdir, FileBasedDirectory):
# Make a new temporary directory to put source in
upload_vdir = CasBasedDirectory(self._get_context(), ref=None)
upload_vdir = CasBasedDirectory(self._get_context().artifactcache.cas, ref=None)
upload_vdir.import_files(self.get_virtual_directory()._get_underlying_directory())
upload_vdir.recalculate_hash()
......
......@@ -156,7 +156,7 @@ class Sandbox():
"""
if self._vdir is None or self._never_cache_vdirs:
if 'BST_CAS_DIRECTORIES' in os.environ:
self._vdir = CasBasedDirectory(self.__context, ref=None)
self._vdir = CasBasedDirectory(self.__context.artifactcache.cas, ref=None)
else:
self._vdir = FileBasedDirectory(self._root)
return self._vdir
......
......@@ -249,13 +249,11 @@ class CasBasedDirectory(Directory):
_pb2_path_sep = "/"
_pb2_absolute_path_prefix = "/"
def __init__(self, context, ref=None, parent=None, common_name="untitled", filename=None):
self.context = context
self.cas_directory = os.path.join(context.artifactdir, 'cas')
def __init__(self, cas_cache, ref=None, parent=None, common_name="untitled", filename=None):
self.filename = filename
self.common_name = common_name
self.pb2_directory = remote_execution_pb2.Directory()
self.cas_cache = context.artifactcache.cas
self.cas_cache = cas_cache
if ref:
with open(self.cas_cache.objpath(ref), 'rb') as f:
self.pb2_directory.ParseFromString(f.read())
......@@ -270,7 +268,7 @@ class CasBasedDirectory(Directory):
if self._directory_read:
return
for entry in self.pb2_directory.directories:
buildStreamDirectory = CasBasedDirectory(self.context, ref=entry.digest,
buildStreamDirectory = CasBasedDirectory(self.cas_cache, ref=entry.digest,
parent=self, filename=entry.name)
self.index[entry.name] = IndexEntry(entry, buildstream_object=buildStreamDirectory)
for entry in self.pb2_directory.files:
......@@ -333,7 +331,7 @@ class CasBasedDirectory(Directory):
.format(name, str(self), type(newdir)))
dirnode = self._find_pb2_entry(name)
else:
newdir = CasBasedDirectory(self.context, parent=self, filename=name)
newdir = CasBasedDirectory(self.cas_cache, parent=self, filename=name)
dirnode = self.pb2_directory.directories.add()
dirnode.name = name
......
......@@ -72,6 +72,11 @@ class ProgramNotFoundError(BstError):
super().__init__(message, domain=ErrorDomain.PROG_NOT_FOUND, reason=reason)
class DirectoryExistsError(OSError):
"""Raised when a `os.rename` is attempted but the destination is an existing directory.
"""
class FileListResult():
"""An object which stores the result of one of the operations
which run on a list of files.
......@@ -500,6 +505,38 @@ def get_bst_version():
.format(__version__))
def move_atomic(source, destination, ensure_parents=True):
"""Move the source to the destination using atomic primitives.
This uses `os.rename` to move a file or directory to a new destination.
It wraps some `OSError` thrown errors to ensure their handling is correct.
The main reason for this to exist is that rename can throw different errors
for the same symptom (https://www.unix.com/man-page/POSIX/3posix/rename/).
We are especially interested here in the case when the destination already
exists. In this case, either EEXIST or ENOTEMPTY are thrown.
In order to ensure consistent handling of these exceptions, this function
should be used instead of `os.rename`
Args:
source (str or Path): source to rename
destination (str or Path): destination to which to move the source
ensure_parents (bool): Whether or not to create the parent's directories
of the destination (default: True)
"""
if ensure_parents:
os.makedirs(os.path.dirname(str(destination)), exist_ok=True)
try:
os.rename(str(source), str(destination))
except OSError as exc:
if exc.errno in (errno.EEXIST, errno.ENOTEMPTY):
raise DirectoryExistsError(*exc.args) from exc
raise
@contextmanager
def save_file_atomic(filename, mode='w', *, buffering=-1, encoding=None,
errors=None, newline=None, closefd=True, opener=None, tempdir=None):
......