Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results
Show changes
Commits on Source (57)
Showing
with 440 additions and 254 deletions
......@@ -23,6 +23,11 @@ a reasonable timeframe for identifying these.
Patch submissions
-----------------
If you want to submit a patch, do ask for developer permissions on our
IRC channel first (GitLab's button also works, but you may need to
shout about it - we often overlook this) - for CI reasons, it's much
easier if patches are in branches of the main repository.
Branches must be submitted as merge requests in gitlab. If the branch
fixes an issue or is related to any issues, these issues must be mentioned
in the merge request or preferably the commit messages themselves.
......
=================
buildstream 1.3.1
buildstream 1.1.5
=================
o Add a `--tar` option to `bst checkout` which allows a tarball to be
......@@ -9,6 +9,15 @@ buildstream 1.3.1
and the preferred mirror to fetch from can be defined in the command
line or user config.
o Added new `remote` source plugin for downloading file blobs
o Failed builds are included in the cache as well.
`bst checkout` will provide anything in `%{install-root}`.
A build including cached fails will cause any dependant elements
to not be scheduled and fail during artifact assembly,
and display the retry prompt during an interactive session.
=================
buildstream 1.1.4
=================
......
......@@ -32,6 +32,7 @@ from .._protos.google.bytestream import bytestream_pb2, bytestream_pb2_grpc
from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2, remote_execution_pb2_grpc
from .._protos.buildstream.v2 import buildstream_pb2, buildstream_pb2_grpc
from .._message import MessageType, Message
from .. import _signals, utils
from .._exceptions import ArtifactError
......@@ -264,7 +265,7 @@ class CASCache(ArtifactCache):
for remote in push_remotes:
remote.init()
skipped_remote = True
element.info("Pushing {} -> {}".format(element._get_brief_display_key(), remote.spec.url))
try:
......@@ -280,8 +281,6 @@ class CASCache(ArtifactCache):
if response.digest.hash == tree.hash and response.digest.size_bytes == tree.size_bytes:
# ref is already on the server with the same tree
element.info("Skipping {}, remote ({}) already has artifact cached".format(
element._get_brief_display_key(), remote.spec.url))
continue
except grpc.RpcError as e:
......@@ -309,6 +308,7 @@ class CASCache(ArtifactCache):
missing_blobs[d.hash] = d
# Upload any blobs missing on the server
skipped_remote = False
for digest in missing_blobs.values():
def request_stream():
resource_name = os.path.join(digest.hash, str(digest.size_bytes))
......@@ -344,6 +344,13 @@ class CASCache(ArtifactCache):
if e.code() != grpc.StatusCode.RESOURCE_EXHAUSTED:
raise ArtifactError("Failed to push artifact {}: {}".format(refs, e), temporary=True) from e
if skipped_remote:
self.context.message(Message(
None,
MessageType.SKIPPED,
"Remote ({}) already has {} cached".format(
remote.spec.url, element._get_brief_display_key())
))
return pushed
################################################
......
......@@ -197,29 +197,55 @@ class Context():
"\nValid values are, for example: 800M 10G 1T 50%\n"
.format(str(e))) from e
# If we are asked not to set a quota, we set it to the maximum
# disk space available minus a headroom of 2GB, such that we
# at least try to avoid raising Exceptions.
# Headroom intended to give BuildStream a bit of leeway.
# This acts as the minimum size of cache_quota and also
# is taken from the user requested cache_quota.
#
# Of course, we might still end up running out during a build
# if we end up writing more than 2G, but hey, this stuff is
# already really fuzzy.
#
if cache_quota is None:
stat = os.statvfs(artifactdir_volume)
# Again, the artifact directory may not yet have been
# created
if not os.path.exists(self.artifactdir):
cache_size = 0
else:
cache_size = utils._get_dir_size(self.artifactdir)
cache_quota = cache_size + stat.f_bsize * stat.f_bavail
if 'BST_TEST_SUITE' in os.environ:
headroom = 0
else:
headroom = 2e9
stat = os.statvfs(artifactdir_volume)
available_space = (stat.f_bsize * stat.f_bavail)
# Again, the artifact directory may not yet have been created yet
#
if not os.path.exists(self.artifactdir):
cache_size = 0
else:
cache_size = utils._get_dir_size(self.artifactdir)
# Ensure system has enough storage for the cache_quota
#
# If cache_quota is none, set it to the maximum it could possibly be.
#
# Also check that cache_quota is atleast as large as our headroom.
#
if cache_quota is None: # Infinity, set to max system storage
cache_quota = cache_size + available_space
if cache_quota < headroom: # Check minimum
raise LoadError(LoadErrorReason.INVALID_DATA,
"Invalid cache quota ({}): ".format(utils._pretty_size(cache_quota)) +
"BuildStream requires a minimum cache quota of 2G.")
elif cache_quota > cache_size + available_space: # Check maximum
raise LoadError(LoadErrorReason.INVALID_DATA,
("Your system does not have enough available " +
"space to support the cache quota specified.\n" +
"You currently have:\n" +
"- {used} of cache in use at {local_cache_path}\n" +
"- {available} of available system storage").format(
used=utils._pretty_size(cache_size),
local_cache_path=self.artifactdir,
available=utils._pretty_size(available_space)))
# Place a slight headroom (2e9 (2GB) on the cache_quota) into
# cache_quota to try and avoid exceptions.
#
# Of course, we might still end up running out during a build
# if we end up writing more than 2G, but hey, this stuff is
# already really fuzzy.
#
self.cache_quota = cache_quota - headroom
self.cache_lower_threshold = self.cache_quota / 2
......
......@@ -88,6 +88,7 @@ class ErrorDomain(Enum):
ELEMENT = 11
APP = 12
STREAM = 13
VIRTUAL_FS = 14
# BstError is an internal base exception class for BuildSream
......
......@@ -68,9 +68,10 @@ def complete_path(path_type, incomplete, base_directory='.'):
# If there was nothing on the left of the last separator,
# we are completing files in the filesystem root
base_path = os.path.join(base_directory, base_path)
elif os.path.isdir(incomplete):
base_path = incomplete
else:
incomplete_base_path = os.path.join(base_directory, incomplete)
if os.path.isdir(incomplete_base_path):
base_path = incomplete_base_path
try:
if base_path:
......
......@@ -160,6 +160,7 @@ class TypeName(Widget):
MessageType.START: "blue",
MessageType.SUCCESS: "green",
MessageType.FAIL: "red",
MessageType.SKIPPED: "yellow",
MessageType.ERROR: "red",
MessageType.BUG: "red",
}
......@@ -368,7 +369,9 @@ class LogLine(Widget):
if consistency == Consistency.INCONSISTENT:
line = p.fmt_subst(line, 'state', "no reference", fg='red')
else:
if element._cached():
if element._cached_failure():
line = p.fmt_subst(line, 'state', "failed", fg='red')
elif element._cached_success():
line = p.fmt_subst(line, 'state', "cached", fg='magenta')
elif consistency == Consistency.RESOLVED:
line = p.fmt_subst(line, 'state', "fetch needed", fg='red')
......@@ -522,12 +525,15 @@ class LogLine(Widget):
text += "\n\n"
if self._failure_messages:
text += self.content_profile.fmt("Failure Summary\n", bold=True)
values = OrderedDict()
for element, messages in sorted(self._failure_messages.items(), key=lambda x: x[0].name):
values[element.name] = ''.join(self._render(v) for v in messages)
text += self._format_values(values, style_value=False)
for queue in stream.queues:
if any(el.name == element.name for el in queue.failed_elements):
values[element.name] = ''.join(self._render(v) for v in messages)
if values:
text += self.content_profile.fmt("Failure Summary\n", bold=True)
text += self._format_values(values, style_value=False)
text += self.content_profile.fmt("Pipeline Summary\n", bold=True)
values = OrderedDict()
......
......@@ -36,6 +36,7 @@ class MessageType():
START = "start" # Status start message
SUCCESS = "success" # Successful status complete message
FAIL = "failure" # Failing status complete message
SKIPPED = "skipped"
# Messages which should be reported regardless of whether
......
......@@ -489,7 +489,7 @@ class _Planner():
self.plan_element(dep, depth)
# Dont try to plan builds of elements that are cached already
if not element._cached():
if not element._cached_success():
for dep in element.dependencies(Scope.BUILD, recurse=False):
self.plan_element(dep, depth + 1)
......@@ -501,4 +501,4 @@ class _Planner():
self.plan_element(root, 0)
depth_sorted = sorted(self.depth_map.items(), key=itemgetter(1), reverse=True)
return [item[0] for item in depth_sorted if plan_cached or not item[0]._cached()]
return [item[0] for item in depth_sorted if plan_cached or not item[0]._cached_success()]
......@@ -18,8 +18,12 @@
# Tristan Van Berkom <tristan.vanberkom@codethink.co.uk>
# Jürg Billeter <juerg.billeter@codethink.co.uk>
from datetime import timedelta
from . import Queue, QueueStatus
from ..jobs import ElementJob
from ..resources import ResourceType
from ..._message import MessageType
# A queue which assembles elements
......@@ -30,6 +34,38 @@ class BuildQueue(Queue):
complete_name = "Built"
resources = [ResourceType.PROCESS]
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._tried = set()
def enqueue(self, elts):
to_queue = []
for element in elts:
if not element._cached_failure() or element in self._tried:
to_queue.append(element)
continue
# Bypass queue processing entirely the first time it's tried.
self._tried.add(element)
_, description, detail = element._get_build_result()
logfile = element._get_build_log()
self._message(element, MessageType.FAIL, description,
detail=detail, action_name=self.action_name,
elapsed=timedelta(seconds=0),
logfile=logfile)
job = ElementJob(self._scheduler, self.action_name,
logfile, element=element, queue=self,
resources=self.resources,
action_cb=self.process,
complete_cb=self._job_done,
max_retries=self._max_retries)
self._done_queue.append(job)
self.failed_elements.append(element)
self._scheduler._job_complete_callback(job, False)
return super().enqueue(to_queue)
def process(self, element):
element._assemble()
return element._get_unique_id()
......@@ -43,7 +79,7 @@ class BuildQueue(Queue):
# Keep it in the queue.
return QueueStatus.WAIT
if element._cached():
if element._cached_success():
return QueueStatus.SKIP
if not element._buildable():
......
......@@ -296,6 +296,7 @@ class Queue():
# See the Job object for an explanation of the call signature
#
def _job_done(self, job, element, success, result):
element._update_state()
# Update values that need to be synchronized in the main task
# before calling any queue implementation
......@@ -335,8 +336,9 @@ class Queue():
# No exception occured, handle the success/failure state in the normal way
#
self._done_queue.append(job)
if success:
self._done_queue.append(job)
if processed:
self.processed_elements.append(element)
else:
......
......@@ -407,15 +407,16 @@ class Stream():
integrate=integrate) as sandbox:
# Copy or move the sandbox to the target directory
sandbox_root = sandbox.get_directory()
sandbox_vroot = sandbox.get_virtual_directory()
if not tar:
with target.timed_activity("Checking out files in '{}'"
.format(location)):
try:
if hardlinks:
self._checkout_hardlinks(sandbox_root, location)
self._checkout_hardlinks(sandbox_vroot, location)
else:
utils.copy_files(sandbox_root, location)
sandbox_vroot.export_files(location)
except OSError as e:
raise StreamError("Failed to checkout files: '{}'"
.format(e)) from e
......@@ -424,14 +425,12 @@ class Stream():
with target.timed_activity("Creating tarball"):
with os.fdopen(sys.stdout.fileno(), 'wb') as fo:
with tarfile.open(fileobj=fo, mode="w|") as tf:
Stream._add_directory_to_tarfile(
tf, sandbox_root, '.')
sandbox_vroot.export_to_tar(tf, '.')
else:
with target.timed_activity("Creating tarball '{}'"
.format(location)):
with tarfile.open(location, "w:") as tf:
Stream._add_directory_to_tarfile(
tf, sandbox_root, '.')
sandbox_vroot.export_to_tar(tf, '.')
except BstError as e:
raise StreamError("Error while staging dependencies into a sandbox"
......@@ -1050,46 +1049,13 @@ class Stream():
# Helper function for checkout()
#
def _checkout_hardlinks(self, sandbox_root, directory):
def _checkout_hardlinks(self, sandbox_vroot, directory):
try:
removed = utils.safe_remove(directory)
utils.safe_remove(directory)
except OSError as e:
raise StreamError("Failed to remove checkout directory: {}".format(e)) from e
if removed:
# Try a simple rename of the sandbox root; if that
# doesnt cut it, then do the regular link files code path
try:
os.rename(sandbox_root, directory)
except OSError:
os.makedirs(directory, exist_ok=True)
utils.link_files(sandbox_root, directory)
else:
utils.link_files(sandbox_root, directory)
# Add a directory entry deterministically to a tar file
#
# This function takes extra steps to ensure the output is deterministic.
# First, it sorts the results of os.listdir() to ensure the ordering of
# the files in the archive is the same. Second, it sets a fixed
# timestamp for each entry. See also https://bugs.python.org/issue24465.
@staticmethod
def _add_directory_to_tarfile(tf, dir_name, dir_arcname, mtime=0):
for filename in sorted(os.listdir(dir_name)):
name = os.path.join(dir_name, filename)
arcname = os.path.join(dir_arcname, filename)
tarinfo = tf.gettarinfo(name, arcname)
tarinfo.mtime = mtime
if tarinfo.isreg():
with open(name, "rb") as f:
tf.addfile(tarinfo, f)
elif tarinfo.isdir():
tf.addfile(tarinfo)
Stream._add_directory_to_tarfile(tf, name, arcname, mtime)
else:
tf.addfile(tarinfo)
sandbox_vroot.export_files(directory, can_link=True, can_destroy=True)
# Write the element build script to the given directory
def _write_element_script(self, directory, element):
......
#
# Copyright (C) 2016 Codethink Limited
# Copyright (C) 2018 Bloomberg Finance LP
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
......@@ -204,8 +205,9 @@ class BuildElement(Element):
def prepare(self, sandbox):
commands = self.__commands['configure-commands']
if commands:
for cmd in commands:
self.__run_command(sandbox, cmd, 'configure-commands')
with self.timed_activity("Running configure-commands"):
for cmd in commands:
self.__run_command(sandbox, cmd, 'configure-commands')
def generate_script(self):
script = ""
......@@ -240,4 +242,5 @@ class BuildElement(Element):
exitcode = sandbox.run(['sh', '-c', '-e', cmd + '\n'],
SandboxFlags.ROOT_READ_ONLY)
if exitcode != 0:
raise ElementError("Command '{}' failed with exitcode {}".format(cmd, exitcode))
raise ElementError("Command '{}' failed with exitcode {}".format(cmd, exitcode),
collect=self.get_variable('install-root'))
This diff is collapsed.
......@@ -34,7 +34,6 @@ The default configuration and possible options are as such:
"""
import os
from buildstream import utils
from buildstream import Element, Scope
......@@ -56,6 +55,9 @@ class ComposeElement(Element):
# added, to reduce the potential for confusion
BST_FORBID_SOURCES = True
# This plugin has been modified to avoid the use of Sandbox.get_directory
BST_VIRTUAL_DIRECTORY = True
def configure(self, node):
self.node_validate(node, [
'integrate', 'include', 'exclude', 'include-orphans'
......@@ -104,7 +106,8 @@ class ComposeElement(Element):
orphans=self.include_orphans)
manifest.update(files)
basedir = sandbox.get_directory()
# Make a snapshot of all the files.
vbasedir = sandbox.get_virtual_directory()
modified_files = set()
removed_files = set()
added_files = set()
......@@ -116,38 +119,24 @@ class ComposeElement(Element):
if require_split:
# Make a snapshot of all the files before integration-commands are run.
snapshot = {
f: getmtime(os.path.join(basedir, f))
for f in utils.list_relative_paths(basedir)
}
snapshot = set(vbasedir.list_relative_paths())
vbasedir.mark_unmodified()
for dep in self.dependencies(Scope.BUILD):
dep.integrate(sandbox)
if require_split:
# Calculate added, modified and removed files
basedir_contents = set(utils.list_relative_paths(basedir))
post_integration_snapshot = vbasedir.list_relative_paths()
modified_files = set(vbasedir.list_modified_paths())
basedir_contents = set(post_integration_snapshot)
for path in manifest:
if path in basedir_contents:
if path in snapshot:
preintegration_mtime = snapshot[path]
if preintegration_mtime != getmtime(os.path.join(basedir, path)):
modified_files.add(path)
else:
# If the path appears in the manifest but not the initial snapshot,
# it may be a file staged inside a directory symlink. In this case
# the path we got from the manifest won't show up in the snapshot
# because utils.list_relative_paths() doesn't recurse into symlink
# directories.
pass
elif path in snapshot:
if path in snapshot and path not in basedir_contents:
removed_files.add(path)
for path in basedir_contents:
if path not in snapshot:
added_files.add(path)
self.info("Integration modified {}, added {} and removed {} files"
.format(len(modified_files), len(added_files), len(removed_files)))
......@@ -166,8 +155,7 @@ class ComposeElement(Element):
# instead of into a subdir. The element assemble() method should
# support this in some way.
#
installdir = os.path.join(basedir, 'buildstream', 'install')
os.makedirs(installdir, exist_ok=True)
installdir = vbasedir.descend(['buildstream', 'install'], create=True)
# We already saved the manifest for created files in the integration phase,
# now collect the rest of the manifest.
......@@ -191,19 +179,12 @@ class ComposeElement(Element):
with self.timed_activity("Creating composition", detail=detail, silent_nested=True):
self.info("Composing {} files".format(len(manifest)))
utils.link_files(basedir, installdir, files=manifest)
installdir.import_files(vbasedir, files=manifest, can_link=True)
# And we're done
return os.path.join(os.sep, 'buildstream', 'install')
# Like os.path.getmtime(), but doesnt explode on symlinks
#
def getmtime(path):
stat = os.lstat(path)
return stat.st_mtime
# Plugin entry point
def setup():
return ComposeElement
......@@ -31,7 +31,6 @@ The empty configuration is as such:
"""
import os
import shutil
from buildstream import Element, BuildElement, ElementError
......@@ -39,6 +38,9 @@ from buildstream import Element, BuildElement, ElementError
class ImportElement(BuildElement):
# pylint: disable=attribute-defined-outside-init
# This plugin has been modified to avoid the use of Sandbox.get_directory
BST_VIRTUAL_DIRECTORY = True
def configure(self, node):
self.source = self.node_subst_member(node, 'source')
self.target = self.node_subst_member(node, 'target')
......@@ -68,27 +70,22 @@ class ImportElement(BuildElement):
# Do not mount workspaces as the files are copied from outside the sandbox
self._stage_sources_in_sandbox(sandbox, 'input', mount_workspaces=False)
rootdir = sandbox.get_directory()
inputdir = os.path.join(rootdir, 'input')
outputdir = os.path.join(rootdir, 'output')
rootdir = sandbox.get_virtual_directory()
inputdir = rootdir.descend(['input'])
outputdir = rootdir.descend(['output'], create=True)
# The directory to grab
inputdir = os.path.join(inputdir, self.source.lstrip(os.sep))
inputdir = inputdir.rstrip(os.sep)
inputdir = inputdir.descend(self.source.strip(os.sep).split(os.sep))
# The output target directory
outputdir = os.path.join(outputdir, self.target.lstrip(os.sep))
outputdir = outputdir.rstrip(os.sep)
# Ensure target directory parent
os.makedirs(os.path.dirname(outputdir), exist_ok=True)
outputdir = outputdir.descend(self.target.strip(os.sep).split(os.sep), create=True)
if not os.path.exists(inputdir):
if inputdir.is_empty():
raise ElementError("{}: No files were found inside directory '{}'"
.format(self, self.source))
# Move it over
shutil.move(inputdir, outputdir)
outputdir.import_files(inputdir)
# And we're done
return '/output'
......
......@@ -24,13 +24,15 @@ Stack elements are simply a symbolic element used for representing
a logical group of elements.
"""
import os
from buildstream import Element
# Element implementation for the 'stack' kind.
class StackElement(Element):
# This plugin has been modified to avoid the use of Sandbox.get_directory
BST_VIRTUAL_DIRECTORY = True
def configure(self, node):
pass
......@@ -52,7 +54,7 @@ class StackElement(Element):
# Just create a dummy empty artifact, its existence is a statement
# that all this stack's dependencies are built.
rootdir = sandbox.get_directory()
vrootdir = sandbox.get_virtual_directory()
# XXX FIXME: This is currently needed because the artifact
# cache wont let us commit an empty artifact.
......@@ -61,10 +63,7 @@ class StackElement(Element):
# the actual artifact data in a subdirectory, then we
# will be able to store some additional state in the
# artifact cache, and we can also remove this hack.
outputdir = os.path.join(rootdir, 'output', 'bst')
# Ensure target directory parent
os.makedirs(os.path.dirname(outputdir), exist_ok=True)
vrootdir.descend(['output', 'bst'], create=True)
# And we're done
return '/output'
......
......@@ -71,6 +71,7 @@ git - stage files from a git repository
"""
import os
import errno
import re
import shutil
from collections import Mapping
......@@ -119,11 +120,21 @@ class GitMirror(SourceFetcher):
fail="Failed to clone git repository {}".format(url),
fail_temporarily=True)
# Attempt atomic rename into destination, this will fail if
# another process beat us to the punch
try:
shutil.move(tmpdir, self.mirror)
except (shutil.Error, OSError) as e:
raise SourceError("{}: Failed to move cloned git repository {} from '{}' to '{}'"
.format(self.source, url, tmpdir, self.mirror)) from e
os.rename(tmpdir, self.mirror)
except OSError as e:
# When renaming and the destination repo already exists, os.rename()
# will fail with ENOTEMPTY, since an empty directory will be silently
# replaced
if e.errno == errno.ENOTEMPTY:
self.source.status("{}: Discarding duplicate clone of {}"
.format(self.source, url))
else:
raise SourceError("{}: Failed to move cloned git repository {} from '{}' to '{}': {}"
.format(self.source, url, tmpdir, self.mirror, e)) from e
def _fetch(self, alias_override=None):
url = self.source.translate_url(self.url, alias_override=alias_override)
......
......@@ -32,7 +32,8 @@ from .._fuse import SafeHardlinks
class Mount():
def __init__(self, sandbox, mount_point, safe_hardlinks):
scratch_directory = sandbox._get_scratch_directory()
root_directory = sandbox.get_directory()
# Getting external_directory here is acceptable as we're part of the sandbox code.
root_directory = sandbox.get_virtual_directory().external_directory
self.mount_point = mount_point
self.safe_hardlinks = safe_hardlinks
......
......@@ -56,7 +56,9 @@ class SandboxBwrap(Sandbox):
def run(self, command, flags, *, cwd=None, env=None):
stdout, stderr = self._get_output()
root_directory = self.get_directory()
# Allowable access to underlying storage as we're part of the sandbox
root_directory = self.get_virtual_directory().external_directory
# Fallback to the sandbox default settings for
# the cwd and env.
......