Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results
Show changes
Commits on Source (5)
  • Qinusty's avatar
    Rework Skipped usage · 93fd7645
    Qinusty authored
    The SKIPPED message type is now used to indicate the end of a task which
    was successful without having to perform the given task.
    
    This overhauls the use of `Queue.done()` and therefore queues do not
    need to provide a processed/skipped return value from `done()`. Instead
    this is replaced with the action of raising a `SkipJob` exception from
    within `Queue.process()`.
    93fd7645
  • Qinusty's avatar
    tests.py: Test skip on push · c3442679
    Qinusty authored
    Adds a test to ensure that BuildStream alerts the user of a skipped push
    when the remote already has the artifact cached.
    c3442679
  • Qinusty's avatar
    element.py: Remove redundant timed_activity · bfa5362f
    Qinusty authored
    This removes the timed_activity for an element _push action. This is
    unnecessary as the job is already being timed elsewhere.
    bfa5362f
  • Qinusty's avatar
    element.py: reword Downloaded to Pulled · ce91b02e
    Qinusty authored
    This changes the info phrase for when an artifact is pulled from a
    remote server.
    ce91b02e
  • Qinusty's avatar
    cascache.py: Modify messaging API calls · 75704f89
    Qinusty authored
    Both pulling and pushing INFO messages are now status messages.
    
    Calls to the messaging API through `self.context.message()` have now
    been switched to `element.info`.
    75704f89
......@@ -228,7 +228,7 @@ class CASCache(ArtifactCache):
try:
remote.init()
element.info("Pulling {} <- {}".format(element._get_brief_display_key(), remote.spec.url))
element.status("Pulling {} <- {}".format(element._get_brief_display_key(), remote.spec.url))
request = buildstream_pb2.GetReferenceRequest()
request.key = ref
......@@ -250,11 +250,8 @@ class CASCache(ArtifactCache):
raise ArtifactError("Failed to pull artifact {}: {}".format(
element._get_brief_display_key(), e)) from e
else:
self.context.message(Message(
None,
MessageType.SKIPPED,
"Remote ({}) does not have {} cached".format(
remote.spec.url, element._get_brief_display_key())
element.info("Remote ({}) does not have {} cached".format(
remote.spec.url, element._get_brief_display_key()
))
return False
......@@ -279,7 +276,7 @@ class CASCache(ArtifactCache):
for remote in push_remotes:
remote.init()
skipped_remote = True
element.info("Pushing {} -> {}".format(element._get_brief_display_key(), remote.spec.url))
element.status("Pushing {} -> {}".format(element._get_brief_display_key(), remote.spec.url))
try:
for ref in refs:
......@@ -361,11 +358,8 @@ class CASCache(ArtifactCache):
raise ArtifactError("Failed to push artifact {}: {}".format(refs, e), temporary=True) from e
if skipped_remote:
self.context.message(Message(
None,
MessageType.SKIPPED,
"Remote ({}) already has {} cached".format(
remote.spec.url, element._get_brief_display_key())
element.info("Remote ({}) already has {} cached".format(
remote.spec.url, element._get_brief_display_key()
))
return pushed
......
......@@ -309,3 +309,17 @@ class StreamError(BstError):
class AppError(BstError):
def __init__(self, message, detail=None, reason=None):
super().__init__(message, detail=detail, domain=ErrorDomain.APP, reason=reason)
# SkipJob
#
# Raised from a child process within a job when the job should be
# considered skipped by the parent process.
#
class SkipJob(Exception):
def __init__(self, *, detail=""):
super().__init__()
self._detail = detail
def __str__(self):
return self._detail
......@@ -31,7 +31,7 @@ import multiprocessing
import psutil
# BuildStream toplevel imports
from ..._exceptions import ImplError, BstError, set_last_task_error
from ..._exceptions import ImplError, BstError, set_last_task_error, SkipJob
from ..._message import Message, MessageType, unconditional_messages
from ... import _signals, utils
......@@ -40,6 +40,7 @@ from ... import _signals, utils
RC_OK = 0
RC_FAIL = 1
RC_PERM_FAIL = 2
RC_SKIPPED = 3
# Used to distinguish between status messages and return values
......@@ -117,7 +118,7 @@ class Job():
self._max_retries = max_retries # Maximum number of automatic retries
self._result = None # Return value of child action in the parent
self._tries = 0 # Try count, for retryable jobs
self._skipped_flag = False # Indicate whether the job was skipped.
# If False, a retry will not be attempted regardless of whether _tries is less than _max_retries.
#
self._retry_flag = True
......@@ -275,6 +276,10 @@ class Job():
def set_task_id(self, task_id):
self._task_id = task_id
@property
def skipped(self):
return self._skipped_flag
#######################################################
# Abstract Methods #
#######################################################
......@@ -396,6 +401,13 @@ class Job():
try:
# Try the task action
result = self.child_process()
except SkipJob as e:
elapsed = datetime.datetime.now() - starttime
self.message(MessageType.SKIPPED, str(e),
elapsed=elapsed, logfile=filename)
# Alert parent of skip by return code
self._child_shutdown(RC_SKIPPED)
except BstError as e:
elapsed = datetime.datetime.now() - starttime
self._retry_flag = e.temporary
......@@ -441,6 +453,7 @@ class Job():
self.message(MessageType.SUCCESS, self.action_name, elapsed=elapsed,
logfile=filename)
# XXX Verify below.
# Shutdown needs to stay outside of the above context manager,
# make sure we dont try to handle SIGTERM while the process
# is already busy in sys.exit()
......@@ -547,14 +560,18 @@ class Job():
# We don't want to retry if we got OK or a permanent fail.
# This is set in _child_action but must also be set for the parent.
#
self._retry_flag = returncode not in (RC_OK, RC_PERM_FAIL)
self._retry_flag = (returncode == RC_FAIL)
# Set the flag to alert Queue that this job skipped.
self._skipped_flag = (returncode == RC_SKIPPED)
if self._retry_flag and (self._tries <= self._max_retries) and not self._scheduler.terminated:
self.spawn()
return
self.parent_complete(returncode == RC_OK, self._result)
self._scheduler.job_completed(self, returncode == RC_OK)
success = (returncode in (RC_OK, RC_SKIPPED))
self.parent_complete(success, self._result)
self._scheduler.job_completed(self, success)
# _parent_process_envelope()
#
......
......@@ -54,12 +54,7 @@ class BuildQueue(Queue):
detail=detail, action_name=self.action_name,
elapsed=timedelta(seconds=0),
logfile=logfile)
job = ElementJob(self._scheduler, self.action_name,
logfile, element=element, queue=self,
resources=self.resources,
action_cb=self.process,
complete_cb=self._job_done,
max_retries=self._max_retries)
job = self._create_job(element)
self._done_queue.append(job)
self.failed_elements.append(element)
self._scheduler._job_complete_callback(job, False)
......@@ -109,5 +104,3 @@ class BuildQueue(Queue):
# This has to be done after _assemble_done, such that the
# element may register its cache key as required
self._check_cache_size(job, element)
return True
......@@ -70,13 +70,8 @@ class FetchQueue(Queue):
return QueueStatus.READY
def done(self, _, element, result, success):
if not success:
return False
if success:
element._update_state()
# Successful fetch, we must be CACHED now
assert element._get_consistency() == Consistency.CACHED
return True
......@@ -21,6 +21,7 @@
# Local imports
from . import Queue, QueueStatus
from ..resources import ResourceType
from ..._exceptions import SkipJob
# A queue which pulls element artifacts
......@@ -33,7 +34,12 @@ class PullQueue(Queue):
def process(self, element):
# returns whether an artifact was downloaded or not
return element._pull()
pulled = element._pull()
if not pulled:
raise SkipJob(detail=self.action_name)
return pulled
def status(self, element):
# state of dependencies may have changed, recalculate element state
......@@ -53,17 +59,10 @@ class PullQueue(Queue):
return QueueStatus.SKIP
def done(self, _, element, result, success):
if not success:
return False
if success:
element._pull_done()
# Build jobs will check the "approximate" size first. Since we
# do not get an artifact size from pull jobs, we have to
# actually check the cache size.
self._scheduler._check_cache_size_real()
# Element._pull() returns True if it downloaded an artifact,
# here we want to appear skipped if we did not download.
return result
......@@ -21,6 +21,7 @@
# Local imports
from . import Queue, QueueStatus
from ..resources import ResourceType
from ..._exceptions import SkipJob
# A queue which pushes element artifacts
......@@ -33,20 +34,15 @@ class PushQueue(Queue):
def process(self, element):
# returns whether an artifact was uploaded or not
return element._push()
pushed = element._push()
if not pushed:
raise SkipJob(detail=self.action_name)
return pushed
def status(self, element):
if element._skip_push():
return QueueStatus.SKIP
return QueueStatus.READY
def done(self, _, element, result, success):
if not success:
return False
# Element._push() returns True if it uploaded an artifact,
# here we want to appear skipped if the remote already had
# the artifact.
return result
......@@ -136,10 +136,6 @@ class Queue():
# success (bool): True if the process() implementation did not
# raise any exception
#
# Returns:
# (bool): True if the element should appear to be processsed,
# Otherwise False will count the element as "skipped"
#
def done(self, job, element, result, success):
pass
......@@ -158,20 +154,8 @@ class Queue():
if not elts:
return
# Note: The internal lists work with jobs. This is not
# reflected in any external methods (except
# pop/peek_ready_jobs).
def create_job(element):
logfile = self._element_log_path(element)
return ElementJob(self._scheduler, self.action_name,
logfile, element=element, queue=self,
resources=self.resources,
action_cb=self.process,
complete_cb=self._job_done,
max_retries=self._max_retries)
# Place skipped elements directly on the done queue
jobs = [create_job(elt) for elt in elts]
jobs = [self._create_job(elt) for elt in elts]
skip = [job for job in jobs if self.status(job.element) == QueueStatus.SKIP]
wait = [job for job in jobs if job not in skip]
......@@ -308,8 +292,7 @@ class Queue():
# and determine if it should be considered as processed
# or skipped.
try:
processed = self.done(job, element, result, success)
self.done(job, element, result, success)
except BstError as e:
# Report error and mark as failed
......@@ -339,7 +322,7 @@ class Queue():
self._done_queue.append(job)
if success:
if processed:
if not job.skipped:
self.processed_elements.append(element)
else:
self.skipped_elements.append(element)
......@@ -360,3 +343,15 @@ class Queue():
logfile = "{key}-{action}".format(key=key, action=action)
return os.path.join(project.name, element.normal_name, logfile)
# Note: The internal lists work with jobs. This is not
# reflected in any external methods (except
# pop/peek_ready_jobs).
def _create_job(self, element):
logfile = self._element_log_path(element)
return ElementJob(self._scheduler, self.action_name,
logfile, element=element, queue=self,
resources=self.resources,
action_cb=self.process,
complete_cb=self._job_done,
max_retries=self._max_retries)
......@@ -49,20 +49,10 @@ class TrackQueue(Queue):
return QueueStatus.READY
def done(self, _, element, result, success):
if not success:
return False
changed = False
if success:
# Set the new refs in the main process one by one as they complete
for unique_id, new_ref in result:
source = _plugin_lookup(unique_id)
# We appear processed if at least one source has changed
if source._save_ref(new_ref):
changed = True
source._save_ref(new_ref)
element._tracking_done()
# We'll appear as a skipped element if tracking resulted in no change
return changed
......@@ -1746,7 +1746,7 @@ class Element(Plugin):
# Notify successfull download
display_key = self._get_brief_display_key()
self.info("Downloaded artifact {}".format(display_key))
self.info("Pulled artifact {}".format(display_key))
return True
# _skip_push():
......@@ -1785,14 +1785,13 @@ class Element(Plugin):
self.warn("Not pushing tainted artifact.")
return False
display_key = self._get_brief_display_key()
with self.timed_activity("Pushing artifact {}".format(display_key)):
# Push all keys used for local commit
pushed = self.__artifacts.push(self, self.__get_cache_keys_for_commit())
if not pushed:
return False
# Notify successful upload
display_key = self._get_brief_display_key()
self.info("Pushed artifact {}".format(display_key))
return True
......
......@@ -356,4 +356,5 @@ def test_pull_missing_notifies_user(caplog, cli, tmpdir, datafiles):
assert not result.get_pulled_elements(), \
"No elements should have been pulled since the cache was empty"
assert "SKIPPED Remote ({}) does not have".format(share.repo) in result.stderr
assert "INFO Remote ({}) does not have".format(share.repo) in result.stderr
assert "SKIPPED Pull" in result.stderr
......@@ -386,3 +386,26 @@ def test_push_cross_junction(cli, tmpdir, datafiles):
cache_key = cli.get_element_key(project, 'junction.bst:import-etc.bst')
assert share.has_artifact('subtest', 'import-etc.bst', cache_key)
@pytest.mark.datafiles(DATA_DIR)
def test_push_already_cached(caplog, cli, tmpdir, datafiles):
project = os.path.join(datafiles.dirname, datafiles.basename)
caplog.set_level(1)
with create_artifact_share(os.path.join(str(tmpdir), 'artifactshare')) as share:
cli.configure({
'artifacts': {'url': share.repo, 'push': True}
})
result = cli.run(project=project, args=['build', 'target.bst'])
result.assert_success()
assert "SKIPPED Push" not in result.stderr
result = cli.run(project=project, args=['push', 'target.bst'])
result.assert_success()
assert not result.get_pushed_elements(), "No elements should have been pushed since the cache was populated"
assert "INFO Remote ({}) already has ".format(share.repo) in result.stderr
assert "SKIPPED Push" in result.stderr
......@@ -178,7 +178,7 @@ class Result():
return list(pushed)
def get_pulled_elements(self):
pulled = re.findall(r'\[\s*pull:(\S+)\s*\]\s*INFO\s*Downloaded artifact', self.stderr)
pulled = re.findall(r'\[\s*pull:(\S+)\s*\]\s*INFO\s*Pulled artifact', self.stderr)
if pulled is None:
return []
......