Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Select Git revision
  • 108-integration-tests-not-idempotent-and-self-contained
  • 131-behavior-of-except-argument-is-frustrating-and-confusing
  • 132-loading-external-plugins-works-without-explicit-requirement-in-project-conf
  • 135-expire-artifacts-in-local-cache
  • 135-expire-artifacts-in-local-cache-clean
  • 138-aborting-bst-push-command-causes-stack-trace-3
  • 142-potentially-printing-provenance-more-than-once-in-loaderrors
  • 188-trigger-external-commands-on-certain-events
  • 214-filter-workspacing-rework
  • 218-allow-specifying-the-chroot-binary-to-use-for-sandboxes-on-unix-platforms
  • 239-use-pylint-for-linting
  • 372-allow-queues-to-run-auxilliary-jobs-after-an-element-s-job-finishes
  • 380-untagged-bst
  • 463-make-dependency-type-default-to-build
  • 537-mirror-fallback-does-not-work-for-git
  • 64-clarify-about-plugins-importing-other-plugins
  • 716-add-example-with-build-directory-outside-of-source-directory
  • 716-add-example-with-build-directory-outside-of-source-directory-2
  • 81-non-empty-read-only-directories-not-handled-during-bst-build-and-others
  • BenjaminSchubert/fix-quota-tests
  • Qinusty/235-manifest
  • Qinusty/397
  • Qinusty/470-bst-track-yaml-indent
  • Qinusty/553-backport-1.2
  • Qinusty/663-missing-cache-key-workspace-open
  • Qinusty/backport-576
  • Qinusty/backport-skipped-562
  • Qinusty/gitlab-ci
  • Qinusty/gitlab-ci-duration
  • Qinusty/message-helpers
  • Qinusty/pytest_cache_gitignore
  • abderrahim/cached-failure
  • abderrahim/cachekey-strictrebuild
  • abderrahim/cleanup-speedup
  • abderrahim/makemaker
  • abderrahim/resolve-remotes
  • abderrahim/source-cache
  • abderrahim/stage-artifact-scriptelement
  • abderrahim/virtual-extract
  • adamjones/contributing
  • adamjones/contribution-guide
  • aevri/assert_no_unexpected_size_writes
  • aevri/casdprocessmanager2
  • aevri/check_spawn_ci_working
  • aevri/enable_spawn_ci_4
  • aevri/enable_spawn_ci_6
  • aevri/enable_spawn_ci_7
  • aevri/json_artifact_meta
  • aevri/picklable_jobs
  • aevri/plugin_venvs
  • aevri/provenance_scope
  • aevri/pylint_ignore_argsdiff
  • aevri/safe_noninteractive
  • aevri/win32
  • aevri/win32_minimal
  • aevri/win32_minimal_seemstowork_20190829
  • aevri/win32_receive_signals
  • aevri/win32_temptext
  • alexfazakas/add-bst-init-argument
  • alexfazakas/use-merge-trains
  • always-do-linting
  • another-segfault
  • becky/locally_downloaded_files
  • becky/shell_launch_errors
  • bschubert/add-isolated-tests
  • bschubert/isort
  • bschubert/merge-parent-child-job
  • bschubert/more-mypy
  • bschubert/no-multiprocessing-bak
  • bschubert/no-multiprocessing-full
  • bschubert/optimize-deps
  • bschubert/optimize-element-init
  • bschubert/optimize-loader-sorting
  • bschubert/optimize-mapping-node
  • bschubert/optimize-splits
  • bschubert/remove-multiline-switch-for-re
  • bschubert/remove-parent-child-pipe
  • bschubert/remove-pip-source
  • bschubert/standardize-source-tests
  • bschubert/test-plugins
  • bschubert/update-coverage
  • bst-1
  • bst-1.0
  • bst-1.2
  • bst-1.4
  • bst-pull
  • bst-push
  • buildbox-pre-will
  • cache-key-v0
  • caching_build_trees
  • cascache_timeouts
  • chandan/automate-pypi-release
  • chandan/cli-deps
  • chandan/contrib-dependencies
  • chandan/element-cache
  • chandan/enums
  • chandan/extras-require
  • chandan/macos-multiprocessing
  • chandan/moar-parallelism
  • chandan/moar-runners
  • 1.0.0
  • 1.0.1
  • 1.1.0
  • 1.1.1
  • 1.1.2
  • 1.1.3
  • 1.1.4
  • 1.1.5
  • 1.1.6
  • 1.1.7
  • 1.2.0
  • 1.2.1
  • 1.2.2
  • 1.2.3
  • 1.2.4
  • 1.2.5
  • 1.2.6
  • 1.2.7
  • 1.2.8
  • 1.3.0
  • 1.3.1
  • 1.4.0
  • 1.4.1
  • 1.4.2
  • 1.4.3
  • 1.5.0
  • 1.5.1
  • 1.6.0
  • 1.6.1
  • 1.91.0
  • 1.91.1
  • 1.91.2
  • 1.91.3
  • 1.93.0
  • 1.93.1
  • 1.93.2
  • 1.93.3
  • 1.93.4
  • 1.93.5
  • CROSS_PLATFORM_SEPT_2017
  • PRE_CAS_MERGE_JULY_2018
  • bst-1-branchpoint
  • bst-1.2-branchpoint
  • bst-1.4-branchpoint
144 results
Show changes
Commits on Source (3)
...@@ -45,7 +45,7 @@ from .. import _yaml ...@@ -45,7 +45,7 @@ from .. import _yaml
_MAX_PAYLOAD_BYTES = 1024 * 1024 _MAX_PAYLOAD_BYTES = 1024 * 1024
class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key client_cert')): class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key client_cert instance_name')):
# _new_from_config_node # _new_from_config_node
# #
...@@ -53,7 +53,7 @@ class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key ...@@ -53,7 +53,7 @@ class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key
# #
@staticmethod @staticmethod
def _new_from_config_node(spec_node, basedir=None): def _new_from_config_node(spec_node, basedir=None):
_yaml.node_validate(spec_node, ['url', 'push', 'server-cert', 'client-key', 'client-cert']) _yaml.node_validate(spec_node, ['url', 'push', 'server-cert', 'client-key', 'client-cert', 'instance_name'])
url = _yaml.node_get(spec_node, str, 'url') url = _yaml.node_get(spec_node, str, 'url')
push = _yaml.node_get(spec_node, bool, 'push', default_value=False) push = _yaml.node_get(spec_node, bool, 'push', default_value=False)
if not url: if not url:
...@@ -61,6 +61,8 @@ class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key ...@@ -61,6 +61,8 @@ class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key
raise LoadError(LoadErrorReason.INVALID_DATA, raise LoadError(LoadErrorReason.INVALID_DATA,
"{}: empty artifact cache URL".format(provenance)) "{}: empty artifact cache URL".format(provenance))
instance_name = _yaml.node_get(spec_node, str, 'instance_name', default_value=None)
server_cert = _yaml.node_get(spec_node, str, 'server-cert', default_value=None) server_cert = _yaml.node_get(spec_node, str, 'server-cert', default_value=None)
if server_cert and basedir: if server_cert and basedir:
server_cert = os.path.join(basedir, server_cert) server_cert = os.path.join(basedir, server_cert)
...@@ -83,10 +85,10 @@ class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key ...@@ -83,10 +85,10 @@ class CASRemoteSpec(namedtuple('CASRemoteSpec', 'url push server_cert client_key
raise LoadError(LoadErrorReason.INVALID_DATA, raise LoadError(LoadErrorReason.INVALID_DATA,
"{}: 'client-cert' was specified without 'client-key'".format(provenance)) "{}: 'client-cert' was specified without 'client-key'".format(provenance))
return CASRemoteSpec(url, push, server_cert, client_key, client_cert) return CASRemoteSpec(url, push, server_cert, client_key, client_cert, instance_name)
CASRemoteSpec.__new__.__defaults__ = (None, None, None) CASRemoteSpec.__new__.__defaults__ = (None, None, None, None)
class BlobNotFound(CASError): class BlobNotFound(CASError):
...@@ -248,7 +250,7 @@ class CASCache(): ...@@ -248,7 +250,7 @@ class CASCache():
remote = CASRemote(remote_spec) remote = CASRemote(remote_spec)
remote.init() remote.init()
request = buildstream_pb2.StatusRequest() request = buildstream_pb2.StatusRequest(instance_name=remote_spec.instance_name)
response = remote.ref_storage.Status(request) response = remote.ref_storage.Status(request)
if remote_spec.push and not response.allow_updates: if remote_spec.push and not response.allow_updates:
...@@ -284,7 +286,7 @@ class CASCache(): ...@@ -284,7 +286,7 @@ class CASCache():
try: try:
remote.init() remote.init()
request = buildstream_pb2.GetReferenceRequest() request = buildstream_pb2.GetReferenceRequest(instance_name=remote.spec.instance_name)
request.key = ref request.key = ref
response = remote.ref_storage.GetReference(request) response = remote.ref_storage.GetReference(request)
...@@ -369,7 +371,7 @@ class CASCache(): ...@@ -369,7 +371,7 @@ class CASCache():
# Check whether ref is already on the server in which case # Check whether ref is already on the server in which case
# there is no need to push the ref # there is no need to push the ref
try: try:
request = buildstream_pb2.GetReferenceRequest() request = buildstream_pb2.GetReferenceRequest(instance_name=remote.spec.instance_name)
request.key = ref request.key = ref
response = remote.ref_storage.GetReference(request) response = remote.ref_storage.GetReference(request)
...@@ -384,7 +386,7 @@ class CASCache(): ...@@ -384,7 +386,7 @@ class CASCache():
self._send_directory(remote, tree) self._send_directory(remote, tree)
request = buildstream_pb2.UpdateReferenceRequest() request = buildstream_pb2.UpdateReferenceRequest(instance_name=remote.spec.instance_name)
request.keys.append(ref) request.keys.append(ref)
request.digest.hash = tree.hash request.digest.hash = tree.hash
request.digest.size_bytes = tree.size_bytes request.digest.size_bytes = tree.size_bytes
...@@ -448,7 +450,7 @@ class CASCache(): ...@@ -448,7 +450,7 @@ class CASCache():
def verify_digest_on_remote(self, remote, digest): def verify_digest_on_remote(self, remote, digest):
remote.init() remote.init()
request = remote_execution_pb2.FindMissingBlobsRequest() request = remote_execution_pb2.FindMissingBlobsRequest(instance_name=remote.spec.instance_name)
request.blob_digests.extend([digest]) request.blob_digests.extend([digest])
response = remote.cas.FindMissingBlobs(request) response = remote.cas.FindMissingBlobs(request)
...@@ -908,7 +910,10 @@ class CASCache(): ...@@ -908,7 +910,10 @@ class CASCache():
yield from self._required_blobs(dirnode.digest) yield from self._required_blobs(dirnode.digest)
def _fetch_blob(self, remote, digest, stream): def _fetch_blob(self, remote, digest, stream):
resource_name = '/'.join(['blobs', digest.hash, str(digest.size_bytes)]) resource_name_components = ['blobs', digest.hash, str(digest.size_bytes)]
if remote.spec.instance_name:
resource_name_components.insert(0, remote.spec.instance_name)
resource_name = '/'.join(resource_name_components)
request = bytestream_pb2.ReadRequest() request = bytestream_pb2.ReadRequest()
request.resource_name = resource_name request.resource_name = resource_name
request.read_offset = 0 request.read_offset = 0
...@@ -1064,8 +1069,11 @@ class CASCache(): ...@@ -1064,8 +1069,11 @@ class CASCache():
return dirdigest return dirdigest
def _send_blob(self, remote, digest, stream, u_uid=uuid.uuid4()): def _send_blob(self, remote, digest, stream, u_uid=uuid.uuid4()):
resource_name = '/'.join(['uploads', str(u_uid), 'blobs', resource_name_components = ['uploads', str(u_uid), 'blobs',
digest.hash, str(digest.size_bytes)]) digest.hash, str(digest.size_bytes)]
if remote.spec.instance_name:
resource_name_components.insert(0, remote.spec.instance_name)
resource_name = '/'.join(resource_name_components)
def request_stream(resname, instream): def request_stream(resname, instream):
offset = 0 offset = 0
...@@ -1097,7 +1105,7 @@ class CASCache(): ...@@ -1097,7 +1105,7 @@ class CASCache():
missing_blobs = dict() missing_blobs = dict()
# Limit size of FindMissingBlobs request # Limit size of FindMissingBlobs request
for required_blobs_group in _grouper(required_blobs, 512): for required_blobs_group in _grouper(required_blobs, 512):
request = remote_execution_pb2.FindMissingBlobsRequest() request = remote_execution_pb2.FindMissingBlobsRequest(instance_name=remote.spec.instance_name)
for required_digest in required_blobs_group: for required_digest in required_blobs_group:
d = request.blob_digests.add() d = request.blob_digests.add()
...@@ -1193,7 +1201,7 @@ class CASRemote(): ...@@ -1193,7 +1201,7 @@ class CASRemote():
self.max_batch_total_size_bytes = _MAX_PAYLOAD_BYTES self.max_batch_total_size_bytes = _MAX_PAYLOAD_BYTES
try: try:
request = remote_execution_pb2.GetCapabilitiesRequest() request = remote_execution_pb2.GetCapabilitiesRequest(instance_name=self.spec.instance_name)
response = self.capabilities.GetCapabilities(request) response = self.capabilities.GetCapabilities(request)
server_max_batch_total_size_bytes = response.cache_capabilities.max_batch_total_size_bytes server_max_batch_total_size_bytes = response.cache_capabilities.max_batch_total_size_bytes
if 0 < server_max_batch_total_size_bytes < self.max_batch_total_size_bytes: if 0 < server_max_batch_total_size_bytes < self.max_batch_total_size_bytes:
...@@ -1206,7 +1214,7 @@ class CASRemote(): ...@@ -1206,7 +1214,7 @@ class CASRemote():
# Check whether the server supports BatchReadBlobs() # Check whether the server supports BatchReadBlobs()
self.batch_read_supported = False self.batch_read_supported = False
try: try:
request = remote_execution_pb2.BatchReadBlobsRequest() request = remote_execution_pb2.BatchReadBlobsRequest(instance_name=self.spec.instance_name)
response = self.cas.BatchReadBlobs(request) response = self.cas.BatchReadBlobs(request)
self.batch_read_supported = True self.batch_read_supported = True
except grpc.RpcError as e: except grpc.RpcError as e:
...@@ -1216,7 +1224,7 @@ class CASRemote(): ...@@ -1216,7 +1224,7 @@ class CASRemote():
# Check whether the server supports BatchUpdateBlobs() # Check whether the server supports BatchUpdateBlobs()
self.batch_update_supported = False self.batch_update_supported = False
try: try:
request = remote_execution_pb2.BatchUpdateBlobsRequest() request = remote_execution_pb2.BatchUpdateBlobsRequest(instance_name=self.spec.instance_name)
response = self.cas.BatchUpdateBlobs(request) response = self.cas.BatchUpdateBlobs(request)
self.batch_update_supported = True self.batch_update_supported = True
except grpc.RpcError as e: except grpc.RpcError as e:
...@@ -1233,7 +1241,7 @@ class _CASBatchRead(): ...@@ -1233,7 +1241,7 @@ class _CASBatchRead():
def __init__(self, remote): def __init__(self, remote):
self._remote = remote self._remote = remote
self._max_total_size_bytes = remote.max_batch_total_size_bytes self._max_total_size_bytes = remote.max_batch_total_size_bytes
self._request = remote_execution_pb2.BatchReadBlobsRequest() self._request = remote_execution_pb2.BatchReadBlobsRequest(instance_name=remote.spec.instance_name)
self._size = 0 self._size = 0
self._sent = False self._sent = False
...@@ -1280,7 +1288,7 @@ class _CASBatchUpdate(): ...@@ -1280,7 +1288,7 @@ class _CASBatchUpdate():
def __init__(self, remote): def __init__(self, remote):
self._remote = remote self._remote = remote
self._max_total_size_bytes = remote.max_batch_total_size_bytes self._max_total_size_bytes = remote.max_batch_total_size_bytes
self._request = remote_execution_pb2.BatchUpdateBlobsRequest() self._request = remote_execution_pb2.BatchUpdateBlobsRequest(instance_name=remote.spec.instance_name)
self._size = 0 self._size = 0
self._sent = False self._sent = False
......
...@@ -61,15 +61,20 @@ class SandboxRemote(Sandbox): ...@@ -61,15 +61,20 @@ class SandboxRemote(Sandbox):
self.storage_url = config.storage_service['url'] self.storage_url = config.storage_service['url']
self.exec_url = config.exec_service['url'] self.exec_url = config.exec_service['url']
if config.action_service: if config.action_service:
self.action_url = config.action_service['url'] self.action_url = config.action_service['url']
else: else:
self.action_url = None self.action_url = None
self.server_instance = config.exec_service.get('instance', None)
self.storage_instance = config.storage_service.get('instance', None)
self.storage_remote_spec = CASRemoteSpec(self.storage_url, push=True, self.storage_remote_spec = CASRemoteSpec(self.storage_url, push=True,
server_cert=config.storage_service['server-cert'], server_cert=config.storage_service['server-cert'],
client_key=config.storage_service['client-key'], client_key=config.storage_service['client-key'],
client_cert=config.storage_service['client-cert']) client_cert=config.storage_service['client-cert'],
instance_name=self.storage_instance)
self.operation_name = None self.operation_name = None
def info(self, msg): def info(self, msg):
...@@ -104,7 +109,7 @@ class SandboxRemote(Sandbox): ...@@ -104,7 +109,7 @@ class SandboxRemote(Sandbox):
remote_exec_storage_config = require_node(remote_config, 'storage-service') remote_exec_storage_config = require_node(remote_config, 'storage-service')
remote_exec_action_config = remote_config.get('action-cache-service') remote_exec_action_config = remote_config.get('action-cache-service')
_yaml.node_validate(remote_exec_service_config, ['url']) _yaml.node_validate(remote_exec_service_config, ['url', 'instance'])
_yaml.node_validate(remote_exec_storage_config, ['url'] + tls_keys) _yaml.node_validate(remote_exec_storage_config, ['url'] + tls_keys)
if remote_exec_action_config: if remote_exec_action_config:
_yaml.node_validate(remote_exec_action_config, ['url']) _yaml.node_validate(remote_exec_action_config, ['url'])
...@@ -142,7 +147,8 @@ class SandboxRemote(Sandbox): ...@@ -142,7 +147,8 @@ class SandboxRemote(Sandbox):
# Try to create a communication channel to the BuildGrid server. # Try to create a communication channel to the BuildGrid server.
stub = remote_execution_pb2_grpc.ExecutionStub(channel) stub = remote_execution_pb2_grpc.ExecutionStub(channel)
request = remote_execution_pb2.ExecuteRequest(action_digest=action_digest, request = remote_execution_pb2.ExecuteRequest(instance_name=self.server_instance,
action_digest=action_digest,
skip_cache_lookup=False) skip_cache_lookup=False)
def __run_remote_command(stub, execute_request=None, running_operation=None): def __run_remote_command(stub, execute_request=None, running_operation=None):
......