Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • willsalmon/buildstream
  • CumHoleZH/buildstream
  • tchaik/buildstream
  • DCotyPortfolio/buildstream
  • jesusoctavioas/buildstream
  • patrickmmartin/buildstream
  • franred/buildstream
  • tintou/buildstream
  • alatiera/buildstream
  • martinblanchard/buildstream
  • neverdie22042524/buildstream
  • Mattlk13/buildstream
  • PServers/buildstream
  • phamnghia610909/buildstream
  • chiaratolentino/buildstream
  • eysz7-x-x/buildstream
  • kerrick1/buildstream
  • matthew-yates/buildstream
  • twofeathers/buildstream
  • mhadjimichael/buildstream
  • pointswaves/buildstream
  • Mr.JackWilson/buildstream
  • Tw3akG33k/buildstream
  • AlexFazakas/buildstream
  • eruidfkiy/buildstream
  • clamotion2/buildstream
  • nanonyme/buildstream
  • wickyjaaa/buildstream
  • nmanchev/buildstream
  • bojorquez.ja/buildstream
  • mostynb/buildstream
  • highpit74/buildstream
  • Demo112/buildstream
  • ba2014sheer/buildstream
  • tonimadrino/buildstream
  • usuario2o/buildstream
  • Angelika123456/buildstream
  • neo355/buildstream
  • corentin-ferlay/buildstream
  • coldtom/buildstream
  • wifitvbox81/buildstream
  • 358253885/buildstream
  • seanborg/buildstream
  • SotK/buildstream
  • DouglasWinship/buildstream
  • karansthr97/buildstream
  • louib/buildstream
  • bwh-ct/buildstream
  • robjh/buildstream
  • we88c0de/buildstream
  • zhengxian5555/buildstream
51 results
Show changes
Commits on Source (38)
Showing
with 494 additions and 115 deletions
......@@ -15,6 +15,7 @@ tmp
.coverage
.coverage.*
.cache
.pytest_cache/
*.bst/
# Pycache, in case buildstream is ran directly from within the source
......
......@@ -23,6 +23,11 @@ a reasonable timeframe for identifying these.
Patch submissions
-----------------
If you want to submit a patch, do ask for developer permissions on our
IRC channel first (GitLab's button also works, but you may need to
shout about it - we often overlook this) - for CI reasons, it's much
easier if patches are in branches of the main repository.
Branches must be submitted as merge requests in gitlab. If the branch
fixes an issue or is related to any issues, these issues must be mentioned
in the merge request or preferably the commit messages themselves.
......
=================
buildstream 1.3.1
buildstream 1.1.5
=================
o Add a `--tar` option to `bst checkout` which allows a tarball to be
created from the artifact contents.
o Fetching and tracking will consult mirrors defined in project config,
and the preferred mirror to fetch from can be defined in the command
line or user config.
o Added new `remote` source plugin for downloading file blobs
=================
buildstream 1.1.4
=================
......
......@@ -25,7 +25,7 @@ BuildStream offers the following advantages:
* **Declarative build instructions/definitions**
BuildStream provides a a flexible and extensible framework for the modelling
BuildStream provides a flexible and extensible framework for the modelling
of software build pipelines in a declarative YAML format, which allows you to
manipulate filesystem data in a controlled, reproducible sandboxed environment.
......@@ -61,25 +61,29 @@ How does BuildStream work?
==========================
BuildStream operates on a set of YAML files (.bst files), as follows:
* loads the YAML files which describe the target(s) and all dependencies
* evaluates the version information and build instructions to calculate a build
* Loads the YAML files which describe the target(s) and all dependencies.
* Evaluates the version information and build instructions to calculate a build
graph for the target(s) and all dependencies and unique cache-keys for each
element
* retrieves elements from cache if they are already built, or builds them in a
sandboxed environment using the instructions declared in the .bst files
* transforms/configures and/or deploys the resulting target(s) based on the
element.
* Retrieves previously built elements (artifacts) from a local/remote cache, or
builds the elements in a sandboxed environment using the instructions declared
in the .bst files.
* Transforms/configures and/or deploys the resulting target(s) based on the
instructions declared in the .bst files.
How can I get started?
======================
The easiest way to get started is to explore some existing .bst files, for example:
To start using BuildStream, first,
`install <https://buildstream.gitlab.io/buildstream/main_install.html>`_
BuildStream onto your machine and then follow our
`tutorial <https://buildstream.gitlab.io/buildstream/using_tutorial.html>`_.
We also recommend exploring some existing BuildStream projects:
* https://gitlab.gnome.org/GNOME/gnome-build-meta/
* https://gitlab.com/freedesktop-sdk/freedesktop-sdk
* https://gitlab.com/baserock/definitions
* https://gitlab.com/BuildStream/buildstream-examples/tree/master/build-x86image
* https://gitlab.com/BuildStream/buildstream-examples/tree/master/netsurf-flatpak
If you have any questions please ask on our `#buildstream <irc://irc.gnome.org/buildstream>`_ channel in `irc.gnome.org <irc://irc.gnome.org>`_
......@@ -29,7 +29,7 @@ if "_BST_COMPLETION" not in os.environ:
from .utils import UtilError, ProgramNotFoundError
from .sandbox import Sandbox, SandboxFlags
from .plugin import Plugin
from .source import Source, SourceError, Consistency
from .source import Source, SourceError, Consistency, SourceFetcher
from .element import Element, ElementError, Scope
from .buildelement import BuildElement
from .scriptelement import ScriptElement
......@@ -197,29 +197,55 @@ class Context():
"\nValid values are, for example: 800M 10G 1T 50%\n"
.format(str(e))) from e
# If we are asked not to set a quota, we set it to the maximum
# disk space available minus a headroom of 2GB, such that we
# at least try to avoid raising Exceptions.
# Headroom intended to give BuildStream a bit of leeway.
# This acts as the minimum size of cache_quota and also
# is taken from the user requested cache_quota.
#
# Of course, we might still end up running out during a build
# if we end up writing more than 2G, but hey, this stuff is
# already really fuzzy.
#
if cache_quota is None:
stat = os.statvfs(artifactdir_volume)
# Again, the artifact directory may not yet have been
# created
if not os.path.exists(self.artifactdir):
cache_size = 0
else:
cache_size = utils._get_dir_size(self.artifactdir)
cache_quota = cache_size + stat.f_bsize * stat.f_bavail
if 'BST_TEST_SUITE' in os.environ:
headroom = 0
else:
headroom = 2e9
stat = os.statvfs(artifactdir_volume)
available_space = (stat.f_bsize * stat.f_bavail)
# Again, the artifact directory may not yet have been created yet
#
if not os.path.exists(self.artifactdir):
cache_size = 0
else:
cache_size = utils._get_dir_size(self.artifactdir)
# Ensure system has enough storage for the cache_quota
#
# If cache_quota is none, set it to the maximum it could possibly be.
#
# Also check that cache_quota is atleast as large as our headroom.
#
if cache_quota is None: # Infinity, set to max system storage
cache_quota = cache_size + available_space
if cache_quota < headroom: # Check minimum
raise LoadError(LoadErrorReason.INVALID_DATA,
"Invalid cache quota ({}): ".format(utils._pretty_size(cache_quota)) +
"BuildStream requires a minimum cache quota of 2G.")
elif cache_quota > cache_size + available_space: # Check maximum
raise LoadError(LoadErrorReason.INVALID_DATA,
("Your system does not have enough available " +
"space to support the cache quota specified.\n" +
"You currently have:\n" +
"- {used} of cache in use at {local_cache_path}\n" +
"- {available} of available system storage").format(
used=utils._pretty_size(cache_size),
local_cache_path=self.artifactdir,
available=utils._pretty_size(available_space)))
# Place a slight headroom (2e9 (2GB) on the cache_quota) into
# cache_quota to try and avoid exceptions.
#
# Of course, we might still end up running out during a build
# if we end up writing more than 2G, but hey, this stuff is
# already really fuzzy.
#
self.cache_quota = cache_quota - headroom
self.cache_lower_threshold = self.cache_quota / 2
......@@ -259,7 +285,7 @@ class Context():
# Shallow validation of overrides, parts of buildstream which rely
# on the overrides are expected to validate elsewhere.
for _, overrides in _yaml.node_items(self._project_overrides):
_yaml.node_validate(overrides, ['artifacts', 'options', 'strict'])
_yaml.node_validate(overrides, ['artifacts', 'options', 'strict', 'default-mirror'])
profile_end(Topics.LOAD_CONTEXT, 'load')
......
......@@ -202,7 +202,8 @@ class App():
# Load the Project
#
try:
self.project = Project(directory, self.context, cli_options=self._main_options['option'])
self.project = Project(directory, self.context, cli_options=self._main_options['option'],
default_mirror=self._main_options.get('default_mirror'))
except LoadError as e:
# Let's automatically start a `bst init` session in this case
......
......@@ -217,6 +217,8 @@ def print_version(ctx, param, value):
help="Elements must be rebuilt when their dependencies have changed")
@click.option('--option', '-o', type=click.Tuple([str, str]), multiple=True, metavar='OPTION VALUE',
help="Specify a project option")
@click.option('--default-mirror', default=None,
help="The mirror to fetch from first, before attempting other mirrors")
@click.pass_context
def cli(context, **kwargs):
"""Build and manipulate BuildStream projects
......
......@@ -522,12 +522,15 @@ class LogLine(Widget):
text += "\n\n"
if self._failure_messages:
text += self.content_profile.fmt("Failure Summary\n", bold=True)
values = OrderedDict()
for element, messages in sorted(self._failure_messages.items(), key=lambda x: x[0].name):
values[element.name] = ''.join(self._render(v) for v in messages)
text += self._format_values(values, style_value=False)
for queue in stream.queues:
if any(el.name == element.name for el in queue.failed_elements):
values[element.name] = ''.join(self._render(v) for v in messages)
if values:
text += self.content_profile.fmt("Failure Summary\n", bold=True)
text += self._format_values(values, style_value=False)
text += self.content_profile.fmt("Pipeline Summary\n", bold=True)
values = OrderedDict()
......
......@@ -513,7 +513,7 @@ class Loader():
if self._fetch_subprojects:
if ticker:
ticker(filename, 'Fetching subproject from {} source'.format(source.get_kind()))
source.fetch()
source._fetch()
else:
detail = "Try fetching the project with `bst fetch {}`".format(filename)
raise LoadError(LoadErrorReason.SUBPROJECT_FETCH_NEEDED,
......
......@@ -19,7 +19,7 @@
import os
import multiprocessing # for cpu_count()
from collections import Mapping
from collections import Mapping, OrderedDict
from pluginbase import PluginBase
from . import utils
from . import _cachekey
......@@ -35,9 +35,6 @@ from ._projectrefs import ProjectRefs, ProjectRefStorage
from ._versions import BST_FORMAT_VERSION
# The separator we use for user specified aliases
_ALIAS_SEPARATOR = ':'
# Project Configuration file
_PROJECT_CONF_FILE = 'project.conf'
......@@ -70,7 +67,7 @@ class HostMount():
#
class Project():
def __init__(self, directory, context, *, junction=None, cli_options=None):
def __init__(self, directory, context, *, junction=None, cli_options=None, default_mirror=None):
# The project name
self.name = None
......@@ -94,6 +91,8 @@ class Project():
self.base_env_nocache = None # The base nocache mask (list) for the environment
self.element_overrides = {} # Element specific configurations
self.source_overrides = {} # Source specific configurations
self.mirrors = OrderedDict() # contains dicts of alias-mappings to URIs.
self.default_mirror = default_mirror # The name of the preferred mirror.
#
# Private Members
......@@ -133,8 +132,8 @@ class Project():
# fully qualified urls based on the shorthand which is allowed
# to be specified in the YAML
def translate_url(self, url):
if url and _ALIAS_SEPARATOR in url:
url_alias, url_body = url.split(_ALIAS_SEPARATOR, 1)
if url and utils._ALIAS_SEPARATOR in url:
url_alias, url_body = url.split(utils._ALIAS_SEPARATOR, 1)
alias_url = self._aliases.get(url_alias)
if alias_url:
url = alias_url + url_body
......@@ -202,6 +201,36 @@ class Project():
self._assert_plugin_format(source, version)
return source
# get_alias_uri()
#
# Returns the URI for a given alias, if it exists
#
# Args:
# alias (str): The alias.
#
# Returns:
# str: The URI for the given alias; or None: if there is no URI for
# that alias.
def get_alias_uri(self, alias):
return self._aliases.get(alias)
# get_alias_uris()
#
# Returns a list of every URI to replace an alias with
def get_alias_uris(self, alias):
if not alias or alias not in self._aliases:
return [None]
mirror_list = []
for key, alias_mapping in self.mirrors.items():
if alias in alias_mapping:
if key == self.default_mirror:
mirror_list = alias_mapping[alias] + mirror_list
else:
mirror_list += alias_mapping[alias]
mirror_list.append(self._aliases[alias])
return mirror_list
# _load():
#
# Loads the project configuration file in the project directory.
......@@ -249,7 +278,7 @@ class Project():
'aliases', 'name',
'artifacts', 'options',
'fail-on-overlap', 'shell',
'ref-storage', 'sandbox'
'ref-storage', 'sandbox', 'mirrors',
])
# The project name, element path and option declarations
......@@ -290,6 +319,10 @@ class Project():
#
self.options.process_node(config)
# Override default_mirror if not set by command-line
if not self.default_mirror:
self.default_mirror = _yaml.node_get(overrides, str, 'default-mirror', default_value=None)
#
# Now all YAML composition is done, from here on we just load
# the values from our loaded configuration dictionary.
......@@ -414,6 +447,21 @@ class Project():
self._shell_host_files.append(mount)
mirrors = _yaml.node_get(config, list, 'mirrors', default_value=[])
for mirror in mirrors:
allowed_mirror_fields = [
'name', 'aliases'
]
_yaml.node_validate(mirror, allowed_mirror_fields)
mirror_name = _yaml.node_get(mirror, str, 'name')
alias_mappings = {}
for alias_mapping, uris in _yaml.node_items(mirror['aliases']):
assert isinstance(uris, list)
alias_mappings[alias_mapping] = list(uris)
self.mirrors[mirror_name] = alias_mappings
if not self.default_mirror:
self.default_mirror = mirror_name
# _assert_plugin_format()
#
# Helper to raise a PluginError if the loaded plugin is of a lesser version then
......
......@@ -23,7 +23,7 @@
# This version is bumped whenever enhancements are made
# to the `project.conf` format or the core element format.
#
BST_FORMAT_VERSION = 10
BST_FORMAT_VERSION = 11
# The base BuildStream artifact version
......
#
# Copyright (C) 2016 Codethink Limited
# Copyright (C) 2018 Bloomberg Finance LP
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
......@@ -204,8 +205,9 @@ class BuildElement(Element):
def prepare(self, sandbox):
commands = self.__commands['configure-commands']
if commands:
for cmd in commands:
self.__run_command(sandbox, cmd, 'configure-commands')
with self.timed_activity("Running configure-commands"):
for cmd in commands:
self.__run_command(sandbox, cmd, 'configure-commands')
def generate_script(self):
script = ""
......@@ -231,13 +233,12 @@ class BuildElement(Element):
return commands
def __run_command(self, sandbox, cmd, cmd_name):
with self.timed_activity("Running {}".format(cmd_name)):
self.status("Running {}".format(cmd_name), detail=cmd)
# Note the -e switch to 'sh' means to exit with an error
# if any untested command fails.
#
exitcode = sandbox.run(['sh', '-c', '-e', cmd + '\n'],
SandboxFlags.ROOT_READ_ONLY)
if exitcode != 0:
raise ElementError("Command '{}' failed with exitcode {}".format(cmd, exitcode))
self.status("Running {}".format(cmd_name), detail=cmd)
# Note the -e switch to 'sh' means to exit with an error
# if any untested command fails.
#
exitcode = sandbox.run(['sh', '-c', '-e', cmd + '\n'],
SandboxFlags.ROOT_READ_ONLY)
if exitcode != 0:
raise ElementError("Command '{}' failed with exitcode {}".format(cmd, exitcode))
......@@ -102,7 +102,7 @@ class BzrSource(Source):
def track(self):
with self.timed_activity("Tracking {}".format(self.url),
silent_nested=True):
self._ensure_mirror()
self._ensure_mirror(skip_ref_check=True)
ret, out = self.check_output([self.host_bzr, "version-info",
"--custom", "--template={revno}",
self._get_branch_dir()],
......@@ -214,7 +214,7 @@ class BzrSource(Source):
yield repodir
self._atomic_replace_mirrordir(repodir)
def _ensure_mirror(self):
def _ensure_mirror(self, skip_ref_check=False):
with self._atomic_repodir() as repodir:
# Initialize repo if no metadata
bzr_metadata_dir = os.path.join(repodir, ".bzr")
......@@ -223,18 +223,21 @@ class BzrSource(Source):
fail="Failed to initialize bzr repository")
branch_dir = os.path.join(repodir, self.tracking)
branch_url = self.url + "/" + self.tracking
if not os.path.exists(branch_dir):
# `bzr branch` the branch if it doesn't exist
# to get the upstream code
branch_url = self.url + "/" + self.tracking
self.call([self.host_bzr, "branch", branch_url, branch_dir],
fail="Failed to branch from {} to {}".format(branch_url, branch_dir))
else:
# `bzr pull` the branch if it does exist
# to get any changes to the upstream code
self.call([self.host_bzr, "pull", "--directory={}".format(branch_dir)],
self.call([self.host_bzr, "pull", "--directory={}".format(branch_dir), branch_url],
fail="Failed to pull new changes for {}".format(branch_dir))
if not skip_ref_check and not self._check_ref():
raise SourceError("Failed to ensure ref '{}' was mirrored".format(self.ref),
reason="ref-not-mirrored")
def setup():
......
......@@ -71,6 +71,7 @@ git - stage files from a git repository
"""
import os
import errno
import re
import shutil
from collections import Mapping
......@@ -78,7 +79,7 @@ from io import StringIO
from configparser import RawConfigParser
from buildstream import Source, SourceError, Consistency
from buildstream import Source, SourceError, Consistency, SourceFetcher
from buildstream import utils
GIT_MODULES = '.gitmodules'
......@@ -88,18 +89,20 @@ GIT_MODULES = '.gitmodules'
# for the primary git source and also for each submodule it
# might have at a given time
#
class GitMirror():
class GitMirror(SourceFetcher):
def __init__(self, source, path, url, ref):
super().__init__()
self.source = source
self.path = path
self.url = source.translate_url(url)
self.url = url
self.ref = ref
self.mirror = os.path.join(source.get_mirror_directory(), utils.url_directory_name(self.url))
self.mirror = os.path.join(source.get_mirror_directory(), utils.url_directory_name(url))
self.mark_download_url(url)
# Ensures that the mirror exists
def ensure(self):
def ensure(self, alias_override=None):
# Unfortunately, git does not know how to only clone just a specific ref,
# so we have to download all of those gigs even if we only need a couple
......@@ -112,22 +115,57 @@ class GitMirror():
# system configured tmpdir is not on the same partition.
#
with self.source.tempdir() as tmpdir:
self.source.call([self.source.host_git, 'clone', '--mirror', '-n', self.url, tmpdir],
fail="Failed to clone git repository {}".format(self.url),
url = self.source.translate_url(self.url, alias_override=alias_override)
self.source.call([self.source.host_git, 'clone', '--mirror', '-n', url, tmpdir],
fail="Failed to clone git repository {}".format(url),
fail_temporarily=True)
# Attempt atomic rename into destination, this will fail if
# another process beat us to the punch
try:
shutil.move(tmpdir, self.mirror)
except (shutil.Error, OSError) as e:
raise SourceError("{}: Failed to move cloned git repository {} from '{}' to '{}'"
.format(self.source, self.url, tmpdir, self.mirror)) from e
def fetch(self):
self.source.call([self.source.host_git, 'fetch', 'origin', '--prune'],
fail="Failed to fetch from remote git repository: {}".format(self.url),
os.rename(tmpdir, self.mirror)
except OSError as e:
# When renaming and the destination repo already exists, os.rename()
# will fail with ENOTEMPTY, since an empty directory will be silently
# replaced
if e.errno == errno.ENOTEMPTY:
self.source.status("{}: Discarding duplicate clone of {}"
.format(self.source, url))
else:
raise SourceError("{}: Failed to move cloned git repository {} from '{}' to '{}': {}"
.format(self.source, url, tmpdir, self.mirror, e)) from e
def _fetch(self, alias_override=None):
url = self.source.translate_url(self.url, alias_override=alias_override)
if alias_override:
remote_name = utils.url_directory_name(alias_override)
_, remotes = self.source.check_output(
[self.source.host_git, 'remote'],
fail="Failed to retrieve list of remotes in {}".format(self.mirror),
cwd=self.mirror
)
if remote_name not in remotes:
self.source.call(
[self.source.host_git, 'remote', 'add', remote_name, url],
fail="Failed to add remote {} with url {}".format(remote_name, url),
cwd=self.mirror
)
else:
remote_name = "origin"
self.source.call([self.source.host_git, 'fetch', remote_name, '--prune'],
fail="Failed to fetch from remote git repository: {}".format(url),
fail_temporarily=True,
cwd=self.mirror)
def fetch(self, alias_override=None):
self.ensure(alias_override)
if not self.has_ref():
self._fetch(alias_override)
self.assert_ref()
def has_ref(self):
if not self.ref:
return False
......@@ -171,13 +209,14 @@ class GitMirror():
def init_workspace(self, directory):
fullpath = os.path.join(directory, self.path)
url = self.source.translate_url(self.url)
self.source.call([self.source.host_git, 'clone', '--no-checkout', self.mirror, fullpath],
fail="Failed to clone git mirror {} in directory: {}".format(self.mirror, fullpath),
fail_temporarily=True)
self.source.call([self.source.host_git, 'remote', 'set-url', 'origin', self.url],
fail='Failed to add remote origin "{}"'.format(self.url),
self.source.call([self.source.host_git, 'remote', 'set-url', 'origin', url],
fail='Failed to add remote origin "{}"'.format(url),
cwd=fullpath)
self.source.call([self.source.host_git, 'checkout', '--force', self.ref],
......@@ -277,6 +316,8 @@ class GitSource(Source):
checkout = self.node_get_member(submodule, bool, 'checkout')
self.submodule_checkout_overrides[path] = checkout
self.mark_download_url(self.original_url)
def preflight(self):
# Check if git is installed, get the binary at the same time
self.host_git = utils.get_host_tool('git')
......@@ -328,31 +369,13 @@ class GitSource(Source):
.format(self.tracking, self.mirror.url),
silent_nested=True):
self.mirror.ensure()
self.mirror.fetch()
self.mirror._fetch()
# Update self.mirror.ref and node.ref from the self.tracking branch
ret = self.mirror.latest_commit(self.tracking)
return ret
def fetch(self):
with self.timed_activity("Fetching {}".format(self.mirror.url), silent_nested=True):
# Here we are only interested in ensuring that our mirror contains
# the self.mirror.ref commit.
self.mirror.ensure()
if not self.mirror.has_ref():
self.mirror.fetch()
self.mirror.assert_ref()
# Here after performing any fetches, we need to also ensure that
# we've cached the desired refs in our mirrors of submodules.
#
self.refresh_submodules()
self.fetch_submodules()
def init_workspace(self, directory):
# XXX: may wish to refactor this as some code dupe with stage()
self.refresh_submodules()
......@@ -384,6 +407,10 @@ class GitSource(Source):
if checkout:
mirror.stage(directory)
def get_source_fetchers(self):
self.refresh_submodules()
return [self.mirror] + self.submodules
###########################################################
# Local Functions #
###########################################################
......@@ -405,6 +432,7 @@ class GitSource(Source):
# Assumes that we have our mirror and we have the ref which we point to
#
def refresh_submodules(self):
self.mirror.ensure()
submodules = []
# XXX Here we should issue a warning if either:
......@@ -426,19 +454,6 @@ class GitSource(Source):
self.submodules = submodules
# Ensures that we have mirrored git repositories for all
# the submodules existing at the given commit of the main git source.
#
# Also ensure that these mirrors have the required commits
# referred to at the given commit of the main git source.
#
def fetch_submodules(self):
for mirror in self.submodules:
mirror.ensure()
if not mirror.has_ref():
mirror.fetch()
mirror.assert_ref()
# Plugin entry point
def setup():
......
......@@ -65,6 +65,33 @@ these methods are mandatory to implement.
**Optional**: If left unimplemented, this will default to calling
:func:`Source.stage() <buildstream.source.Source.stage>`
* :func:`Source.get_source_fetchers() <buildstream.source.Source.get_source_fetchers>`
Get the objects that are used for fetching.
**Optional**: This only needs to be implemented for sources that need to
download from multiple URLs while fetching (e.g. a git repo and its
submodules). For details on how to define a SourceFetcher, see
:ref:`SourceFetcher <core_source_fetcher>`.
.. _core_source_fetcher:
SourceFetcher - Object for fetching individual URLs
===================================================
Abstract Methods
----------------
SourceFetchers expose the following abstract methods. Unless explicitly
mentioned, these methods are mandatory to implement.
* :func:`SourceFetcher.fetch() <buildstream.source.SourceFetcher.fetch>`
Fetches the URL associated with this SourceFetcher, optionally taking an
alias override.
"""
import os
......@@ -114,6 +141,63 @@ class SourceError(BstError):
super().__init__(message, detail=detail, domain=ErrorDomain.SOURCE, reason=reason, temporary=temporary)
class SourceFetcher():
"""SourceFetcher()
This interface exists so that a source that downloads from multiple
places (e.g. a git source with submodules) has a consistent interface for
fetching and substituting aliases.
*Since: 1.2*
"""
def __init__(self):
self.__alias = None
#############################################################
# Abstract Methods #
#############################################################
def fetch(self, alias_override=None):
"""Fetch remote sources and mirror them locally, ensuring at least
that the specific reference is cached locally.
Args:
alias_override (str): The alias to use instead of the default one
defined by the :ref:`aliases <project_source_aliases>` field
in the project's config.
Raises:
:class:`.SourceError`
Implementors should raise :class:`.SourceError` if the there is some
network error or if the source reference could not be matched.
"""
raise ImplError("Source fetcher '{}' does not implement fetch()".format(type(self)))
#############################################################
# Public Methods #
#############################################################
def mark_download_url(self, url):
"""Identifies the URL that this SourceFetcher uses to download
This must be called during the fetcher's initialization
Args:
url (str): The url used to download.
"""
# Not guaranteed to be a valid alias yet.
# Ensuring it's a valid alias currently happens in Project.get_alias_uris
alias, _ = url.split(utils._ALIAS_SEPARATOR, 1)
self.__alias = alias
#############################################################
# Private Methods used in BuildStream #
#############################################################
# Returns the alias used by this fetcher
def _get_alias(self):
return self.__alias
class Source(Plugin):
"""Source()
......@@ -125,7 +209,7 @@ class Source(Plugin):
__defaults = {} # The defaults from the project
__defaults_set = False # Flag, in case there are not defaults at all
def __init__(self, context, project, meta):
def __init__(self, context, project, meta, *, alias_override=None):
provenance = _yaml.node_get_provenance(meta.config)
super().__init__("{}-{}".format(meta.element_name, meta.element_index),
context, project, provenance, "source")
......@@ -135,6 +219,11 @@ class Source(Plugin):
self.__element_kind = meta.element_kind # The kind of the element owning this source
self.__directory = meta.directory # Staging relative directory
self.__consistency = Consistency.INCONSISTENT # Cached consistency state
self.__alias_override = alias_override # Tuple of alias and its override to use instead
self.__expected_alias = None # A hacky way to store the first alias used
# FIXME: Reconstruct a MetaSource from a Source instead of storing it.
self.__meta = meta # MetaSource stored so we can copy this source later.
# Collect the composited element configuration and
# ask the element to configure itself.
......@@ -284,6 +373,36 @@ class Source(Plugin):
"""
self.stage(directory)
def mark_download_url(self, url):
"""Identifies the URL that this Source uses to download
This must be called during :func:`~buildstream.plugin.Plugin.configure` if
:func:`~buildstream.source.Source.translate_url` is not called.
Args:
url (str): The url used to download
*Since: 1.2*
"""
alias, _ = url.split(utils._ALIAS_SEPARATOR, 1)
self.__expected_alias = alias
def get_source_fetchers(self):
"""Get the objects that are used for fetching
If this source doesn't download from multiple URLs,
returning None and falling back on the default behaviour
is recommended.
Returns:
list: A list of SourceFetchers. If SourceFetchers are not supported,
this will be an empty list.
*Since: 1.2*
"""
return []
#############################################################
# Public Methods #
#############################################################
......@@ -300,18 +419,42 @@ class Source(Plugin):
os.makedirs(directory, exist_ok=True)
return directory
def translate_url(self, url):
def translate_url(self, url, *, alias_override=None):
"""Translates the given url which may be specified with an alias
into a fully qualified url.
Args:
url (str): A url, which may be using an alias
alias_override (str): Optionally, an URI to override the alias with. (*Since: 1.2*)
Returns:
str: The fully qualified url, with aliases resolved
"""
project = self._get_project()
return project.translate_url(url)
# Alias overriding can happen explicitly (by command-line) or
# implicitly (the Source being constructed with an __alias_override).
if alias_override or self.__alias_override:
url_alias, url_body = url.split(utils._ALIAS_SEPARATOR, 1)
if url_alias:
if alias_override:
url = alias_override + url_body
else:
# Implicit alias overrides may only be done for one
# specific alias, so that sources that fetch from multiple
# URLs and use different aliases default to only overriding
# one alias, rather than getting confused.
override_alias = self.__alias_override[0]
override_url = self.__alias_override[1]
if url_alias == override_alias:
url = override_url + url_body
return url
else:
# Sneakily store the alias if it hasn't already been stored
if not self.__expected_alias and url and utils._ALIAS_SEPARATOR in url:
url_alias, _ = url.split(utils._ALIAS_SEPARATOR, 1)
self.__expected_alias = url_alias
project = self._get_project()
return project.translate_url(url)
def get_project_directory(self):
"""Fetch the project base directory
......@@ -375,7 +518,45 @@ class Source(Plugin):
# Wrapper function around plugin provided fetch method
#
def _fetch(self):
self.fetch()
project = self._get_project()
source_fetchers = self.get_source_fetchers()
if source_fetchers:
for fetcher in source_fetchers:
alias = fetcher._get_alias()
success = False
for uri in project.get_alias_uris(alias):
try:
fetcher.fetch(uri)
# FIXME: Need to consider temporary vs. permanent failures,
# and how this works with retries.
except BstError as e:
last_error = e
continue
success = True
break
if not success:
raise last_error
else:
alias = self._get_alias()
if not project.mirrors or not alias:
self.fetch()
return
context = self._get_context()
source_kind = type(self)
for uri in project.get_alias_uris(alias):
new_source = source_kind(context, project, self.__meta,
alias_override=(alias, uri))
new_source._preflight()
try:
new_source.fetch()
# FIXME: Need to consider temporary vs. permanent failures,
# and how this works with retries.
except BstError as e:
last_error = e
continue
return
raise last_error
# Wrapper for stage() api which gives the source
# plugin a fully constructed path considering the
......@@ -582,7 +763,7 @@ class Source(Plugin):
# Wrapper for track()
#
def _track(self):
new_ref = self.track()
new_ref = self.__do_track()
current_ref = self.get_ref()
if new_ref is None:
......@@ -594,10 +775,48 @@ class Source(Plugin):
return new_ref
# Returns the alias if it's defined in the project
def _get_alias(self):
alias = self.__expected_alias
project = self._get_project()
if project.get_alias_uri(alias):
# The alias must already be defined in the project's aliases
# otherwise http://foo gets treated like it contains an alias
return alias
else:
return None
#############################################################
# Local Private Methods #
#############################################################
# Tries to call track for every mirror, stopping once it succeeds
def __do_track(self):
project = self._get_project()
# If there are no mirrors, or no aliases to replace, there's nothing to do here.
alias = self._get_alias()
if not project.mirrors or not alias:
return self.track()
context = self._get_context()
source_kind = type(self)
# NOTE: We are assuming here that tracking only requires substituting the
# first alias used
for uri in reversed(project.get_alias_uris(alias)):
new_source = source_kind(context, project, self.__meta,
alias_override=(alias, uri))
new_source._preflight()
try:
ref = new_source.track()
# FIXME: Need to consider temporary vs. permanent failures,
# and how this works with retries.
except BstError as e:
last_error = e
continue
return ref
raise last_error
# Ensures a fully constructed path and returns it
def __ensure_directory(self, directory):
......
......@@ -42,6 +42,10 @@ from . import _signals
from ._exceptions import BstError, ErrorDomain
# The separator we use for user specified aliases
_ALIAS_SEPARATOR = ':'
class UtilError(BstError):
"""Raised by utility functions when system calls fail.
......@@ -608,6 +612,27 @@ def _parse_size(size, volume):
return int(num) * 1024**units.index(unit)
# _pretty_size()
#
# Converts a number of bytes into a string representation in KB, MB, GB, TB
# represented as K, M, G, T etc.
#
# Args:
# size (int): The size to convert in bytes.
# dec_places (int): The number of decimal places to output to.
#
# Returns:
# (str): The string representation of the number of bytes in the largest
def _pretty_size(size, dec_places=0):
psize = size
unit = 'B'
for unit in ('B', 'K', 'M', 'G', 'T'):
if psize < 1024:
break
else:
psize /= 1024
return "{size:g}{unit}".format(size=round(psize, dec_places), unit=unit)
# A sentinel to be used as a default argument for functions that need
# to distinguish between a kwarg set to None and an unset kwarg.
_sentinel = object()
......
......@@ -118,8 +118,11 @@ html devhelp: templates sessions
$(SPHINXBUILD) -b $@ $(ALLSPHINXOPTS) "$(BUILDDIR)/$@" \
$(wildcard source/*.rst) \
$(wildcard source/tutorial/*.rst) \
$(wildcard source/advanced-features/*.rst) \
$(wildcard source/examples/*.rst) \
$(wildcard source/elements/*.rst) \
$(wildcard source/sources/*.rst)
@echo
@echo "Build of $@ finished, output: $(CURDIR)/$(BUILDDIR)/$@"
# Makefile for Sphinx documentation
#
kind: import
sources:
- kind: local
path: files/callHello.sh
depends:
- filename: hello.bst
junction: hello-junction.bst
kind: junction
# Specify the source of the BuildStream project
# We are going to use the autotools examples distributed with BuildStream in the
# doc/examples/autotools directory
sources:
- kind: local
path: ../autotools