[pyAgrum] regenerating files+improvement in sphinx doc

parent 6bacdea0
......@@ -68,7 +68,7 @@ dag : pyAgrum.DAG
an initial DAG structure
"
%feature("docstring") gum::learning::BNLearner::useEM
%feature("docstring") gum::learning::genericBNLearner::useEM
"
Indicates if we use EM for parameter learning.
......@@ -129,7 +129,7 @@ list
"
%feature("docstring") gum::learning::BNLearner::setSliceOrder
%feature("docstring") gum::learning::genericBNLearner::setSliceOrder
"
Set a partial order on the nodes.
......@@ -165,7 +165,7 @@ weight : double
Indicate that we wish to use a greedy hill climbing algorithm.
"
%feature("docstring") gum::learning::BNLearner::useK2
%feature("docstring") gum::learning::genericBNLearner::useK2
"
Indicate that we wish to use K2.
......@@ -175,7 +175,7 @@ order : list
a list of ids
"
%feature("docstring") gum::learning::BNLearner::useLocalSearchWithTabuList
%feature("docstring") gum::learning::genericBNLearner::useLocalSearchWithTabuList
"
Indicate that we wish to use a local search with tabu list
......@@ -183,10 +183,21 @@ Parameters
----------
tabu_size : int
The size of the tabu list
nb_decrease : int
nb_decrease : int
The max number of changes decreasing the score consecutively that we allow to apply
"
%feature("docstring") gum::learning::genericBNLearner::hasMissingValues
"
Indicates wether there are missing values in the database.
Returns
-------
bool
True if there are some missing values in the database.
"
%feature("docstring") gum::learning::BNLearner::useNoApriori
"
Use no apriori.
......@@ -223,7 +234,7 @@ Indicate that we wish to use a Log2Likelihood score.
"
%feature("docstring") gum::learning::BNLearner::idFromName
%feature("docstring") gum::learning::genericBNLearner::idFromName
"
Parameters
----------
......@@ -233,7 +244,7 @@ var_names : str
Returns
-------
int
the node id corresponding to a variable name
the column id corresponding to a variable name
Raises
------
......@@ -241,9 +252,9 @@ gum.MissingVariableInDatabase
If a variable of the BN is not found in the database.
"
%feature("docstring") gum::learning::BNLearner::learnDAG
%feature("docstring") gum::learning::genericBNLearner::learnDAG
"
learn a structure from a file (must have read the db before)
learn a structure from a file
Returns
-------
......@@ -252,7 +263,7 @@ pyAgrum.DAG
"
%feature("docstring") gum::learning::BNLearner::eraseForbiddenArc
%feature("docstring") gum::learning::genericBNLearner::eraseForbiddenArc
"
Allow the arc in parameter to be added if necessary.
......@@ -270,7 +281,7 @@ tail :
a variable's name (str)
"
%feature("docstring") gum::learning::BNLearner::eraseMandatoryArc
%feature("docstring") gum::learning::genericBNLearner::eraseMandatoryArc
"
Parameters
----------
......@@ -286,7 +297,7 @@ tail :
a variable's name (str)
"
%feature("docstring") gum::learning::BNLearner::addForbiddenArc
%feature("docstring") gum::learning::genericBNLearner::addForbiddenArc
"
The arc in parameters won't be added.
......@@ -304,7 +315,7 @@ tail :
a variable's name (str)
"
%feature("docstring") gum::learning::BNLearner::addMandatoryArc
%feature("docstring") gum::learning::genericBNLearner::addMandatoryArc
"
Allow to add prior structural knowledge.
......@@ -336,7 +347,7 @@ vector<pos,size>
the number of modalities of the database's variables.
"
%feature("docstring") gum::learning::BNLearner::nameFromId
%feature("docstring") gum::learning::genericBNLearner::nameFromId
"
Parameters
----------
......@@ -349,7 +360,7 @@ str
the variable's name
"
%feature("docstring") gum::learning::BNLearner::names
%feature("docstring") gum::learning::genericBNLearner::names
"
Returns
-------
......@@ -367,14 +378,27 @@ max_indegree : int
"
%feature("docstring") gum::learning::BNLearner::setAprioriWeight
%feature("docstring") gum::learning::genericBNLearner::setAprioriWeight
"
Set the weigth of the prior
Parameters
----------
weight : double
the apriori weight
"
%feature("docstring") gum::learning::genericBNLearner::setDatabaseWeight
"
Set the database weight.
Parameters
----------
weight : double
the database weight
"
%feature("docstring") gum::learning::BNLearner::chi2
"
chi2 computes the chi2 statistic and pvalue for two columns, given a list of other columns.
......@@ -397,9 +421,10 @@ statistic,pvalue
the chi2 statistic and the associated p-value as a Tuple
"
%feature("docstring") gum::learning::BNLearner::LL
%feature("docstring") gum::learning::genericBNLearner::logLikelihood
"
LL computes the log-likelihood for the columns in vars, given the columns in the list knowing (optional)
logLikelihood computes the log-likelihood for the columns in vars, given the columns in the list knowing (optional)
Parameters
----------
......@@ -409,14 +434,17 @@ vars: List[str]
knowing : List[str]
the (optional) list of names of conditioning columns
Returns
-------
double
the log-likelihood (base 2)
"
%feature("docstring") gum::learning::BNLearner::nbRows
%feature("docstring") gum::learning::genericBNLearner::nbRows
"
Return the number of row in the database
Returns
-------
int
......@@ -424,8 +452,11 @@ int
"
%feature("docstring") gum::learning::BNLearner::nbCols
%feature("docstring") gum::learning::genericBNLearner::nbCols
"
Return the nimber of columns in the database
Returns
-------
int
......
Inference
---------
Inference is the process that consists in computing new probabilistc information from a Bayesian network and some evidence. aGrUM/pyAgrum mainly focus and the computation of (joint) posterior for some variables of the Bayesian networks given soft or hard evidence that are the form of likelihoods on some variables.
Inference is a hard task (NP-complete). aGrUM/pyAgrum implements exact inference but also approximated inference that can converge slowly and (even) not exactly but thant can in many cases be useful for applications.
Exact Inference
---------------
Lazy Propagation
~~~~~~~~~~~~~~~~
Lazy Propagation is the main exact inference for classical Bayesian networks in aGrUM/pyAgrum.
.. autoclass:: pyAgrum.LazyPropagation
:inherited-members:
:exclude-members: setFindBarrenNodesType, setRelevantPotentialsFinderType, setTriangulation
Shafer Shenoy Inference
~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.ShaferShenoyInference
:exclude-members: setFindBarrenNodesType, setRelevantPotentialsFinderType, setTriangulation
Variable Elimination
~~~~~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.VariableElimination
:exclude-members: setFindBarrenNodesType, setRelevantPotentialsFinderType, setTriangulation
Approximated Inference
----------------------
Loopy Belief Propagation
~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.LoopyBeliefPropagation
:exclude-members: asIApproximationSchemeConfiguration
Sampling
~~~~~~~~
Gibbs Sampling
++++++++++++++
.. autoclass:: pyAgrum.GibbsSampling
:exclude-members: asIApproximationSchemeConfiguration
Monte Carlo Sampling
++++++++++++++++++++
.. autoclass:: pyAgrum.MonteCarloSampling
:exclude-members: asIApproximationSchemeConfiguration
Weighted Sampling
+++++++++++++++++
.. autoclass:: pyAgrum.WeightedSampling
:exclude-members: asIApproximationSchemeConfiguration
Importance Sampling
+++++++++++++++++++
.. autoclass:: pyAgrum.ImportanceSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy sampling
~~~~~~~~~~~~~~
Loopy Gibbs Sampling
++++++++++++++++++++
.. autoclass:: pyAgrum.LoopyGibbsSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy Monte Carlo Sampling
++++++++++++++++++++++++++
.. autoclass:: pyAgrum.LoopyMonteCarloSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy Weighted Sampling
+++++++++++++++++++++++
.. autoclass:: pyAgrum.LoopyWeightedSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy Importance Sampling
+++++++++++++++++++++++++
.. autoclass:: pyAgrum.LoopyImportanceSampling
:exclude-members: asIApproximationSchemeConfiguration
Learning
--------
pyAgrum encloses all the learning processes for Bayesian network in a simple class BNLearner. This class gives access directly to the complete learning algorithm and theirs parameters (such as prior, scores, constraints, etc.) but also proposes low-level functions that eases the work on developping new learning algorithms (for instance, compute chi2 or conditioanl likelihood on the database, etc.).
.. autoclass:: pyAgrum.BNLearner
:exclude-members: asIApproximationSchemeConfiguration, thisown
Model
-----
.. autoclass:: pyAgrum.BayesNet
:exclude-members: setProperty, property, propertyWithDefault
Bayesian Network
================
The Bayesian Network is the main object of pyAgrum. A Bayesian network is a probabilistic graphical model. It represents a joint distribution over a set of random variables. In pyAgrum, the variables are (for now) only discrete. A Bayesian network uses a directed acyclic graph (DAG) to represent conditional indepencies in the joint distribution. These conditional indepencies allow to factorize the joint distribution, thereby allowing to compactly represent very large ones. Moreover, inference algorithms can also use this graph to speed up the computations. Finally, the Bayesian networks can be learnt from data.
Model
-----
.. autoclass:: pyAgrum.BayesNet
:exclude-members: setProperty, property, propertyWithDefault
Exact Inference
---------------
Lazy Propagation
~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.LazyPropagation
:inherited-members:
:exclude-members: setFindBarrenNodesType, setRelevantPotentialsFinderType, setTriangulation
Shafer Shenoy Inference
~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.ShaferShenoyInference
:exclude-members: setFindBarrenNodesType, setRelevantPotentialsFinderType, setTriangulation
Variable Elimination
~~~~~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.VariableElimination
:exclude-members: setFindBarrenNodesType, setRelevantPotentialsFinderType, setTriangulation
Approximate Inference
---------------------
Loopy Belief Propagation
~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: pyAgrum.LoopyBeliefPropagation
:exclude-members: asIApproximationSchemeConfiguration
Sampling
~~~~~~~~
.. figure:: _static/causal.png
:align: center
:alt: Causality in pyAgrum
Gibbs Sampling
++++++++++++++
.. autoclass:: pyAgrum.GibbsSampling
:exclude-members: asIApproximationSchemeConfiguration
Monte Carlo Sampling
++++++++++++++++++++
.. autoclass:: pyAgrum.MonteCarloSampling
:exclude-members: asIApproximationSchemeConfiguration
Weighted Sampling
+++++++++++++++++
.. autoclass:: pyAgrum.WeightedSampling
:exclude-members: asIApproximationSchemeConfiguration
Importance Sampling
+++++++++++++++++++
.. autoclass:: pyAgrum.ImportanceSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy sampling
~~~~~~~~~~~~~~
Loopy Gibbs Sampling
++++++++++++++++++++
.. autoclass:: pyAgrum.LoopyGibbsSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy Monte Carlo Sampling
++++++++++++++++++++++++++
.. autoclass:: pyAgrum.LoopyMonteCarloSampling
:exclude-members: asIApproximationSchemeConfiguration
Loopy Weighted Sampling
+++++++++++++++++++++++
The Bayesian Network is the main object of pyAgrum. A Bayesian network is a probabilistic graphical model. It represents a joint distribution over a set of random variables. In pyAgrum, the variables are (for now) only discrete. A Bayesian network uses a directed acyclic graph (DAG) to represent conditional indepencies in the joint distribution. These conditional indepencies allow to factorize the joint distribution, thereby allowing to compactly represent very large ones. Moreover, inference algorithms can also use this graph to speed up the computations. Finally, the Bayesian networks can be learnt from data.
.. autoclass:: pyAgrum.LoopyWeightedSampling
:exclude-members: asIApproximationSchemeConfiguration
.. toctree::
:maxdepth: 3
Loopy Importance Sampling
+++++++++++++++++++++++++
BNModel
.. autoclass:: pyAgrum.LoopyImportanceSampling
:exclude-members: asIApproximationSchemeConfiguration
BNInference
Learning
--------
BNLearning
.. autoclass:: pyAgrum.BNLearner
:exclude-members: asIApproximationSchemeConfiguration
......@@ -53,8 +53,8 @@ Causality in pyAgrum
Causal
Python helpers : pyAgrum.lib
============================
pyAgrum.lib
===========
pyAgrum.lib is a set of python tools.
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment