GitLab's annual major release is around the corner. Along with a lot of new and exciting features, there will be a few breaking changes. Learn more here.

README.md 4.7 KB
Newer Older
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
1 2 3 4 5 6 7 8 9
# WSaO

Walsh Surrogate-assisted Optimization framework.

## What is WSaO ?

WSaO is the first surrogate-assisted optimization algorithm based on the
mathematical foundations of discrete Walsh functions, combined with the powerful
grey-box optimization techniques in order to solve pseudo-boolean optimization
Florian Leprêtre's avatar
Florian Leprêtre committed
10 11 12
problems.

For more details, please see [Walsh functions as surrogate model for pseudo-boolean optimization problems](https://hal.archives-ouvertes.fr/hal-02190141/document), published in The Genetic and Evolutionary Computation Conference (GECCO), 2019.
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
13 14 15

## Authors

Florian Leprêtre's avatar
Florian Leprêtre committed
16 17 18 19
[Florian Leprêtre](https://www-lisic.univ-littoral.fr/~flepretre/),
[Cyril Fonlupt](https://www-lisic.univ-littoral.fr/~fonlupt/),
[Sébastien Verel](https://www-lisic.univ-littoral.fr/~verel/),
[Virginie Marion](http://www-lisic.univ-littoral.fr/~poty/)
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39

Université du Littoral Côte d'Opale, Calais, France

## Setup

The use of a python [virtual environment](https://virtualenv.pypa.io/en/latest/) is recommended.

```sh
git clone https://gitlab.com/florianlprt/wsao.git
cd wsao
# activate your python virtual environment, then
pip install pkgconfig setuptools
pip install . --upgrade
```

## Usage

### Command line

```sh
40
cli_restart.py <problem> <model> <surrogate> <algorithm> <run> <analysis>
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75
```

### Parameters

**`problem`:** Set up the pseudo-boolean problem
- `nkl`: NK-Landscapes
    - `size`: number of variables
    - `k`: number of epistasic interactions per variable
- `ubqp`: Unconstrained Binary Quadratic Programming
    - `size`: number of variables
    - `d`: density of zeros in the *Q* matrix

**`model`:** Set up the regression model (from [scikit-learn](https://scikit-learn.org/stable/)); each should be further parametrized according to their respective documentation
- `lin`: linear model ([](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html))
- `lasso`: linear model with L1 prior as regularizer ([](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.Lasso.html))
- `lars`: least angle regression ([](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.Lars.html))
- `lassolars`: lasso model fit with lars ([](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LassoLars.html))
- `gaussianprocess`: gaussian process regression ([](https://scikit-learn.org/stable/modules/generated/sklearn.gaussian_process.GaussianProcessRegressor.html))

**`surrogate`:** Set up the surrogate model
- `multilinear`: multilinear polynomial basis
- `walsh`: Walsh functions basis
- `gaussianprocess`: combinatorial gaussian process

**`algorithm`:** Set up the algorithm to be executed
- `fitter`: Compute the accuracy of surrogate models (mae)
- `exhaustive`: Exhaustive optimization
- `sie`: Simple Incremental Evaluation
- `die`: Double Incremental Evaluation
- `bocs`: Bayesian Optimization of Combinatorial Structures ([BOCS](https://github.com/baptistar/BOCS))
- `ego`: Efficient Global Optimization for Combinatorial Problems ([EGO](http://www.spotseven.de/wp-content/papercite-data/pdf/zaef14b.pdf))


**`run`:** Set up specific run parameters

76
**`analysis`:** Set up the output
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
77 78 79

### Examples

Florian Leprêtre's avatar
Florian Leprêtre committed
80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98
#### Sampling examples

Sample `sample_size` example points in the search space according to a `problem`
and store them in a `logfile`.

The `model`, `surrogate`, `algorithm` and `run` parameters are set to `none` as
they are not usefull in this context.

```sh
cli_restart.py problem=nkl,size=10,k=1 \
               none \
               none \
               none \
               none \
               analysis=sampler,sample_size=10000,logfile=out_sample.csv
```

#### Fitting models

Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
99 100 101 102
Compute the accuracy of a Walsh surrogate model trained with Lasso regressor on a NK-Landscape problem.

```sh
cli_restart.py problem=nkl,size=10,k=1 \
103
               model=lasso,alpha=1e-5 \
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
104
               surrogate=walsh,order=2 \
105
               algo=fitter,algo_restarts=10 \
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
106
               sample=100,step=10 \
107
               analysis=fitter,logfile=out_fit.csv,testfile=out_sample.csv
Florian Leprêtre's avatar
v0.1  
Florian Leprêtre committed
108 109
```

Florian Leprêtre's avatar
Florian Leprêtre committed
110 111 112 113 114 115 116 117 118 119 120 121
#### Optimization

Maximize NK-Landscape problem with a Surrogate-assisted Optimization approach
(WSaO).

```sh
cli_restart.py problem=nkl,size=10,k=1 \
               model=lasso,alpha=1e-5 \
               surrogate=walsh,order=2 \
               algo=die,target=max,algo_restarts=10 \
               initial_sample=10,max_sample=100,restarts=100,iterations=100 \
               analysis=optimizer,logfile=out_optim.csv
Florian Leprêtre's avatar
Florian Leprêtre committed
122 123 124 125 126 127 128 129 130 131 132 133 134 135
```

#### Plots

Plot mean absolute error or optimization.

```sh
python analysis/plot_mae.py out_fit.csv # [out_fit2.csv out_fit3.csv ...]
python analysis/plot_optim.py out_optim.csv # [out_optim2.csv out_optim3.csv ...]
```

#### More

More experiments are available in the run/params.txt file.