Draft: TaoRegularizer and TaoProx.
Draft of TaoRegularizer and TaoProx.
TaoRegularizer is a new object, just like TaoLineSearch. TaoRegularizer is for adding regularizer, g(x)
to main Tao object.
TaoRegularizer can also act as a Bregman divergence, g(x,y)
, which is needed for TaoProx algorithm - for proximal algorithm.
For an example, for L2 Euclidean distance, g(x)
would be \|x\|_2^2
, with according object and gradient.
If one wants TaoRegularizer to be Bregman divergence, one needs to set TaoRegularizerSetCentralVector(TaoRegularizer, Vec)
.
In this case, g(x,y) = \|x-y\|_2^2
.
Currently, TaoRegularizer has L2, and KL divergence, and does not support Hessian routines yet.
TaoProx is a new Tao solver for solving proximal algorithm. Generally speaking, TaoProx is only meant to be used for its built-in proximal algorithm. If a user wants more elaborate proximal solver, it makes more sense for the user to write his own custom TaoShell
solver.
(Note: as far as monitoring goes, I am only thinking about proximal algorithms that are not iterative - say, soft threshold, projection onto simplex. So, there is not much monitoring inside TaoSolve_Prox
. I can't really think of some proximal algorithm of practical use that needs iterative solver. Perhaps such support/usage should be for future work?)
I would like to have TaoProx tested, and be in main, before I venture onto writing more robust operator splitting solvers (e.g, FISTA) for Tao.
Currently, TaoProx only supports L1 - SoftThreshold. More to come. (Note: there is a robust Matlab/Python package for proximal algorithms that supports plethora of functions/convex sets/etc. http://proximity-operator.net/ - It is unclear to me to which extent these routines should be supports.)
(11/3 Update: CI passed, minus some coverage test. But I wanted some feedback on how this code is structured before moving on and adding additional features. @tisaac @tmunson @adener )