Skip to content

Support categorical parameters in optimization #144

@pavelkomarov

Description

@pavelkomarov

This has been in the back of my mind, but it came up again because I was thinking it made more sense if, based on #138, we say all the TVR methods are expressions of one method and optimize over the derivative order, $\nu$, in addition to $\gamma$, rather than having to make three separate optimize calls. Then as part of #143, I have gone and made the currently-private common backing functions for the TVR, Kalman, and Iterated FD modules public, thereby enabling one to optimize a single function for the module. Then this of course necessitated messing with the optimization code, and I realized Iterated FD needs categorical order, because it can be 2 or 4 but not 3, so a numerical parameter that rounds to the nearest int wasn't gonna work.

After some work, categorical parameters are supported, and in the process I've realized it does not actually make sense to try to optimize over $\nu$; in fact I often see worse results. I think essentially by making the optimization problem higher dimension, we're less likely to find its true minimum. And it actually doesn't really save work to treat whole number orders, which have few options, as continuous ranges. In fact, it's probably better to treat them as categorical parameters and run separate optimizations in each bin and compare, because otherwise you risk the optimization wanting to guess fractional numbers for that dimension, which will just be rounded anyway, possibly to the same values if the guesses are close. Same goes for booleans: better to treat them as categoricals.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or improvement

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions