Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
aa5f4b7
big changes! Basically fully rewrote the optimizer code. Getting same…
pavelkomarov Jul 7, 2025
3a6433a
moved rest of linear_model optimization
pavelkomarov Jul 8, 2025
f05e9ed
moved finite difference to new optimizer
pavelkomarov Jul 8, 2025
18d4bc1
moved smooth fd to new optimization
pavelkomarov Jul 8, 2025
0ce0922
moved TVR to new optimizer
pavelkomarov Jul 8, 2025
40b460f
moved kalman module, to complete all 5
pavelkomarov Jul 8, 2025
d4769b5
addressed #82
pavelkomarov Jul 8, 2025
209731d
changed init_conds to search_space because it's a flashier, more memo…
pavelkomarov Jul 8, 2025
685e5e0
removed a couple unnecessary imports, reran basic_tutorial notebook (…
pavelkomarov Jul 8, 2025
8965557
Merge pull request #126 from florisvb/notebooks-update-for-optimizer
pavelkomarov Jul 8, 2025
a4ef0b3
Merge branch 'master' of github.com:florisvb/PyNumDiff into notebooks…
pavelkomarov Jul 8, 2025
687778f
Merge branch 'master' of github.com:florisvb/PyNumDiff into hack-the-…
pavelkomarov Jul 8, 2025
2cae18a
realized TV is defined as normalizing by m, not m-1 in the paper, so …
pavelkomarov Jul 8, 2025
3c7e453
Merge branch 'hack-the-optimizer-more' of github.com:florisvb/PyNumDi…
pavelkomarov Jul 8, 2025
3d56cef
fully updated notebook 2a
pavelkomarov Jul 8, 2025
d5f212b
notebook 2b now uses the new optimizer
pavelkomarov Jul 8, 2025
a5b79b8
added search_space arg to test_polydiff in optimizer tests to lock do…
pavelkomarov Jul 8, 2025
47058db
Merge pull request #128 from florisvb/notebooks-update-for-optimizer
pavelkomarov Jul 8, 2025
3d0aff8
importing the optimize function to the top level doesn't work, becaus…
pavelkomarov Jul 8, 2025
0a5ea09
realized unit tests assume padding='auto', but metrics assume padding…
pavelkomarov Jul 9, 2025
6949e81
improved docstring, hadn't said what padding does at the optimizer level
pavelkomarov Jul 9, 2025
feef529
added tools field to .readthedocs.yaml, trying to get the documentati…
pavelkomarov Jul 14, 2025
a6787bb
fixing another readthedocs build error
pavelkomarov Jul 14, 2025
0191a82
more wrestling with readthedocs
pavelkomarov Jul 14, 2025
5a3a246
removed obsolete tag from the readthedocs yaml
pavelkomarov Jul 14, 2025
49ad154
trying readthedocs yet again
pavelkomarov Jul 14, 2025
62d27ee
docs didn't build right for a couple modules. fixing
pavelkomarov Jul 14, 2025
3bac5d5
Merge branch 'master' of github.com:florisvb/PyNumDiff into hack-the-…
pavelkomarov Jul 14, 2025
02994b7
added cvxpy to documentation requirements so all methods get imported…
pavelkomarov Jul 14, 2025
267b8c8
parmesan
pavelkomarov Jul 14, 2025
7b5eaee
Merge branch 'master' of github.com:florisvb/PyNumDiff into hack-the-…
pavelkomarov Jul 14, 2025
7f826c8
updated doc source to reflect changes on this branch
pavelkomarov Jul 15, 2025
c188f42
Merge branch 'master' of github.com:florisvb/PyNumDiff into hack-the-…
pavelkomarov Jul 15, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 2 additions & 9 deletions docs/source/optimize.rst
Original file line number Diff line number Diff line change
@@ -1,12 +1,5 @@
optimize
=========

.. toctree::
:maxdepth: 1

optimize/__optimize__
optimize/__finite_difference__
optimize/__kalman_smooth__
optimize/__linear_model__
optimize/__smooth_finite_difference__
optimize/__total_variation_regularization__
.. automodule:: pynumdiff.optimize
:members:
7 changes: 0 additions & 7 deletions docs/source/optimize/__finite_difference__.rst

This file was deleted.

7 changes: 0 additions & 7 deletions docs/source/optimize/__kalman_smooth__.rst

This file was deleted.

7 changes: 0 additions & 7 deletions docs/source/optimize/__linear_model__.rst

This file was deleted.

7 changes: 0 additions & 7 deletions docs/source/optimize/__optimize__.rst

This file was deleted.

7 changes: 0 additions & 7 deletions docs/source/optimize/__smooth_finite_difference__.rst

This file was deleted.

7 changes: 0 additions & 7 deletions docs/source/optimize/__total_variation_regularization__.rst

This file was deleted.

68 changes: 29 additions & 39 deletions examples/1_basic_tutorial.ipynb

Large diffs are not rendered by default.

532 changes: 174 additions & 358 deletions examples/2a_optimizing_parameters_with_dxdt_known.ipynb

Large diffs are not rendered by default.

530 changes: 181 additions & 349 deletions examples/2b_optimizing_parameters_with_dxdt_unknown.ipynb

Large diffs are not rendered by default.

52 changes: 29 additions & 23 deletions pynumdiff/linear_model/_linear_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,16 +13,16 @@
#########################
# Savitzky-Golay filter #
#########################
def savgoldiff(x, dt, params=None, options=None, polynomial_order=None, window_size=None, smoothing_win=None):
def savgoldiff(x, dt, params=None, options=None, poly_order=None, window_size=None, smoothing_win=None):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Decided to rename this parameter, because typing out the full word can take up a bunch of space and feel unnecessary.

"""Use the Savitzky-Golay to smooth the data and calculate the first derivative. It wses
scipy.signal.savgol_filter. The Savitzky-Golay is very similar to the sliding polynomial fit,
but slightly noisier, and much faster

:param np.array[float] x: data to differentiate
:param float dt: step size
:param list params: (**deprecated**, prefer :code:`polynomial_order`, :code:`window_size`, and :code:`smoothing_win`)
:param list params: (**deprecated**, prefer :code:`poly_order`, :code:`window_size`, and :code:`smoothing_win`)
:param dict options: (**deprecated**)
:param int polynomial_order: order of the polynomial
:param int poly_order: order of the polynomial
:param int window_size: size of the sliding window, must be odd (if not, 1 is added)
:param int smoothing_win: size of the window used for gaussian smoothing, a good default is
window_size, but smaller for high frequnecy data
Expand All @@ -32,17 +32,17 @@ def savgoldiff(x, dt, params=None, options=None, polynomial_order=None, window_s
- **dxdt_hat** -- estimated derivative of x
"""
if params != None: # Warning to support old interface for a while. Remove these lines along with params in a future release.
warn("`params` and `options` parameters will be removed in a future version. Use `polynomial_order`, " +
warn("`params` and `options` parameters will be removed in a future version. Use `poly_order`, " +
"`window_size`, and `smoothing_win` instead.", DeprecationWarning)
polynomial_order, window_size, smoothing_win = params
elif polynomial_order == None or window_size == None or smoothing_win == None:
raise ValueError("`polynomial_order`, `window_size`, and `smoothing_win` must be given.")
poly_order, window_size, smoothing_win = params
elif poly_order == None or window_size == None or smoothing_win == None:
raise ValueError("`poly_order`, `window_size`, and `smoothing_win` must be given.")

window_size = np.clip(window_size, polynomial_order + 1, len(x)-1)
if not window_size % 2: window_size += 1 # window_size needs to be odd
window_size = np.clip(window_size, poly_order + 1, len(x)-1)
if window_size % 2 == 0: window_size += 1 # window_size needs to be odd
smoothing_win = min(smoothing_win, len(x)-1)

dxdt_hat = scipy.signal.savgol_filter(x, window_size, polynomial_order, deriv=1)/dt
dxdt_hat = scipy.signal.savgol_filter(x, window_size, poly_order, deriv=1)/dt

kernel = utility.gaussian_kernel(smoothing_win)
dxdt_hat = utility.convolutional_smoother(dxdt_hat, kernel)
Expand All @@ -57,16 +57,16 @@ def savgoldiff(x, dt, params=None, options=None, polynomial_order=None, window_s
######################
# Polynomial fitting #
######################
def polydiff(x, dt, params=None, options=None, polynomial_order=None, window_size=None, step_size=1,
def polydiff(x, dt, params=None, options=None, poly_order=None, window_size=None, step_size=1,
kernel='friedrichs'):
"""Fit polynomials to the data, and differentiate the polynomials.

:param np.array[float] x: data to differentiate
:param float dt: step size
:param list[int] params: (**deprecated**, prefer :code:`polynomial_order` and :code:`window_size`)
:param list[int] params: (**deprecated**, prefer :code:`poly_order` and :code:`window_size`)
:param dict options: (**deprecated**, prefer :code:`step_size` and :code:`kernel`)
a dictionary consisting of {'sliding': (bool), 'step_size': (int), 'kernel_name': (str)}
:param int polynomial_order: order of the polynomial
:param int poly_order: order of the polynomial
:param int window_size: size of the sliding window, if not given no sliding
:param int step_size: step size for sliding
:param str kernel: name of kernel to use for weighting and smoothing windows ('gaussian' or 'friedrichs')
Expand All @@ -76,24 +76,27 @@ def polydiff(x, dt, params=None, options=None, polynomial_order=None, window_siz
- **dxdt_hat** -- estimated derivative of x
"""
if params != None:
warn("`params` and `options` parameters will be removed in a future version. Use `polynomial_order` " +
warn("`params` and `options` parameters will be removed in a future version. Use `poly_order` " +
"and `window_size` instead.", DeprecationWarning)
polynomial_order = params[0]
poly_order = params[0]
if len(params) > 1: window_size = params[1]
if options != None:
if 'sliding' in options and not options['sliding']: window_size = None
if 'step_size' in options: step_size = options['step_size']
if 'kernel_name' in options: kernel = options['kernel_name']
elif polynomial_order == None or window_size == None:
raise ValueError("`polynomial_order` and `window_size` must be given.")
elif poly_order == None or window_size == None:
raise ValueError("`poly_order` and `window_size` must be given.")

if window_size < polynomial_order*3:
window_size = polynomial_order*3+1
if window_size < poly_order*3:
window_size = poly_order*3+1
if window_size % 2 == 0:
window_size += 1
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This thing was failing because the optimizer wanted to give even-length windows, so I've added back this +1.

warn("Kernel window size should be odd. Added 1 to length.")

def _polydiff(x, dt, polynomial_order, weights=None):
def _polydiff(x, dt, poly_order, weights=None):
t = np.arange(len(x))*dt

r = np.polyfit(t, x, polynomial_order, w=weights) # polyfit returns highest order first
r = np.polyfit(t, x, poly_order, w=weights) # polyfit returns highest order first
dr = np.polyder(r) # power rule already implemented for us

dxdt_hat = np.polyval(dr, t) # evaluate the derivative and original polynomials at points t
Expand All @@ -102,10 +105,10 @@ def _polydiff(x, dt, polynomial_order, weights=None):
return x_hat, dxdt_hat

if not window_size:
return _polydiff(x, dt, polynomial_order)
return _polydiff(x, dt, poly_order)

kernel = {'gaussian':utility.gaussian_kernel, 'friedrichs':utility.friedrichs_kernel}[kernel](window_size)
return utility.slide_function(_polydiff, x, dt, kernel, polynomial_order, stride=step_size, pass_weights=True)
return utility.slide_function(_polydiff, x, dt, kernel, poly_order, stride=step_size, pass_weights=True)


#############
Expand Down Expand Up @@ -309,6 +312,9 @@ def _lineardiff(x, dt, order, gamma, solver=None):

if not window_size:
return _lineardiff(x, dt, order, gamma, solver)
elif window_size % 2 == 0:
window_size += 1
warn("Kernel window size should be odd. Added 1 to length.")

kernel = {'gaussian':utility.gaussian_kernel, 'friedrichs':utility.friedrichs_kernel}[kernel](window_size)

Expand Down
5 changes: 2 additions & 3 deletions pynumdiff/optimize/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,12 @@
"""
try:
import cvxpy
from . import total_variation_regularization
except ImportError:
from warnings import warn
warn("Limited support for total variation regularization and linear model detected! " +
"Some functions in the `total_variation_regularization` and `linear_model` modules require " +
"CVXPY to be installed. You can still pynumdiff.optimize for other functions.")

from . import finite_difference, smooth_finite_difference, linear_model, kalman_smooth
from ._optimize import optimize
Copy link
Collaborator Author

@pavelkomarov pavelkomarov Jul 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only need to import a single thing now, one function to rule them all


__all__ = ['finite_difference', 'smooth_finite_difference', 'linear_model', 'kalman_smooth', 'total_variation_regularization'] # So these get treated as direct members of the module by sphinx
__all__ = ['optimize'] # So these get treated as direct members of the module by sphinx
144 changes: 0 additions & 144 deletions pynumdiff/optimize/__optimize__.py

This file was deleted.

Loading