Conference paper

Notes on soft minimum and other function approximations


Authors listStrickert, Marc; Hüllermeier, Eyke

Appeared inMIWOCI Workshop 2013

Editor listSchleif, Frank-Michael; Villmann, Thomas

Publication year2013

Pages60-70

URLhttps://www.techfak.uni-bielefeld.de/~fschleif/mlr/mlr_04_2013.pdf

Conference5th Mittweida Workshop on Computational Intelligence

Title of seriesMachine Learning Reports

Number in series2013, 04


Abstract

Optimization of non-differentiable functions mixed with continuous expressions is a frequent problem in machine learning. For example, optimal continuous models over triangular norms, `1-norms, and rankings involve non-differentiable minimum, maximum, relational, or counting operators. Soft formulations of such operators are investigated for model optimization by gradient-based methods. Several aspects of the original presentation on ’Amazing Soft-Min’ at the fifth Mittweida workshop on computational intelligence are summarized in the following, being extended beyond the mere minimum operator.




Authors/Editors




Citation Styles

Harvard Citation styleStrickert, M. and Hüllermeier, E. (2013) Notes on soft minimum and other function approximations, in Schleif, F. and Villmann, T. (eds.) MIWOCI Workshop 2013. Mittweida: University of Applied Sciences. pp. 60-70. https://www.techfak.uni-bielefeld.de/~fschleif/mlr/mlr_04_2013.pdf

APA Citation styleStrickert, M., & Hüllermeier, E. (2013). Notes on soft minimum and other function approximations. In Schleif, F., & Villmann, T. (Eds.), MIWOCI Workshop 2013. (pp. 60-70). University of Applied Sciences. https://www.techfak.uni-bielefeld.de/~fschleif/mlr/mlr_04_2013.pdf


Last updated on 2025-06-06 at 14:53