Using fractal analysis in neural network optimization algorithms in medical diagnostics
Abstract
Relevance. The development of optimization methods for neural networks in medical tasks is limited by data noisiness and imbalance, which complicates the application of classical algorithms. The use of fractal analysis makes it possible to create new approaches for improving the robustness, stability, and accuracy of models.
Goal. To improve the convergence and stability of training deep neural networks in medical diagnostics through a new optimization algorithm based on fractal self-similarity.
Methods. The proposed algorithm extends the Adam by introducing fractal modulation of gradient moments through multiscale averaging. Two temporal moments are maintained: a short-term component reflecting local gradient trends and a long-term component that accumulates fractal-smoothed information over multiple scales. The update rule incorporates a fractal coefficient which controls the balance between local adaptability and global stability. This design allows the optimizer to perform gradient corrections in a self-similar manner, analogous to fractional-order dynamics.
Results. Experimental results showed that the FractalMomentAdam optimizer achieves superior performance across several key metrics. The algorithm reached a validation accuracy of 96.44%, exceeding the baseline Adam by 2.5%, while also demonstrating smoother convergence and reduced loss oscillations between epochs. The multiscale fractal smoothing contributed to better noise resistance and more stable training dynamics in the presence of data imbalance. The combination of adaptive moment estimation and fractal modulation effectively enhanced both convergence speed and final model quality.
Conclusions. The research confirms that the fractal approach to optimization provides a robust and efficient alternative to traditional gradient-based methods. By incorporating self-similar structures into moment estimation, FractalMomentAdam enhances the stability, reliability, and adaptability of neural network training in medical tasks. These findings open prospects for further research in the field of adaptive fractal optimizers, including dynamic parameter tuning, hybridization with metaheuristic strategies, and application to broader classes of medical datasets.
Downloads
References
/References
Philip Ward, MRI artifacts still require significant care and attention. 2023. URL: https://www.auntminnieeurope.com/clinical-news/article/15657705/mri-artifacts-still-require-significant-care-and-attention (date of access: 18.06.2025).
Bodner B. 10 PyTorch Optimizers Everyone Is Using. 2024. URL: https://medium.com/@benjybo7/10-pytorch-optimizers-you-must-know-c99cf3390899 (date of access: 21.05.2025).
Diederik P. Kingma, Jimmy Lei Ba, ADAM: A METHOD FOR STOCHASTIC OPTIMIZATION. 2015. URL: https://arxiv.org/pdf/1412.6980 (date of access: 10.06.2025).
Robin M. Schmidt, Schneider F., Hennig P. Descending through a Crowded Valley — Benchmarking Deep Learning Optimizers. 2021. URL: https://arxiv.org/pdf/2007.01547 (date of access: 21.05.2025).
GfG, Optimization Algorithms in Machine Learning. 2025. URL: https://www.geeksforgeeks.org/machine-learning/optimization-algorithms-in-machine-learning/ (date of access: 25.05.2025).
Reyad M., Amany M. Sarhan, Arafa M. A modified Adam algorithm for deep neural network optimization. 2023. URL: https://link.springer.com/article/10.1007/s00521-023-08568-z (date of access: 28.05.2025).
G. E. Hinton, N. Srivastava, A. Krizhevsky, I. Sutskever and R. R. Salakhutdinov Improving neural networks by preventing co-adaptation of feature detectors. 2012. URL: https://arxiv.org/pdf/1207.0580 (date of access: 21.05.2025).
Salmi M., Atif D., Oliva D., Abraham A., Ventura S. Handling imbalanced medical datasets: review of a decade of research. 2024. URL: https://link.springer.com/article/10.1007/s10462-024-10884-2 (date of access: 18.06.2025).
Barnsley, Michael F.; Rising Hawley; Fractals Everywhere. Boston: Academic Press Professional, 1993. ISBN 0-12-079061-0
Jiancheng Yang, Rui Shi, Donglai Wei, Zequan Liu, Lin Zhao, Bilian Ke, Hanspeter Pfister & Bingbing Ni. MedMNIST v2 - A large-scale lightweight benchmark for 2D and 3D biomedical image classification. 2023. URL: https://www.nature.com/articles/s41597-022-01721-8 (date of access: 18.06.2025).
Jakob Nikolas Kather. Predicting survival from colorectal cancer histology slides using deep learning: A retrospective multicenter study. 2019. URL: https://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1002730 (date of access: 18.06.2025).
Convolutional Neural Network (CNN) TensorFlow Guide for CIFAR. 2025. URL: https://www.tensorflow.org/tutorials/images/cnn (date of access: 25.06.2025).
Philip Ward, MRI artifacts still require significant care and attention. 2023. URL: https://www.auntminnieeurope.com/clinical-news/article/15657705/mri-artifacts-still-require-significant-care-and-attention (date of access: 18.06.2025).
Bodner B. 10 PyTorch Optimizers Everyone Is Using. 2024. URL: https://medium.com/@benjybo7/10-pytorch-optimizers-you-must-know-c99cf3390899 (date of access: 21.05.2025).
Diederik P. Kingma, Jimmy Lei Ba, ADAM: A METHOD FOR STOCHASTIC OPTIMIZATION. 2015. URL: https://arxiv.org/pdf/1412.6980 (date of access: 10.06.2025).
Robin M. Schmidt, Schneider F., Hennig P. Descending through a Crowded Valley — Benchmarking Deep Learning Optimizers. 2021. URL: https://arxiv.org/pdf/2007.01547 (date of access: 21.05.2025).
GfG, Optimization Algorithms in Machine Learning. 2025. URL: https://www.geeksforgeeks.org/machine-learning/optimization-algorithms-in-machine-learning/ (date of access: 25.05.2025).
Reyad M., Amany M. Sarhan, Arafa M. A modified Adam algorithm for deep neural network optimization. 2023. URL: https://link.springer.com/article/10.1007/s00521-023-08568-z (date of access: 28.05.2025).
G. E. Hinton, N. Srivastava, A. Krizhevsky, I. Sutskever and R. R. Salakhutdinov Improving neural networks by preventing co-adaptation of feature detectors. 2012. URL: https://arxiv.org/pdf/1207.0580 (date of access: 21.05.2025).
Salmi M., Atif D., Oliva D., Abraham A., Ventura S. Handling imbalanced medical datasets: review of a decade of research. 2024. URL: https://link.springer.com/article/10.1007/s10462-024-10884-2 (date of access: 18.06.2025).
Barnsley, Michael F.; Rising Hawley; Fractals Everywhere. Boston: Academic Press Professional, 1993. ISBN 0-12-079061-0
Jiancheng Yang, Rui Shi, Donglai Wei, Zequan Liu, Lin Zhao, Bilian Ke, Hanspeter Pfister & Bingbing Ni. MedMNIST v2 - A large-scale lightweight benchmark for 2D and 3D biomedical image classification. 2023. URL: https://www.nature.com/articles/s41597-022-01721-8 (date of access: 18.06.2025).
Jakob Nikolas Kather. Predicting survival from colorectal cancer histology slides using deep learning: A retrospective multicenter study. 2019. URL: https://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1002730 (date of access: 18.06.2025).
Convolutional Neural Network (CNN) TensorFlow Guide for CIFAR. 2025. URL: https://www.tensorflow.org/tutorials/images/cnn (date of access: 25.06.2025).