Abstract:
In medical image analysis, accurately segmenting brain tumors is still very challenging, motivating researchers to explore advanced deep-learning methods. While U-Net models have produced promising results, improving their performance through optimized training techniques is still necessary. Given that Adam is commonly used as the default optimizer in such tasks, our study explores the impact of different Adam optimizer variants on U-Net performance using the well-known BraTS 2020 dataset. We evaluated Adam, AdamW, Adagrad, Adamax, Adafactor, and RMSprop optimizers, comparing their performance using key metrics such as training loss, validation loss, F-score, Intersection over Union (IoU), precision, and recall. The obtained results show that Adamax achieves the highest F-score (0.8120) and IoU score, demonstrating superior performance in segmenting tumor regions in medical images; AdamW also showed strong results with lower training and validation losses, as well as good precision and recall, highlighting its efficiency and accuracy. These findings emphasize the importance of selecting the right optimizer for Li-Net-based brain tumor segmentation and encourage further exploration into optimized training strategies in medical image analysis.
CITATION:
IEEE format
S. Offorjindu, M. Marjanović, T. Bezdan, “Effects of Adam Optimizer Variants on Brain Tumor Segmentation Task,” in Sinteza 2025 - International Scientific Conference on Information Technology, Computer Science, and Data Science, Belgrade, Singidunum University, Serbia, 2025, pp. 41-47. doi:10.15308/Sinteza-2025-41-47
APA format
Offorjindu, S., Marjanović, M., Bezdan, T. (2025). Effects of Adam Optimizer Variants on Brain Tumor Segmentation Task. Paper presented at Sinteza 2025 - International Scientific Conference on Information Technology, Computer Science, and Data Science. doi:10.15308/Sinteza-2025-41-47