Please use this identifier to cite or link to this item: https://scidar.kg.ac.rs/handle/123456789/23028
Full metadata record
DC FieldValueLanguage
dc.contributor.authorVučićević, Nemanja-
dc.date.accessioned2026-02-18T12:02:28Z-
dc.date.available2026-02-18T12:02:28Z-
dc.date.issued2024-
dc.identifier.urihttps://scidar.kg.ac.rs/handle/123456789/23028-
dc.description.abstractThis paper deals with the minimization of unconstrained objective functions in the form of finite sums. We present an extra-gradient method with line search strategy and algorithm that uses variable sample size and thus makes the process significantly cheaper. The method is non-monotone, and the adaptive step size αk obtained in the linear search, is a random variable dependent on the sample ξk. The inevitable consequence is that the errors do not induce martingales. The algorithm is tested on a couple of examples, including the machine learning problems. [ 1, 2]en_US
dc.language.isoen_USen_US
dc.publisherUniverzitet u Beogradu, Matematički fakulteten_US
dc.subjectfinite sum minimizationen_US
dc.subjectmachine learningen_US
dc.subjectline search extragradienten_US
dc.titleA machine learning method with extra-gradient stepen_US
dc.typeconferenceObjecten_US
dc.type.versionPublishedVersionen_US
dc.source.conferenceXV SERBIAN MATHEMATICAL CONGRESSen_US
Appears in Collections:Faculty of Science, Kragujevac

Page views(s)

11

Files in This Item:
File Description SizeFormat 
SMAK_2024-67.pdfSplit PDF document from the book of abstracts317.81 kBAdobe PDFThumbnail
View/Open


Items in SCIDAR are protected by copyright, with all rights reserved, unless otherwise indicated.