Please use this identifier to cite or link to this item:
https://scidar.kg.ac.rs/handle/123456789/23028Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Vučićević, Nemanja | - |
| dc.date.accessioned | 2026-02-18T12:02:28Z | - |
| dc.date.available | 2026-02-18T12:02:28Z | - |
| dc.date.issued | 2024 | - |
| dc.identifier.uri | https://scidar.kg.ac.rs/handle/123456789/23028 | - |
| dc.description.abstract | This paper deals with the minimization of unconstrained objective functions in the form of finite sums. We present an extra-gradient method with line search strategy and algorithm that uses variable sample size and thus makes the process significantly cheaper. The method is non-monotone, and the adaptive step size αk obtained in the linear search, is a random variable dependent on the sample ξk. The inevitable consequence is that the errors do not induce martingales. The algorithm is tested on a couple of examples, including the machine learning problems. [ 1, 2] | en_US |
| dc.language.iso | en_US | en_US |
| dc.publisher | Univerzitet u Beogradu, Matematički fakultet | en_US |
| dc.subject | finite sum minimization | en_US |
| dc.subject | machine learning | en_US |
| dc.subject | line search extragradient | en_US |
| dc.title | A machine learning method with extra-gradient step | en_US |
| dc.type | conferenceObject | en_US |
| dc.type.version | PublishedVersion | en_US |
| dc.source.conference | XV SERBIAN MATHEMATICAL CONGRESS | en_US |
| Appears in Collections: | Faculty of Science, Kragujevac | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| SMAK_2024-67.pdf | Split PDF document from the book of abstracts | 317.81 kB | Adobe PDF | ![]() View/Open |
Items in SCIDAR are protected by copyright, with all rights reserved, unless otherwise indicated.

