Matches in Nanopublications for { ?s ?p ?o <https://w3id.org/sciencelive/np/RAxXRIhluryUsyPzQCAqfZKaE2XU7CBIIys7rN6r39jtY/assertion>. }
Showing items 1 to 5 of
5
with 100 items per page.
- The%20model%20training%20utilized%20a%20batch%20size%20of%206%2C%20conducted%2050%20epochs%2C%20initialized%20the%20learning%20rate%20at%200.1%20with%20scheduled%20decrease%20by%20a%20factor%20of%200.5%20between%20the%2030th%20to%2050th%20epochs%2C%20and%20used%20Stochastic%20Gradient%20Descent%20optimizer. type AIDA-Sentence assertion.
- The%20model%20training%20utilized%20a%20batch%20size%20of%206%2C%20conducted%2050%20epochs%2C%20initialized%20the%20learning%20rate%20at%200.1%20with%20scheduled%20decrease%20by%20a%20factor%20of%200.5%20between%20the%2030th%20to%2050th%20epochs%2C%20and%20used%20Stochastic%20Gradient%20Descent%20optimizer. about Q197536 assertion.
- The%20model%20training%20utilized%20a%20batch%20size%20of%206%2C%20conducted%2050%20epochs%2C%20initialized%20the%20learning%20rate%20at%200.1%20with%20scheduled%20decrease%20by%20a%20factor%20of%200.5%20between%20the%2030th%20to%2050th%20epochs%2C%20and%20used%20Stochastic%20Gradient%20Descent%20optimizer. about Q2539 assertion.
- The%20model%20training%20utilized%20a%20batch%20size%20of%206%2C%20conducted%2050%20epochs%2C%20initialized%20the%20learning%20rate%20at%200.1%20with%20scheduled%20decrease%20by%20a%20factor%20of%200.5%20between%20the%2030th%20to%2050th%20epochs%2C%20and%20used%20Stochastic%20Gradient%20Descent%20optimizer. about Q192776 assertion.
- The%20model%20training%20utilized%20a%20batch%20size%20of%206%2C%20conducted%2050%20epochs%2C%20initialized%20the%20learning%20rate%20at%200.1%20with%20scheduled%20decrease%20by%20a%20factor%20of%200.5%20between%20the%2030th%20to%2050th%20epochs%2C%20and%20used%20Stochastic%20Gradient%20Descent%20optimizer. obtainsSupportFrom j.jag.2024.104034 assertion.