NEYRON TARMOQLARDA CHIZIQLI BO'LMAGAN FAOLLASHTIRISH FUNKSIYALARI VA ULARDAN FOYDALANISH
Keywords:
Sigmoid, Tanh funksiyasi, ReLU, Leaky ReLU, Parametrik ReLU, ELU, SoftmaxAbstract
Chiziqli bo'lmagan faollashtirish funksiyalari chiziqli faollashtirish funksiyalarining quyidagi cheklovlarini hal qiladi: Ular orqaga tarqalishga ruxsat beradi, chunki endi hosila funksiyasi kirish bilan bog'liq bo'ladi va orqaga qaytib, kirish neyronlaridagi qaysi og'irliklar yaxshiroq bashorat qilishini tushunish mumkin; Ular neyronlarning bir nechta qatlamlarini joylashtirish imkonini beradi, chunki chiqish endi bir nechta qatlamlardan o'tgan kirishning chiziqli bo'lmagan kombinatsiyasi bo'ladi. Har qanday chiqish neyron tarmog'ida funksional hisoblash sifatida taqdim etilishi mumkin.
References
Jo’rayev S.U. Reografik tahlilning rivojlanish istiqbollari haqida //Educational Research in Universal Sciences ISSN: 2181-3515 VOLUME 2 | SPECIAL ISSUE 5 | 2023
Niyozmatova N.A. Mamatov N.S. Samijonov A.N. Raximov E. Method for selecting informative and non-informative features // IOP Conf. Series: Materials Science and Engineering 919 (2020) 042013
Mamatov N.S. Niyozmatova N.A. Samijonov A.N. Abdullayeva B. Jurayev S.U. The choice of informative features based on heterogeneous functionals// IOP Conf. Series: Materials Science and Engineering 919 (2020) 042009
Jo‘rayev S. U. PYTHON KUTUBXONALARI //GOLDEN BRAIN. – 2023. – Т. 1. – №. 16. – С. 249-260.
Jo‘rayev S. U. Python programming language and its features //American Journal of Pedagogical and Educational Research Volume 14, | July, 2023
Jo‘rayev S. U. Performing an arithmetic operation in the python programming language //American Journal of Pedagogical and Educational Research Volume 14, | July, 2023