- Neural networks as computational models. The classification problem. The neuron as a classifier. Rosenblatt's perceptron.
- Rosenblatt's convergence theorem. Minsky and Pappert's counterexample.
- Class separability. Cover's counting theorem. Separability of random variables. Generalization according to Cover.
- Nonlinear separability according to Vapnik and Chervonenkis. The VC dimension. Generalization according to Vapnik and Chervonenkis.
- Mapping multi-variable binary functions by neural networks. Existence and uniquness theorems.
- Support Vector Classification (SVC) and Support Vector Regression (SVR).
- Continuous function approximation by neural networks. Regularization.
- Learning and information storage in neural networks. The back propagation method. Hebbian learning
- Associative memories: Hopfield's and Kanerva's models. Storage capacities. Error correction.
- The dynamics of discrete networks. Convergence and oscillations.
- Some real-world examples.
הודעותAnnouncementsСообщенияاعلانات
מידע כלליGeneral InfoОбщая Инф.معلومات عامة
סילבוסSyllabusСилабусخطة الدورة
סגלStaffПреподавателиطاقم التدريس
אירועיםEventsСобытияأحداث
תרגילי ביתAssignmentsДом. Заданияوظائف بيتية
חומר המקצועCourse MaterialМатериал Курсаمادة الدورة
ציוניםGradesОценкиعلامات
ספרותLiteratureЛитератураمراجع
קישוריםLinksСсылкиوصلات
חיפוש שותפיםFind a PartnerНайти Партнёраإيجاد شركاء
עדכון אוטומטיAuto UpdateАвт. Информир.حتلنة تلقائية
סמסטר קודםPrev. SemesterPrev. Semesterالفصل السابق