
Thursday tutorials during the lockdown | |
Dear students, Again, due to the lockdown, I won't be able to attend to the tutorial tomorrow morning. The recording of today's tutorial will be ready for you on Panopto. If needed, I will arrange a reception tomorrow evening. Cheers, Ronen. |
פורסם ב- 20/1/2021, 21:59:53 Created on 20/1/2021, 21:59:53 Создано 20/1/2021, 21:59:53 تم النشر ب- 20/1/2021, 21:59:53 |
Fix to the dry part on Homework 5 | |
Dear students, Following a comment by a student, we fixed Q2 in the dry part of HW5. The linear layers should be *homogeneous* and not include any bias vectors. The fixed version of the assignment is available under the "Homework" section. Sorry for the inconvenience. Best regards |
פורסם ב- 17/1/2021, 15:05:50 Created on 17/1/2021, 15:05:50 Создано 17/1/2021, 15:05:50 تم النشر ب- 17/1/2021, 15:05:50 |
Homework 5 is out | |
Hello students, The 5th exercise is out and you can download it from the homework section in the webcourse. The due date is 26/1/2021. Parent students and their partners can treat this assignment as MAGEN and submit their assignment on 30/1/2021. If you have questions regarding the exercise, please post them in the appropriate section in the course Piazza: https://piazza.com/class/kg83owlgunp67a Good luck, - Course staff |
פורסם ב- 15/1/2021, 16:39:57 Created on 15/1/2021, 16:39:57 Создано 15/1/2021, 16:39:57 تم النشر ب- 15/1/2021, 16:39:57 |
Thursday Tutorial | |
Hello all, Sadly, and due to the lockdown, I will be unable to attend the deep learning tutorial tomorrow. (a cute but rather noisy 2 y/o was set free a couple of days ago). The recording of today's tutorial is ready for you on Panopto. Cheers, Ronen. |
פורסם ב- 13/1/2021, 22:28:52 Created on 13/1/2021, 22:28:52 Создано 13/1/2021, 22:28:52 تم النشر ب- 13/1/2021, 22:28:52 |
4rth exercise due date | |
Hello students, You have 1 more day to complete the 4rth exercise due to a small mistake in the assignment instructions. That is, the final due date is Thursday 14/1/2021. Parent students due date remains the same. Good luck, - Course staff |
פורסם ב- 12/1/2021, 15:05:55 Created on 12/1/2021, 15:05:55 Создано 12/1/2021, 15:05:55 تم النشر ب- 12/1/2021, 15:05:55 |
Correcting to HW4 | |
Dear students, We corrected subsection 2.a on the dry part of HW4. You should express the probability of a weight vector w *without* the condition on the examples x,y. The corrected version is available under the assignment section. Best regards |
פורסם ב- 4/1/2021, 20:24:48 Created on 4/1/2021, 20:24:48 Создано 4/1/2021, 20:24:48 تم النشر ب- 4/1/2021, 20:24:48 |
Class recording online | |
Dear students, Last lecture recording is online. We covered the first (out of two) lectures on deep learning. We defined what a (deep) neural network is, how it differs from linear learning, we discussed its expressivity (what it can compute) and then saw how to train it using back propagation. Next lecture we will cover some "tricks of the trade" - common and useful tricks that improve training. This includes the dropout technique, various ways to intialize the weights of a network. We shall also talk about auto-encoders, which is an unsupervised learning method using deep networks. Time permitting, we will also talk about network "compression" (how to reduce the size of big networks after training, so that they fit in low-energy devices). Exciting stuff! Happy 2021 and see you next week. Nir |
פורסם ב- 1/1/2021, 10:03:54 Created on 1/1/2021, 10:03:54 Создано 1/1/2021, 10:03:54 تم النشر ب- 1/1/2021, 10:03:54 |
Homework 4 is out | |
Hello students, The 4rth exercise is out and you can download it from the homework section in the webcourse. The due date is 13/1/2021 but parent students and their partners are allowed to submit their assignment on 20/1/2021. If you have questions regarding the exercise, please post them in the appropriate section in the course Piazza: https://piazza.com/class/kg83owlgunp67a Good luck, - Course staff |
פורסם ב- 30/12/2020, 10:17:11 Created on 30/12/2020, 10:17:11 Создано 30/12/2020, 10:17:11 تم النشر ب- 30/12/2020, 10:17:11 |
This week's lecture on Boosting and Ensemble Learning online | |
Hello, Please find the link on the website, or here: https://panoptotech.cloud.panopto.eu/Panopto/Pages/Viewer.aspx?id=676b928d-2ea1-41cd-9d6c-ac9a00b8d0be Have a great weekend, Nir |
פורסם ב- 25/12/2020, 18:32:35 Created on 25/12/2020, 18:32:35 Создано 25/12/2020, 18:32:35 تم النشر ب- 25/12/2020, 18:32:35 |
Important: Reading material before this week's lecture | |
Dear students, Under the materials for lecture 10 you can find a short document explaining regression trees and ensemble methods. Please read this document *before* attending the upcoming lecture. Best regards |
עדכון אחרון ב- 20/12/2020, 20:41:20 Last updated on 20/12/2020, 20:41:20 Последняя модификация 20/12/2020, 20:41:20 تمت الحتلنة الأخيرة ب- 20/12/2020, 20:41:20 |
A note on yesterday's tutorial 08 | |
Dear students, Following Wednesday's tutorial on generative models, I made slight changes in the last exercise on ridge regression and MAP. I split the exercise into two - (1) first we show that the LS problem maximize the likelihood function (MLE), and then (2) we show that the ridge regression corresponds to adding a prior on the weight vector and maximizing the posterior probability (MAP). The updated slides are in the course material section. The recorded updated version was taught on today's tutorial (Thursday), and can be found on Panopto. If you watched Wednesday's tutorial this week, please take a few minutes to review the changes. Happy Hanukka, Itay |
פורסם ב- 10/12/2020, 14:44:11 Created on 10/12/2020, 14:44:11 Создано 10/12/2020, 14:44:11 تم النشر ب- 10/12/2020, 14:44:11 |
spectral clustering demo | |
Dear students, As I mentioned yesterday, I am providing you with code that demonstrates how spectral clustering works. There are two datasets (two concentric circles and two half moons). For each dataset, you will see a plot of the emebedding of the points obtained by two eigenvectors of the Laplacians (this give sthe matrix \tilde H, using the class notation). As you can see, it is very easy to cluster based on this embedding!! The code is here: https://www.cs.technion.ac.il/~nailon/236756_spectral_clustering.py Note: it is based on the code in the following page, which is recommended for reading as well: https://scikit-learn.org/stable/auto_examples/cluster/plot_cluster_comparison.html#sphx-glr-auto-examples-cluster-plot-cluster-comparison-py I removed most of the datasets from the original code. Aside from the spectral clustering part, I kept the comparison between different clustering algorithms, so you can see that as well. Later today I will also share the video recording from yesterday. Have a great Hanukka vacation :-) Nir |
פורסם ב- 10/12/2020, 13:36:34 Created on 10/12/2020, 13:36:34 Создано 10/12/2020, 13:36:34 تم النشر ب- 10/12/2020, 13:36:34 |
Homework 3 is out | |
Hello students, The third exercise is out and you can download it from the homework section in the webcourse. The due date is 24/12/2020 but parent students and their partners are allowed to submit their assignment on 31/12/2020. If you have questions regarding the exercise, please post them in the appropriate section in the course Piazza: https://piazza.com/class/kg83owlgunp67a Good luck, - Course staff |
פורסם ב- 8/12/2020, 14:23:10 Created on 8/12/2020, 14:23:10 Создано 8/12/2020, 14:23:10 تم النشر ب- 8/12/2020, 14:23:10 |
הרצאות Lectures Lectures Lectures | |
Dear students, Last Wednesday we discussed generative models, and explained the difference between "discriminative" and "generative". We discussed about MLE and MAP methods for estimating parameters of generative models. We discussed Naive Bayes, and also LDA. Finally, we covered the EM algorithm and proved its guarantees. Next week we will talk about data clustering and unsupervised learning. See you then! Nir |
פורסם ב- 5/12/2020, 18:01:17 Created on 5/12/2020, 18:01:17 Создано 5/12/2020, 18:01:17 تم النشر ب- 5/12/2020, 18:01:17 |
Update | |
Hello students, Unfortunately, I cannot attend tomorrow's tutoring session. The reception will still take place as planned. A recording of the lecture will be uploaded to Panopto soon and I will host an additional reception hour on Sunday at 9:30. Zoom link: https://technion.zoom.us/j/99780355516 Cheers, Ronen |
פורסם ב- 2/12/2020, 17:08:00 Created on 2/12/2020, 17:08:00 Создано 2/12/2020, 17:08:00 تم النشر ب- 2/12/2020, 17:08:00 |
class 6 (Linear Regression + Model Selection and Validation) online | |
Dear class, Yesterday's lecture video recording is online. Next week we will start talking about generative models. Have a great weekend, Nir |
פורסם ב- 26/11/2020, 20:51:25 Created on 26/11/2020, 20:51:25 Создано 26/11/2020, 20:51:25 تم النشر ب- 26/11/2020, 20:51:25 |
Update | |
Dear students, Yesterday during the lecture, we went from hard-SVM to soft-SVM (a.k.a. simply "SVM"). This was done by adding "slack variables" (\xi_i) to the optimization problem. At optimum, we saw that the value of the slack variables is a function of y_i<x_i,w>. This function is called the hinge loss. We then discussed the Representer theorem, which allowed us to rewrite the optimization function for SVM using dual variables (one per training point). The dual view depends only on the data Gram matrix G and the label vector y_1..y_m. This means that if we want to consider alternative, richer feature spaces (such as nonlinear combinations of "vanilla" features), then we really just need to compute dot-product in this enriched space, and we never really need to compute these features. For two such mappings we saw that although the dimension of the richer space space can be huge (even infinity), computation of the gram matrix G can be extremely efficient. One case was that "polynomial Kernel" and the other "RBF Kernel". Dear students, Yesterday during the lecture, we went from hard-SVM to soft-SVM (a.k.a. simply "SVM"). This was done by adding "slack variables" (\xi_i) to the optimization problem. At optimum, we saw that the value of the slack variables is a function of the (y_i<x_i,w>)'s. This function is called the hinge loss. It looks like this: \ \ \ \------- Where the "elbow" overlaps the horizontal axis, starting at +1. We then discussed the Representer theorem, which allowed us to rewrite the optimization function for SVM using dual variables xi_i (one per training point). The dual view depends only on the data Gram matrix G and the label vector y_1..y_m. This means that if we want to consider alternative, richer feature spaces (such as nonlinear combinations of "vanilla" features), then we really just need to compute dot-product in this richer space, and we never really need to compute these features. For two such mappings we saw that although the dimension of the richer space space can be huge (even infinity), computation of the gram matrix G can be extremely efficient. One case was that "polynomial Kernel" and the other "RBF Kernel". To help with some confusing terminology: The "Kernel function" takes two input points x, x' and returns their dot product in some feature space of our choice. The value of the "Gram matrix" at position i,j is the Kernel function evaluated at sample points x_i, x_j. The "Kernel trick" refers to cases in which the kernel function can be computed extremely fast, even if the dimension of the embedding space is huge, or even infinity. After discussing SVM, we discussed logistic regression, which is also linear classifier for binary classification, but with a probabilistic justification. The sigmoid function was introduced, and will make a reappearance later in the course. We also discussed the log-sigmoid function and saw that it is concave. Next week we will discuss linear regression, and model evaluation and selection. Have a great weekend! Nir |
פורסם ב- 19/11/2020, 18:22:24 Created on 19/11/2020, 18:22:24 Создано 19/11/2020, 18:22:24 تم النشر ب- 19/11/2020, 18:22:24 |
Notes for Tutorial 05 - Linear classification | |
Good morning, I added my handwritten notes on tutorial 05 under the course material. You are welcome to use them as a short summary of the tutorial. Itay |
פורסם ב- 19/11/2020, 09:18:47 Created on 19/11/2020, 09:18:47 Создано 19/11/2020, 09:18:47 تم النشر ب- 19/11/2020, 09:18:47 |
Homework 2 is out | |
Hello students, The second exercise is out and you can download it from the homework section in the webcourse. The due date is 7/12/2020 but parent students and their partners are allowed to submit their assignment on 18/12/2020. If you have questions regarding the exercise, please post them in the appropriate section in the course Piazza: https://piazza.com/class/kg83owlgunp67a Good luck, - Course staff |
פורסם ב- 16/11/2020, 12:14:39 Created on 16/11/2020, 12:14:39 Создано 16/11/2020, 12:14:39 تم النشر ب- 16/11/2020, 12:14:39 |
Reading material before this week's tutorial | |
Hello, Under the materials for tutorial 05 you can find a short document explaining the intuition behind the gradient descent algorithm. Please read this document *before* attending the tutorial. Itay |
פורסם ב- 14/11/2020, 21:04:53 Created on 14/11/2020, 21:04:53 Создано 14/11/2020, 21:04:53 تم النشر ب- 14/11/2020, 21:04:53 |
Reception Hour | |
Hello, Sadly, tomorrow (12/11) I won't be able to attend to my reception hour. However, the tutorial will be held as planned. Cheers, Ronen. |
פורסם ב- 11/11/2020, 22:05:06 Created on 11/11/2020, 22:05:06 Создано 11/11/2020, 22:05:06 تم النشر ب- 11/11/2020, 22:05:06 |
Update | |
Dear Students, Yesterday we finished discussing feature selection, and then discussed PCA in length. We started with a visual motivation and using the spectral theorem from linear algebra we went very far! We proved the SVD (Singular Vector Decomposition) theorem en route. We also saw alternative definitions of PAC as an optimal encoder/decoder pair for linear mappings. Here are some notes on the lecture: - In the updated presentation that I uploaded on the website, I consistently use the term "singular values", "right/left singular vectors". When I say "principal component", I am actually referring to a tuple: (i'th singular value, i'th left singular vector, i'th right singular vector) It is also ok to say "principal value", "right/left principal vector" - That terminology is also used in certain places and I believe it is also correct. - In the random projection slides, I changed the notation to be consistent (dimensionality=n, number of points=m) Keep in mind that tutorials may differ. It is not intentional, but it should keep you alert on your feet! Nir Nir |
עדכון אחרון ב- 5/11/2020, 11:18:27 Last updated on 5/11/2020, 11:18:27 Последняя модификация 5/11/2020, 11:18:27 تمت الحتلنة الأخيرة ب- 5/11/2020, 11:18:27 |
Updated Pandas Jupyter Notebook in HW1 | |
Hello, Note that we have uploaded an updated Jupyter notebook file to the HW1 section in the webcourse. - course staff |
פורסם ב- 1/11/2020, 09:18:52 Created on 1/11/2020, 09:18:52 Создано 1/11/2020, 09:18:52 تم النشر ب- 1/11/2020, 09:18:52 |
Homework 1 is out | |
Hello students, The first exercise is out and you can download it from the homework section in the webcourse. Like any other exercise in this course, this exercise is to be done in pairs. The due date is 12/11/2020 but parent students and their partners are allowed to submit their assignment on 19/11/2020. If you have questions regarding the exercise, please post them in the Piazza: https://piazza.com/class/kg83owlgunp67a Good luck, - Course staff |
פורסם ב- 29/10/2020, 17:43:55 Created on 29/10/2020, 17:43:55 Создано 29/10/2020, 17:43:55 تم النشر ب- 29/10/2020, 17:43:55 |
Video of yesterday's class | |
Dear students, Yesterday we discussed data preparation and cleaning. We spent much time on EM (we will learn it more theoretically later in the semester), and en route also learned some facts about multivariate Gaussian distributions. We also discussed various heuristics for dealing with missing data, normalizing existing features and generating new ones. We also discussed class imbalance, why and when it is a problem, and ways to deal with it. A link to the lecture can be found in the class material section. I did not get to finish the presentation, and would like to ask you to go over the remaining slides, which are mainly on feature selection methods. It is a simple (but important) subject and I would like to cover it quickly beginning next week. In particular, make sure you understand the definitions of Pearson correlation and mutual information, so they are not new to you next week. Regards and have a great weekend, Nir |
פורסם ב- 29/10/2020, 09:31:41 Created on 29/10/2020, 09:31:41 Создано 29/10/2020, 09:31:41 تم النشر ب- 29/10/2020, 09:31:41 |
Student parents adjustments | |
Hello, As previously reported, we prepare to make some adjustments aimed at helping parent students with the load of the assignments. For that, we ask all parent students to contact Ronen via email (ASAP) and fill out the request form here: https://tinyurl.com/yys8ruax as soon as possible. Thank you, course staff |
פורסם ב- 24/10/2020, 23:29:16 Created on 24/10/2020, 23:29:16 Создано 24/10/2020, 23:29:16 تم النشر ب- 24/10/2020, 23:29:16 |
Zoom recordings and a personal request | |
Hello students, The videos are up and you can find them under the "material" section. As for now, it seems that only students with the Technion campus account (@campus...) have access to the videos that are located in the Panopto video server. I'll give you an update about it as soon as possible but meanwhile, I suggest that the free listeners try to attend the live sessions. Also, I'd like to ask you to use Piazza for everything that is not highly personal. E.g. "when will this week's videos be ready" ==> should be in the Piazza "can you tell me more about data exploration" ==> Piazza "What time will the tutorial/lecture take place" ==> Google and then, Piazza "Reception hour.." ==> Piazza "I have some medical condition ..." ==> talk to Ronen via mail (uronennatcampus...) "I can't log into Piazza" ==> talk to Ronen via mail (uronennatcampus...) You are a group of very motivated students and we want you to get a good learning experience but please keep in mind that we have some other responsibilities. Cheers, -course staff |
פורסם ב- 22/10/2020, 16:42:02 Created on 22/10/2020, 16:42:02 Создано 22/10/2020, 16:42:02 تم النشر ب- 22/10/2020, 16:42:02 |
פניות למרצה במייל | |
תלמידים יקרים שלום, אבקשכם כאשר אתם פונים אליי, לציין את שמכם, את מספר תעודה הזהות שלכם ואת מספר הקורס 236756 בפנייה אני מלמד יותר מקורס אחד ויש לי מאות סטודנטים מיילים שבהם לא יופיעו פרטים אלה לא יזכו להתיחסות. בנוסף, עיניינים הקשורים לפן הטכני של הקורס, למשל הרצאות מוקלטות, איך לעבוד עם ג'ופיטר פייתון, שיעורי בית וכו אבקש שתנסו קודם להפנות לצוות המתרגלים. שעות הקבלה שלי יהיו ביומי ראשון ב 12:30 אבל אבקשכת לכתוב לי במייל מראש אם בכוונתכם להגיע. בברכה ובהצלחה ניר אילון מרצה אחראי |
פורסם ב- 22/10/2020, 09:13:43 Created on 22/10/2020, 09:13:43 Создано 22/10/2020, 09:13:43 تم النشر ب- 22/10/2020, 09:13:43 |
Zoom meetings | |
Dear students, The links to the course Zoom meetings as well as today's presentations are detailed in the "materials" section in the webcourse. Good luck! -course staff |
פורסם ב- 21/10/2020, 09:42:10 Created on 21/10/2020, 09:42:10 Создано 21/10/2020, 09:42:10 تم النشر ب- 21/10/2020, 09:42:10 |
Get to know you questionnaire | |
Hello again, Please take your time and fill the questionnaire in the following link. We aim that it will help us know you and your needs better towards the upcoming semester. https://tinyurl.com/y6qcrhpn Also, there's an open text box in the bottom of the form; you can use it bring to our attention any further course-related details. -course staff |
פורסם ב- 18/10/2020, 22:28:41 Created on 18/10/2020, 22:28:41 Создано 18/10/2020, 22:28:41 تم النشر ب- 18/10/2020, 22:28:41 |
Course forum in Piazza | |
Dear students, Welcome to Introduction to Machine Learning course! We'll be conducting all class-related discussions with Piazza this term. Please visit the course forum via this link and get familiar with it: https://piazza.com/technion.ac.il/winter2021/236756/home The quicker you begin asking questions on Piazza (rather than via emails), the quicker you'll benefit from the collective knowledge of your classmates and instructors. We encourage you to ask questions when you're struggling to understand a concept—you can even do so anonymously. -course staff |
פורסם ב- 18/10/2020, 15:30:54 Created on 18/10/2020, 15:30:54 Создано 18/10/2020, 15:30:54 تم النشر ب- 18/10/2020, 15:30:54 |
Welcome to Introduction to Machine Learning (Winter 2020/21) | |
Dear Students, We are excited to open the course 236756 (introduction to machine learning). We have a few announcements: 1. Unless otherwise stated, we will always be using the CS webcourse platform for communicating with the students (announcements, course material, links to videos, homework publication, submission, and grading feedback). 2. Presentations, announcements, homework assignments will be published in English. Online and frontal classes will be offered in Hebrew. The exam text will be written in Hebrew. If requested by students, the exam text will be translated into English. 3. Most of the homework will be “wet”, i.e. will require coding and code execution, and we require good documentation, understandable code, simplicity. Some homework will be “dry”, i.e. will require handing in free-text answers. Answers to dry homework must be typed in (which means handwritten \ scanning is forbidden). You are allowed to submit answers to dry questions in either Hebrew or English. The grading feedback will be in Hebrew (unless the student explicitly asks for English). 4. Late homework submission will cost you in grade. The exact deadlines and late submission “price tags” will be announced in due time. Deadlines and price tags will be non-negotiable. 5. The course is not based on a single book. We do offer some references to prominent literature in the “course material” tab, and we may point to relevant chapters as the course progresses. 6. Corona!!! 6.1 As of the date of writing these lines, the government has imposed a major three-week-long lockdown to combat the high infection rates in Israel. It is impossible to know what the policy will be during the semester, and we will hence be prepared to offer the lecture and the tutorials both on-campus and online, as imposed by the government and the Technion. In one extreme scenario, all classes and tutorials will be offered on campus, with no limit on student numbers. In another extreme scenario, all classes and tutorials will be done by Zoom online. We will most likely follow an in-between scenario, allowing X students physically attending the lectures, and Y students physically attending the tutorials, where X and Y depend on the Corona policy and the classroom size. Enforcement of the physical attendance (including possibly mandatory pre-registration) will be done using Technion protocols. All lectures will be recorded. 6.2 The Corona policy will not affect homework rules, because they are 100% online anyway. 6.3 There are 3 exam scenarios. (1) onsite (2) online and (3) take-home exam. We leave all three scenarios open and will decide in due course based on the guidelines. 7. Course grading: passing grade in the final exam is a necessary condition for passing the course. (If you fail the exam, you fail the course). In case you pass the exam, your final course grade will be 60% exam 40% home assignments: * Assignments are in pairs. * Changing partners is NOT allowed after the first assignment is submitted. 8. Let us note that we've increased the quota of registrants for the course to 80 students so that there are 20 more places for those who are interested. Good luck and health, The course staff (Nir Ailon, Ronen Nir, Tom Avrech) |
עדכון אחרון ב- 3/11/2020, 09:47:13 Last updated on 3/11/2020, 09:47:13 Последняя модификация 3/11/2020, 09:47:13 تمت الحتلنة الأخيرة ب- 3/11/2020, 09:47:13 |