Since 2021-2022 this course was made available for a broader range of study programs. Exams from before 2021-2022 are found here: https://wiki.wina.be/examens/index.php/Machine_Learning_and_Inductive_Inference
2021-2022:
26 januari:
small questions:
connect 6 concepts with the right scentence about the concept or their definition
same as previous question
3 questions where you have to anwser what the concept is.
Which of the three has the largest entropy? {a,a,a,a,a,a} OR {a,b} OR {a,a,a,a,a,b}
Big questions:
12 examples given where 4 attributes(2 or three possible values per attribute) predict 1 class (3 possible values). A new instance their 4 attributes are given.
Classify the new instance according to naive bayes (m=2, q=1/2)
Classify the new instance according 4NN. (calculate the distance from the examples).
The hypothesis "Tony likes pizza" is formulated with 0, 1 or 2 literals at most. The possible attributes that predict the hypothesis are {mushrooms, ham, tomatoes, onion}. Example that is valid"TOny likes pizza with mushrooms and tomatoes" or for 0 literals "Tony likes all pizza". Invalid is"TOny likes pizza with mushrooms or ham".
show that the VC dimension of this hypothesis space is at most 3.
show that the VC dimension of this hypothesis space is at least 3.
Given 2 literals (something like: p(X,X) <-- q(X,Y)... and p(X,Y) <-- ... (both had a total of 3) )
how many variables after combining them to claculate the LGG?
What is the LGG (multiple choice a-g)
Calculate the ROC based on the DT below. The proportions in the leaves discribe how the training data is labeled. Assume that a new instance will get classified as positive if the proportion of positives in the leaf is higher then a threshold C. Let the threshold C vary from 0-1 and draw the ROC for this descision tree classifier.