Exams
We will have an in-class midterm exam in mid-March and a final from 1:00pm to 3:00pm Friday, May 17 in our classroom, Fine Arts 210.
Midterm exam
Here are some notes on the midterm exam, which will be held in class in mid-March, probably on Wednesday, March 14.
(1) There may be questions that ask you to write or understand very simple Python code.
(2) The exam will be based on the concepts, techniques and algorithms discussed in our text book and in class.
(3) It's important to have read all of chapters 1-6 and section 17.5 in our text. This will fill in some of the gaps in our class coverage and discussions and also provide more background knowledge.
(4) You can look at the old mid-term exams found linked from this page. Note that there will be no questions on Lisp, Prolog or any topics related to them (e.g., unification).
(5) listed below are things you should be prepared to do.
Chapter 2: Intelligent agents
- Understand the basic frameworks and characteristics of environments and agents introduced in chapter 2.
Chapters 3 and 4: Search
- Take a problem description and come up with a way to represent it as a search problem by developing a way to represent the states, actions, and recognize a goal state.
- Be able to analyze a state space to estimate the number of states, the 'branching factor' of its actions, etc.
- Know the properties that characterize a state space (finite, infinite, etc.) and algorithms that are used to find solutions (completeness, soundness, optimality, etc).
- Understand both uninformed and informed search algorithms including breadth first, depth first, best first, algorithm A, algorithm A*, iterative deepening, depth limited, bi-directional search, beam search, uniform cost search, etc.
- Understand local search algorithms including hill-climbing and its variants, simulate annealing and genetic algorithms.
- Know how to simulate these algorithms.
- Be able to develop heuristic functions and tell which ones are admissible.
- Understand how to tell if one heuristic is more informed than another.
Chapter 5: Constraint Satisfaction Problems
- Understand the basics of CSP, including variables, domains, constraints.
- Be able to take a problem description and set it up as a CSP problem. For example, identify a reasonable set of variables, indicate the domain of each, describe the constraints that must old on variables or sets of variables.
- Understand the forward checking and ARC-3 algorithms and be able to simulate them.
- Understand the min-constraints algorithm and be able to simulate it.
Chapter 6: Adversarial search (and 17.6) Game theory
- Understand the basic characteristics of games
- Understand and be able to simulate Minimax with and without alpha-beta given a game tree and the values of the static evaluation function on the leaves.
- Be able to take a game and develop a representation for it and its moves and to describe a reasonable static evaluation function.
- Understand how to handle games with uncertainty.
- Be familiar with the basic concepts of game theory -- strategies, payoffs, Nash equilibrium, prisoner's dilemma, dominant strategies, etc. and how and when they come into play.
Some old midterms
Here are some old 471 midterm exams and a few from 671, the graduate version of 471.
Final exam
The final exam will be given on from 1:00pm to 3
:00pm Friday, May 17 in our classroom, Fine Arts 210.
The final will be comprehensive with more emphasis on material since the midterm exam. Review the slides we showed in class, the homework assignments, and the old exams I've given in previous semesters.
Chapter 7,8 : Logical Agents 7.1-7.6; 8.1-8.2; notes
- Understand how an agent can use logic to model its environment, make decisions and achieve goals, e.g. a player in the Wumpus World
- Understand the syntax and semantics of propositional logic
- Understand the concept of a model for a set of propositional sentences
- Understand the concept of a valid sentence (tautology) and an inconsistent sentence
- Know how to find all models entailed by a set of propositions
- Understand the concepts of soundness and completeness for a logic reasoner
- Understand the resolution inference rule and how a resolution theorem prover works
- Know what a Horn clause is in propositional logic, how to determine if a proposition sentence is a Horn clause and why Horn clauses are significant
- know how to convert a set of propositional sentences to conjunctive normal form and then to use resolution to try to prove if an additional propositional sentence is true
- Understand the limitation of propositional logic as a representation and reasoning tool
- Know what it means for a proposition to satisfiable
- Understand first order logic (FOL), it's notation(s) and quantifiers
- Understand how FOL differs from higher-order logics differ
- Be able to represent the meaning of an English sentence in FOL and to paraphrase a FOL sentence in English
Chapter 10: Planning 10.1, 10.2, notes
- Understand the blocks world domain
- Understand classical strips planning
- Understand algorithms for state-space search
- Understand the partial-order planning approach and its advantage
- Be familiar with the PDDL planning representation
Chapter 18: Learning from Examples 18.1-18.6; 18.9-11; notes
- supervised vs. unsupervised ML
- Tasks: regression and classification
- Regression: linear and logistic
- Decision trees
- entropy and information gain
- ID3 algorithm using information gain
- advantages and disadvantages of decision trees
- Support Vector Machine (SVM)
- linear separability of data
- margin and support vectors
- soft margin for allowing non-separable data
- Tools
- ML methodology
- Separate development, test and validation data
- k-fold cross validation
- Metrics: precision, recall, accuracy, F1
- Learning curve
- Confusion matrix
- ML ensembling
- bagging, various ways
- random forest of decision trees
- Advantages
- Unsupervised ML
- Clustering data
- k-means clustering
- hierarchical clustering
- dendogram
- bottom-up agglomerative vs. top-down divisive
Chapter 18: Neural Networks 18.7 notes
- Basic elements: nodes (inputs, hidden layers, outputs), connections, weights, activation function
- Types of neural networks and their advantages./disadvantages/purpose
- Basic perceptron (single layer, step activation function)
- MLP: Multi-layer perceptron
- Feed Forward
- RNN: recurrent neural network
- CNN: convoluted neural network
- Training process
- Loss function
- Backpropagation
- Activation functions (step, ReLu, sigmoid, tanh
- batches and epochs
- dropout
- Awareness of tools: TensorFlow, Keras
- Advantages and disadvantages of neural networks for supervised machine learning compared to other methods (e.g., decision trees, SVN)
Here are old exams that you can use as examples of what to expect. The content has varied over the years, so you should ignore anything that we did not cover this semester.
|