期刊名称:Proceedings of the Canadian Engineering Education Association
出版年度:2017
出版社:The Canadian Engineering Education Association (CEEA)
摘要:An assessment consists of questions addressing the required learning outcomes of a course. If a pool of questions of various types is made available then assessment design reduces to selection of questions, one by one, from the pool. Since the number of possible questions for a course may be quite large, and several preferences have to be matched, manual selection of a suitable question is not possible. This paper presents an enhanced implementation of a previously presented idea of a methodology for assessment design with an application to a course of Hydraulics with an initial pool of 1,000 questions. Each question is tagged with a set of attributes. The rules are generated by the expert system itself. The idea of a score of relevance has been introduced. The enhanced implementation displays a set of questions with their relevance scores rather than a single question to let the instructor choose from them. An instance of MS SQL Server at Azure database is used for the web-based cloud implementation.