Doing more cool stuff with Maple TA without being a Maple Expert
Earlier I wrote a blog post on my first steps into Maple Graded questions in MapleTA, helping me to grade variations of the correct answer in student’s response.
For the next run of the class we piloted last year, we wanted to improve the grading.
Our goal:
Provide automatic partial grading of small mistakes,
thus boosting student confidence during the exam
and saving time on grading the exam.
..
I managed to simplify the grading code with help from the MapleTA Community. I used ‘wildcards’ to capture the essence of a formula and used the Maple substitution function (subs and algsubs) to allow for the various notations the students would write for the correct response. If you want to see the code I used: visit the Maple TA and Mobius Community , (register) and search for my posts (@metahofzicht).
..
..
At first I was directed towards using procedures in the algorithm section in MapleTA. The example I got via the forum was brilliant (including easy checking of the code) but I got stuck, when using multiple response fields in my questions. (Still need to ask an expert what went wrong), but moving the grading to the grading code section of the response field resolved the issue.
Unfortunately it introduced another hurdle: I had to start using Maple to check my code. I had to use it for its debugging functionality, since regrettably TA lacks this essential feature. The disadvantage of having to use Maple for debugging is that syntax differs between the two. Some functions are ‘translated’ automatically when copy/pasting it from Maple into TA, but not all (especially not the parameter notation).
The main advantages of the new way of coding, was that it was way easier to read and correct typing mistakes in the formulas and since it was a lot shorter, it ran faster. In the first exam (june 2016) some questions took some time to load.
This year’s exam and retake went very smooth. Firstly, our lengthy discussions on how to pose the question (learning goals, texts, scoring and presentation) in the previous year, provided us with some sort of a blueprint for this particular course. This helped the instructor create new exam questions quicker and in a format that made clear how to do the programming and grading. Secondly, the instructor made sure he tested the exam extensively beforehand to minimize the number of mistakes afterwards. Thirdly, the students came in better prepared since they learned how to use TA and had practiced its syntax from day 1 of the course. They had access to three example exams to practice with.
..
Concluding
..
Grading the exam was quick, even though the number of participants exceeded expectations (500+). This process turned out to be the biggest time saver despite some small corrections needed. Being able to regrade a question would certainly have speeded up these corrections, but for now we could only conclude that digital testing actually can save time (only after a big time investment earlier).
If you want to know more about the pilot and its findings you can find an article about it on Maplesofts website in the technical Research Papers section : Digital Testing in Engineering courses.
…
Comments are closed.