Unresolved CANVAS LMS bug in algorithmic quizzes

I have been using CANVAS Learning management system for algorithmic quizzes for the last two years at University of South Florida and recently in a MOOC for a course in Numerical Methods.

Due to a bug introduced via an update in the CANVAS software, integers now are followed by a decimal point and a zero.  For example, what should be written as 5, now gets written as 5.0.  Some may believe that this is a minor hassle, but see what happened to statements within the quizzes.

  • 3 bits becomes 3.0 bits
  • 5 terms becomes 5.0 terms
  • N=5 becomes N=5.0
  • Represent integer 2 becomes Represent integer 2.0
  • Solve 37 simultaneous linear equations becomes Solve 37.0 simultaneous linear equations
  • Number of zeros after 2 steps becomes Number of zeros after 2.0 steps
  • A binary number is 1.001 becomes A binary number is 1.0.0.0.0.1.0

We already have students who come to a course with many misconceptions; it is discouraging that a bug in a learning management system will create more misconceptions, and it has hence resulted in I not using the quizzes at all.

So far, CANVAS has acknowledged the bug but what has driven me up the wall is that they have made no promise of solving it, let alone a timeline of resolving the bug.  Our university officials have not been able to make much headway either.  What do you think?

UPDATE (Sunday June 20, 2016):  From our university personnel: “It’s been escalated all the way up the chain. From what I can tell it has been put on the top 10 bug list and it has been assigned an engineer.  I should have more information tomorrow morning.”

UPDATE (Sunday June 27, 2016):  From CANVAS: “We’ve deployed a fix for this issue in our Beta environment. If all goes well, it will make it to the live Production environment on July 16, 2016. This ticket will remain in an “On-Hold” status until then.”

___________

This post is brought to you by

0 thoughts on “Unresolved CANVAS LMS bug in algorithmic quizzes”

  1. I wonder if it’s a third-party integration or some code that was added locally.

    The last example suggests this isn’t a datatype issue (particularly since Coffeescript uses a single number datatype for ints and floats.) Maybe the LMS is trying to parse numbers similar to (although not exactly like) this: https://gist.github.com/lnolte/3794646

    It could also be that any formula-type or numerical-type question now uses decimal for the answer:
    https://canvas.instructure.com/doc/api/quiz_submission_questions.html (Search for “formula”; parameter type is decimal.)

    USF might or might not sync their local installation often, but you can try to get the Canvas developers to acknowledge and repair the issue at: https://github.com/instructure/canvas-lms/issues

    One last thing: Is there a way to download the raw quiz data (e.g. in json format) from Canvas? If so, you can check if the correct format is shown in the raw data. It would still be a parsing issue, but you could narrow down if it occurs right after a question is coded or if it’s how Canvas is displaying the document to the user.

    These are all shots in the dark. Hopefully they get you closer to a solution without wasting too much of your time!

  2. I wonder if it’s a third-party integration or some code that was added locally.

    The last example suggests this isn’t a datatype issue (particularly since Coffeescript uses a single number datatype for ints and floats.) Maybe the LMS is trying to parse numbers similar to (although not exactly like) this: https://gist.github.com/lnolte/3794646

    It could also be that any formula-type or numerical-type question now uses decimal for the answer:
    https://canvas.instructure.com/doc/api/quiz_submission_questions.html (Search for “formula”; parameter type is decimal.)

    USF might or might not sync their local installation often, but you can try to get the Canvas developers to acknowledge and repair the issue at: https://github.com/instructure/canvas-lms/issues

    One last thing: Is there a way to download the raw quiz data (e.g. in json format) from Canvas? If so, you can check if the correct format is shown in the raw data. It would still be a parsing issue, but you could narrow down if it occurs right after a question is coded or if it’s how Canvas is displaying the document to the user.

    These are all shots in the dark. Hopefully they get you closer to a solution without wasting too much of your time!

    1. It is not a third-party issue as the problem has been duplicated at least at one other university https://community.canvaslms.com/groups/teaching-math-in-canvas . Canvas has acknowledged the problem but do not want to work on it as they are working on a new quiz engine https://community.canvaslms.com/docs/DOC-4674 . There is no release date for the new quiz engine and so it is a waiting game. With a new quiz engine, in all probability, there will be bugs and legacy issues. So it is frustrating to have to deal with a company like that and they know that we will not leave them at the drop of the hat.

  3. I hope there is a fix for this soon. I teach chemistry and significant figures is a part of that. We try to emphasize how important this is to our students. I simply can’t use the formula type questions when I make a quiz on this topic – extra zeroes will make the students’ answers wrong.

  4. I hope there is a fix for this soon. I teach chemistry and significant figures is a part of that. We try to emphasize how important this is to our students. I simply can’t use the formula type questions when I make a quiz on this topic – extra zeroes will make the students’ answers wrong.

    1. Hello Debra:
      Thank you! Please do what you can for CANVAS to take action. You can tweet @canvaslms or send an email to support@instructure.com. You can also contact your CANVAS support at the university and ask them to file as a bug. I am sure that you have done some of the above. STEM instructors do not get much traction with LMS functions and bugs as we are a minority. If you hear something, please comment here or email me at autarkaw@yahoo.com
      Best Wishes
      Autar
      http://autarkaw.com

  5. UPDATE (Sunday June 20, 2016): From our university personnel: “It’s been escalated all the way up the chain. From what I can tell it has been put on the top 10 bug list and it has been assigned an engineer. I should have more information tomorrow morning.”

  6. UPDATE (Sunday June 20, 2016): From our university personnel: “It’s been escalated all the way up the chain. From what I can tell it has been put on the top 10 bug list and it has been assigned an engineer. I should have more information tomorrow morning.”

Leave a Reply