Should we get into a tizzy about students meeting deadlines

April 7, 2024

In a recent paper on extending deadlines for student assignments, researchers point out that it is not an issue we need to sweat about.

“This study uses evidence to debunk common misconceptions about assignment extensions.”
“The extension without penalty system was used by 78% of the students, but half of them only used it once”

My two cents: There is always a happy medium between being strict and lenient. Extending deadlines for everyone is fair—not just for those who ask unless they have a reasonable excuse. Many extroverts get ahead because they ask—are we rewarding behavior or learning? Sure, one should also give a fixed number of unexcused deadline extensions so that private issues are not forced to be exposed.

In LMS, one can set a deadline and then “open until” a date. The two can act as deadlines and extended deadlines, respectively. I did this for a few assignments in a course many years ago, and the “open until” became the deadline, and it was all the same. Students catch up fast, and it makes no difference. Less than 10% of the students submitted on time. The extended deadline bugs students though, as the “open until” does not show up on their calendar, and they must manually keep track of deadlines – oh, the travesty.

Effect of Multiple Chance Testing on Student Performance and Perception

April 6, 2024

We just published an article in the International Journal of Engineering Education on multiple chance testing in an engineering course.

A. Kaw, R. Clark, “Effects of Standards-Based Testing via Multiple-Chance Testing on Cognitive and Affective Outcomes in an Engineering Course,” International Journal of Engineering Education, Vol. 40, (2), 2024, pp. 303-321. 

The article is behind a paywall, but you can ask me to send a preprint.

In this article, we explore the concept of standards-based grading and its potential benefits for student learning. We also discussed my reservations about adopting standards-based grading in a large enrollment class and proposed an alternative approach, standards-based testing with multiple-chance testing. The findings of our study indicate that implementing multiple-chance testing resulted in higher student performance, more ‘A’ grades, and a more positive classroom environment. Students appreciated the enhanced learning experience, the opportunity for retakes, and the reduced stress associated with standards-based testing. However, some students mentioned the issue of not knowing their ongoing overall grade in the course. I believe that this manuscript will be of interest to your readers, as it provides a practical approach to implementing standards-based grading principles in large enrollment classes. It also raises important questions about using multiple-chance testing and its potential advantages and drawbacks for students and instructors.

Abstract: Multiple-chance testing was used to conduct standards-based testing in a blended-format numerical methods course for engineering undergraduates. The process involved giving multiple chances on tests and post-class learning management system quizzes. The effectiveness of standards-based testing was evaluated through various forms of assessment, including an analysis of cognitive and affective outcomes, and compared to a blended classroom that did not use standards-based testing. Based on a two-part final exam, a concept inventory, final course grades, a classroom environment inventory, and focus groups, the results showed that standards-based testing had overall positive effects. Standards-based testing was associated with a more significant percentage of students (15% vs. 3%) earning a high final exam score, a higher proportion of A grades (36% vs. 27%), and a better classroom environment on dimensions of involvement, cohesiveness, and satisfaction. Focus group discussions revealed that students appreciated the benefits of enhanced learning, second chances, and reduced stress with standards-based testing. The study also included an analysis of the impact of standards based testing on underrepresented minorities, Pell Grant recipients (low socioeconomic groups), and low-GPA students, as well as an examination of test-retaking behaviors. The methodology and comprehensive results of the study are presented in this paper.

How do I solve a first order ODE numerically in MATLAB?

The other day a student came to ask me for help in solving a first order ordinary differential equation using the ode45 routine of MATLAB.  To use ode45, one needs to be familiar with how the inputs are required by MATLAB.

To solve a first-order ODE by ode45 is straightforward.

The ODE given is given as
3*dy/dx+7*y^1.2=5*x^1.1, y(2)=13.
Find the value of y(19).

Program without comments

clc
clear all
dydx=@(x,y) (5*x^1.1-7*y^1.2)/3
[xx,yy]=ode45(dydx,[2,19],13);
n=length(yy);
yend=yy(n)

 

Program with comments

clc
clear all
% Assume that you are given a first-order differential equation
% 3*dy/dx+7*y^1.2=5*x^1.1, y(2)=13.
% Find the value of y(19).
% How would you solve it by using the ode45 MATLAB function?
% SOLUTION
% First you would need to rewrite ODE as dy/dx=f(x,y) form
% dy/dx=(5*x^1.1-7*y^1.2)/3, y(2)=13
% Define a variable of your choice and write what dy/dx is.
% @(x,y) means these are the independent and dependent variables in ODE
dydx=@(x,y) (5*x^1.1-7*y^1.2)/3
% Look at the ode45 help in MATLAB
% Left hand side yy vector is where you want the values stored.
% xx is the vector that will be chosen by MATLAB, not you, at which it will
% provide you the value of yy vector.
% Inputs to ode45 are the following.
% 1) dydx is the ODE in the form of Line 14.
% 2) [2,19] is the span of xx values. You can observe x=2 is where the
% initial condition is given and x=19 is the value at which you are
% seeking the y value at. These two inputs can be variables too.
% 3) Last input is value of the initial condition that is given at x=2.
% This input can be a variable too.
[xx,yy]=ode45(dydx,[2,19],13);
% Since last entry of yy vector would be where the xx span ends, and that
% is where you want to find the value of y
n=length(yy);
yend=yy(n)

________________________________________

This post is brought to you by

Journal paper on use of adaptive learning in flipped classrooms published

August 28, 2023

As part of various instructional approaches, adaptive learning platforms are becoming increasingly popular. In this article, we focus specifically on the use of adaptive learning in personalized, preclass learning for a flipped classroom. In the past, data on student engagement with course content generated by adaptive learning platforms was not easily accessible. However, this data is now proving to be invaluable in gaining a deeper understanding of the learning process and improving it. Our goal is to examine the connection between interactions on adaptive learning platforms and overall student success. We also aim to identify the variables that have the greatest impact on student success. To achieve this, we conducted a comprehensive analysis of adaptive learning platform data gathered from a Numerical Methods course. Our analysis included aggregate statistics, frequency analysis, and Principal Component Analysis. We used this analysis to identify the variables that exhibited the most variability and provided the most information in the data. We then explored naturally occurring clusters of students using the Partitioning Around Medoids clustering approach. We found that overall performance in the course, as measured by the final course grade, is strongly associated with two factors: (1) the behavioral interactions of students with the adaptive platform and (2) their performance on the adaptive learning assessments. We also discovered distinct student clusters that exhibited different behaviors and success in the course. This information can be used to identify students who require more support and to design evidence-based strategies to support these students.

Reference:
Yalcin, Ali, Autar Kaw, and Renee Clark. “On learning platform metrics as markers for student success in a course.” Computer Applications in Engineering Education (2023).  https://doi.org/10.1002/cae.22653

Asking ChatGPT to look at my student evaluations to give me advice on improving my teaching

I took the student evaluations of my course in Computational Methods taught to juniors in engineering and asked ChatGPT to summarize how I can improve my teaching.

Based on the student evaluations, here are some key points to consider for improving your teaching:

  • Adaptive Lessons: Students found the adaptive lessons to be beneficial as they encouraged independent learning and helped them overcome their fear of failure. Consider keeping or modifying this approach to continue fostering self-directed learning.
  • Course Structure: The course was praised for its structure, but some students felt that it covered too much material. Consider evaluating the course content to determine if any sections can be condensed or removed to enhance learning outcomes.
  • MATLAB Coding: Some students desired more time dedicated to MATLAB coding. Consider incorporating additional coding exercises or providing supplementary resources to further develop students’ programming skills.
  • Pre-Tests and Testing Materials: Students expressed a desire for more comprehensive pre-tests and supplemental testing materials. Consider providing multiple pre-tests and additional practice materials to help students prepare for exams effectively.
  • Real-Life Examples: Students appreciated when the course material was related to real-life examples. Continue incorporating practical applications to enhance student understanding and engagement.
  • Formula Sheet and Exam Format: Students suggested allowing the use of a formula sheet and including more applicable problems in exams rather than focusing solely on memorization. Consider revising exam formats and providing formula sheets to support students’ problem-solving skills.
  • Office Hours and Support: Students appreciated the availability of office hours and additional help sessions. Continue offering these resources to ensure students have ample opportunities to seek clarification and assistance.
  • Flipped Classroom Approach: While some students appreciated the flipped classroom approach, others found it redundant. Consider evaluating the effectiveness of this method and potentially modifying it based on student feedback.
  • Consistency in Interaction: Students mentioned inconsistencies in how you interacted with students inside and outside the class, raising concerns about fairness. Strive to maintain consistency in your interactions with students to ensure a fair and inclusive learning environment.
  • Computer Projects: Some students found the computer projects lengthy and confusing. Consider providing clearer instructions and support for these projects to enhance the learning experience.
  • Student Engagement: Students appreciated your efforts to engage them and create an enjoyable learning environment. Continue employing varied teaching methods, incorporating humor, and making connections with students to foster their interest and participation.
  • Feedback and Improvement: Students appreciated your dedication to improvement and willingness to listen to their opinions. Continue seeking student feedback and implementing changes accordingly to enhance the learning experience.
  • Clarity of Explanation: Some students found your explanations challenging to follow due to speaking too fast. Aim for clarity in your delivery, allowing students to absorb and understand the concepts effectively.
  • Feedback and Availability: Students appreciated your knowledge and availability outside of class. Continue providing clear feedback and being accessible to students, fostering their learning and growth.
  • Cultural Sensitivity: Be mindful of cultural sensitivity and ensure your teaching approach is respectful and unbiased toward all students.

Overall, the evaluations indicate that you are a knowledgeable and caring professor who puts effort into supporting student learning. By addressing the mentioned areas of improvement, you can further enhance the learning experience for your students.

A VBA function for Cohen’s effect size

While many social scientists tend to get excited to report p-values of less than 0.05 when comparing two groups, they tend to forget to report effect size. One can get a low p-value just by having a very large sample size but it is the effect size that tells you how much pragmatic difference there is between two groups. Although there are simple formulas out there for effect size, it is better to have a formula for two groups of unequal size. I wrote a VBA program for the Cohen’s effect size and you can use it readily.  You can download it form here.  If you are not familiar with how to enter the VBA program, here is a quick tutorial.  You just have to save the excel file as a .xlsm file.

Example of usage is
EffectSizeCohen(A23:A78,B24:B67)
where A23:A78 has the control group numbers and B24:B67 has the experimental group numbers.

Function EffectSizeCohen(ControlGroup As Range, ExperimentalGroup As Range) As Variant
'This function finds the Cohen's effect size given the numbers from the control
'and experimental group
'INPUTS
'ControlGroup: Excel range of numbers for control group, e.g. A120:230
'ExperimentalGroup: Excel range of numbers for experimental group, e.g. A120:230
'OUTPUTS
'Cohen's effect size.
'See formula at
'https://www.statisticssolutions.com/free-resources/directory-of-statistical-analyses/effect-size/
'USAGE
'EffectSizeCohen(A23:A78,B24:B67)

'Putting the ranges in a column vector. Note that excel stores it as as two dimensional matrix though
ControlData = ControlGroup.Value
ExperimentalData = ExperimentalGroup.Value

'Number of entries of the two groups
ControlLength = ControlGroup.Count
ExperimentalLength = ExperimentalGroup.Count

'Calculating the average and standard Deviation of control group
ControlSum = 0
For i = 1 To ControlLength
ControlSum = ControlSum + ControlData(i, 1)
Next i
ControlAvg = ControlSum / ControlLength

ControlVar = 0
For i = 1 To ControlLength
ControlVar = ControlVar + (ControlData(i, 1) - ControlAvg) ^ 2
Next i
ControlStd = (ControlVar / (ControlLength - 1)) ^ 0.5

'Calculating the average and standard Deviation of experimental group
For i = 1 To ExperimentalLength
ExperimentalSum = ExperimentalSum + ExperimentalData(i, 1)
Next i
ExperimentalAvg = ExperimentalSum / ExperimentalLength
ExperimentalVar = 0
For i = 1 To ExperimentalLength
ExperimentalVar = ExperimentalVar + (ExperimentalData(i, 1) - ExperimentalAvg) ^ 2
Next i
ExperimentalStd = (ExperimentalVar / (ExperimentalLength - 1)) ^ 0.5

'Calculating the Cohen's effect size
'See formula at https://www.statisticssolutions.com/free-resources/directory-of-statistical-analyses/effect-size/
StdMean = (((ControlLength - 1) * (ControlStd) ^ 2 + (ExperimentalLength - 1) * (ExperimentalStd) ^ 2) / (ControlLength + ExperimentalLength)) ^ 0.5
EffectSizeCohen = (ExperimentalAvg - ControlAvg) / StdMean
End Function

Using PollEveryWhere in University of South Florida

Introduction: I have been using Poll Everywhere in my Numerical Methods course for a year.  As per their website, “Poll Everywhere is a web-based audience response system that lets speakers embed live activities directly into their presentations. Using a variety of activity types, you can turn a series of slides into an effective, interactive experience. Get to know participants, gauge their knowledge of a specific topic, and capture their valuable feedback at scale.”

I use Poll Everywhere to ask mostly conceptual questions in class.  Think-pair active learning strategy is implemented. Students answer a question by themselves (think) and then if the correct answer is chosen by less than 85% of the class, two students pair up with another student for discussion.  The poll is retaken and the instructor wraps up by discussing the question.

In my class, the participation of students is not recorded in any form, nor is it a part of a grade.  This makes for quicker learning and implementation of Poll Everywhere for the instructor, while students learn the content without pressure of grade or being found out.  Do students still participate – yes?  Do all students participate – no?  If you want to learn how to collect participation data and integrate it with CANVAS Gradebook, click here.

Getting Started at USF: The Getting Started document written specifically for the USF faculty is here.  This document will help you set up the Poll Everywhere account.

Do not forget to change your username to something short, but I advise you not to use a course name or something specific.  Last names are good to use. To change the username, log in to Poll Everywhere and click on Profile, and change the default name.  I changed it to kaw.  When students in my class are participating in the poll, they simply go to a browser and enter https://pollev.com/kaw and respond to the presented question.

Although you can use Poll Everywhere directly via a web browser, I use it only through PowerPoint (PPT).  This blog is thus limited to the PPT implementation.

To use in a PPT, you have to download the Poll Everywhere add-on. You can download it here for Windows and here for Mac.  Once you execute the downloaded file, Poll Everywhere should appear as an option in the top ribbon of the PPT menu as shown in the figure below.  If it does not show up in the menu, go to File>Options>Add ins and add it.  Microsoft has a step-by-step guide for adding an add-in.

Inserting a question in PPT: The directions to insert a poll audience question in PowerPoint are given in the YouTube video below.

For PCs

For Macs

Start small with only multiple-choice questions, and you can explore other types of questions later.  In the above video, inserting an already-made activity is shown.  The directions on how to make such an already-made activity at the Poll Everywhere website are given here.

Visual Settings: When you make an activity, such as a multiple-choice questions, you need to choose “Visual settings” in the right-hand menu of the activity.  These are the choices I use to keep it simple.

I only allow website response.  Make sure the activity is unlocked.


I do not place any audience restrictions and the participants can be identified by their chosen arbitrary screen name!


I allow a student to respond only one time and let them choose only one option in multiple-choice questions.  The second option will need to be “as many times as they like” if multiple answers need to be picked.

For more information about activity settings, go here.


Using it in Class: When you wish to ask students the question, you present the PPT in the presentation mode.  Give the polling link to the students so that they can respond (it will show up on the top part of the slide – see figure below).  They can use their mobile devices to respond and a good practice is to countdown 5-4-3-2-1 when you think enough time has been given to respond. You can now click on “Responses” when you hover over the slide with your mouse.  If you want the students to retake a poll, click on More> Clear the responses. You can show the correct answer by clicking “Correctness.”

Other Resources:  The video below was highly beneficial to me in getting an overall view of Poll Everywhere.  It is an hour long media, but you will learn faster.

There is an advanced video as well if you so want to venture.

And do not hesitate to send an email to support@polleverywhere.com about an issue or a question.  They were most helpful.



This post is brought to you by

Balancing the social mobility index and reputation rankings

Autar Kaw
December 19, 2022

Social mobility is becoming ever so popular a criterion to evaluate university education.  However, we all know that reputation of university matters as well because it attracts high-caliber students, faculty, and staff, and hence presumably, the quality of education and opportunity of high impact practices, such as research experiences, internships, cooperative education, and capstone courses.

I wanted to see how well my university – the University of South Florida (USF), balances the two issues of social mobility and reputation.  USF is a Carnegie R1 institution, meaning that it is categorized as a university with “very high research activity”.  There are 146 R1 universities, amongst which 106 (actually, there are 107 but one of them Graduate Center, CUNY only admits graduate students) are public.

This blog is limited to the Carnegie R1 public universities.  I gave the same weight to the two rankings US News and World Report ranking and the Economic Mobility Index ranking .   The US News and World Report ranking  mostly use reputation in its calculations, and other factors include graduation and retention rates, selectivity of the incoming class, alumni giving rate, etc., which are substantially influenced by the wealth and income of the student’s family. The Economic Mobility Index ranking measures social mobility via out-of-pocket expenses, salary boost due to college degree, and time after graduation to recoup the money spent to go to college.

The weighted ranking is not hard science, but it gives us a glimpse of where we stand.  We at the University of South Florida are No. 17 out of 106 public Carnegie Research 1 universities.  Seven of the top 10 universities belong to the University of California system, while Florida is not far behind, holding the spots at 11, 13, 17, 22, and 25 in this combined ranking system.

University of California, Irvine 1
University of California, San Diego 2
University of Illinois Urbana-Champaign 3
University of California, Davis 4
University of California, Los Angeles 5
University of California, Santa Barbara 6
University of Texas at Austin 7
Rutgers University–New Brunswick 8
University of California, Berkeley 9
University of California, Riverside 10
University of Florida 11
University of Illinois Chicago 12
Florida State University 13
Stony Brook University 14
New Jersey Institute of Technology 15
University of North Carolina at Chapel Hill 16
University of South Florida 17
University of California, Santa Cruz 18
University at Buffalo 19
University of Washington 20
University of Connecticut 21
Florida International University 22
Arizona State University Campus Immersion 23
Binghamton University 24
University of Central Florida 25
Ohio State University 26
University of Georgia 27
University of Arizona 28
University of Michigan 29
University of Houston 30
Texas A&M University 31
Michigan State University 32
Georgia State University 33
University at Albany, SUNY 34
University of Colorado Denver 35
Wayne State University 36
University of Texas at Dallas 37
University of Maryland, Baltimore County 38
Temple University 39
George Mason University 40
North Carolina State University 41
University of Utah 42
University of Tennessee 43
University of Maryland, College Park 44
University of Minnesota 45
Indiana University Bloomington 46
University of Virginia 47
University of Alabama at Birmingham 48
University of Wisconsin–Madison 49
University of Iowa 50
University of Texas at El Paso 51
University of Massachusetts Amherst 52
Purdue University 53
University of Oregon 54
Georgia Institute of Technology 55
University of Texas at San Antonio 56
University of Pittsburgh 57
Washington State University 58
Virginia Tech 59
University of North Texas 60
Ohio University 61
University of Nevada, Las Vegas 62
University of Memphis 63
Utah State University 64
University of Hawaii at Manoa 65
Oklahoma State University–Stillwater 66
Pennsylvania State University 67
University of Texas at Arlington 68
University of Kansas 69
University of Kentucky 70
University of Oklahoma 71
University of Missouri 72
University of South Carolina 73
Clemson University 74
Oregon State University 75
University of Louisville 76
Virginia Commonwealth University 77
Old Dominion University 78
Texas Tech University 79
University of Mississippi 80
Iowa State University 81
Colorado School of Mines 82
University of Southern Mississippi 83
University of New Mexico 84
Louisiana State University 85
University of Delaware 86
University of Colorado Boulder 87
University of New Hampshire 88
West Virginia University 89
University of Cincinnati 90
University of Nevada, Reno 91
Auburn University 92
Mississippi State University 93
Colorado State University 94
University of Maine 95
University of Louisiana at Lafayette 96
University of Nebraska–Lincoln 97
University of Wisconsin–Milwaukee 98
Kansas State University 99
University of Arkansas 100
University of Alabama 101
Kent State University 102
University of Alabama in Huntsville 103
University of Montana 104
Montana State University 105
North Dakota State University 106


Autar Kaw is a professor of mechanical engineering at the University of South Florida. He is a recipient of the 2012 U.S. Professor of the Year Award (doctoral and research universities) from the Council for Advancement and Support of Education and the Carnegie Foundation for Advancement of Teaching. His primary scholarly interests are engineering education research, adaptive, blended, and flipped learning, open courseware development, composite materials mechanics, and higher education’s state and future. He has written more than 150 refereed technical papers, and his opinion editorials have appeared in the Tampa Bay Times, the Tampa Tribune, and the Chronicle Vitae.

 

 

Integrating functions given at discrete points via MATLAB

When integrating functions given at discrete data points in MATLAB, trapz is the function of choice to go for. But we can get more accurate results by interpolating via cubic splines with the MATLAB spline function.  Since the spline function is made of piecewise cubics, Simpson’s 1/3rd rule can be used to exactly integrate them.   Here is a test program and a function written for you.  You can download the mfile here and a published version here for more readability if you wish.

clc
clear all
% Author: Autar Kaw, AutarKaw.com
% https://creativecommons.org/licenses/by-nc-sa/4.0/ 
% Testing the program with data given at discrete points
% for y=x^6
xx=[1  1.5    2   2.5     3     3.5    4   4.5   5];
yy=[1  1.5^6  2^6 2.5^6   3^6   3.5^6  4^6 4.5^6 5^6];
n=length(xx);
splineintegval=splineintegral(xx,yy);
fprintf('Value of integral using spline =%g',splineintegval)
% Exact value of integral if function was given continuously 
syms x
exact=vpaintegral(x^6,x,xx(1),xx(n));
fprintf('\n Value of integral using exact integration =%g',exact)
%% Function to integrate via spline interpolation
function splineval=splineintegral(x,y)
% This function integrates functions given at discrete data points
% INPUTS
% The x-values are given in ascending order 
% The limits of integration are x(1) to x(n), where n is teh length of
% the x-vector.
% OUTPUTS
% Integral of y dx from x(1) to x(n)
% Author: Autar Kaw, AutarKaw.com 
% https://creativecommons.org/licenses/by-nc-sa/4.0/ 
% The function finds the mid-point value of y between the given 
% x-values at the mid-point. Then since the spline is made of cubics,
% it uses the Simpson's 1/3rd rule to integrate the cubics exactly
n=length(x);
m=n-1;
% Calculating mid-points
    for i=1:1:m
        xmid(i)=(x(i)+x(i+1))*0.5;
    end
% Calculating value of y at the midpoints
polyvalmid=spline(x,y,xmid);
% Using Simpson's 1/3rd rule of integration to integrate cubics
splineval=0;
    for i=1:1:m
        splineval=splineval+(y(i)+y(i+1)+4*polyvalmid(i))*(x(i+1)-x(i))/6;
    end
end

____________________________

This post is brought to you by

Quick Start Guide to OpenMCR Program for a Single Key Exam

Ian Sanders and Autar Kaw


Step 1: Print out the required MCQ sheets
      • 75 Question Variant (Download from here or here)
      • 150 Question Variant (Download from here or here)

Step 2: Conduct the test

Give your printed question paper and the blank MCQ sheet to each of the students.  Ask them to bubble items such as last name, first name, middle name, student id, course number, key code (skip this as it is not necessary), and their answer responses.

A detailed explanation is given here


Step 3: Make the key

To create an answer key, simply print a normal sheet and put 9999999999 in the Student ID field. Fill in the exam with the correct answers.

A detailed explanation is given here

Step 4: Download the program to your PC

Go here https://github.com/iansan5653/open-mcr/releases and download the open-mcr.zip file that is available under assets.  Extract the zip file to the directory of your choice.   One of the files you will see amongst the extracted files is open-mcr.exe. That is the one you need to double-click on to run the program.

A detailed explanation is given here

Step 5:

One of the files you will see amongst the extracted files from Step 2 is open-mcr.exe. That is the one you need to double-click on to run the program.

Select the input folder where the images of the MCQ sheets are.

Choose the proper Form Variant as 75 Questions or 150 questions.

Under Select Output Folder, click Browse, and select the folder where you would like to save the resulting CSV files.

Press Continue and watch the progress of the program.

          A detailed explanation is given here


Step 6: Output Files

After the program finishes processing, results will be saved as CSV files in your selected output folder. To get the scores, open the scores csv file with Excel or a text editor.\

          A detailed explanation is given here