Journal paper on use of adaptive learning in flipped classrooms published

August 28, 2023

As part of various instructional approaches, adaptive learning platforms are becoming increasingly popular. In this article, we focus specifically on the use of adaptive learning in personalized, preclass learning for a flipped classroom. In the past, data on student engagement with course content generated by adaptive learning platforms was not easily accessible. However, this data is now proving to be invaluable in gaining a deeper understanding of the learning process and improving it. Our goal is to examine the connection between interactions on adaptive learning platforms and overall student success. We also aim to identify the variables that have the greatest impact on student success. To achieve this, we conducted a comprehensive analysis of adaptive learning platform data gathered from a Numerical Methods course. Our analysis included aggregate statistics, frequency analysis, and Principal Component Analysis. We used this analysis to identify the variables that exhibited the most variability and provided the most information in the data. We then explored naturally occurring clusters of students using the Partitioning Around Medoids clustering approach. We found that overall performance in the course, as measured by the final course grade, is strongly associated with two factors: (1) the behavioral interactions of students with the adaptive platform and (2) their performance on the adaptive learning assessments. We also discovered distinct student clusters that exhibited different behaviors and success in the course. This information can be used to identify students who require more support and to design evidence-based strategies to support these students.

Yalcin, Ali, Autar Kaw, and Renee Clark. “On learning platform metrics as markers for student success in a course.” Computer Applications in Engineering Education (2023).

Asking ChatGPT to look at my student evaluations to give me advice on improving my teaching

I took the student evaluations of my course in Computational Methods taught to juniors in engineering and asked ChatGPT to summarize how I can improve my teaching.

Based on the student evaluations, here are some key points to consider for improving your teaching:

  • Adaptive Lessons: Students found the adaptive lessons to be beneficial as they encouraged independent learning and helped them overcome their fear of failure. Consider keeping or modifying this approach to continue fostering self-directed learning.
  • Course Structure: The course was praised for its structure, but some students felt that it covered too much material. Consider evaluating the course content to determine if any sections can be condensed or removed to enhance learning outcomes.
  • MATLAB Coding: Some students desired more time dedicated to MATLAB coding. Consider incorporating additional coding exercises or providing supplementary resources to further develop students’ programming skills.
  • Pre-Tests and Testing Materials: Students expressed a desire for more comprehensive pre-tests and supplemental testing materials. Consider providing multiple pre-tests and additional practice materials to help students prepare for exams effectively.
  • Real-Life Examples: Students appreciated when the course material was related to real-life examples. Continue incorporating practical applications to enhance student understanding and engagement.
  • Formula Sheet and Exam Format: Students suggested allowing the use of a formula sheet and including more applicable problems in exams rather than focusing solely on memorization. Consider revising exam formats and providing formula sheets to support students’ problem-solving skills.
  • Office Hours and Support: Students appreciated the availability of office hours and additional help sessions. Continue offering these resources to ensure students have ample opportunities to seek clarification and assistance.
  • Flipped Classroom Approach: While some students appreciated the flipped classroom approach, others found it redundant. Consider evaluating the effectiveness of this method and potentially modifying it based on student feedback.
  • Consistency in Interaction: Students mentioned inconsistencies in how you interacted with students inside and outside the class, raising concerns about fairness. Strive to maintain consistency in your interactions with students to ensure a fair and inclusive learning environment.
  • Computer Projects: Some students found the computer projects lengthy and confusing. Consider providing clearer instructions and support for these projects to enhance the learning experience.
  • Student Engagement: Students appreciated your efforts to engage them and create an enjoyable learning environment. Continue employing varied teaching methods, incorporating humor, and making connections with students to foster their interest and participation.
  • Feedback and Improvement: Students appreciated your dedication to improvement and willingness to listen to their opinions. Continue seeking student feedback and implementing changes accordingly to enhance the learning experience.
  • Clarity of Explanation: Some students found your explanations challenging to follow due to speaking too fast. Aim for clarity in your delivery, allowing students to absorb and understand the concepts effectively.
  • Feedback and Availability: Students appreciated your knowledge and availability outside of class. Continue providing clear feedback and being accessible to students, fostering their learning and growth.
  • Cultural Sensitivity: Be mindful of cultural sensitivity and ensure your teaching approach is respectful and unbiased toward all students.

Overall, the evaluations indicate that you are a knowledgeable and caring professor who puts effort into supporting student learning. By addressing the mentioned areas of improvement, you can further enhance the learning experience for your students.

A VBA function for Cohen’s effect size

While many social scientists tend to get excited to report p-values of less than 0.05 when comparing two groups, they tend to forget to report effect size. One can get a low p-value just by having a very large sample size but it is the effect size that tells you how much pragmatic difference there is between two groups. Although there are simple formulas out there for effect size, it is better to have a formula for two groups of unequal size. I wrote a VBA program for the Cohen’s effect size and you can use it readily.  You can download it form here.  If you are not familiar with how to enter the VBA program, here is a quick tutorial.  You just have to save the excel file as a .xlsm file.

Example of usage is
where A23:A78 has the control group numbers and B24:B67 has the experimental group numbers.

Function EffectSizeCohen(ControlGroup As Range, ExperimentalGroup As Range) As Variant
'This function finds the Cohen's effect size given the numbers from the control
'and experimental group
'ControlGroup: Excel range of numbers for control group, e.g. A120:230
'ExperimentalGroup: Excel range of numbers for experimental group, e.g. A120:230
'Cohen's effect size.
'See formula at

'Putting the ranges in a column vector. Note that excel stores it as as two dimensional matrix though
ControlData = ControlGroup.Value
ExperimentalData = ExperimentalGroup.Value

'Number of entries of the two groups
ControlLength = ControlGroup.Count
ExperimentalLength = ExperimentalGroup.Count

'Calculating the average and standard Deviation of control group
ControlSum = 0
For i = 1 To ControlLength
ControlSum = ControlSum + ControlData(i, 1)
Next i
ControlAvg = ControlSum / ControlLength

ControlVar = 0
For i = 1 To ControlLength
ControlVar = ControlVar + (ControlData(i, 1) - ControlAvg) ^ 2
Next i
ControlStd = (ControlVar / (ControlLength - 1)) ^ 0.5

'Calculating the average and standard Deviation of experimental group
For i = 1 To ExperimentalLength
ExperimentalSum = ExperimentalSum + ExperimentalData(i, 1)
Next i
ExperimentalAvg = ExperimentalSum / ExperimentalLength
ExperimentalVar = 0
For i = 1 To ExperimentalLength
ExperimentalVar = ExperimentalVar + (ExperimentalData(i, 1) - ExperimentalAvg) ^ 2
Next i
ExperimentalStd = (ExperimentalVar / (ExperimentalLength - 1)) ^ 0.5

'Calculating the Cohen's effect size
'See formula at
StdMean = (((ControlLength - 1) * (ControlStd) ^ 2 + (ExperimentalLength - 1) * (ExperimentalStd) ^ 2) / (ControlLength + ExperimentalLength)) ^ 0.5
EffectSizeCohen = (ExperimentalAvg - ControlAvg) / StdMean
End Function

Using PollEveryWhere in University of South Florida

Introduction: I have been using Poll Everywhere in my Numerical Methods course for a year.  As per their website, “Poll Everywhere is a web-based audience response system that lets speakers embed live activities directly into their presentations. Using a variety of activity types, you can turn a series of slides into an effective, interactive experience. Get to know participants, gauge their knowledge of a specific topic, and capture their valuable feedback at scale.”

I use Poll Everywhere to ask mostly conceptual questions in class.  Think-pair active learning strategy is implemented. Students answer a question by themselves (think) and then if the correct answer is chosen by less than 85% of the class, two students pair up with another student for discussion.  The poll is retaken and the instructor wraps up by discussing the question.

In my class, the participation of students is not recorded in any form, nor is it a part of a grade.  This makes for quicker learning and implementation of Poll Everywhere for the instructor, while students learn the content without pressure of grade or being found out.  Do students still participate – yes?  Do all students participate – no?  If you want to learn how to collect participation data and integrate it with CANVAS Gradebook, click here.

Getting Started at USF: The Getting Started document written specifically for the USF faculty is here.  This document will help you set up the Poll Everywhere account.

Do not forget to change your username to something short, but I advise you not to use a course name or something specific.  Last names are good to use. To change the username, log in to Poll Everywhere and click on Profile, and change the default name.  I changed it to kaw.  When students in my class are participating in the poll, they simply go to a browser and enter and respond to the presented question.

Although you can use Poll Everywhere directly via a web browser, I use it only through PowerPoint (PPT).  This blog is thus limited to the PPT implementation.

To use in a PPT, you have to download the Poll Everywhere add-on. You can download it here for Windows and here for Mac.  Once you execute the downloaded file, Poll Everywhere should appear as an option in the top ribbon of the PPT menu as shown in the figure below.  If it does not show up in the menu, go to File>Options>Add ins and add it.  Microsoft has a step-by-step guide for adding an add-in.

Inserting a question in PPT: The directions to insert a poll audience question in PowerPoint are given in the YouTube video below.

For PCs

For Macs

Start small with only multiple-choice questions, and you can explore other types of questions later.  In the above video, inserting an already-made activity is shown.  The directions on how to make such an already-made activity at the Poll Everywhere website are given here.

Visual Settings: When you make an activity, such as a multiple-choice questions, you need to choose “Visual settings” in the right-hand menu of the activity.  These are the choices I use to keep it simple.

I only allow website response.  Make sure the activity is unlocked.

I do not place any audience restrictions and the participants can be identified by their chosen arbitrary screen name!

I allow a student to respond only one time and let them choose only one option in multiple-choice questions.  The second option will need to be “as many times as they like” if multiple answers need to be picked.

For more information about activity settings, go here.

Using it in Class: When you wish to ask students the question, you present the PPT in the presentation mode.  Give the polling link to the students so that they can respond (it will show up on the top part of the slide – see figure below).  They can use their mobile devices to respond and a good practice is to countdown 5-4-3-2-1 when you think enough time has been given to respond. You can now click on “Responses” when you hover over the slide with your mouse.  If you want the students to retake a poll, click on More> Clear the responses. You can show the correct answer by clicking “Correctness.”

Other Resources:  The video below was highly beneficial to me in getting an overall view of Poll Everywhere.  It is an hour long media, but you will learn faster.

There is an advanced video as well if you so want to venture.

And do not hesitate to send an email to about an issue or a question.  They were most helpful.

This post is brought to you by

Balancing the social mobility index and reputation rankings

Autar Kaw
December 19, 2022

Social mobility is becoming ever so popular a criterion to evaluate university education.  However, we all know that reputation of university matters as well because it attracts high-caliber students, faculty, and staff, and hence presumably, the quality of education and opportunity of high impact practices, such as research experiences, internships, cooperative education, and capstone courses.

I wanted to see how well my university – the University of South Florida (USF), balances the two issues of social mobility and reputation.  USF is a Carnegie R1 institution, meaning that it is categorized as a university with “very high research activity”.  There are 146 R1 universities, amongst which 106 (actually, there are 107 but one of them Graduate Center, CUNY only admits graduate students) are public.

This blog is limited to the Carnegie R1 public universities.  I gave the same weight to the two rankings US News and World Report ranking and the Economic Mobility Index ranking .   The US News and World Report ranking  mostly use reputation in its calculations, and other factors include graduation and retention rates, selectivity of the incoming class, alumni giving rate, etc., which are substantially influenced by the wealth and income of the student’s family. The Economic Mobility Index ranking measures social mobility via out-of-pocket expenses, salary boost due to college degree, and time after graduation to recoup the money spent to go to college.

The weighted ranking is not hard science, but it gives us a glimpse of where we stand.  We at the University of South Florida are No. 17 out of 106 public Carnegie Research 1 universities.  Seven of the top 10 universities belong to the University of California system, while Florida is not far behind, holding the spots at 11, 13, 17, 22, and 25 in this combined ranking system.

University of California, Irvine 1
University of California, San Diego 2
University of Illinois Urbana-Champaign 3
University of California, Davis 4
University of California, Los Angeles 5
University of California, Santa Barbara 6
University of Texas at Austin 7
Rutgers University–New Brunswick 8
University of California, Berkeley 9
University of California, Riverside 10
University of Florida 11
University of Illinois Chicago 12
Florida State University 13
Stony Brook University 14
New Jersey Institute of Technology 15
University of North Carolina at Chapel Hill 16
University of South Florida 17
University of California, Santa Cruz 18
University at Buffalo 19
University of Washington 20
University of Connecticut 21
Florida International University 22
Arizona State University Campus Immersion 23
Binghamton University 24
University of Central Florida 25
Ohio State University 26
University of Georgia 27
University of Arizona 28
University of Michigan 29
University of Houston 30
Texas A&M University 31
Michigan State University 32
Georgia State University 33
University at Albany, SUNY 34
University of Colorado Denver 35
Wayne State University 36
University of Texas at Dallas 37
University of Maryland, Baltimore County 38
Temple University 39
George Mason University 40
North Carolina State University 41
University of Utah 42
University of Tennessee 43
University of Maryland, College Park 44
University of Minnesota 45
Indiana University Bloomington 46
University of Virginia 47
University of Alabama at Birmingham 48
University of Wisconsin–Madison 49
University of Iowa 50
University of Texas at El Paso 51
University of Massachusetts Amherst 52
Purdue University 53
University of Oregon 54
Georgia Institute of Technology 55
University of Texas at San Antonio 56
University of Pittsburgh 57
Washington State University 58
Virginia Tech 59
University of North Texas 60
Ohio University 61
University of Nevada, Las Vegas 62
University of Memphis 63
Utah State University 64
University of Hawaii at Manoa 65
Oklahoma State University–Stillwater 66
Pennsylvania State University 67
University of Texas at Arlington 68
University of Kansas 69
University of Kentucky 70
University of Oklahoma 71
University of Missouri 72
University of South Carolina 73
Clemson University 74
Oregon State University 75
University of Louisville 76
Virginia Commonwealth University 77
Old Dominion University 78
Texas Tech University 79
University of Mississippi 80
Iowa State University 81
Colorado School of Mines 82
University of Southern Mississippi 83
University of New Mexico 84
Louisiana State University 85
University of Delaware 86
University of Colorado Boulder 87
University of New Hampshire 88
West Virginia University 89
University of Cincinnati 90
University of Nevada, Reno 91
Auburn University 92
Mississippi State University 93
Colorado State University 94
University of Maine 95
University of Louisiana at Lafayette 96
University of Nebraska–Lincoln 97
University of Wisconsin–Milwaukee 98
Kansas State University 99
University of Arkansas 100
University of Alabama 101
Kent State University 102
University of Alabama in Huntsville 103
University of Montana 104
Montana State University 105
North Dakota State University 106

Autar Kaw is a professor of mechanical engineering at the University of South Florida. He is a recipient of the 2012 U.S. Professor of the Year Award (doctoral and research universities) from the Council for Advancement and Support of Education and the Carnegie Foundation for Advancement of Teaching. His primary scholarly interests are engineering education research, adaptive, blended, and flipped learning, open courseware development, composite materials mechanics, and higher education’s state and future. He has written more than 150 refereed technical papers, and his opinion editorials have appeared in the Tampa Bay Times, the Tampa Tribune, and the Chronicle Vitae.



Integrating functions given at discrete points via MATLAB

When integrating functions given at discrete data points in MATLAB, trapz is the function of choice to go for. But we can get more accurate results by interpolating via cubic splines with the MATLAB spline function.  Since the spline function is made of piecewise cubics, Simpson’s 1/3rd rule can be used to exactly integrate them.   Here is a test program and a function written for you.  You can download the mfile here and a published version here for more readability if you wish.

clear all
% Author: Autar Kaw,
% Testing the program with data given at discrete points
% for y=x^6
xx=[1  1.5    2   2.5     3     3.5    4   4.5   5];
yy=[1  1.5^6  2^6 2.5^6   3^6   3.5^6  4^6 4.5^6 5^6];
fprintf('Value of integral using spline =%g',splineintegval)
% Exact value of integral if function was given continuously 
syms x
fprintf('\n Value of integral using exact integration =%g',exact)
%% Function to integrate via spline interpolation
function splineval=splineintegral(x,y)
% This function integrates functions given at discrete data points
% The x-values are given in ascending order 
% The limits of integration are x(1) to x(n), where n is teh length of
% the x-vector.
% Integral of y dx from x(1) to x(n)
% Author: Autar Kaw, 
% The function finds the mid-point value of y between the given 
% x-values at the mid-point. Then since the spline is made of cubics,
% it uses the Simpson's 1/3rd rule to integrate the cubics exactly
% Calculating mid-points
    for i=1:1:m
% Calculating value of y at the midpoints
% Using Simpson's 1/3rd rule of integration to integrate cubics
    for i=1:1:m


This post is brought to you by

Quick Start Guide to OpenMCR Program for a Single Key Exam

Ian Sanders and Autar Kaw

Step 1: Print out the required MCQ sheets
      • 75 Question Variant (Download from here or here)
      • 150 Question Variant (Download from here or here)

Step 2: Conduct the test

Give your printed question paper and the blank MCQ sheet to each of the students.  Ask them to bubble items such as last name, first name, middle name, student id, course number, key code (skip this as it is not necessary), and their answer responses.

A detailed explanation is given here

Step 3: Make the key

To create an answer key, simply print a normal sheet and put 9999999999 in the Student ID field. Fill in the exam with the correct answers.

A detailed explanation is given here

Step 4: Download the program to your PC

Go here and download the file that is available under assets.  Extract the zip file to the directory of your choice.   One of the files you will see amongst the extracted files is open-mcr.exe. That is the one you need to double-click on to run the program.

A detailed explanation is given here

Step 5:

One of the files you will see amongst the extracted files from Step 2 is open-mcr.exe. That is the one you need to double-click on to run the program.

Select the input folder where the images of the MCQ sheets are.

Choose the proper Form Variant as 75 Questions or 150 questions.

Under Select Output Folder, click Browse, and select the folder where you would like to save the resulting CSV files.

Press Continue and watch the progress of the program.

          A detailed explanation is given here

Step 6: Output Files

After the program finishes processing, results will be saved as CSV files in your selected output folder. To get the scores, open the scores csv file with Excel or a text editor.\

          A detailed explanation is given here

A multiple-choice question response reader

Ian Sanders and Autar Kaw

As we are back in face-to-face classes, we may wish to again conduct multiple-choice question examinations in the classroom without the use of computers to alleviate academic integrity concerns, not have to depend on the reliability of the laptop battery and available WiFi, and not have to continue to make different tests every semester.


Commercially available OMR (optical mark recognition) exam sheets, scanners, and processing software can cost educators and educational institutions thousands of dollars per year. In response to this, OpenMCR has been developed as a free and easy-to-use alternative. The tool includes a multiple-choice exam sheet and works with any scanner and printer.

Here are the steps to use the program.

Step 1: Print out the MCQ sheets

Depending on the number of questions being asked, whether it is less than 75 or more (the maximum number of questions that can be asked is 150), you will first choose one of the two pdf files to print.

        • 75 Question Variant (Download the pdf file from here)
        • 150 Question Variant (Download the pdf file from here)

Step 2: Download the program to your PC

Go here and download the file that is available under assets.  Extract the zip file to the directory of your choice.   One of the files you will see amongst the extracted files is open-mcr.exe. That is the one you need to double-click on to run the program.

For more details about this open resource software, go to

Step 3: Choose your variant of the key

In addition to reading scanned images, the software can also automatically score exam results. It does this by comparing the provided keys with the output. There are three options for this, depending on which way you generate your exams:

1. One Exam Variant

This is the most common exam variant.  If you give every exam-taker the exact same exam, you can instruct them to leave the Test Form Code field blank on their sheets. In addition, leave that field blank on the answer key sheet. All exam results will be compared to the single answer key sheet provided. Skip to Step 4 if this is what you chose – no need to create confusion.

2. Shuffled Exam Variants

If you provide the exam-takers with multiple variants of the same exam, and these variants differ only in question order (in other words, each variant is simply shuffled), then you can score all of these with the same key file by providing a file that defines the orders of the shuffled variants.

Each row in this file represents a key, and each column represents the position that that question should be moved to.

For example, if the exam form A has questions 1, 2, and 3, the exam form B might have them in 3, 1, 2 order and C might have them in 3, 2, 1 order. This would result in the following arrangement file:

Test Form Code, Q1, Q2, Q3
             A,  1,  2,  3
             B,  3,  1,  2
             C,  3,  2,  1

If this were the file you upload, then all of the exams with form A would be left untouched while B and C would be rearranged to 1, 2, 3 order. Select the file in the program under the Select Form Arrangement Map.

Note that the first row in this file should always be in 1, 2, 3, … order, and each row after that should only have one instance of each number.

If you use this option, only one exam key can be provided or an error will be raised.

3. Distinct Exam Variants

Finally, you can provide the exam-takers with multiple wholly distinct variants of the same exam. In this case, each exam will be scored by selecting the answer key with an exactly matching Test Form Code. No rearrangement will be performed.

Step 4: Conduct the test

Give the question paper and the blank MCQ sheet to the students.  Ask them to bubble items such as last name, first name, middle name, student id, course number, key code, and their answer responses.

Step 5: Scan the MCQ sheets and the key(s)

If you would like to take advantage of the automatic grading feature of the software, you must provide it with one or more answer keys. To create an answer key, simply print a normal sheet and put 9999999999 in the Student ID field. Also, add a Test Form Code which will be used to match students’ exams with the correct answer key, and finally, fill in the exam with the correct answers.

This is optional – you can choose to just have the software read the exams and not score them.

Scan all the student and key MCQ sheets on a copier or a scanner into a single PDF file.  Use Adobe Acrobat DC or any other freely available program to export the pdf file to images.

Step 6: Run the program

One of the files you will see amongst the extracted files from Step 2 is open-mcr.exe. That is the one you need to double-click on to run the program.

Select the input folder where the images of the MCQ sheets are.

If you select the convert multiple answers in a question to ‘F’ option, then if a student selects, for example, A and B for a question, the output file will save that as F instead of [A|B].

If you select the save empty in questions as ‘G’ option, if a student skips a question by leaving it blank, the output file will save that as G instead of as a blank cell.

Choose the proper Form Variant as 75 Questions or 150 questions.

Under Select Output Folder, click Browse, and select the folder where you would like to save the resulting CSV files.  If you select the sort results by name option, results will be sorted by the students’ last, first, and middle names (in that order). Otherwise, results will be saved in the order they were processed.

Press Continue and watch the progress of the program.

Step 7: Output Files

After the program finishes processing, results will be saved as CSV files in your selected output folder. These files can be opened in Excel or in any text editor. Files will be saved with the time of processing to avoid overwriting any existing files.

If you did not include any answer keys, one raw file will be saved with all of the students’ selected answers, and no scoring is performed.

If you did include one or more answer keys, two more files will be saved in addition to the aforementioned raw file. One of these files will have all of the keys that were found, and the other will have the scored results. In the scored file, questions are saved for each student as either 1 (correct) or 0 (incorrect).

Software License

Copyright (C) 2019-22 Ian Sanders

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

For the full license text, see license.txt.

Multiple-Choice Sheet License

The multiple-choice sheet distributed with this software is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license (CC BY-NC-SA 4.0). In summary, this means that you are free to distribute and modify the document so long as you share it under the same license, provide attribution, and do not use it for commercial purposes. For the full license, see the Creative Commons website.

Note: You are explicitly allowed to distribute the multiple-choice sheet without attribution if using it unmodified for educational purposes and not in any way implying that it is your own work. This is an exception to the Creative Commons terms.

Getting last name and first name from full name with a delimited comma

Rather than using the “text to columns” procedure in excel, if you want to use a function to separate the last name and the first name for simplicity of use and the ability to copy the functionality anywhere, here are two functions that are written for you.

I wrote these functions because CANVAS LMS does not have separate columns for first and last names in the grade book.   However, they do have an option of showing the full name delimited by a comma.  This is preserved when you export the GradeBook.

Figure 1: Display name as separated by commas.

To use them, press Alt-F11.  It will open up the Microsoft Visual Basic for Applications window.  Choose Insert>Module.  It will show up as Module1 by default in the VBA Project window.  Good to rename the module to say “LastFirstNameBreak” using the Properties Window.  Cut and paste the two functions in the module, and save your excel file.  You will need to save the excel file as an .xslm file though.

Figure 2. Microsoft VBA Window.  Functions are shown below.

Function BreakLastName(FullName)
' This function separates the last name from the
' full name that is delimited by a comma
FullNameTrim = Trim(FullName)
leng = Len(FullNameTrim)
' Loop checks where the comma is
For i = 1 To leng
If Mid(FullNameTrim, i, 1) = "," Then
ival = i
Exit For
End If
Next i
BreakLastName = Left(FullNameTrim, ival - 1)
End Function
Function BreakFirstName(FullName)
' This function separates the first name from the
' full name that is delimited by a comma
FullNameTrim = Trim(FullName)
leng = Len(FullNameTrim)
For i = 1 To leng
If Mid(FullNameTrim, i, 1) = "," Then
ival = i
Exit For
End If
Next i
BreakFirstName = Right(FullNameTrim, leng - ival - 1)
End Function

To use the functions, just use them like any other Excel function.  BreakLastName separates the last name, while BreakFirstName separates the first name.

Figure 3. Using the functions in an Excel spreadsheet.


This post is brought to you by

A javascript code for Romberg integration

As I am writing backend JavaScript for simulations in teaching Numerical Methods, I have also started developing some functions for some numerical techniques. Here is a function for Romberg integration.

function auto_integrator_trap_romb_hnm(func,a,b,nmax,tol_ae,tol_rae)
// func=integrand
// a= lower limit of integration
// b= upper limit of integration
// nmax = number of partitions, n=2^nmax
// tol_ae= maximum absolute approximate error acceptable (should be >=0)
// tol_rae=maximum absolute relative approximate error acceptable (should be >=0)
// integ_value= estimated value of integral

//Checking for input errors
	if (typeof a !== 'number') 
		  throw new TypeError('<a> must be a number');
    if (typeof b !== 'number') 
		  throw new TypeError('<b> must be a number');
    if ((!Number.isInteger(nmax)) || (nmax<1))
		  throw new TypeError('<nmax> must be an integer greater than or equal to one.');
	if ((typeof tol_ae !== 'number') || (tol_ae<0)) 
		  throw new TypeError('<tole_ae> must be a number greater than or equal to zero');
	if ((typeof tol_rae !== 'number') || (tol_rae<=0)) 
		  throw new TypeError('<tole_ae> must be a number greater than or equal to zero');
	var h=b-a
	// initialize matrix where the values of integral are stored
	var Romb = []; // rows
	for (var i = 0; i < nmax+1; i++) 
		for (var j = 0; j < nmax+1; j++) 
	//calculating the value with 1-segment trapezoidal rule
	var integ_val=Romb[0][0]
	for (var i=1; i<=nmax; i++)
	// updating the value with double the number of segments
	// by only using the values where they need to be calculated
	// See
		var integ=0
		for (var j=1; j<=2**i-1; j+=2)
			var integ=integ+func(a+j*h)
		// Using Romberg method to calculate next extrapolatable value
		// See
		for (k=1; k<=i; k++)
			var addterm=Romb[i][k-1]-Romb[i-1][k-1]

			//Calculating absolute approximate error
			var Ea=math.abs(Romb[i][k]-Romb[i][k-1])
			//Calculating absolute relative approximate error
			var epsa=math.abs(Ea/Romb[i][k])*100.0
			//Assigning most recent value to the return variable
			// returning the value if either tolerance is met
			if ((epsa<tol_rae) || (Ea<tol_ae))
	// returning the last calculated value of integral whether tolerance is met or not

Here we are testing it for a typical integrand of f(x)=1/x. Take it for a spin and see how well it works. Make it even better.

<!DOCTYPE html>
<meta content="text/html;charset=utf-8" http-equiv="Content-Type">
<meta content="utf-8" http-equiv="encoding">
	<title>A test for the automatic integrator based on Romberg integration and trapezoidal rule</title>
// This program is written to test the romberg integration scheme that is used
// as an automatic integrator
// a= lower limit of integration
// b= upper limit of integraton
// nmax= number of partitions, segment is then 2^nmax 
// tol_ae= tolerance on absolute approximate error 
// tol_rae=tolerance on percentage absolute relative approximate error 
var a=0.001
var b=10
var nmax=20
var tol_ea=0.0
var tol_rae=0.0000000005

var abc=auto_integrator_trap_romb_hnm(func,a,b,nmax,tol_ea,tol_rae)
console.log("romberg "+abc)

var exact=math.log(b)-math.log(a)
console.log("exact "+exact)
function func(x)
   //var pi=4*math.atan(1.0)
   //var val=2/math.sqrt(pi)*math.exp(-x*x)
   var val=1/x


This post is brought to you by