Crowdmark Lessons I Have Learned
This semester I used
Crowdmark
for the first time, to organise the grading of
my students'
midterm exams. I had heard that it makes grading much easier,
especially in multiple-TA classes. But I encountered many
problems and challenges.
Although there is lots of
Crowdmark documentation
to get you
started,
as well as specific instructions at various
other
universities
and faculties
and depts,
plus I had a very supportive
colleague,
nevertheless I still made a number of mistakes and poor choices.
So, this document is an attempt to summarise the things I learned from my
Crowdmark experience,
that I wish I knew earlier --
to remind myself plus perhaps assist others.
I will update this document as warranted;
comments and suggestions
are welcome.
Using Crowdmark successfully requires lots of steps, which I summarise
and comment on below
(focusing specifically on in-person midterm tests in
large undergraduate courses).
Before the Midterm:
- Make up the test questions.
This is obviously essential.
But if you are using Crowdmark, then it is just the beginning!
- Format the test paper appropriately.
Leave room at the top of each page for Crowdmark's QR code; in LaTeX
that can be accomplished with "\setlength{\topmargin}{0in}".
Include a separate cover page at the beginning.
Leave plenty of space for each answer (so students don't need
to write in other places which can't easily be scanned),
including perhaps an extra page at the end.
And, if you are using Crowdmark's
automated matching (which I did
and it worked fairly well), then leave space from 9 to 18 cm
on the cover page for that.
And it's good to add one or two extra blank
page(s) at the end of your paper, in case students need more space.
[In case it helps, here is a possible
template
and its LaTeX source.]
Fortunately, if you upload your paper to Crowdmark
but aren't happy with the result, then it is fine to replace it
with a new version.
- Login to Crowdmark.
At
my university
you can do this at:
https://app.crowdmark.com/sign-in/utoronto
(Bookmark this page, you will need it often!)
Then choose "Sign in with Quercus", and login to UTORid (possibly
requiring two-factor authentication), and then click "Authorize".
(Note: Although we use Quercus, similar comments probably apply with
another LMS like Canvas.)
- Import your course into Crowdmark.
Click on "Import a course", then choose the course and follow
the instructions.
(If importing multiple courses/sections, try to import them in a logical
order, since you can't change their order later!)
- Add the student list.
You should probably add the students now,
especially if you plan to use Crowdmark's
exam matcher app,
though if you aren't then you could choose to wait until later.
The simplest way is to click "Sync", but that annoyingly fails
to get their student numbers (which is a problem for e.g.
automated matching).
Alternatively (or in addition afterwards), first get Quercus to give you a
csv of student first&last names plus student number plus email address;
one way is to go to Settings / Navigation to add "UT Advanced Group
Tool", then go to that and select "Export Group Roster" / "Click here to
download the entire course roster"; an appropriate csv file
should then eventually appear in your email and/or Quercus inbox.
Then, in Crowdmark, you can select "edit the students" and
then manually upload the csv file.
(Note that your csv file has to include the students' emails, otherwise
Crowdmark will not accept it and will just hang forever!
And, make sure it identifies the correct column for student numbers.
Also, bizarrely, if you later want to add more students, you need
to upload a csv file which INCLUDES the previously-enrolled students,
otherwise they will be deleted!!)
[Added later: It seems Quercus has two numbers for each student -- their usual UofT Student Number, plus a six-digit (LMS) ID number which is needed for Quercus syncing. I'm told you can FIRST upload the students by syncing, and then AFTERWARDS upload a csv file containing emails and UofT Student Numbers to get the Student Numbers added too, though I haven't tried this yet.]
[Added even later: Apparently as of 2024, when syncing you now have the option to include the numerical student ID numbers. So this year I will try that!]
- [Added later:
If multiple sections have a common final exam,
then for the exam
it seems best to create a new separate Crowdmark "class"
(under the "Crowdmark (no LMS)" Courses tab, not Quercus!),
and add all students to it manually using a single csv file,
and then later download the exam grades from Crowdmark as a separate
csv file which you can then include using e.g. with
VLOOKUP or
my csvlookup.c.]
- Create the assessment.
Click on "Create assessment", then "New assessment",
and give it an appropriate name.
You can add your grading team if you want,
but for security it's probably better to wait
(just choose "Custom" and only yourself for now).
Then make it an "Administered" assessment for in-person exams.
Then decide if you will use double-sided printing (I will from now on),
and multiple choice (I did not),
and enable automated matching (I did and it worked fairly well).
- Create the test booklets.
Next you should upload your template (i.e. test paper) as a pdf file,
and get Crowdmark to generate the test booklets.
Crowdmark will then give you a huge pdf file, with one unique complete
test booklet for each student in your class (plus a few extras).
Have a look at this file and see if it seems good.
If not, you can upload a new template, and try again.
Once you're happy, you should also specify where each question begins,
and how many points it is worth.
[Be sure to generate plenty of booklets, since you can't add more later.]
- Separate out a few of the test booklets!
Use pdf editing software (e.g. "pdftk" in Linux) to separate
out the last few (say, 4 or even 8) booklets of the giant pdf file into
a separate pdf file, so you can send those to
Accommodated Testing
Services (ATS)
via CIS,
when test.exam@utoronto.ca (inevitably) requests some.
[In fact they sometimes make two different requests, so you might want to
split your few extra booklets into two.]
[But for final exams,
just give the exam office 18% extra booklets,
and then they will supply ATS themselves.]
- Print the rest of the test booklets.
Send the pdf file of the REST of the test booklets
(not including the ones you've saved for ATS) to the appropriate
staff person.
If you tell them it's a Crowdmark file, they should be able to get
it printed and stapled into separate booklets.
Make sure they deliver the printed booklets to you well
before your midterm time. Now you're ready to go!
- Prepare the Exam Matcher app (optional).
You can optionally use Crowdmark's
Exam Matcher app
(or mobile version)
to help your TAs match the students.
For this, you need to first add your students to Crowdmark as above.
Then, you should generate sufficient 5-digit tokens
(go to Settings within the assessment), and email them to your TAs.
Your TAs should then download the app and token
to their own phones, within three
days of when you generated it.
This app isn't essential, and I didn't use it the first time.
But it helps with matching students to their papers
later, and it is no more difficult than taking attendance manually,
so I now recommend it.
During the Midterm:
- Administer the midterm.
This is pretty much the same as usual.
Distribute the printed booklets to the individual students.
Instruct the students to write in pen or sharp pencil, so
the scanner can pick up their writing.
Instruct the students to only write in the designated answer spaces
(perhaps including an extra last page),
allowing additional pages only as a last resort.
And in case of multi-section courses, make sure students write with
their correct section so they match up with the corresponding class list.
- Use the Exam Matcher app (optional).
If you have prepared Crowdmark's
Exam Matcher app as above,
then your TAs can take attendance during the exam by typing
(the beginning of) each
student's name and then scanning the corresponding paper's QR code.
Note: The TAs should keep track of any student who does not appear
on their app's classlist.
[If not using the app, then you may wish to take attendance manually.]
- Collect the papers.
Carefully collect all of the students' answer booklets at the end.
Pay special attention to any
extra pages where students needed extra space for their answer,
since those pages need to be handled separately as special cases
(which is why it's better to add extra blank
page(s) at the end of your original Crowdmark file instead).
Right After the Midterm:
- Count the papers.
It's best to know now how many papers there "should" be at the end.
- Sort all of the booklets by their booklet number!
In principle this is unnecessary, since
you can upload the answer pages to Crowdmark in any
order to then be automatically sorted online.
But in practice, you will need to inspect some of the
original pages again, due to unreadable scan images,
and this is much easier if they are sorted.
And, they are much easier to sort before the staples are removed
-- trust me!
Scan the papers into Crowdmark:
(Note: It is possible that
our
kind
staff
will scan and upload your test papers themselves, if you add
ug.statistics@utoronto.ca
as an
Uploader or Facilitator for the assessment on Crowdmark.
If so, then this is a great offer which you should definitely accept,
and leave the steps in this section to them!)
- Remove the staples.
Carefully remove the staples from each booklet,
e.g. by cutting off the corners
with a large scissors. But don't cut off too much,
to avoid losing actual student work or making scanning more difficult.
And be careful to keep the booklets together in order!
- Scan the answer pages.
Carefully carry the unstapled pages to the scanner -- in
my department
the scanner is in an open hallway
next to room 9122.
Set the resolution (dpi) and contrast to maximum
(thanks),
and specify the page size (8.5 x 11 portrait),
and probably use "batch" mode
(though multiple files are fine too),
Set it to save the scans to your USB memory stick
(or it might be possible to have it emailed instead).
You can then carefully use the automatic feeder to scan the pages.
(If the scanner fails to notice your pages due to missing corners,
then jiggle them or press on the plastic sensor manually.)
The pages might get jammed or fail to feed, and need to be re-scanned.
Be sure to scan everything -- it is better to scan some pages twice
than not at all.
And be sure to preserve the page order!
- Inspect the scan files.
Despite your best efforts, the scan might have come out in the wrong shape,
or be illegible, or include blank pages, etc.
Inspect the resulting pdf files carefully to look for errors.
Fix it up as best as you can, and re-scan as necessary, before proceeding.
Fixing problems now will save time later!
- Upload the scan files to Crowdmark.
Upload your scan pdf file(s) to your Crowdmark
assessment page, and wait for the processing. Crowdmark will then tell
you how many complete and incomplete booklets it has scanned.
If you're lucky, the number will match your previous count and they will
all be complete. If not, then some pages didn't scan properly. Check
Crowdmark's list of errors to see if you can fix them (by manually
identifying the pages that Crowdmark could not).
Otherwise, find the corresponding physical pages (much easier if your
pile is well sorted and preserved!), and re-scan and re-upload as necessary.
Fortunately, Crowdmark will ignore repeatedly uploaded pages, so it's
okay to re-upload lots of pages just to fix a few if necessary.
- Securely store the physical student papers (in order).
You will probably need to check some of them again later.
The Grading Phase:
- Add the student list.
If you haven't already (e.g. to use the exam matcher app),
then add the students now; see above.
- Match the booklets to the students.
Get Crowdmark to use the automated matching from your
student list and/or the exam matcher app, to identify
which exam booklet is for which student.
Some students might fail to match (my automated matching got 122/138),
in which case you can match them manually in Crowdmark.
- Add your TAs to your assessment team.
You can do this either by syncing with Quercus
or adding them by email address.
Note that you first have to add them to the course team (if you haven't
already), and then to the specific assessment's team.
You can then send them an email "invition" to join.
- Get your TAs to do the grading.
Email the TAs your usual grading instructions, and tell them about
their Crowdmark invitation.
Then they can
grade on Crowdmark,
including using Crowdmark's
features
and tools
to quickly add comments etc.
As a bonus, you can monitor your TA's grading online, both to
check their progress and to guide their work (e.g. review their
first few grades and suggest adjustments).
[They can also optionally
tag
answers which require further investigation.]
- Deal with any extra pages.
If a few students needed extra pages for their answers, and you allowed
that, then separately scan and email those pages
to your TAs to take into account in the grading.
(I told the TAs to note on the Crowdmark page that they had seen
the extra page. If you included extra blank page(s) in your
test booklets, then hopefully this will not be an issue.)
- Include the ATS papers.
Some days later, the few papers written in
ATS
will arrive.
Scan those separately, and upload the new pdf file to Crowdmark
(assuming you gave separate booklets to ATS as above, otherwise
this won't work!).
Inform your TAs of the extra papers, so they will grade them too.
- Investigate unreadable pages.
Your TAs should make a list of any answer scans that are
too faint, or cut-off, or scanned badly, or otherwise unreadable.
You then need to find the corresponding physical pages (easier if your
pile is well sorted and stored!).
Then, either re-scan them and email them directly to your TAs
(you can't re-upload the same pages to Crowdmark),
or simply grade those few pages yourself (which I did).
And, update the corresponding grades directly in Crowdmark.
Wrapping Up:
- Make sure you are done.
You can monitor the grading on Crowdmark, as well as hopefully getting
email reports from your TAs, so you can see when the grading is all
done and no questions are left ungraded.
- Save the grades.
Once all the grading is finally complete, then the grades are
automatically tabulated. They can then be seen
in the "Results" tab,
and saved as a csv file for backup.
(You can also download a huge pdf file of all the students' graded papers,
which is good for backup purposes.)
- Share the grades.
Once you are satisfied with the grades, under Crowdmark's "Results" tab
they can be exported to Quercus's gradesheet (after which you can "publish"
the grades there),
and shared with the students (at which point they will receive
an email with a link to view their graded answers; perhaps
first add a message by editing the "Student view settings").
(That all worked fine for me, except for Quercus Sync, which
doesn't always work and sometimes has to be attempted multiple times.
Also, strangely, you cannot sync or export grades while the assessment
is "Locked".)
One bonus is that, since you upload the grades to Quercus
yourself, your TAs'
role
on Quercus could be just "Designer" instead
of "TA" so they cannot adjust the grades there.
- Deal with regrade requests.
After the students receive their grades, some
will inevitably request regrades or higher marks.
Since there is no physical paper to exchange, such requests
can be handled purely by email.
Any Crowdmark-related technical/scanning
issues should of course be investigated directly,
including comparing physical pages to scanned pages as necessary.
Other regrade requests are essentially the same as usual (though
I recommend a firm policy).
Fortunately, since you still have the originals, students
cannot modify their answer before the regrade.
[Note: When reviewing grading, the TA's comments might sometimes
BLOCK some student writing. Seeing what's underneath seems to
require either undesirable cloning/adjusting/replacing of the grading,
or (better) searching for the student and clicking "score" in their
three-dot menu to show the grading in student view
with a (local) "Hide feedback" option at the top.]
Final Thoughts -- Was It Worth It?
It is clear from the above that Crowdmark introduces lots of additional
challenges and hassles into the grading process -- much more than I expected.
Hopefully, these will diminish with more experience.
On the other hand, Crowdmark definitely does provide certain
advantages compared to traditional paper marking, such as:
- Crowdmark avoids the risk of student papers being lost or damaged when
passed from TA to TA -- the scanned files are easily saved and backed
up, and the original booklets remain safe in the instructor's
(or departmental) office.
- Grading is more convenient in some ways: it can be done anywhere, without
the need to pick up or transport physical pages, comments can be typed
online and re-used, etc.
(Though at least one of my TAs said he actually prefers physical grading.)
- The instructor can monitor and review the grading at any time.
- The final grades can be automatically totalled and uploaded to Quercus,
avoiding the tedium and error-risks of manual adding and entering.
- Students cannot alter their paper before requesting a regrade.
Probably I will keep using Crowdmark, especially to avoid the risk
of lost or damaged papers.
But on the question of whether this is really a good advance,
the jury is still out.
-- Jeffrey S. Rosenthal, October 2022
(contact me)
Postscript:
A Crowdmark employee
told me that this web page inspired them to create a new
Crowdmark cheat sheet.
So, that's something!