Spotlight on Startups: teachers can cut exam marking time in half with Crowdmark
PostedOctober 10, 2013
It’s been several long weeks since the excitement of back-to-school, meaning teachers are starting to drown in their homework and exam marking right about now.
But what if time spent marking could be drastically reduced, while the quality of evaluation and communication between teachers and students could be improved?
University of Toronto (U of T) mathematics professor James Colliander’s startup, Crowdmark, is poised to revolutionize exam marking and eliminate hours of inefficiency through a new tool developed with the help of several of U of T’s entrepreneurship support systems.
Crowdmark’s story is part of a new U of T News series called Spotlight on Startups, which profiles the many entrepreneurial efforts growing from the hundreds of companies spun out from research and connections sparking every day at the University of Toronto:
U of T hosts more than 50 enterprise-fostering courses, programs, labs, clubs, contests and speaker series across its faculties, departments and campuses—and that doesn’t even include the innovations developing in informal settings. U of T ranks No. 1 in Canada and No. 3 in North America for number of startups launched. And its roster of spin-off companies driving innovation in Toronto and around the world continues to grow.
Colliander explained how Crowdmark came about.
What is Crowdmark?
There are 28 million teachers worldwide. They spend, on average, two to three hours per week evaluating student work. But marking as it’s generally done now is tremendously inefficient with excessive paper shuffling, repetitive commenting and difficulty coordinating work between multiple evaluators.
Crowdmark helps teachers grade better and faster. It is a web application that provides a new, streamlined workflow, free from current obstructions.
How did you arrive at the idea?
Crowdmark was inspired by the logistical nightmare of grading the Canadian Open Mathematics Challenge in 2011. There were banker boxes and banker boxes stuffed with FedEx envelopes containing tens of exams, and each exam containing 14 pages of hand-written answers to mathematics contest problems.
We used skilled human volunteers to assess 70,000 pages of hand-written math papers. Some of the volunteers could only mark pages one, two and three. Others would only grade pages 10, 11, 12, etc.
I remember seeing one of my colleagues, a strong research mathematician, waiting for a pile of exams to mark. I remember saying to myself, frustrated and under my breath, “serialization” and then asking myself, “How can I parallelize the grading workflow?” Then the first idea came to me.
The opportunities that emerged from that “Aha!” moment unfolded after a long time of thinking and collaborating with others on the Crowdmark team.How did U of T support you and help you develop the concept?
Why did you choose to work with UTEST as opposed to other U of T incubators?
I chose to work in UTEST because I was convinced that they offered a good deal for getting Crowdmark started and that they had access to business expertise I required as a new entrepreneur. I was also attracted to this program because of the way it changed the U of T patent policy so that the invention underpinning Crowdmark could be owned by the company. Also, Lyssa Neel, formerly of MaRS Innovation, co-founded UTEST and was the principal business mentor to me. She rocks.
You put Crowdmark to the test with a major math competition– what was that experience like and what did you learn from it?
The Canadian Open Mathematics Challenge—which inspired the concept for Crowdmark in the first place– was our first live test of the Crowdmark platform. This involved working with 150 graders from eight Canadian universities, working simultaneously over a weekend. We knew there would be bugs but we also knew this was a great opportunity.
The experience was stressful and exhilarating. Crowdmark has visual indicators showing the percentage of the work that has been completed by the marking team. As these progress bars advanced, the team began to celebrate. The experience was a bit like watching election returns.
We learned a lot from this experiment. The Crowdmark idea and our execution in working code achieved proof-of-concept. We also identified many ways the application had to improve to become a useful tool without requiring a dedicated team of software developers supporting the grading activity.Any interesting news or new testing scenarios since then that you want to mention?
Yes! Crowdmark has been validated as an exciting technology by teachers in elementary school: it was recently used by groups of teachers at Golf Road Junior Public School to carry out moderated marking of grade 3 and grade 6 mock-exams. The experiment was a tremendous success! (Read more about the experiment on the Crowdmark blog.)
Joseph Romano, who led the pilot at the Golf Road, told us that Crowdmark cut teachers’ marking time in half and provided a smooth, easy-to-use interface to facilitate both individual and group marking from home.(Read about Joseph Romano’s testimonial here.)
Crowdmark was also invited by Jason Calacanis to demonstrate our product at the Launch Education and Kids event this past June—a conference on education, innovation and kid-centric startups. Our presentation was a success and has generated a lot of interest from educators worldwide. (Watch a video from the event, below.)What’s next for Crowdmark?
Crowdmark is developing into a business that empowers teams of skilled humans to work together and exponentially improve global education.
More about UTEST:
The UTEST Program is a joint initiative of the University of Toronto, The Connaught Fund and MaRS Innovation to provide early stage software companies with start-up funding, work space, mentoring and business strategy support. UTEST enables passionate UT entrepreneurs the opportunity to transform an idea to a product at a stage which is typically too early for traditional technology incubators.