The SAM Blog

Why EdTech Startups Should Care About Research

Why EdTech Startups Should Care About Research

Making decisions about what EdTech solutions best fit the goals of schools, teachers, or parents is a complex and personal process. A big part of that is positioning the EdTech offering in the most helpful way, for users to quickly and easily identify when a product is the right fit for them. Identifying the specific needs each EdTech product meets is perhaps one of the more important, yet hardest parts of a startup’s journey, especially when separating between what the product is designed for, and how users identify with the product for their own purposes.

Researchers are experts at formulating questions and finding ways of gathering evidence to answer those questions, building on sound methodologies to ensure the validity of those answers. And that is why EdTech startups should be interested in research: because research means evidence. Whether looking to measure efficacy, explore how better to align their product with users’ needs, or even identify the questions that are most relevant to ask and answer about the relationship between a product and its users, researchers have been building the toolbox needed to approach these issues.

Working specifically towards this purpose is the EDUCATE program, which brings together startups, educators and researchers to support the evaluation of EdTech products and increase their impact. Founded by Rose Luckin, Professor of Learner Centred Design at the UCL Knowledge Lab, it makes research evidence and practice easily accessible for relevant stakeholders to advance the efficacy of educational technologies.

SAM Labs has a long history of engaging with research, starting with the collaboration between Joachim Horn, SAM Labs’ founder, and Rose Luckin, EDUCATE’s founder and my PhD supervisor. My own research investigates the role of data visualisations in supporting project-based learning using physical computing kits such as SAM Labs.

Project-based learning is rooted in constructivist, student-centred, growth-based pedagogies which emphasise a trial and error learning process over perfect end results. The design of SAM Labs kits is well-aligned with existing research around what make construction kits powerful tools for learning. However, planning and guidance in these environments can often be challenging, with many educators struggling to effectively support their students without taking away the freedom and benefits inherent to project-based learning activities.

In my research, I explore the potential of using data visualisations to expose the iterative design process students engage in when using construction kits like SAM Labs. The assumption is that by having a more accurate understanding of what the iterative design process looks like more exactly, we will be able to better support it. Several prototypes were designed and evaluated with educators who are versed users of SAM Labs, in a bid to validate their potential.

The first example is an interactive visualisation where teachers can review all the different ways in which students have attempted to implement a specific SAM Labs behaviour. In the figure below, the final implementation of a sensor-triggered flap door is displayed in the bottom right, and each horizontal bar represents an earlier implementation of that flap door in the top right corner. The green and grey colours are indicative of the valid use of SAM Labs components, green indicate correct usage, with grey indicating one or more incorrect connections.

Data visualisation of SAM Labs sensor-triggered flap door project, with horizonal lines going upwards from left to right.

In a second prototype, all the SAM Labs components and connections that a student has used throughout a whole project are displayed in a single view. The width of the line represents how many times a connection is used in the students’ experimentation.

Data visualisation of SAM components and connections, connected by lines.

A third view exposes the work kept versus the work discarded by each student over the course of the project. In the figure below, the orange bars represent connections between two SAM Labs components added and subsequently discarded from the final design, whilst the blue bars represent to connections which are kept in the student’s final SAM Labs implementation.

Data visualisation of SAM components added, discarded, or kept in student implementations.

Using these, teachers are able to get closer to the students’ thinking process and intentions, and ground their support in what the students are trying to achieve. For example, the teachers are able to enhance the students’ metacognitive skills by prompting them to articulate their decision making process between iterations, or intervene based on the volume and nature of their experimentation.

These can lead to a more effective guided reflection in the classroom, informed by the students’ goals, that emphasise the process independent of end results. In addition, patterns emerge on the students’ experimentation behaviours which can lead to better ways of designing the projects in the first place.

As a researcher, I used established methodologies to enquire into students’ iterative design process using the SAM Labs kits, and reached out to teachers to validate my findings. I know other researchers have engaged with the SAM Labs kits in different contexts, from different perspectives, using different methodologies to answer new questions. And that is the beauty of research: it provides a varied array of tools to gather evidence best suited to the context of the questions being asked. We all have something to learn from being a little more research-minded.