In anticipation to Scratch Day we have been experimenting in the Tinkering Studio with the awesome video sensing ability that is built into Scratch 2.0. As usual, we have spent some time playing and prototyping ideas that make use of video sensing internally before trying it out with the public, but one of the big hurdles in the context of the Tinkering Studio is how to get visitors engaged meaningfully, within a short time, with something quite complex—programming an interactive animation using an unfamiliar environment (Scratch) and making use of advanced tools (video sensing from a webcam). Here is the way I approached it today.
I started with a simple program running: a parrot is flying on the screen and when “captured” by the net it disappears. To make it reappear you have to shake the tree with the net. This is achieved simply by having the sprites responding to the color of the net (off white). I first encouraged kids to play with the program as it was set up and pointed out the various parts, like the camera, the virtual sprites, and the code that was animating the parrot. After a while I asked if they would like to add their own character to the animation and make it do something. They all enthusiastically said yes! I encouraged them to make their own using the construction paper available.
Once a character was created I helped them bring it into Scratch using the camera function and magic wand to get rid of the background. Having a camera already mounted pointing straight down at the stage made this super easy and fast. Having a character that they created in physical form be transported inside of a virtual world was at once magical and meant kids were immediately invested in what happened to it. I asked a pretty open ended question: “What would you like your character to do?” and gently nudged them if necessary toward thinking about movement first. How should it move?
From there I pointed out the Motion section of Scratch and had them drag a couple of initial blocks (like move 10 steps and turn 15 degrees) onto the stage, start clicking on them and notice what happens to the sprite. I encouraged them to play around with the values, notice how the movement changes. I encouraged them to snap two or more blocks together and see what happens when you string commands together. Finally I revealed the repeat and forever loops as a way to avoid clicking repeatedly on the code and making the character move autonomously. This led to a more intentional phase of experimentation with values and blocks to see what kind of movement they could get out of their character. I found it very interesting that every kid had a clear idea of how their character should move, determined by the nature of the character itself, and that led to very different bits of code and behaviors. A butterfly moves very differently than a dragon, naturally!
Once the visitors were satisfied with the movement of their creature, I re-introduced the net from the beginning, asking them now what should happen to the character when it is captured by the net. Once again, every kid had a different idea in mind for what their character should do in that situation!
Jade’s butterfly moves erratically and very fast on the screen, and when captured by the net it disappears for 10 seconds, then reappears.
Jailen and Jayden’s dragon (the similarity between all the kids’ names is purely coincidental!) glides smoothly on the screen and when captured it breathes fire. Of course a dragon’s fire breath is blue, didn’t you know? In this case it also required a trip to the Costumes tab where the kids duplicated their sprite and hand-drew the flame, then we worked out how to switch costumes based on whether the net was touching the sprite or not.
The most interesting part to me is that when introducing the net as an interactive device, the first thing kids said was something along the lines of “I want it to do X when the net catches it.” When I pointed out that the computer doesn't know about the net, it can only detect either color or motion, everyone autonomously came up with the solution of having the sprite react when touching “white.” I think this is a good example of abstracting a high level goal into a set of instructions that a computer can understand and work with.
All these interactions were around 20 to 30 minutes, and I think that for such a short engagement it resulted in meaningful and authentic exploration of programming, Scratch, and a fairly sophisticated technology such as video sensing. This is definitely a more scaffolded and guided approach than we usually adopt in the Tinkering Studio with lower threshold activities, but perhaps in this case it is the better approach. I also noticed that many of the parents who were not previously aware of Scratch were very impressed with how easy it is to introduce programming concepts and practices and mentioned wanting to continue playing with it at home. The fact that Scratch itself is free and this particular approach only uses a webcam and readily available materials certainly contributed to them feeling they could do so easily.
This work was supported by a grant from Science Sandbox, an initiative of the Simons Foundation
This project was made possible through the generous support from the LEGO Foundation