We need to think beyond the screen and find ways to present information and ideas in a way that people can interact with in a more exciting way, according to MIT Media Lab’s Hiroshi Ishii.
Speaking at The Economist’s Technology Frontiers event, he described the birth of astronomy and the subsequent invention of the orrery, a mechanical representation of the solar system. “It is one of my favourite physical devices in terms of how it presents knowledge and information,” said Ishii. He also talked about the abacus, where information is physicaly embodied with beads. Children can grab it and it can become a train, a musical instrument or a tool to scratch your back. “How do we present ideas so people can interact with them and understand?”
For Ishii, we need to involve the body. “These days computers dominate. Everything is pixels, intangible.” He described looking at information on a screen as if it is something at the bottom of the sea — “you can see but you can’t touch”. People must depend on a remote control like a mouse or a keyboard.
Ishii’s work focuses on trying to bring information up to the “surface of the water”, exposing portions to the physical domain. This is what he refers to as “tangible bits”: physical embodiments of digital information. This is necessary to make computation more “legible”. He said: “Today’s computation is so complex — everything is hidden in a black box.”
Some of the projects Ishii showed on stage were extraordinary. One of the first examples was MusicBottles — glass bottles which played classical music when you opened their lids. He also showed the I/O Brush, which looks like a regular paintbrush and can be wiped over real-world objects to “pick up” their colour, and movement. They can then paint this back onto a canvas using “digital ink”. The tool inspires children to become creators.
Topobo is a robotic construction kit with “kinetic memory”. Children can assemble robots using a range of different modular components build around a central body. They can then press a record button on the body and manipulate their robot through a sequence of motions using their hands. The robot learns that sequence and can play it back at the touch of a button.
Ishii specialises in vision-driven research, which means not having to think about the cost or the market. He said: “Many research projects are driven by technology, but the technologies of today go into landfill next year. Applications will be obsolete in 10 years. Vision is very strong and survives our lifespan.”
When asked whether he was interested in creating touchscreen devices with haptic feedback, he said that he wasn’t interested. “A touchscreen is still a screen fundamentally. My lifespan is running out so I need to focus on the most crazy, edgy stuff — not touchscreens.”