We grew up interacting with the physical objects around us. There are an enormous number of them that we use every day. Unlike most of our computing devices, these objects are much more fun to use. When you talk about objects, one other thing automatically comes attached to that thing, and that is gestures: how we manipulate these objects, how we use these objects in everyday life. http://tech
We use gestures not only to interactwith these objects, but we also use themto interact with each other. A gesture of “Namaste!”,maybe, to respect someone, or maybe, in India I don’t need to teacha kid that this means “four runs” in cricket. It comes as a partof our everyday learning. So, I am very interested, from the beginning, how our knowledge about everyday objects and gestures, and how we use these objects, can be leveraged to our inter action swith the digital world. https://worldgraphics20.com/2020/10/31/what-streaming-means-for-the-future-of-entertainment-technology-in-2020/
Rather than using a keyboard and mouse, why can I not use my computer in the same way that I interactin the physical world? So, I started this explorationaround eight years back, and it literally startedwith a mouse on my desk. Rather than using it for my computer,I actually opened it. Most of you might be awarethat, in those days, the mouse used to come with a ball inside, and there were two rollers that actually guide the computerwhere the ball is moving, and, accordingly,where the mouse is moving.
So, I was interested in these two rollers, and I actually wanted more, so I borrowedanother mouse from a friend — never returned to him — and I now had four rollers. Interestingly, what I didwith these rollers is, basically, I took them off of these mousesand then put them in one line. It had some stringsand pulleys and some springs. What I got is basicallya gesture-interface device that actually actsas a motion-sensing device made for two dollars.
So, here, whatever movementI do in my physical world is actually replicatedinside the digital world just using this small devicethat I made, around eight years back, in 2000. Because I was interestedin integrating these two worlds, I thought of sticky notes. I thought, “Why can I not connect the normal interfaceof a physical sticky note to the digital world?” A message writtenon a sticky note to my mom, on paper, can come to an SMS, or maybe a meeting reminder automatically syncswith my digital calendar — a to-do list that automaticallysyncs with you.
But you can also searchin the digital world, or maybe you can write a query, saying, “What is Dr. Smith’s address?” and this small systemactually prints it out — so it actually acts like a paperinput-output system, just made out of paper. In another exploration, I thought of making a penthat can draw in three dimensions. So, I implemented this penthat can help designers and architects not only think in three dimensions, but they can actually draw, so that it’s more intuitiveto use that way.
Then I thought,”Why not make a Google Map, but in the physical world?” Rather than typing a keywordto find something, I put my objects on top of it. If I put a boarding pass, it will show mewhere the flight gate is. A coffee cup will showwhere you can find more coffee, or where you can trash the cup. So, these were someof the earlier explorations I did because the goal was to connectthese two worlds seamlessly.
Among all these experiments, there was one thing in common: I was trying to bringa part of the physical world to the digital world. I was taking some part of the objects, or any of the intuitiveness of real life, and bringing them to the digital world, because the goal was to makeour computing interfaces more intuitive. But then I realized that we humansare not actually interested in computing. What we are interested in is information.
We want to know about things. We want to know aboutdynamic things going around. So I thought, around last year –in the beginning of the last year — I started thinking, “Why can I not takethis approach in the reverse way?” Maybe, “How about I take my digital world and paint the physical worldwith that digital information?” Because pixels are actually, right now,confined in these rectangular devices that fit in our pockets.
Why can I not remove this confine and take that to my everyday objects, everyday life so that I don’t needto learn the new language for interacting with those pixels? So, in order to realize this dream, I actually thought of puttinga big-size projector on my head. I think that’s why this is calleda head-mounted projector, isn’t it? I took it very literally, and took my bike helmet, put a little cut over there so thatthe projector actually fits nicely.
So now, what I can do — I can augment the world around mewith this digital information. But later, I realized that I actuallywanted to interact with those digital pixels, also. So I put a small camera over therethat acts as a digital eye. Later, we moved to a much better, consumer-oriented pendant version of that, that many of you now knowas the Sixth Sense device.
But the most interesting thing about this particular technology is that you can carryyour digital world with you wherever you go. You can start using any surface,any wall around you, as an interface. The camera is actually trackingall your gestures. Whatever you’re doing with your hands, it’s understanding that gesture. And, actually, if you see,there are some color markers that in the beginning versionwe are using with it.
The thrilling potential of 6th Sense technology
You can start painting on any wall. You stop by a wall,and start painting on that wall. But we are not only trackingone finger, here. We are giving you the freedomof using all of both of your hands, so you can actually use both of your handsto zoom into or zoom out of a map just by pinching all present. The camera is actually doing –just, getting all the images — is doing the edge recognitionand also the color recognition and so many other small algorithmsare going on inside.
So, technically,it’s a little bit complex, but it gives you an output which is moreintuitive to use, in some sense. But I’m more excited that you canactually take it outside. Rather than getting your cameraout of your pocket, you can just do the gestureof taking a photo, and it takes a photo for you. (Applause) Thank you. And later I can find a wall, anywhere, and start browsing those photos or maybe, “OK, I want to modifythis photo a little bit and send it as an email to a friend.” So, we are looking for an era where computing will actually mergewith the physical world.
And, of course,if you don’t have any surface, you can start using your palmfor simple operations. Here, I’m dialing a phone numberjust using my hand. The camera is actually notonly understanding your hand movements, but, interestingly, is also able to understand what objectsyou are holding in your hand.
For example, in this case, the book cover is matched with so many thousands,or maybe millions of books online, and checking out which book it is. Once it has that information, it finds out more reviews about that, or maybe New York Timeshas a sound overview on that, so you can actually hear, on a physical book, a review as sound.
Famous talkat Harvard University — This was Obama’s visit last week to MIT. And particularly I wantto thank two outstanding MIT — Pranav Mistry: So, I was seeingthe live of his talk, outside, on just a newspaper. Your newspaper will show youlive weather information rather than having it updated. You have to check your computerin order to do that, right? When I’m going back, I can just use my boarding pass to check how much my flighthas been delayed, because at that particular time, I’m not feeling like opening my iPhone, and checking out a particular icon.
The thrilling potential of 6th Sense technology
And I think this technologywill not only change the way — Yes. It will change the waywe interact with people, also, not only the physical world. The fun part is, I’m goingto the Boston metro, and playing a pong game inside the trainon the ground, right? And I think the imaginationis the only limit of what you can think of when this kind of technologymerges with real life.
But many of you argue, actually, that all of our work is notonly about physical objects. We actually do lotsof accounting and paper editing and all those kinds of things;what about that? And many of you are excitedabout the next-generation tablet computers to come out in the market. So, rather than waiting for that, I actually made my own,just using a piece of paper. So, what I did hereis remove the camera — All the webcam cameras havea microphone inside the camera.
I removed the microphone from that, and then just pinched that — like I just made a clipout of the microphone — and clipped that to a piece of paper,any paper that you found around. So now the sound of the touch is getting me when exactlyI’m touching the paper. But the camera is actually trackingwhere my fingers are moving. You can of course watch movies.
Good afternoon.My name is Russell, and I am a WildernessExplorer in Tribe 54.” PM: And you can of course play games. Here, the camera is actually understandinghow you’re holding the paper and playing a car-racing game. Many of you already must havethought, OK, you can browse. Yeah. Of course you canbrowse to any websites or you can do all sortsof computing on a piece of paper wherever you need it.
So, more interestingly, I’m interested in how we cantake that in a more dynamic way. When I come back to my desk,I can just pinch that information back to my desktop so I can use my full-size computer. (Applause) And why only computers? We can just play with papers. Paper world is interesting to play with. Here, I’m taking a part of a document, and putting over here a second partfrom a second place, and I’m actually modifying the informationthat I have over there.
Yeah. And I say, “OK, this looks nice,let me print it out, that thing.” So I now have a print-out of that thing. So the workflow is more intuitive, the way we used to do itmaybe 20 years back, rather than now switchingbetween these two worlds. So, as a last thought, I think that integratinginformation to everyday objects will not only help us to get ridof the digital divide, the gap between these two worlds, but will also help us, in some way, to stay human, to be more connectedto our physical world.
And it will actually help usnot end up being machines sitting in front of other machines. That’s all. Thank you. Thank you. Chris Anderson: So, Pranav,first of all, you’re a genius. This is incredible, really. What are you doing with this?Is there a company being planned? Or is this research forever, or what? Pranav Mistry: So, there arelots of companies, sponsor companies of Media Labinterested in taking this ahead in one or another way.
The thrilling potential of 6th Sense technology
Companies like mobile-phone operators want to take this in a different waythan the NGOs in India, thinking, “Why can we onlyhave ‘Sixth Sense’? We should have a ‘Fifth Sense’for missing-sense people who cannot speak. This technology can be used for themto speak out in a different way maybe a speaker system.”
CA: What are your own plans? Are you staying at MIT, or are you going to dosomething with this? PM: I’m trying to make thismore available to people so that anyone can developtheir own SixthSense device, because the hardware is actuallynot that hard to manufacture or hard to make your own. We will provide all the open sourcesoftware for them, maybe starting next month. CA: Open source? Wow. CA: Are you going to come back to Indiawith some of this, at some point? PM: Yeah. Yes, yes, of course.
CA: What are your plans? MIT? India? How are you going to split your time going forward? PM: There is a lot of energy here.Lots of learning. All of this work that you have seenis all about my learning in India. And now, if you see, it’s more aboutthe cost-effectiveness: this system costs you $300 compared to the $20,000 surface tables,or anything like that.
Or maybe even the $2 mouse gesture systemat that time was costing around $5,000? I showed that, at a conference, to President Abdul Kalam, at that time, and then he said, “OK, we should use thisin Bhabha Atomic Research Centre for some use of that.” So I’m excited about how I can bringthe technology to the masses rather than just keeping that technologyin the lab environment. CA: Based on the people we’ve seen at TED, I would say you’re truly one of the two or three best inventors in the world right now. It’s an honor to have you at TED. Thank you so much. That’s fantastic.