We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


In June 2017, when Misty Robotics spun out of Colorado-based startup Sphero — the folks behind the eponymous Sphero toy series, a motorized R2-D2, and a miniature replica of Lightning McQueen from Disney’s Cars franchise — it announced intentions to develop a “mainstream” home robot with the help of hobbyists, enthusiasts, and crowdfunding backers. With $11.5 million in capital from Venrock and Foundry Group in the bank, the team wasted no time in getting to work. And the startup unveiled the first fruits of its 11-month labor — a robotic development platform called Misty I — at the 2018 Consumer Electronics Show.

In May, Misty Robotics took the wraps off the second iteration of its robot — Misty II — and made 1,500 units available for preorder, starting at $1,499. (The first units are expected to ship by December 4.) The robot weighs in at six pounds, stands 14 inches tall, and packs electronics like a 4K Sony camera, a 4.3-inch LCD display, two hi-fi speakers, eight time-of-flight sensors, and three far-field microphone arrays. It’ll work with third-party services such as Amazon’s Alexa, Microsoft’s Cognitive Services, and the Google Assistant when it launches and will allow owners to create custom programs and routines — including ones that tap into machine learning frameworks like Google’s TensorFlow and Facebook’s Caffe2.

Misty Robotics isn’t exactly rushing to market. It has a 10-year plan, and it’s taking a hands-on approach to development. While a few preprogrammed skills (like autonomous driving and voice recognition) are available on GitHub, the idea is to let developers come up with use cases that the founding team might not have thought of.

I caught up with Misty Robotics CEO Tim Enwall ahead of IFA 2018 in Berlin to talk progress, the state of the home robotics industry, and what the future might hold.

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

Here’s an edited transcript of our interview.

VentureBeat: So tell me what you’ve been up to lately. 

Tim Enwall: We went out and raised money. Around July 4 of last year, we moved into our own offices and started building out a team. Then in May, we did a crowdfunding campaign, and we’re super excited about how it went — we raised almost a million dollars. It proved to me that without a shadow of a doubt, we’d got a business worth tens of millions of dollars.

Now, we’re driving toward shipping our product in December and doing all the hard work required to create the most advanced, affordable, useful, and easily programmable robots out there.

VentureBeat: Right. Maybe you can talk a bit more about Misty II. From where I’m standing, a lot seems to have changed since you announced it. Anki unveiled a home robot — Vector — and [Bosch-funded startup] Mayfield Robotics announced that it’s shuttering its doors after years of iterating its robot, Kuri. So what do you think sets Misty apart from what’s out there and what’s come before? What will make it successful?

Enwall: The short answer to that question is the usefulness of the Misty platform.

When I speak to investors, journalists, and others about Misty, I like to ask them a couple of questions to set the stage. The first is, do you believe that in the relatively near future, there will be robots in every home and every office? Let’s not debate the definition of “relative” for the moment. Yes or no?

VentureBeat: Possibly. I’m not convinced — I haven’t come across or encountered a home robot I really thought was indispensable. 

Enwall: But that’s not the question. The question isn’t, is there one that exists today? The question is, do you believe in the relatively near future, robots will exist in our offices in our homes in a widespread way? I’m talking the next 10 to 20 years.

The next question is, do you believe that in the future, we’ll be buying 30 to 50 single-use robots like those that exist today? I’m referring to robot vacuum cleaners, lawnmowers, telepresence robots, security robots, et cetera … Are we going to buy all of these, or are we going to buy one that does 30, 50, or even 100 things for us?

There is no company on the planet today — or in the future, for that matter — that can deliver a robot that’ll perform all of the hundreds of things that robots will eventually do in the office or the home. To be clear: There’s no company on the planet that has the resources and the talent to deliver a robot that does 50 things that are different for you in your home than the 50 things I want and the 50 things that somebody else wants. The only way to get there is through crowdsourcing.

So that’s the concept behind Misty. Therefore, we have to deliver something that is powerful enough to create usefulness. And the only other product out there on the market that comes anywhere close to being able to provide a level of usefulness is …  [SoftBank’s] Pepper, and that’s a $15,000 product.

VentureBeat: You’ve talked in the past about all the ways the Misty II platform will remain open and already, you’re integrating with the many third-party services out there [like Alexa and the Google Assistant]. Was that a strategic decision? Do you think it’ll help you gain traction?

Enwall: Yeah, definitely. Again, I’m a huge believer that the consumer who wants a robot in their house wants Rosie [from The Jetsons] … and that class of robot does 100 things for the consumer.

Right now, every robot maker has decided to go the single-purpose route to serve the mass market. As a result, you’ve got this expectation gap between robots that can, say, vacuum the floor and Rosie. We believe that the only way to close it is by building a robot platform that can eventually do physically what Rosie can do, with the help of a software foundation that lets thousands of developers create the skills and the accessories.

VentureBeat: I’m guessing your open approach has guided Misty’s hardware design. I mean, it can’t be easy to figure out which components to include and which not to include. I’m sure price point is top of mind for you, and you probably can’t fit everything you want and maintain affordability. Perhaps you could talk a bit about how the Misty II’s design has evolved throughout the campaign and what your backers are telling you.

Enwall: From a hardware perspective, it’s not going to change much from now to December — I mean, we’re already piloting production runs in China. The decision-making process was guided by this question: How can we tack the most usefulness and stay within budget?

We wanted a robot that could do a lot of AI locally, and AI at all skill levels. Whether it’s a college student or a professional who’s experimenting with an independent robot can do for them, we wanted the robot to be very capable from an API perspective. We also wanted it to be autonomously mobile.

That’s why we chose a pretty expensive but super capable 3D depth sensor so that it can not only navigate its way around, but produce a manipulable 3D map of the world for object detection and other features. We wanted Misty II to have high fidelity speakers so that when it does leverage natural language processing from Amazon or Google, it sounds great. We wanted the robot to have great microphones so that it’s capable of working with voice.

Those are the sort of the basics we thought Misty II needed for it be useful. It had to be able to navigate by itself, and it had to have great eyes, great ears, and a great brain that’s capable of capable of doing quite a bit of AI.

VentureBeat: So AI is a core part of the platform, I take it. That makes sense — artificial intelligence is what most people think of when they think of a robot. So was that a natural area of emphasis for you? Did you choose the Misty II’s internals with that in mind, and are you going to supply developer tools that make it easy to get things like computer vision applications and natural language processing up and running?

Enwall: Yeah, absolutely. We put two powerful cell phone processors in there so that you can do some AI-related things, and we also opened up the pipeline for developers and gave them frameworks by which to access it.

We believe that with a mobile real-time platform, you’ve got all kinds of auditory and visual data that can feed learning systems. Most learning systems are off in the cloud somewhere, and they’re fed data mainly in a non-real-time manner. When you’re at the edge, you can feed a bunch of these learning systems and have them work collaboratively with the cloud to do additional processing.

VentureBeat: I’m glad you brought that up. There seems to be an ongoing debate about on-device processing versus cloud processing. It makes a difference whether you’re talking about inference or model training, of course. But is it something you considered in the development of the Misty platform? Did you design it to ensure that it’s not incomplete, so to speak, if there’s not a reliable internet connection available?

Enwall: Oh yeah, we’ve done a lot to design it so that it’s independent and autonomous of the internet. Most of our work, actually, concentrate on making this robot robust and effective and useful without being connected to the internet. Stuff like face detection, face recognition, autonomous mapping, and navigation — none of that requires any internet connectivity.

We think that’s a pretty big distinction. When you’re moving around in the real world and you’re mobile, you have to respond in milliseconds. Regardless of what anybody would like to imagine about the latency of a round trip back to the cloud, it’s just more latency than decisions that you can make locally.

I think there’s a privacy component there, too. People will be even more aware of it with robots moving around offices and homes. It’s certainly a vector that we believe in and spend time thinking about.

VentureBeat: With the remaining time we have, I’d like to ask about the future. When the Misty II launches and it starts to arrive in people’s homes and places of work, where are do you go from there? Is there going to be some kind of marketplace where people can download creations from other users? Are there going to be frequent updates that enhance its capabilities?

Enwall: Absolutely. That’s already part of the plan — there’s a mechanism for developers to share their skills and accessories with each.

You can expect to see some skills emerging that might make your robot more valuable and useful to you, even in the earliest days, and it will continue to get better and smarter.

On our website, we’ve made our product roadmap publicly available. We only want to work on the things that our customer base finds valuable. We hope to every month be issuing a new feature, a new capability, so that the robot gets more powerful.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics