We interviewed GM and CMO Jessica Gilmartin about Lighthouse, the new home security system that thinks of the household (pets included) first

If the words ‘smart home’ still conjure up an image of clunky robotics and remote controls, we don’t blame you. But as designs get sleeker and voice-enabled AI gets exponentially more capable, home tech seems poised to make its big entrance subtle and seamless—and to prompt us, increasingly, to turn away from buttons and screens. That’s the feeling behind Lighthouse, the Palo Alto-based startup set to launch a security system uniting a camera with an AI assistant. The package combines the type of voice commands we’re growing accustomed to through the likes of Alexa and Siri with the computer vision technology found in autonomous vehicles.

Like recent models of Amazon Echo and Google Home, the cylindrical device won’t mar your living room aesthetic. Sure, it’s a security system, but Lighthouse isn’t trying to make you paranoid about strangers breaking into your house. Instead the company has taken a family-oriented approach to the AI camera, enabling it to send alerts via an app for all of the happenings, and non-happenings, you care about. Lighthouse can be asked to ping you when your kids come home from school every weekday or when the dog walker arrives during a weekend away. You might set up a Lighthouse in the home of an elderly parent and create an alert only for an unusual occurrence, like when it doesn’t detect movement on the stairs by 9 a.m. A built-in speaker and microphone enables two-way talk, allowing for a quick conversation between a working parent and a smartphone-less child.

Jessica Gilmartin, who wears the dual hats of General Manager and Chief Marketing Officer, emphasizes Lighthouse’s approachable UX. In a demo using the Lighthouse devices in Gilmartin’s own house—one on a living room table and another mounted near the front door—she asked the app a series of questions. “What did the dog do after I left?” calls up footage of her dog roaming the house and occasionally barking. With “Did you see the kids running this morning?” Lighthouse presents a clip of her young son dashing down the hall. The AI understands both “kids” (as opposed to adults or animals) and the action of running, and scans and selects from hours of footage accordingly. It is also equipped with facial recognition, so Gilmartin can ask about her husband, children and other regular visitors to the house by name. In the videos, color-coded halos appear around pets, adults and kids to indicate that Lighthouse, using its 3D sensor, understands the object moving through space and time.

PSFK chatted with Gilmartin about Lighthouse's origins and where it fits into the broader notion of the connected home. In trying to distill the company’s hopes for the product, which starts shipping to preorder customers next month, she cited a professional baseball player who is already using the device and service. “He’s got a ton of homes, all the money in the world,” Gilmartin said. “When I asked him, ‘What made you interested in using this?’ [He said,] ‘I’m a dad, just like any other dad, and I travel a lot. I want my kids to know that Dad’s still there. I want to see what they’re doing and I want to be part of their lives.’ I thought, that’s exactly why we’re doing this.”

Could you describe how the team went about developing and choosing key features for Lighthouse AI? What can the AI do for Lighthouse customers?

Our founders helped build the first self-driving car. The big insight from self-driving cars, what makes them work, is a combination of deep learning and 3D sensing. Deep learning is a huge topic right now, obviously—everyone is talking about AI, deep learning and machine learning—but very few people understand how important it is to combine deep learning with 3D sensing.

There are lots of different 3D sensors: self-driving cars use LIDAR, which is very big and expensive and has a little bit of a different technology. Our 3D sensor is a time-of-flight sensor. It emits diffuse light into the environment, and it measures the amount of time for the light to hit an object and reflect back. Basically what this camera is doing is, in real time, creating a 3D structure of the world around it. That feeds the deep learning model infinitely more data points to use to make more accurate predictions. It’s very hard for a traditional camera to be able to look at a 2D scene and to accurately say, “I’m seeing an object; what is that object?” and predict what it will be. If you have a 3D sensor, and you have so many more data points to feed the model, it makes it much more accurate and much easier to be able to say, “That’s a dog; that’s an adult; that’s a child; that’s something I don’t care about.” That’s really the fundamental breakthrough with our technology: taking 3D sensing and combining it with very advanced deep learning.

There are many different AI techniques we use at Lighthouse. Obviously we use computer vision, which is the hardest technique, basically the eyes of AI. We also use Natural Language Processing. We built our own NLP that allows you to ask questions of the camera in a way that doesn’t really exist in any other type of device. We took inspiration from Alexa and from Siri; it’s very clear that people are really interested in using voice because it is so much more efficient, and it’s so much more delightful an experience, and we realized that we can do that with Lighthouse. We started with the technology, and our co-founder and CEO Alex [Teichman]—this really came from his own interest in creating a home security system for himself, and he rigged one up using the 3D sensor and the techniques he learned from self-driving cars. He and Hendrik [Dahlkamp], our co-founder, realized they actually had something pretty amazing, and the thing that they jerry-rigged was already a better camera than anything on the market. So it started from there.

It really started as a security camera. We put it into people’s homes—we had about 100 beta users that were using it for a year—and then we asked them how they used it. Fortunately very few people get [their homes] broken into, so yes, it’s great as a security camera, it’s nice to have that safety and security, but people we actually using it to know what was happening in their home. They’re away at work, they’re traveling, and people care deeply about making sure that they’re connected to the people and the pets that they love at home. That’s really the evolution of all of the features, which were based upon seeing how actual people use it in their home.

A lot of Lighthouse's abilities feel very personal—for example, how your kids can wave at the camera and Lighthouse will send their ‘hello' message to your phone.

The wave actually came from me! My 5-year-old loves to wave at the camera just to say hi to me because he knows I’m on the other end. But I don’t always see it, because I’m working, so I mentioned to Alex, it would be amazing—because we have this 3D sensor and we have this ability to understand actions—it would be amazing if we could detect waving, and then I would get a notification and we could just start talking. He said he actually noticed that other people were doing the same thing. People were arming the camera—putting it in an armed state—waving, and then it would send a security alert. We found out it was for the exact same reason: a wife saying goodbye to her husband when she was leaving… It’s just those personal things. We have texts, but people like to show and see things; we’re very visual. So we thought, we should just build it—and that’s what we built. All of it really comes from the inherent ways that people have been using [Lighthouse] for the past year.

I love the UX and UI, everything you’ve done with this product to make it so friendly. Could you tell me a little more about your process?

We have a very collaborative process internally. I acted as the product owner along with our CEO because I’m the one who talks most directly with customers. We figured out our road map first, and a lot of it is based on feedback from customers. We have a head of our beta program—he works with me—and pretty much every week we’re putting out new surveys about things that they’re interested in prioritizing. So we get a lot of features—we might have 100 or 200 features—and then we work internally to prioritize them based on technical feasibility as well as value to the biggest number of customers.

Once we have the list of priorities, we break them down into different sprints, we prioritize the sprints and we go to our UX/UI designer and we ask him to create mock-ups. Once he’s created the mock-up we all meet as an engineering and product team. We critique it, ask questions and he goes back—so it’s probably a one-week iterative process behind the design. Then once we’ve locked the design down, we enter into the sprint building process for two weeks. Then we run it through QA testing, we release it to our beta users and we leave about a week or so for feedback, iteration and bug fixing.

Lighthouse is now available for preorder, with devices shipping to early adopters in September. What does it take behind the scenes to prepare for that transition that finally brings the product to its end user?

The number one most important thing for everyone on the team that we talk about every day is making the customers happy. Every single thing that we’re doing is making sure that we have our customer support ready, that they’re trained and really understanding that process. It’s getting all of our logistics and operations ready to be able to handle when we actually have paying customers. We have a hardware team working incredibly, incredibly hard—they’re all in China right now. They’ve been there for the better part of a month ramping up production. Our software team is putting the final touches on quite a few features that we still need to build out before we’re live. On the marketing side, we’re evaluating our channels: which channels we’re going to be selling in, as well as continuing to understand from our early customers, our early pre-purchasers, what are they looking for? Getting them really excited, making sure that we understand them, why they purchased it and making sure they have a really delightful experience when we launch.

Thinking about smart homes, how do you envision your system integrating with other smart systems in the home? Do you have any plans to connect to other devices or appliances to build a more holistic home ecosystem?

Absolutely. We’ve been very fortunate: we’ve had a lot of the major platforms reach out to us to talk about integration because I think they’re really excited about our technology as well. So we’re having a lot of really interesting conversations right now, and part of what we’ve been doing—because we are a very customer-centric organization—is reaching out to our beta users as well as our early customers and asking them, who would they want us to integrate with? What are the use cases that would be most interesting to them? We’re pretty knee-deep in that process right now. We won’t have them by shipping just because we have such a massive amount to do before we ship, but I would say very shortly thereafter we will have our first set of integrations.

Generally what people are looking for—the smart home is super interesting to people, and generally, the smart home has not been very smart. It’s really been remote controlled. So people have to do all the work. It’s neat that you can turn your lights off when you leave or change them into different colors, have your security system turn on when you leave, but there’s no native intelligence in that. What we want to do is bring intelligence there. Instead of having the thermostat change just because it knows GPS, it sees you and it knows that you like your temperature at 68 and you’re in a specific room, and it turns the lights on in that room and turns the temperature to a specific setting because it sees you. Those are things that we’ve heard. Or, for example, garage doors: people leave their garage doors open all the time. So having a Lighthouse in the garage, if it notices that it hasn’t seen any activity in 15 or 20 minutes, it closes the garage door automatically. You don’t have to think about it.

We’ve also heard really interesting things around using voice, integrating with Alexa potentially, and you could say, for example, “Alexa, when was the last time you saw the dog?” And it could just tell you, “I saw the dog at 2 pm.” That would be an amazing use case. Or, “Alexa, I’m going upstairs to bed, arm the downstairs cameras.” And it would arm it automatically. Those are some use cases we’ve heard that we think could be really valuable for our customers.

Does the company have plans to broaden its scope to commercial spaces in the future?

That’s been the toughest part of my job: evaluating all of the possibilities for our product. We’ve been inundated with people from every possible sector—it’s been crazy. So I spend a lot of my time evaluating for elderly care, for small-medium business, for warehouses and factories, for healthcare—people see this and it sort of widens the possibilities that they never knew existed. It’s actually fairly frustrating for me because there are so many things we could do and it’s really about focusing. Right now we want to focus on having an absolutely amazing consumer launch, making sure we delight our customers, and we’re very laser-focused. Right after that, it is part of my job to think about what are the other opportunities for us. And there are so many.


If the words ‘smart home’ still conjure up an image of clunky robotics and remote controls, we don’t blame you. But as designs get sleeker and voice-enabled AI gets exponentially more capable, home tech seems poised to make its big entrance subtle and seamless—and to prompt us, increasingly, to turn away from buttons and screens. That’s the feeling behind Lighthouse, the Palo Alto-based startup set to launch a security system uniting a camera with an AI assistant. The package combines the type of voice commands we’re growing accustomed to through the likes of Alexa and Siri with the computer vision technology found in autonomous vehicles.