Future Of Home+Living 2030: Ambient Dwellings
What will life look like in a decade? This article considers how our future residences will not only automatically adapt to but actually anticipate our needs, becoming an extension of our bodies and minds
In this series of articles, PSFK's Piers Fawkes will explore scenarios and signals that suggest the ways in which we will live by the end of this new decade. We’ll look at six key future trends, listen to expert opinions and examine fringe ideas that may be mainstream by 2030. This is the fourth post in the Future of Home+Living series.
But First: Newcastle
At first, a town that once thrived with coal mining and shipbuilding might not seem to exemplify the future of home living. The city of Newcastle upon Tyne in the north of England still has streets of row houses that were constructed as homes for the workers and their children—future workers.
But now, Newcastle is home to one of the world’s most progressive research centers exploring the future of home living. Open Lab at Newcastle University is a hyper-progressive laboratory where researchers explore shape-changing fabrics, environmental sensors, wellness wearables and the use of IoT to support communities.
One of Open Lab’s lead researchers is Sara Nabil Ahmed, who has released a series of papers and reports on the future of homes. For her research, she has studied interactive and responsive environments. In one project, she researched traditional crafting methods such as stitching, embroidering, dyeing and machine sewing, and how each can embed shape-changing and color changing actuation into soft fabrics.
While interactive architecture tends be associated most with walls and the solid edges of buildings, Nabil believes that designers will be able to develop fabrics and soft materials which will change their color or shape for an array of in-home objects and decor.
Nabil has coined the phrase “interioraction,” and one of her recent projects was a dining table runner which changed shape and color depending on the setting. In this context, she’s exploring how furnishings would change dependent on meal occasion, cuisine and guests.
In my research, I spoke to another conceptual designer, Kiki Goti, who runs a studio called Some People in Brooklyn. She told me that the field of kinetic, adaptive architecture that responds to the environments and to human needs has been widely explored, from the radical architects of the 1960s, like Archigram and Price, to the computational designers of the early 2000s, like Kolarevic and Oosterhuis, but there is still work to be done.
“The challenge today is to think beyond this one-way relationship between users and architectural machines: the user gives a command, the architectural machine executes it by transforming itself,” Goti says. “If your home ‘knows’ you well, then very subtle shape-changing actions can become very meaningful in a specific context. For example, breathing architectural skins that can provide more or less privacy according to users’ feelings or subtle color changes that adapt to users’ moods may communicate with users almost subconsciously.”
Goti’s ideas and Nabil’s research and experimentation provide strong signals about the homes of the future.
So, How Will We Live Tomorrow?
Picture this: Whether we live in communal compounds, parasitic structures or are wealthy enough to reside in what were once thought of as traditional homes, the buildings that house us will interact with the people inside them. They will provide ambient care in four ways: listening, learning, acting and ordering.
Our homes will gather information about inhabitants and visitors through a mix of sensors. They will gather data from traditional methods like sight, sound and device tracking, but they will also pick up on touch, heat, radar and odor. We might provide information and requests to the home through gesture, glances and gasps.
As our dwellings track us, they will learn about us—about what our habits, or the way we shift the furniture and rearrange the room for certain visitors. The brains in our walls will connect with other platforms and services to correlate our changing behavior and understand us at an intense, sometimes overwhelming, level.
As we move around our homes, the walls, appliances and furnishings will adapt to help us achieve our chores and tasks—and sometimes, the home will inhibit us from those personal projects which are deemed less appropriate. Walls will move, furniture will shift, rooms will grow or shrink. Lighting will adapt to provide projections and visual stimulus.
The home and its contents will also support and direct us by providing sonic cues, alter temperature zones, sending acute vibrations or even producing an array of scents. Electronics will employ a common language based on sonics and haptics.
As buildings adapt physically to their residents, they will also connect with services beyond the home. Beyond the satisfaction of our simple needs, they will leverage an intimate understanding of our needs to help us get the most out of life. It will revolutionize the way people think of self-care—a kind of tech-care.
The home will also respond when we’re ill—not just ordering an ambulance, but working with hospital systems to ensure the right medical staff and equipment are ready for you when you arrive.
Around the world, designers, architects and developers are conceiving future living experiences that are similar to these scenarios. In a future-forecast from Some People, the team imagines rapid production to build parasitic homes. The designers wonder if a team of fabrication robots would live with each resident, respond to needs and build extensions to their homes. In another concept from Some People, a robot estimates the needs of the community, then builds structures built out of lightweight wooden frames with the help of humans.
There are already several more examples and ideas connected to the notion of robots that respond to the resident. iRobot, the makers of Roomba, have created an update where their remote vacuum cleaner will interact with a connected Bravaa mop. A car concept from designers Bene Distler and Felix Marx comes with its own butler robot and syncs with smarthomes. Amazon has reportedly been working on a home robot, codenamed “Vesta,” that would serve as a mobile version of Alexa that follows users around the home. Business Insider reports that prototypes of the device are about waist high and navigate using computer vision.
How might these robots really interact with us? Today, Orli sells robotic furniture, such as cabinets and fold-away beds that shift around a room to adapt to the use the resident wants. In 2020, they will release a collaboration with IKEA called ROGNAN. The design leader Yves Béhar also spoke recently about his approach to designing ElliQ, a device that is reflective of a healthy relationship with something practical. “We landed on [a] notion of a beautiful table-top object that reacts when they enter the room but doesn’t intrude or take over the environment,” he told Fast Company.
Because we will share the buildings we call home with other people, they will need to adapt. In Engadget, Google executive Rishi Chandra revealed how IoT hardware—like the tech his company makes today—will help us live together. “We have to understand the context of multiple people in the house and react to it,” Chandra said.
There’s serious experimentation around responsive lighting, too. IKEA has launched a set of connected blinds that will rise and fall at sunset and sunrise. Dyson, a company best known for producing high-end vacuum cleaners, has a desk and floor lamp called Lightcycle that automatically adjust their lighting temperature based on ambient light or time of day. The Cactus gym in Colorado has lighting that responds to visitors and also synchronizes with music to boost the experience. Class instructors can also build their own patterns.
Beyond lighting, Netflix has developed a hack that rumbles phones to create a haptic effect as part of a show’s experience. Amazon has developed Alexa Custom Interfaces that can be programmed to trigger a sound effect every time you win a built-in game and even give music lessons with when connected to a piano keyboard.
While we’ve been getting familiar with the use of voice commands in the home—or the practice and re-practice of them—we’re seeing other ways to interact with the home. Consumers can add a spinning LIDAR to electronics to allow them to be controlled by gestures. Comcast is developing a remote control that allows people to adjust their TV with eye movements.
Slightly outside the home but still a weak signal to consider in this category is Honktap which allows users pays for things like road-tolls with the sound of their car-horn. The City of Paris is giving out tickets to loud cars identified through a new noise-radar system. Could we one day get tickets for screaming at one another in the home?
Maybe. Amazon also has a unit experimenting with AI to get their smart speaker to detect emotions like happiness, sadness, and anger—especially for veterans with PTSD. And an experiment by the University of Washington allows Alexa to realize that breathing has changed in a way that suggests cardiac arrest.
Health and wellness will be a critical driver for the adoption of the ambient home. A recent Forrester report argues that healthcare organizations need to learn and understand how to navigate the smart home space and use of IoT tech to improve wellness or chronic condition management on the backdrop of creating an end-to-end connected health experience for consumers. The World Health Organization says that 60% of related factors to individual health and quality of life correlate to lifestyle choices, getting exercise and reducing stress.
Moni Miyashita and Michael Brad, researchers at the innovation consulting firm Innosight, explored this in an article in the Harvard Business Review: “There’s a wide array of non-acute health decisions that consumers make daily. These decisions do not warrant the attention of a skilled clinician but ultimately play a large role in determining patient’s health—and ultimately the cost of healthcare. … Aided by AI-driven models, it is now possible to provide patients with interventions and reminders throughout this day-to-day process based on changes to the patient’s vital signs.”
Today’s developments will inform tomorrow’s tech. For example, this sleep tracker knows that you’ve had too much to drink and Apple has introduced “decibel monitoring” on its watch OS to notify users when the surrounding environment is too loud. Meanwhile, Amazon (yet again) employs a team called Lab 126 that is reportedly working on a wearable device called Dylan that reads human emotions and can coach users through social situations.
Connected with the theme of health is the growth of the senior population. Assistive technologies are estimated to become a $26 billion market in the U.S., and this prize is being pursued by startups like B-Temia, Kinova Robotics, Open Bionics, Voiceitt, OrCam Technologies, eSight, and Whill. Design leader Yves Béhar, speaking to Fast Company, puts it best: “The opportunity here is to design robots that really serve people that tend to be not served well by technology–and aging is certainly one of those categories.”
Could our homes eventually know us better than we know ourselves? There are several examples of companies using predictive modeling today to deliver products and services to people just when they need it—even if they hadn’t yet realized they needed it. In my research, Kiki Goti, design lead at Some People Studio, has the most holistic view of how and why ambient homes will unfold.
“For me, ambient buildings are not just finished buildings that track users and respond to their needs. They are ever-evolving structures capable of adapting to unexpected conditions. These structures are able to evolve by “learning” about their users through constant, long-term, subtle communication processes.
“The communication between users and structures is not a simple sensor-based feedback loop, but a much more complex process that is a natural result of their coexistence. Just like two flat mates know each other and understand their needs, that’s how an intelligent structure and its habitat co-exist.
“Future scenarios include buildings that understand complex social and psychological relationships and are able to develop intelligent behaviors. Buildings that understand users’ personality and aesthetics and are able to respond to their feelings and mood, through smart material and tectonic systems.”
Truly adaptive homes may still seem far off, but the tech is already on the way—it’s just a matter of how we apply it.
To continue to follow the series watch for updates on PSFK or join Piers' personal newsletter.