In computer science, the term “adaptive system” refers to a process
in which an interactive system adapts its behavior to individual users
based on information acquired about its user(s), the context of use and
its environment. Although adaptive systems have been long-discussed in academia and have been an aspiration for computer scientists and researchers,
there has never been a better time than today to realize the potential
of what future interaction with computer systems will be like.
The abilities of today’s network information technologies to create rich, immersive personalized experiences to track interactions and aggregate and analyze them in real time, together with the data collected by the sensors we carry in our smart devices, provides us an opportunity like never before to design adaptivity in order to ultimately offer a better user experience that is both unobtrusive and transparent.
This article will cover the fundamental concepts for utilizing smart device technologies and sensor data in order to understand context and introduce “adaptive thinking” into the UX professional’s toolset. I will demonstrate the importance of context when designing adaptive experiences, give ideas on how to design adaptive systems, and perhaps inspire designers to consider how smart devices and context aware applications can enhance the user experience with adaptivity.
The day and night interfaces in the GARMIN Zumo 660 adapt the interface color so the user isn’t blinded with a bright light.
Adaptive design is about listening to the environment and learning user patterns. Combining smart device sensor data, network connectivity and analysis of user behavior is the secret sauce behind creating an adaptive experience. By combining these capabilities, we not only understand the context of use, we can also anticipate what the user needs at a particular moment.
Google Now is an interesting example of an adaptive application that gives users answer to questions they’ve thought rather than typed. Through a series of smart cards that appear throughout the day on the user’s mobile phone, Google Now tells you the day’s weather before you start your day, how much traffic to expect before you leave for work, when the next train will arrive as you’re standing on the platform or your favorite team’s score while they’re playing. It does this by recording and analyzing your preferences while you’re using your phone. For example, updates on your favorite sports team are based on your Web browsing and search history. And by analyzing your current location, previous locations and Web history, Google Now presents a card with traffic conditions on route to your next likely destination.
As UX professionals, we understand that some mobile users do not like to use the virtual keyboard and we try to avoid that necessity as much as possible. By utilizing the user’s personal behavior as a sensor together with smart device capabilities and enabling voice commands (similar to iOS’s Siri), Google Now creates an adaptive experience that helps users avoid using the virtual keyboard, thus further adapting to the mobile user’s needs and helping users quickly get the information they require on the go.
Adaptive systems are not only limited to mobile devices. Ubiquitous computing (ubicomp) is the idea of being surrounded by smart devices and networked digital objects that are carefully tuned to offer us unobtrusive assistance as we navigate through our work and personal lives. Similarly, ambient intelligence (AmI) refers to digital environments that are sensitive and responsive to the presence of people.
Nest uses sensors to adapt the temperature to activity in the home.
Nest, The Learning Thermostat, is a great example of an adaptive system integrated to home environments. Using a variety of sensors for temperature, humidity, touch, near-field activity, far-field activity and even ambient light, it can detect whether there are people home and how active the home is at any time. By adjusting the temperature to adapt to this information, it can automatically cut up to 20% off a home’s heating and cooling bills.
When no one is around, Nest learns to turn the heat down. When you come home from work, it knows that the heat should go back up. After the first few weeks, it learns when you come home from work and can turn the heat up before you arrive so that you come home to a warm house.
In 1991 Mark Weiser, widely considered to be the father of ubiquitous computing, wrote:
These devices create contexts of sensor and user data that provide a superior user experience by anticipating what the user might need before the need is expressed. This is the future of UX design.
Today’s users can customize their device’s system through preferences and settings, and by choosing what applications work best for their needs. Even after the implementation of user-centered design processes that assure a certain degree of user acceptance and yield a richer understanding of the context, it is impossible to anticipate the requirements of all users and map those requirements to a single best or optimal system configuration.
“Adaptive thinking” is a mindset that provides the tools necessary to significantly improve the user experience and enhance the intended purpose of the product by utilizing the technology that is readily available in every pocket. It is about learning the environment and the user and adapting to their current needs and situation. Therefore, designers should first design for the context of use and then design the set of functions that are triggered in relevant situations.
Here is an instructive case where adaptive thinking was used to create a mobile application for a bike sharing program. Bicycle sharing systems, also known as bike rental, are becoming more and more common in major cities around the world. Bicycle sharing helps reduce traffic congestion and air pollution, and encourages local residents to maintain a healthy lifestyle.
A user who wants to rent a bike can use a mobile application to look for the nearest bike rental station that has bikes available to rent. If the user is unfamiliar with the city, they can use the application to get directions to the rental station; this is the core functionality of the application.
An adaptive system will realize when the user has arrived at the bike rental station and automatically offer additional options, i.e., adapt to the current situation. For example, it may offer the user a quick way to rent a bike, a feature that was not available in the application before arriving at the rental station. During the rental period, the system will anticipate the user’s needs and offer nearby bike rental stations with available parking spots where the bike can be returned, and show the user the current balance for the rental time.
A bicycle sharing application can adapt to show the user different options depending on location, and whether the user is currently renting a bike.
By using the assisted GPS device capabilities, using the network connectivity and understanding the user’s story at any given time through the product lifecycle, adaptive design will provide users of the mobile application a reliable extension to the bike rental program.
An application can analyze the user’s precise location within a store to provide information that is adapted to the current content.
AislePhone, an Israeli start-up currently in the beta stage, is developing a platform for precise in-store positioning that can determine the exact position of a person down to the specific aisle. With this technology, shopping with your mobile phone in hand will be a common experience, as mobile apps for supermarkets and other large retail stores will use locational and user data to enhance the shopping experience, much like a personal shopping assistant in your pocket.
Google Indoor Maps allows users to view and navigate floor plans of several kinds of commercial locations such as airports, department stores or malls, all within Google Maps.
This technology not only knows your indoor location, but also what floor you’re on in a building. Depending on what data is available, the map shows notable places in the building you’re currently in, such as stores, restrooms or the nearest food court.
With this type of technology, “you are here” directory maps will no longer be needed in malls or department stores. You will be able to determine your location and orient yourself using a smartphone, and this experience will adapt to your specific needs. For example, apps will offer you relevant discounts as you walk through the mall or highlight shops based on your gender and age.
When a user searches for “The Beatles,” Google understands this as part of a research session and will help you quickly discover Ringo Starr or Paul McCartney as you enter the first three letters of their name; it understands the context of your search and compares it with other similar popular relevant results.
Google Instant understands the context of your search.
Another example of a subtle feature that helps enhance the user experience is a testing system for students that adjusts the difficulty of test questions according to whether prior questions were answered correctly. Or a music discovery application that looks into your current play list and adapts to your taste, helping you discover additional music you may like.
Although the experience should always be unobtrusive, adaptive interfaces need to be obvious so users understand the context for the adaptation and always feel in control. For a better experience, applications should also allow users to manage adaptive features. For example, if at nighttime the interface changes to a darker night mode (like in navigational devices), the user should always be able to change it back manually. Or, if entering a shopping mall triggers a different experience, the user must understand the context for this adaptivity and want to embrace the added functionality.
Charles Darwin wrote:
In an even louder environment, we use hand gestures to get attention and focus our eyes on the other person’s mouth to try to read their lips. However, unlike computers that can process multiple layers of data, human beings have limited sensory resources and a limited cognitive workload.
In today’s world, a person carries in one pocket more advanced technologies than ever before possible. An intelligent device like a smartphone is embedded with highly sophisticated sensors. These sensors, together with advanced computing power and network connectivity, can help us analyze and understand the context of use. The smart device’s ability to analyze the context of use in real time, together with understanding the user’s story, allows opportunities to provide an even greater user experience by adapting to the user needs.
I will illustrate some of the key points in using these technologies.
Here is a practical example of how analyzing the user’s behavior could help in creating an adaptive system. In the now famous Google Glasses video, we follow the user throughout his morning as he eats his breakfast and then leaves his house heading for the subway. Upon arriving at the subway, he receives a message that subway service is suspended and is offered a walking route. As useful as this may be, a true adaptive system will analyze the user behavior as the user gets up and will warn the user ahead of time that the subway service is suspended.
Google Glasses uses information about the user’s location to provide relevant information.
Understanding the user’s behavior (whether he takes a subway or walks to work) and connecting it with available information online allows us to understand and adapt to his needs. Most times, using one data source is not enough; combining the technologies (network connectivity, user behavior and sensor data) is the only way to understand context. For example, we can gauge the outside temperature combining the user’s current location with online weather information, and then use this data to offer phone numbers for nearby cab companies in addition to a walking route, assuming the user may not wish to walk to work in the rain.
Another aspect of personalization is the increasing prevalence of open data on the Web. Many companies make their data available on the Web via APIs, Web services and open data standards. For example, Pipl is a search engine designed to locate people’s information across the web. Pipl uses identity resolution algorithms to aggregate information and cross-link various sources before delivering an online profile containing a summary report of everything that’s publicly available for each individual. Pipl offers all that wealth of information to developers via an API. One useful application for this would be running an API request for an email address; one can determine the user’s gender, age, location and interests and provide an adaptive experience based on the individual user.
Pipl Search aggregates information that’s publicly available on any individual.
Understanding the user story is possible with a network connection. However, network connectivity is not only important to understand the user and his online record, it is a vital instrument that connects all other technologies together — cloud computing, understanding local weather, traffic conditions or even the type of connection itself (Wi-Fi or G3) can help us understand context. Ultimately, the possibilities inherent in understanding and designing to the user’s story — their context — are possibilities built upon the collection of sensor data and user data via the network.
There are two main scenarios for using sensors: everyday objects transmitting data like temperature or noise level to other devices, for example, iGrill, a cooking thermometer and application that communicates with smart devices via a secure, long-range Bluetooth connection. Or smart device applications, utilizing the built-in sensors to receive, process and output data to the user. By using these sensors and mixing other technologies discussed above, we can often obtain powerful information on the context of use and use it to create adaptive systems.
iGrill Cooking Thermometer.
Sensors can be a powerful design tool of the future. For example, with the aid of sensors, e-commerce checkout will be as easy as logging into a bank account with no password. Here is an example of how using four layers of sensor data to secure the user’s identity with a degree of certainty to create passwordless banking that would present the user a “light” version of his bank account, so he quickly could check his account balance. Imagine a user is at home surfing to his bank account through his tablet computer.
The first layer of security is the username associated with the tablet. Second is the location sensor, which will give us a greater degree of certainty that the user is in his home vicinity, cross-checked with his registered address with the bank. The third layer is the Wi-Fi connection (its MAC address, a unique identifier assigned to a network) the user is surfing on. For the fourth layer, we can check for other nearby Wi-Fi connections (The neighbors are sure to have a unique Wi-Fi MAC address) that can also be used as a security verification. If these bits of data are consistent across several password logins, the system can adapt and allow the user to enter without any password.
To learn more about adaptive design and how to get from sensors to context, I highly recommended you read this paper by Albrecht Schmidt about Context-Aware Computing (Interaction Design Foundation Encyclopedia).
The abilities of today’s network information technologies to create rich, immersive personalized experiences to track interactions and aggregate and analyze them in real time, together with the data collected by the sensors we carry in our smart devices, provides us an opportunity like never before to design adaptivity in order to ultimately offer a better user experience that is both unobtrusive and transparent.
This article will cover the fundamental concepts for utilizing smart device technologies and sensor data in order to understand context and introduce “adaptive thinking” into the UX professional’s toolset. I will demonstrate the importance of context when designing adaptive experiences, give ideas on how to design adaptive systems, and perhaps inspire designers to consider how smart devices and context aware applications can enhance the user experience with adaptivity.
Examples Of Adaptive Systems
An early example of an adaptive feature can be found in GPS navigational devices. Using one of these devices, a user is able to easily locate and navigate to any location they can drive to. When the sun sets or while driving through a tunnel, the system automatically changes the interface color to a dark “night mode” so as not to blind the driver with a bright light from the device. The system knows the user’s exact location and the position of the sun, and by understanding these two factors, the system maintains a safe driving environment by adapting to the user’s needs.The day and night interfaces in the GARMIN Zumo 660 adapt the interface color so the user isn’t blinded with a bright light.
Adaptive design is about listening to the environment and learning user patterns. Combining smart device sensor data, network connectivity and analysis of user behavior is the secret sauce behind creating an adaptive experience. By combining these capabilities, we not only understand the context of use, we can also anticipate what the user needs at a particular moment.
Google Now is an interesting example of an adaptive application that gives users answer to questions they’ve thought rather than typed. Through a series of smart cards that appear throughout the day on the user’s mobile phone, Google Now tells you the day’s weather before you start your day, how much traffic to expect before you leave for work, when the next train will arrive as you’re standing on the platform or your favorite team’s score while they’re playing. It does this by recording and analyzing your preferences while you’re using your phone. For example, updates on your favorite sports team are based on your Web browsing and search history. And by analyzing your current location, previous locations and Web history, Google Now presents a card with traffic conditions on route to your next likely destination.
As UX professionals, we understand that some mobile users do not like to use the virtual keyboard and we try to avoid that necessity as much as possible. By utilizing the user’s personal behavior as a sensor together with smart device capabilities and enabling voice commands (similar to iOS’s Siri), Google Now creates an adaptive experience that helps users avoid using the virtual keyboard, thus further adapting to the mobile user’s needs and helping users quickly get the information they require on the go.
Adaptive systems are not only limited to mobile devices. Ubiquitous computing (ubicomp) is the idea of being surrounded by smart devices and networked digital objects that are carefully tuned to offer us unobtrusive assistance as we navigate through our work and personal lives. Similarly, ambient intelligence (AmI) refers to digital environments that are sensitive and responsive to the presence of people.
Nest uses sensors to adapt the temperature to activity in the home.
Nest, The Learning Thermostat, is a great example of an adaptive system integrated to home environments. Using a variety of sensors for temperature, humidity, touch, near-field activity, far-field activity and even ambient light, it can detect whether there are people home and how active the home is at any time. By adjusting the temperature to adapt to this information, it can automatically cut up to 20% off a home’s heating and cooling bills.
When no one is around, Nest learns to turn the heat down. When you come home from work, it knows that the heat should go back up. After the first few weeks, it learns when you come home from work and can turn the heat up before you arrive so that you come home to a warm house.
In 1991 Mark Weiser, widely considered to be the father of ubiquitous computing, wrote:
“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”Nest is a great example of ubicomp and how technology can disappear into our surroundings until only the user interface remains perceivable to users.
These devices create contexts of sensor and user data that provide a superior user experience by anticipating what the user might need before the need is expressed. This is the future of UX design.
Adaptive Thinking
In contrast to traditional desktop systems, mobile devices are normally used in many different situations. However, mobile applications nowadays do not often take advantage of information about the context of their use, and hence are only usable for very specific purposes. For example, an application with city maps for local businesses can be used in different contexts: walking through town or at home, and with or without network connectivity.Today’s users can customize their device’s system through preferences and settings, and by choosing what applications work best for their needs. Even after the implementation of user-centered design processes that assure a certain degree of user acceptance and yield a richer understanding of the context, it is impossible to anticipate the requirements of all users and map those requirements to a single best or optimal system configuration.
“Adaptive thinking” is a mindset that provides the tools necessary to significantly improve the user experience and enhance the intended purpose of the product by utilizing the technology that is readily available in every pocket. It is about learning the environment and the user and adapting to their current needs and situation. Therefore, designers should first design for the context of use and then design the set of functions that are triggered in relevant situations.
Here is an instructive case where adaptive thinking was used to create a mobile application for a bike sharing program. Bicycle sharing systems, also known as bike rental, are becoming more and more common in major cities around the world. Bicycle sharing helps reduce traffic congestion and air pollution, and encourages local residents to maintain a healthy lifestyle.
A user who wants to rent a bike can use a mobile application to look for the nearest bike rental station that has bikes available to rent. If the user is unfamiliar with the city, they can use the application to get directions to the rental station; this is the core functionality of the application.
An adaptive system will realize when the user has arrived at the bike rental station and automatically offer additional options, i.e., adapt to the current situation. For example, it may offer the user a quick way to rent a bike, a feature that was not available in the application before arriving at the rental station. During the rental period, the system will anticipate the user’s needs and offer nearby bike rental stations with available parking spots where the bike can be returned, and show the user the current balance for the rental time.
A bicycle sharing application can adapt to show the user different options depending on location, and whether the user is currently renting a bike.
By using the assisted GPS device capabilities, using the network connectivity and understanding the user’s story at any given time through the product lifecycle, adaptive design will provide users of the mobile application a reliable extension to the bike rental program.
Adaptive and Responsive Design
An adaptive system is one that adapts automatically to its users according to changing conditions. Responsive design (or adaptive layout) is a subset of adaptive design, an approach to Web design in which a website is crafted to provide an optimal viewing experience across a wide range of devices. In my UX magazine article The Multiscreen Ecosystem, I discuss how responsive websites can also be adaptive by understanding the context of using a mobile device and by designing contextual paths.Context For Adaptivity
I quote below from the 2007 book The Adaptive Web, which talks about the importance of context for adaptive mobile guides. It explains adaptivity in the scope of mobile systems as context-aware computing, i.e., the ability to use information in the current context to adapt the user interaction and the presentation of information to the current situations of the user.“Understanding the context is an important prerequisite for the adaption process. Context is not just the location, but encompasses also information like the ambient noise or lighting level, the network connectivity or bandwidth, and even the social circumstances of the user. Furthermore, systems have to anticipate the user’s goals and intentions, which might be inferred from their actions or from physiological sensors and appropriate environmental sensors (e.g. light, pressure and noise sensors).That said, understanding the locational context and the user story is now easier than ever before. We can take advantage of the fact that we carry our phones wherever we go. A smartphone is packed with technology and with information about the user that designers can use to understand context. The highly sophisticated advanced technology in a user’s pocket not only allows designers to analyze if the user is walking, standing or in a loud or quiet environment, but also can help designers understand the precise location of a person within a department store, such as a specific aisle.
One prerequisite for adaptive systems is the proper assessment of the user’s situation. For this purpose, systems need to rely on a representation of relevant situations. Depending on the supported task, situations can be characterized by many different attributes. Therefore, designers of suitable adaptation for mobile devices need to look at a variety of spatial, temporal, physical and activity related attributes to provide effective assistance.
For example, a mobile application that assists users in a shop needs to know about the current spatial environment of the users (e.g. which products are nearby), the temporal constraints of the user (e.g. how much time is available for shopping), the general interests of the users and their preferences (e.g. if the user prefers red or white wine with tuna), details about the shopping task itself (e.g. which items are on the shopping list and for which purpose the products are needed) and maybe even about the physiological and the emotional state of users (e.g. whether users are enjoying the shopping or not).”
An application can analyze the user’s precise location within a store to provide information that is adapted to the current content.
AislePhone, an Israeli start-up currently in the beta stage, is developing a platform for precise in-store positioning that can determine the exact position of a person down to the specific aisle. With this technology, shopping with your mobile phone in hand will be a common experience, as mobile apps for supermarkets and other large retail stores will use locational and user data to enhance the shopping experience, much like a personal shopping assistant in your pocket.
Google Indoor Maps allows users to view and navigate floor plans of several kinds of commercial locations such as airports, department stores or malls, all within Google Maps.
This technology not only knows your indoor location, but also what floor you’re on in a building. Depending on what data is available, the map shows notable places in the building you’re currently in, such as stores, restrooms or the nearest food court.
With this type of technology, “you are here” directory maps will no longer be needed in malls or department stores. You will be able to determine your location and orient yourself using a smartphone, and this experience will adapt to your specific needs. For example, apps will offer you relevant discounts as you walk through the mall or highlight shops based on your gender and age.
Designing An Adaptive System
Adaptive design integrates both subtle and obvious features. Often, adaptive qualities can be very subtle and unobtrusive: sometimes a seemingly small adaptive feature can greatly improve the overall experience. For example, did you ever notice that Google Search can read your mind? When you start typing, Google Instant, using autocomplete, knows what you’re thinking even when you enter only three letters of a search term. It does this because Google Search considers and records all search queries within a session in order to have a better understanding of the user’s intent.When a user searches for “The Beatles,” Google understands this as part of a research session and will help you quickly discover Ringo Starr or Paul McCartney as you enter the first three letters of their name; it understands the context of your search and compares it with other similar popular relevant results.
Google Instant understands the context of your search.
Another example of a subtle feature that helps enhance the user experience is a testing system for students that adjusts the difficulty of test questions according to whether prior questions were answered correctly. Or a music discovery application that looks into your current play list and adapts to your taste, helping you discover additional music you may like.
Although the experience should always be unobtrusive, adaptive interfaces need to be obvious so users understand the context for the adaptation and always feel in control. For a better experience, applications should also allow users to manage adaptive features. For example, if at nighttime the interface changes to a darker night mode (like in navigational devices), the user should always be able to change it back manually. Or, if entering a shopping mall triggers a different experience, the user must understand the context for this adaptivity and want to embrace the added functionality.
Charles Darwin wrote:
“It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change.”As human beings, we adapt to our surroundings naturally; it is the key to our survival. As designers, we can use this inherent ability and our physical senses and the powers of the brain to analyze and design what we would do in adaptable situations. For example, to communicate in a loud environment, we adapt by raising up our voice up to be heard. Similarly, an adaptive system will raise a device’s volume.
In an even louder environment, we use hand gestures to get attention and focus our eyes on the other person’s mouth to try to read their lips. However, unlike computers that can process multiple layers of data, human beings have limited sensory resources and a limited cognitive workload.
In today’s world, a person carries in one pocket more advanced technologies than ever before possible. An intelligent device like a smartphone is embedded with highly sophisticated sensors. These sensors, together with advanced computing power and network connectivity, can help us analyze and understand the context of use. The smart device’s ability to analyze the context of use in real time, together with understanding the user’s story, allows opportunities to provide an even greater user experience by adapting to the user needs.
I will illustrate some of the key points in using these technologies.
Analyzing User Behavior
Similar to the Google Now example, analyzing user behavior and the user’s interaction with the digital world can yield a great understanding of the user’s context. Analyzing the user’s search patterns or what applications they download can tell us about their preferences and hobbies. Tracking current location and location history can give us the user’s surroundings and the physical boundaries of their life, so we can understand what subway station they take to work or where they like to eat their lunch. Note that when this is done without the knowledge of users, it may be considered a breach of browser security and illegal in many countries.Here is a practical example of how analyzing the user’s behavior could help in creating an adaptive system. In the now famous Google Glasses video, we follow the user throughout his morning as he eats his breakfast and then leaves his house heading for the subway. Upon arriving at the subway, he receives a message that subway service is suspended and is offered a walking route. As useful as this may be, a true adaptive system will analyze the user behavior as the user gets up and will warn the user ahead of time that the subway service is suspended.
Google Glasses uses information about the user’s location to provide relevant information.
Understanding the user’s behavior (whether he takes a subway or walks to work) and connecting it with available information online allows us to understand and adapt to his needs. Most times, using one data source is not enough; combining the technologies (network connectivity, user behavior and sensor data) is the only way to understand context. For example, we can gauge the outside temperature combining the user’s current location with online weather information, and then use this data to offer phone numbers for nearby cab companies in addition to a walking route, assuming the user may not wish to walk to work in the rain.
Making Use of the User’s Story
Behavioral targeting or personalization refers to a range of technologies used by online website publishers and advertisers that allows them to increase the effectiveness of their campaigns by capturing data generated from website and landing page visitors and then adapting to their needs. Personalization technology enables the dynamic insertion, customization or suggestion of content in any format that is relevant to the individual user, based both on the user’s explicitly provided details and their implicit behavior and preferences.Another aspect of personalization is the increasing prevalence of open data on the Web. Many companies make their data available on the Web via APIs, Web services and open data standards. For example, Pipl is a search engine designed to locate people’s information across the web. Pipl uses identity resolution algorithms to aggregate information and cross-link various sources before delivering an online profile containing a summary report of everything that’s publicly available for each individual. Pipl offers all that wealth of information to developers via an API. One useful application for this would be running an API request for an email address; one can determine the user’s gender, age, location and interests and provide an adaptive experience based on the individual user.
Pipl Search aggregates information that’s publicly available on any individual.
Understanding the user story is possible with a network connection. However, network connectivity is not only important to understand the user and his online record, it is a vital instrument that connects all other technologies together — cloud computing, understanding local weather, traffic conditions or even the type of connection itself (Wi-Fi or G3) can help us understand context. Ultimately, the possibilities inherent in understanding and designing to the user’s story — their context — are possibilities built upon the collection of sensor data and user data via the network.
Sensor Data
A sensor for adaptive systems is any technology that allows a device to understand and evaluate context. It includes a built-in accelerometer in smart devices, a camera, a clock or even a microphone. We can use the various sensors embedded in smart devices to better understand the user’s environment. For example, the built-in accelerometer can be used to gauge if a user is walking or running.There are two main scenarios for using sensors: everyday objects transmitting data like temperature or noise level to other devices, for example, iGrill, a cooking thermometer and application that communicates with smart devices via a secure, long-range Bluetooth connection. Or smart device applications, utilizing the built-in sensors to receive, process and output data to the user. By using these sensors and mixing other technologies discussed above, we can often obtain powerful information on the context of use and use it to create adaptive systems.
iGrill Cooking Thermometer.
Sensors can be a powerful design tool of the future. For example, with the aid of sensors, e-commerce checkout will be as easy as logging into a bank account with no password. Here is an example of how using four layers of sensor data to secure the user’s identity with a degree of certainty to create passwordless banking that would present the user a “light” version of his bank account, so he quickly could check his account balance. Imagine a user is at home surfing to his bank account through his tablet computer.
The first layer of security is the username associated with the tablet. Second is the location sensor, which will give us a greater degree of certainty that the user is in his home vicinity, cross-checked with his registered address with the bank. The third layer is the Wi-Fi connection (its MAC address, a unique identifier assigned to a network) the user is surfing on. For the fourth layer, we can check for other nearby Wi-Fi connections (The neighbors are sure to have a unique Wi-Fi MAC address) that can also be used as a security verification. If these bits of data are consistent across several password logins, the system can adapt and allow the user to enter without any password.
To learn more about adaptive design and how to get from sensors to context, I highly recommended you read this paper by Albrecht Schmidt about Context-Aware Computing (Interaction Design Foundation Encyclopedia).