Distilling market noise into market sense

VisionMobile is the leading research company in the app economy and mobile business models. Our research and workshops help clients compete and win in their rapidly changing industries.

Ambient intelligence: how well does your phone know you?

[Did you ever wish that your handset understood you better and you didn't have to tap, type & scroll to tell it what you want it to do? Well, that moment may not be too far away. VisionMobile Senior Analyst Andreas Pappas discusses the future of ambient intelligence and the way handset evolution has brought us closer to that vision.]

“PlaceMe” is not an app; not in the traditional sense. You don’t really interact with it and you don’t tell it what to do. It will sit there quietly, in the background, observing you, learning about your daily routine and keeping a record of everywhere you’ve been.

“Highlight” is another app that will sit quietly in the background. It will scan people around you and check their profiles. If it finds interesting people such as someone that you share common friends with it will let you know everything it knows about them.

PlaceMe and Highlight belong to a new breed of mobile apps that are better described as ambient sensing services. These services require minimal or no user input; they are constantly aware of your environment by monitoring mobile sensors on your phone and combining it with cloud-based information sources to fill in any missing information and augment your perception of the world.

Heavyweights such as Google and Qualcomm are also investing in this type of services:
- Google Now is a service currently available on Android Jellybean that uses past history searches to predict and display or read-out information that it thinks will be useful to you, such local weather or flight info.
- Gimbal, Qualcomm’s context awareness platform aims to empower context aware applications through an SDK that includes geofencing (location-based selection), interest sensing and image recognition among other features.

Alohar Mobile, the makers of PlaceMe also provide an SDK and APIs that developers can use to integrate location and motion-based features into their apps.
These services are currently at an early stage. They are designed to maximise ambient context sensing (where you are, where you’re going, what the weather, your mood, health or the traffic conditions are like) and minimise your input so that the services disappear into the background and only interrupt you when they have something interesting to say.

Mobile devices at the centre stage

Ambient intelligence, has been a hot-topic in academic circles and research labs for almost two decades: smart walls, screens, fridges and homes, sensors and sensor networks have featured excessively in research papers. These devices would understand and automate people’s interaction with their environment. And while some progress has been made, there has been a profound lack of real-world use-cases and commercial traction. One of the reasons behind their failure is that such services and devices were disconnected from the user most of time, i.e. their utility has been confined within a very limited use-case.

The one component that was necessary to link all these smart components to the user was missing until recently: the sensor-packed smartphone. The sensor-packed smartphone is a vital and key component of the ambient intelligence environment: a device that is with you most of the time, if not always, and has the capacity to monitor, understand, communicate and react to most of your actions and interactions.

The number and quality of sensors on smartphones has been increasing rapidly in recent years (gyros, humidity, temperature) and this trend will certainly continue: in the future we might as well see medical sensors integrated within handsets. However, there are several ingredients necessary to empower an ambient intelligence service, apart from sensors:

- Cloud-based data stores: Online data sources that turn sensor readings into information that can be understood by humans. E.g. A Wi-Fi transceiver will only work as a location sensor if the address of the Wi-Fi hotspot is mapped in a cloud database.
- Cheap & ubiquitous data: The utility of ambient services increases with always-on, ubiquitous and real-time data connectivity to cloud-based resources.
- Cheap cloud processing power and inference engines: These cloud-based components do the number crunching and data processing required to combine sensor input with all sorts of data (environmental, geographic, economic etc.) in order to infer intentions and determine appropriate actions.
- Mobile processing power: With Quad-core processors running at 1.5GHz, mobile devices now have comparable capabilities to notebooks and are able to do much of the processing required locally.
- Apps: Apps act as the front-end and interface to the user and fulfil and extend the use-cases available.

Most of the above elements have only recently reached the performance or price level required to enable mobile-based ambient intelligent services. Context-awareness SDKs such as those provided by Qualcomm and Alohar Mobile empower developers with the tools required to optimise their apps with context information of the user and their environment. By doing so they provide a more streamlined experience and a more intuitive user-app interaction than traditional apps. For example, by knowing the user’s location, daily routine and current traffic information, a personal assistant service will be able to infer and inform the user that it is time to leave work to pick the kids from school and his best option is to take the long route there since traffic there’s a traffic jam on their usual route. And this process should not require any user input. These services will learn the user’s habits and schedule just by monitoring their everyday behaviour.

Samsung has integrated some of these concepts in the Galaxy S3: the phone will understand that you want to call someone when you lift the handset to your ear while texting them. It will also keep the screen lit as long as you keep looking at it. These emerging features in both handsets and services signal a trend towards ambient intelligence, which will spark a wave of innovation in this space opening up an even wider range of use-cases.

New use-cases are being unlocked

Apart from a range of personal assistant services that are only valuable to individuals, the scope for ambient intelligent services is much larger than the personal level. Harvesting and processing sensor data from thousands and millions of users via services such as PlaceMe can reveal social behaviour and environmental data on an unprecedented scale. The opportunities are immense:

- Commercial: user shopping patterns such as the route a customer took inside a supermarket and how long they spent in each corridor may be useful information for physical retailers. Smart mobile services must offer direct value to users even if their primary purpose is commercial. Otherwise it will be difficult to achieve user traction and overcome users’ privacy barriers. For example, by knowing an individual’s shopping basket, the shortest route and best deals can be highlighted in large supermarket.
- Public safety: traffic patterns in cities and concentration of large masses of people in specific areas can be highlighted by using the collective intelligence enabled by sensor data. Safety alerts can be triggered when a variation from normal patterns is observed.
- Planning, forecasting and research: the data sets generated by constant observation of location and environmental variables present a unique resource that planners and researchers can tap into in order to better understand changing social or environmental patterns are respond to these.
- Health monitoring: medical sensors integrated into mobile phones can alert individuals or medical professionals of an emergency and recommend a course of action. While such sensors are currently provided as add-ons, it may be possible that some may become integrated into phones in the future.

The advent of context aware SDKs together with increasing levels of handset sensor integration is bound to open-up a lot more new use cases that developers will address.

Beyond the “freaky line” of privacy

As prominent Silicon Valley influencer Robert Scoble puts it, the services described here are way over the “freaky line”, implying a very high level of privacy concerns. Placeme and Highlight are examples of apps/services that cross the freaky line, and go far beyond Facebook in the amount of private data they collect. While PlaceMe currently encrypts this data, most commercial use cases can only be unlocked if users relinquish some control of their personal information.

Sam Liang, founder of PlaceMe, suggests that these services will become so pervasive that, once we start using them, people will not want to be without them. If this is true, it will signify a fundamental shift in the way users perceive their privacy: the utility they receive will be large enough to overcome their concerns about the way their data is used. Facebook has led the way here, pushing the “freaky line” to higher levels: it has opened up people’s lives, where they’ve been, who they were with and what they did to a wide social circle. And all this is usually exposed voluntarily by the user. Despite privacy concerns there is no sign that users are abandoning Facebook, indicating that its utility is worth the privacy trade-offs for most users. PlaceMe and Highlight will push this line even further by collecting data automatically and continuously.

Naturally, sensitivity to privacy will vary by age: teenagers are already less sensitive to privacy loss already giving away much control of personal data to social networks. People’s attitudes towards privacy have been evolving with more and more users relinquishing control. However, a large number of sensitive users still exist and when using personal data, marketers should start thinking about segmenting consumers by their sensitivity to privacy on top of price sensitivity.

Where is the opportunity?

In the emerging ambient intelligence world there are several stakeholders and opportunities exist all along the mobile value chain:

Device and component manufacturers can benefit from a shorter handset replacement cycle fuelled by integration of more and diverse sensors, particularly ones that open-up new use cases such as medical and environmental sensors. As today’s expensive components become good enough and cheaper (screens, CPUs, memory) it is likely that value will shift from these components to integrated sensors in the near future.

Platform providers, cloud services and context-aware API/SDK providers link mobile sensor data to cloud resources and to developers. They will likely act as aggregators and distributors of sensor information and processed data adding significant resale value to the raw sensor data. Operators may be able to capture some value if they manage to leverage their role as communication aggregators: from location data to communication patterns and personal data, operators have access to fine grained user information. Presently, there may be regulatory barriers preventing operators from exploiting such data, but changing user attitudes towards privacy could also push regulators to adopt a lighter-touch approach towards operators.

Developers have demonstrated the capacity to innovate by extending and creating new use-cases that leverage the newest hardware or software capabilities. Developers will provide the front-end that brings ambient intelligence to the user, empowered by these capabilities and driving a new growth cycle for the app economy.

Let us know your thoughts about the future of ambient intelligence services.

- Andreas (@PappasAndreas)

  • http://www.dodsworth.com Clark Dodsworth

    Andreas, you've described the handset-centric ambient intelligence & sensor data integration (single-user and aggregate) opportunities, AKA context awareness. I was one of the creators of the Ambient Intelligence strategy for Philips in ~'98 and you can see my more recent talks on CA online.

    GNow and Siri are good steps, but as long as the concrete + semantic user-behavior datastream and the sensor datastream from a user's phone are accumulated and munged by a vendor, it'll be difficult/impossible to provide good privacy and data security. And the CA is largely limited to single-user benefits now.

    There's another step necessary to achieve full context awareness, to provide a ubiquitous, user-centric hyperpersonalized service that uncovers value in the moment. All users' constantly evolving intent and priorities can be intermediated, so you move through the day — situations, complications, and locations — in a re-forming mesh of other related nodes. Then, when priorities and intents converge, this will be integrated, prioritized, and delivered to you.

    It requires Strong AI, distributed among all users and a cloud Salience Engine. Each user owns their own information, making whatever elements available that they choose.

    We've been working on this for a long time, and we're launching our Kickstarter project tomorrow: http://www.kimerasystems.com

  • http://callspy.net/ cell phone tracker

    AKA context awareness impresses me greatly.Thank you Andreas for nice post. But one of the main guestion is how to provide good privacy and data security as Clark says it it very limited?

VISIONMOBILE BLOG

Distilling market noise into market sense

The 3 key Apple Watch features that nobody talks about. Yet.

apple-watch-09

If Apple wants to create a new, large product category out of smart watches, it must empower developers to discover…

Continue reading ...

Uber API launch validates the “Gurley scenario”

Uber API

[With the release of Uber’s API, their ploy to achieve world domination has just gotten a lot more probable. Uber…

Continue reading ...

Will developers stop playing the app lottery?

illu

[How long will developers be loyal to ecosystems that seemingly set them up for failure? The odds are clearly stacked…

Continue reading ...

VISIONMOBILE STRATEGY

Workshops & research on Developer-centric Business Models

Apps for connected cars? Your mileage may vary

Automotive-report_illustration_web

Car makers are now entering unfamiliar territory as some of their latest product innovations have nothing to do with driving.…

Continue reading ...

M2M Ecosystem Recipe

M2M-Ecosystem

M2M is rapidly approaching a tipping point. Lower technological barriers pave the way for new entrants in the market. As…

Continue reading ...

Mobile Innovation Economics

profit recipe

Mobile Innovation Economics is a strategy workshop focused on business models and asymmetric competition of mobile ecosystems. Mobile ecosystems are…

Continue reading ...

VISIONMOBILE RESEARCH

Research on the app economy and developer ecosystems

App Profits and Costs

AppProfits

This research report examines the critical success factors for a profitable app, and how business and technology choices, such as…

Continue reading ...

Developer Segmentation 2013

Developer Segmentation 2013

The Developer Segmentation 2013 report delivers a needs-based segmentation model that actually works, with extensive profiling of the eight principle…

Continue reading ...

App Economy Forecasts 2013-2016

App Economy Forecasts

This report investigates the relative sizes of the app economy: developer population by region and platform, distribution of revenues, revenue…

Continue reading ...