Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ACTIVITY INSIGHT MICRO-ENGINE
Document Type and Number:
WIPO Patent Application WO/2016/029233
Kind Code:
A1
Abstract:
Embodiments relate generally to electrical and electronic hardware, computer software, wired and wireless network communications, and computing devices. More specifically, an insight micro-engine is configured to present insights relating a detected activity. According to an embodiment, a method includes detecting at a wearable computing device an absence of a communication link and activating an insight micro-engine configured to operate responsive to the absence of the communication link. Further, the method can include receiving a subset of sensor data from a first subset of sensors implemented in the wearable computing device, determining an activity based on a subset of triggers, determining an aggregate value representative of a state of the activity, correlating the aggregate value to a target value, and deriving data representing an insight that associates the aggregate value with a parameter. Also, the method can include presenting at a displayable user interface a representation of the aggregate value.

Inventors:
ROBISON JEREMIAH (US)
THAKUR ADITYA (US)
Application Number:
PCT/US2015/046618
Publication Date:
February 25, 2016
Filing Date:
August 24, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ALIPHCOM (US)
ROBISON JEREMIAH (US)
THAKUR ADITYA (US)
International Classes:
G16H20/30; G08B1/08; G16H40/67
Foreign References:
US20120283855A12012-11-08
Attorney, Agent or Firm:
KOKKA & BACKUS, PC (Palo Alto, California, US)
Download PDF:
Claims:
What is claimed:

1. A method comprising:

detecting at a wearable computing device an absence of a communication link with a first computing device;

activating an insight micro-engine configured to operate in association with a memory responsive to the absence of the communication link, wherein a processor coupled to the insight micro-engine performs a method comprising;

receiving a subset of sensor data from a first subset of sensors implemented in the wearable computing device;

determining an activity based on a subset of triggers;

determining an aggregate value representative of a state of the activity;

correlating the aggregate value to a target value; and

deriving data representing an insight that associates the aggregate value with a parameter; and

presenting at a displayable user interface a representation of the aggregate value.

2. The method of claim 1, further comprising:

presenting a representation of the insight.

3. The method of claim 2, wherein correlating the aggregate value to the target value comprises:

correlating an aggregate number of steps to a target number of steps.

4. The method of claim 2, wherein deriving data representing the insight that associates the target value with the parameter comprises:

deriving data representing the aggregate number of steps at a point in time as the parameter.

5. The method of claim 2, wherein deriving data representing the insight that associates the target value with the parameter comprises:

deriving data representing the aggregate number of steps at a point of time as the parameter;

comparing data representing another aggregate number of steps at another point of time as another parameter;

determining a difference; and

presenting a representation of the difference.

6. The method of claim 1, further comprising: determining loss of sensor data from a second subset of sensors associated with the first computing device.

7. The method of claim 1, furthering comprising:

detecting presence of the communication link;

terminating at least a portion of the insight micro-engine ; and

receiving another insight from the first computing device.

8. The method of claim 7, wherein detecting presence of the communication link comprises: determining the wearable computing device is within a communication boundary at a distance from the first computing device.

9. The method of claim 1, wherein detecting absence of the communication link with the first computing device comprises:

detecting absence of the communication link with a mobile phone.

10. The method of claim 1, furthering comprising:

detecting presence of the communication link; and

establishing another communication link to a second computing device; and

retrieving an insight engine that includes a portion of instructions equivalent to the insight micro-engine .

11. The method of claim 10, wherein the second computing device is a remote server.

Description:
ACTIVITY INSIGHT MICRO-ENGINE

FIELD

The present invention relates generally to electrical and electronic hardware, computer software, wired and wireless network communications, and computing devices. More specifically, an insight micro-engine is configured to present insights relating a detected activity.

BACKGROUND

With the advent of greater computing capabilities in smaller personal and/or portable form factors and an increasing number of applications (i.e., computer and Internet software or programs) for different uses, consumers (i.e., users) have access to large amounts of personal data. Information and data are often readily available, but poorly captured using conventional data capture devices. Conventional devices typically lack capabilities that can capture, analyze, communicate, or use data in a contextually-meaningful, comprehensive, and efficient manner. Further, conventional solutions are often limited to specific individual purposes or uses, demanding that users invest in multiple devices in order to perform different activities (e.g., a sports watch for tracking time and distance, a GPS receiver for monitoring a hike or run, a cyclometer for gathering cycling data, and others). Although a wide range of data and information is available, conventional devices and applications fail to provide effective solutions that comprehensively capture data for a given user across numerous disparate activities.

Some conventional solutions combine a small number of discrete functions. Functionality for data capture, processing, storage, or communication in conventional devices such as a watch or timer with a heart rate monitor or global positioning system ("GPS") receiver are available conventionally, but are expensive to manufacture and purchase. Other conventional solutions for combining personal data capture facilities often present numerous design and manufacturing problems such as size restrictions, specialized materials requirements, lowered tolerances for defects such as pits or holes in coverings for water-resistant or waterproof devices, unreliability, higher failure rates, increased manufacturing time, and expense. Subsequently, conventional devices such as fitness watches, heart rate monitors, GPS-enabled fitness monitors, health monitors (e.g., diabetic blood sugar testing units), digital voice recorders, pedometers, altimeters, and other conventional personal data capture devices are generally manufactured for conditions that occur in a single or small groupings of activities. Problematically, though, conventional devices do not provide effective solutions to users in terms of providing a comprehensive view of one's overall health or wellness as a result of a combined analysis of data gathered. This is a limiting aspect of the commercial attraction of the various types of conventional devices listed above.

Generally, if the number of activities performed by conventional personal data capture devices increases, there is a corresponding rise in design and manufacturing requirements that results in significant consumer expense, which eventually becomes prohibitive to both investment and commercialization. Further, conventional manufacturing techniques are often limited and ineffective at meeting increased requirements to protect sensitive hardware, circuitry, and other components that are susceptible to damage, but which are required to perform various personal data capture activities. As a conventional example, sensitive electronic components such as printed circuit board assemblies ("PCBA"), sensors, and computer memory (hereafter "memory") can be significantly damaged or destroyed during manufacturing processes where overmoldings or layering of protective material occurs using techniques such as injection molding, cold molding, and others. Damaged or destroyed items subsequently raises the cost of goods sold and can deter not only investment and commercialization, but also innovation in data capture and analysis technologies, which are highly compelling fields of opportunity.

Thus, what is needed is a solution for data capture devices without the limitations of conventional techniques.

BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments or examples ("examples") of the invention are disclosed in the following detailed description and the accompanying drawings:

FIG. 1 illustrates an exemplary data-capable band system;

FIG. 2 illustrates a block diagram of an exemplary data-capable band;

FIG. 3 illustrates sensors for use with an exemplary data-capable band;

FIG. 4 illustrates an application architecture for an exemplary data-capable band;

FIG. 5 A illustrates representative data types for use with an exemplary data-capable band;

FIG. 5B illustrates representative data types for use with an exemplary data-capable band in fitness-related activities;

FIG. 5C illustrates representative data types for use with an exemplary data-capable band in sleep management activities;

FIG. 5D illustrates representative data types for use with an exemplary data-capable band in medical-related activities; FIG. 5E illustrates representative data types for use with an exemplary data-capable band in social media/networking-related activities;

FIG. 6 illustrates an exemplary communications device system implemented with multiple exemplary data-capable bands;

FIG. 7 illustrates an exemplary wellness tracking system for use with or within a distributed wellness application;

FIG. 8 illustrates representative calculations executed by an exemplary conversion module to determine an aggregate value for producing a graphical representation of a user's wellness;

FIG. 9 illustrates an exemplary process for generating and displaying a graphical representation of a user's wellness based upon the user's activities;

FIG. 10 illustrates an exemplary graphical representation of a user's wellness over a time period;

FIG. 11 illustrates another exemplary graphical representation of a user's wellness over a time period;

FIGS. 12A-12F illustrate exemplary wireframes of exemplary webpages associated with a wellness marketplace portal;

FIG. 13 illustrates an exemplary computer system suitable for implementation of a wellness application and use with a data-capable band;

FIG. 14 depicts an example of an aggregation engine, according to some examples;

FIG. 15 is an example of a flow for presenting insights based on locally- derived sensor data, according to some embodiments;

FIG. 16 is a functional block diagram depicting an example of an insight generated by an insight micro-engine according to some embodiments; and

FIG. 17 is a functional block diagram depicting an interface for transitioning an insight micro-engine into a specific mode of operation, according to some embodiments.

DETAILED DESCRIPTION

Various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or a series of program instructions on a computer readable medium such as a computer readable storage medium or a computer network where the program instructions are sent over optical, electronic, or wireless communication links. In general, operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims. A detailed description of one or more examples is provided below along with accompanying figures. The detailed description is provided in connection with such examples, but is not limited to any particular example. The scope is limited only by the claims and numerous alternatives, modifications, and equivalents are encompassed. Numerous specific details are set forth in the following description in order to provide a thorough understanding. These details are provided for the purpose of example and the described techniques may be practiced according to the claims without some or all of these specific details. For clarity, technical material that is known in the technical fields related to the examples has not been described in detail to avoid unnecessarily obscuring the description.

FIG. 1 illustrates an exemplary data-capable band system. Here, system 100 includes network 102, bands 104-112, server 114, mobile computing device 116, mobile communications device 118, computer 120, laptop 122, and distributed sensor 124. Bands 104-112 may be implemented as data-capable device that may be worn as a strap or band around an arm, leg, ankle, or other bodily appendage or feature. In other examples, bands 104-112 may be attached directly or indirectly to other items, organic or inorganic, animate, or static. In still other examples, bands 104-112 may be used differently.

As described above, bands 104-112 may be implemented as wearable personal data or data capture devices (e.g., data-capable devices) that are worn by a user around a wrist, ankle, arm, ear, or other appendage, or attached to the body or affixed to clothing. One or more facilities, sensing elements, or sensors, both active and passive, may be implemented as part of bands 104-112 in order to capture various types of data from different sources. Temperature, environmental, temporal, motion, electronic, electrical, chemical, or other types of sensors (including those described below in connection with FIG. 3) may be used in order to gather varying amounts of data, which may be configurable by a user, locally (e.g., using user interface facilities such as buttons, switches, motion-activated/detected command structures (e.g., accelerometer-gathered data from user-initiated motion of bands 104-112), and others) or remotely (e.g., entering rules or parameters in a website or graphical user interface ("GUI") that may be used to modify control systems or signals in firmware, circuitry, hardware, and software implemented (i.e., installed) on bands 104-112). Bands 104-112 may also be implemented as data-capable devices that are configured for data communication using various types of communications infrastructure and media, as described in greater detail below. Bands 104-112 may also be wearable, personal, non-intrusive, lightweight devices that are configured to gather large amounts of personally relevant data that can be used to improve user health, fitness levels, medical conditions, athletic performance, sleeping physiology, and physiological conditions, or used as a sensory-based user interface ("UI") to signal social-related notifications specifying the state of the user through vibration, heat, lights or other sensory based notifications. For example, a social-related notification signal indicating a user is on-line can be transmitted to a recipient, who in turn, receives the notification as, for instance, a vibration.

Using data gathered by bands 104-112, applications may be used to perform various analyses and evaluations that can generate information as to a person's physical (e.g., healthy, sick, weakened, or other states, or activity level), emotional, or mental state (e.g., an elevated body temperature or heart rate may indicate stress, a lowered heart rate and skin temperature, or reduced movement (e.g., excessive sleeping), may indicate physiological depression caused by exertion or other factors, chemical data gathered from evaluating outgassing from the skin's surface may be analyzed to determine whether a person's diet is balanced or if various nutrients are lacking, salinity detectors may be evaluated to determine if high, lower, or proper blood sugar levels are present for diabetes management, and others). Generally, bands 104-112 may be configured to gather from sensors locally and remotely.

As an example, band 104 may capture (i.e., record, store, communicate (i.e., send or receive), process, or the like) data from various sources (i.e., sensors that are organic (i.e., installed, integrated, or otherwise implemented with band 104) or distributed (e.g., microphones on mobile computing device 116, mobile communications device 118, computer 120, laptop 122, distributed sensor 124, global positioning system ("GPS") satellites, or others, without limitation)) and exchange data with one or more of bands 106-112, server 114, mobile computing device 116, mobile communications device 118, computer 120, laptop 122, and distributed sensor 124. As shown here, a local sensor may be one that is incorporated, integrated, or otherwise implemented with bands 104-112. A remote or distributed sensor (e.g., mobile computing device 116, mobile communications device 118, computer 120, laptop 122, or, generally, distributed sensor 124) may be sensors that can be accessed, controlled, or otherwise used by bands 104-112. For example, band 112 may be configured to control devices that are also controlled by a given user (e.g., mobile computing device 116, mobile communications device 118, computer 120, laptop 122, and distributed sensor 124). For example, a microphone in mobile communications device 118 may be used to detect, for example, ambient audio data that is used to help identify a person's location, or an ear clip (e.g., a headset as described below) affixed to an ear may be used to record pulse or blood oxygen saturation levels. Additionally, a sensor implemented with a screen on mobile computing device 116 may be used to read a user's temperature or obtain a biometric signature while a user is interacting with data. A further example may include using data that is observed on computer 120 or laptop 122 that provides information as to a user's online behavior and the type of content that she is viewing, which may be used by bands 104-112. Regardless of the type or location of sensor used, data may be transferred to bands 104-112 by using, for example, an analog audio jack, digital adapter (e.g., USB, mini-USB), or other, without limitation, plug, or other type of connector that may be used to physically couple bands 104-112 to another device or system for transferring data and, in some examples, to provide power to recharge a battery (not shown). Alternatively, a wireless data communication interface or facility (e.g., a wireless radio that is configured to communicate data from bands 104-112 using one or more data communication protocols (e.g., IEEE 802.1 la/b/g/n (WiFi), WiMax, ANT™, ZigBee®, Bluetooth®, Near Field Communications ("NFC"), and others)) may be used to receive or transfer data. Further, bands 104-112 may be configured to analyze, evaluate, modify, or otherwise use data gathered, either directly or indirectly.

In some examples, bands 104-112 may be configured to share data with each other or with an intermediary facility, such as a database, website, web service, or the like, which may be implemented by server 114. In some embodiments, server 114 can be operated by a third party providing, for example, social media-related services. Bands 104-112 and other related devices may exchange data with each other directly, or bands 104-112 may exchange data via a third party server, such as a third party like Facebook®, to provide social-media related services. Examples of other third party servers include those implemented by social networking services, including, but not limited to, services such as Yahoo! IM™, GTalk™, MSN Messenger™, Twitter® and other private or public social networks. The exchanged data may include personal physiological data and data derived from sensory-based user interfaces ("UI"). Server 114, in some examples, may be implemented using one or more processor-based computing devices or networks, including computing clouds, storage area networks ("SAN"), or the like. As shown, bands 104-112 may be used as a personal data or area network (e.g., "PDN" or "PAN") in which data relevant to a given user or band (e.g., one or more of bands 104-112) may be shared. As shown here, bands 104 and 112 may be configured to exchange data with each other over network 102 or indirectly using server 114. Users of bands 104 and 112 may direct a web browser hosted on a computer (e.g., computer 120, laptop 122, or the like) in order to access, view, modify, or perform other operations with data captured by bands 104 and 112. For example, two runners using bands 104 and 112 may be geographically remote (e.g., users are not geographically in close proximity locally such that bands being used by each user are in direct data communication), but wish to share data regarding their race times (pre, post, or in-race), personal records (i.e., "PR"), target split times, results, performance characteristics (e.g., target heart rate, target V02 max, and others), and other information. If both runners (i.e., bands 104 and 112) are engaged in a race on the same day, data can be gathered for comparative analysis and other uses. Further, data can be shared in substantially real-time (taking into account any latencies incurred by data transfer rates, network topologies, or other data network factors) as well as uploaded after a given activity or event has been performed. In other words, data can be captured by the user as it is worn and configured to transfer data using, for example, a wireless network connection (e.g., a wireless network interface card, wireless local area network ("LAN") card, cell phone, or the like). Data may also be shared in a temporally asynchronous manner in which a wired data connection (e.g., an analog audio plug (and associated software or firmware) configured to transfer digitally encoded data to encoded audio data that may be transferred between bands 104-112 and a plug configured to receive, encode/decode, and process data exchanged) may be used to transfer data from one or more bands 104-112 to various destinations (e.g., another of bands 104-112, server 114, mobile computing device 116, mobile communications device 118, computer 120, laptop 122, and distributed sensor 124). Bands 104- 112 may be implemented with various types of wired and/or wireless communication facilities and are not intended to be limited to any specific technology. For example, data may be transferred from bands 104-112 using an analog audio plug (e.g., TRRS, TRS, or others). In other examples, wireless communication facilities using various types of data communication protocols (e.g., WiFi, Bluetooth®, ZigBee®, ANT™, and others) may be implemented as part of bands 104-112, which may include circuitry, firmware, hardware, radios, antennas, processors, microprocessors, memories, or other electrical, electronic, mechanical, or physical elements configured to enable data communication capabilities of various types and characteristics.

As data-capable devices, bands 104-112 may be configured to collect data from a wide range of sources, including onboard (not shown) and distributed sensors (e.g., server 114, mobile computing device 116, mobile communications device 118, computer 120, laptop 122, and distributed sensor 124) or other bands. Some or all data captured may be personal, sensitive, or confidential and various techniques for providing secure storage and access may be implemented. For example, various types of security protocols and algorithms may be used to encode data stored or accessed by bands 104-112. Examples of security protocols and algorithms include authentication, encryption, encoding, private and public key infrastructure, passwords, checksums, hash codes and hash functions (e.g., SHA, SHA-1, MD-5, and the like), or others may be used to prevent undesired access to data captured by bands 104-112. In other examples, data security for bands 104-112 may be implemented differently.

Bands 104-112 may be used as personal wearable, data capture devices that, when worn, are configured to identify a specific, individual user. By evaluating captured data such as motion data from an accelerometer, biometric data such as heart rate, skin galvanic response, and other biometric data, and using long-term analysis techniques (e.g., software packages or modules of any type, without limitation), a user may have a unique pattern of behavior or motion and/or biometric responses that can be used as a signature for identification. For example, bands 104- 112 may gather data regarding an individual person's gait or other unique biometric, physiological or behavioral characteristics. Using, for example, distributed sensor 124, a biometric signature (e.g., fingerprint, retinal or iris vascular pattern, or others) may be gathered and transmitted to bands 104-112 that, when combined with other data, determines that a given user has been properly identified and, as such, authenticated. When bands 104-112 are worn, a user may be identified and authenticated to enable a variety of other functions such as accessing or modifying data, enabling wired or wireless data transmission facilities (i.e., allowing the transfer of data from bands 104-112), modifying functionality or functions of bands 104-112, authenticating financial transactions using stored data and information (e.g., credit card, PIN, card security numbers, and the like), running applications that allow for various operations to be performed (e.g., controlling physical security and access by transmitting a security code to a reader that, when authenticated, unlocks a door by turning off current to an electromagnetic lock, and others), and others. Different functions and operations beyond those described may be performed using bands 104-112, which can act as secure, personal, wearable, data-capable devices. The number, type, function, configuration, specifications, structure, or other features of system 100 and the above-described elements may be varied and are not limited to the examples provided.

FIG. 2 illustrates a block diagram of an exemplary data-capable band. Here, band 200 includes bus 202, processor 204, memory 206, notification facility 208, accelerometer 210, sensor 212, battery 214, and communications facility 216. In some examples, the quantity, type, function, structure, and configuration of band 200 and the elements (e.g., bus 202, processor 204, memory 206, notification facility 208, accelerometer 210, sensor 212, battery 214, and communications facility 216) shown may be varied and are not limited to the examples provided. As shown, processor 204 may be implemented as logic to provide control functions and signals to memory 206, notification facility 208, accelerometer 210, sensor 212, battery 214, and communications facility 216. Processor 204 may be implemented using any type of processor or microprocessor suitable for packaging within bands 104-112 (FIG. 1). Various types of microprocessors may be used to provide data processing capabilities for band 200 and are not limited to any specific type or capability. For example, a MSP430F5528-type microprocessor manufactured by Texas Instruments of Dallas, Texas may be configured for data communication using audio tones and enabling the use of an audio plug-and-jack system (e.g., TRRS, TRS, or others) for transferring data captured by band 200. Further, different processors may be desired if other functionality (e.g., the type and number of sensors (e.g., sensor 212)) are varied. Data processed by processor 204 may be stored using, for example, memory 206.

In some examples, memory 206 may be implemented using various types of data storage technologies and standards, including, without limitation, read-only memory ("ROM"), random access memory ("RAM"), dynamic random access memory ("DRAM"), static random access memory ("SRAM"), static/dynamic random access memory ("SDRAM"), magnetic random access memory ("MRAM"), solid state, two and three-dimensional memories, Flash®, and others. Memory 206 may also be implemented using one or more partitions that are configured for multiple types of data storage technologies to allow for non-modifiable (i.e., by a user) software to be installed (e.g., firmware installed on ROM) while also providing for storage of captured data and applications using, for example, RAM. Once captured and/or stored in memory 206, data may be subjected to various operations performed by other elements of band 200.

Notification facility 208, in some examples, may be implemented to provide vibratory energy, audio or visual signals, communicated through band 200. As used herein, "facility" refers to any, some, or all of the features and structures that are used to implement a given set of functions. In some examples, the vibratory energy may be implemented using a motor or other mechanical structure. In some examples, the audio signal may be a tone or other audio cue, or it may be implemented using different sounds for different purposes. The audio signals may be emitted directly using notification facility 208, or indirectly by transmission via communications facility 216 to other audio-capable devices (e.g., headphones (not shown), a headset (as described below with regard to FIG. 12), mobile computing device 116, mobile communications device 118, computer 120, laptop 122, distributed sensor 124, etc.). In some examples, the visual signal may be implemented using any available display technology, such as lights, light- emitting diodes (LEDs), interferometric modulator display (IMOD), electrophoretic ink (E Ink), organic light-emitting diode (OLED), or other display technologies. As an example, an application stored on memory 206 may be configured to monitor a clock signal from processor 204 in order to provide timekeeping functions to band 200. For example, if an alarm is set for a desired time, notification facility 208 may be used to provide a vibration or an audio tone, or a series of vibrations or audio tones, when the desired time occurs. As another example, notification facility 208 may be coupled to a framework (not shown) or other structure that is used to translate or communicate vibratory energy throughout the physical structure of band 200. In other examples, notification facility 208 may be implemented differently.

Power may be stored in battery 214, which may be implemented as a battery, battery module, power management module, or the like. Power may also be gathered from local power sources such as solar panels, thermo-electric generators, and kinetic energy generators, among others that are alternatives power sources to external power for a battery. These additional sources can either power the system directly or can charge a battery, which, in turn, is used to power the system (e.g., of a band). In other words, battery 214 may include a rechargeable, expendable, replaceable, or other type of battery, but also circuitry, hardware, or software that may be used in connection with in lieu of processor 204 in order to provide power management, charge/recharging, sleep, or other functions. Further, battery 214 may be implemented using various types of battery technologies, including Lithium Ion ("LI"), Nickel Metal Hydride ("NiMH"), or others, without limitation. Power drawn as electrical current may be distributed from battery via bus 202, the latter of which may be implemented as deposited or formed circuitry or using other forms of circuits or cabling, including flexible circuitry. Electrical current distributed from battery 204 and managed by processor 204 may be used by one or more of memory 206, notification facility 208, accelerometer 210, sensor 212, or communications facility 216.

As shown, various sensors may be used as input sources for data captured by band 200.

For example, accelerometer 210 may be used to gather data measured across one, two, or three axes of motion. In addition to accelerometer 210, other sensors (i.e., sensor 212) may be implemented to provide temperature, environmental, physical, chemical, electrical, or other types of sensed inputs. As presented here, sensor 212 may include one or multiple sensors and is not intended to be limiting as to the quantity or type of sensor implemented. Data captured by band 200 using accelerometer 210 and sensor 212 or data requested from another source (i.e., outside of band 200) may also be exchanged, transferred, or otherwise communicated using communications facility 216. For example, communications facility 216 may include a wireless radio, control circuit or logic, antenna, transceiver, receiver, transmitter, resistors, diodes, transistors, or other elements that are used to transmit and receive data from band 200. In some examples, communications facility 216 may be implemented to provide a "wired" data communication capability such as an analog or digital attachment, plug, jack, or the like to allow for data to be transferred. In other examples, communications facility 216 may be implemented to provide a wireless data communication capability to transmit digitally encoded data across one or more frequencies using various types of data communication protocols, without limitation. In still other examples, band 200 and the above-described elements may be varied in function, structure, configuration, or implementation and are not limited to those shown and described.

FIG. 3 illustrates sensors for use with an exemplary data-capable band. Sensor 212 may be implemented using various types of sensors, some of which are shown. Like-numbered and named elements may describe the same or substantially similar element as those shown in other descriptions. Here, sensor 212 (FIG. 2) may be implemented as accelerometer 302, altimeter/barometer 304, light/infrared ("IR") sensor 306, pulse/heart rate ("HR") monitor 308, audio sensor (e.g., microphone, transducer, or others) 310, pedometer 312, velocimeter 314, GPS receiver 316, location-based service sensor (e.g., sensor for determining location within a cellular or micro-cellular network, which may or may not use GPS or other satellite constellations for fixing a position) 318, motion detection sensor 320, environmental sensor 322, chemical sensor 324, electrical sensor 326, or mechanical sensor 328.

As shown, accelerometer 302 may be used to capture data associated with motion detection along 1, 2, or 3-axes of measurement, without limitation to any specific type of specification of sensor. Accelerometer 302 may also be implemented to measure various types of user motion and may be configured based on the type of sensor, firmware, software, hardware, or circuitry used. As another example, altimeter/barometer 304 may be used to measure environment pressure, atmospheric or otherwise, and is not limited to any specification or type of pressure -reading device. In some examples, altimeter/barometer 304 may be an altimeter, a barometer, or a combination thereof. For example, altimeter/barometer 304 may be implemented as an altimeter for measuring above ground level ("AGL") pressure in band 200, which has been configured for use by naval or military aviators. As another example, altimeter/barometer 304 may be implemented as a barometer for reading atmospheric pressure for marine -based applications. In other examples, altimeter/barometer 304 may be implemented differently.

Other types of sensors that may be used to measure light or photonic conditions include light/IR sensor 306, motion detection sensor 320, and environmental sensor 322, the latter of which may include any type of sensor for capturing data associated with environmental conditions beyond light. Further, motion detection sensor 320 may be configured to detect motion using a variety of techniques and technologies, including, but not limited to comparative or differential light analysis (e.g., comparing foreground and background lighting), sound monitoring, or others. Audio sensor 310 may be implemented using any type of device configured to record or capture sound.

In some examples, pedometer 312 may be implemented using devices to measure various types of data associated with pedestrian-oriented activities such as running or walking. Footstrikes, stride length, stride length or interval, time, and other data may be measured. Velocimeter 314 may be implemented, in some examples, to measure velocity (e.g., speed and directional vectors) without limitation to any particular activity. Further, additional sensors that may be used as sensor 212 include those configured to identify or obtain location-based data. For example, GPS receiver 316 may be used to obtain coordinates of the geographic location of band 200 using, for example, various types of signals transmitted by civilian and/or military satellite constellations in low, medium, or high earth orbit (e.g., "LEO," "MEO," or "GEO"). In other examples, differential GPS algorithms may also be implemented with GPS receiver 316, which may be used to generate more precise or accurate coordinates. Still further, location- based services sensor 318 may be implemented to obtain location-based data including, but not limited to location, nearby services or items of interest, and the like. As an example, location- based services sensor 318 may be configured to detect an electronic signal, encoded or otherwise, that provides information regarding a physical locale as band 200 passes. The electronic signal may include, in some examples, encoded data regarding the location and information associated therewith. Electrical sensor 326 and mechanical sensor 328 may be configured to include other types (e.g., haptic, kinetic, piezoelectric, piezomechanical, pressure, touch, thermal, and others) of sensors for data input to band 200, without limitation. Other types of sensors apart from those shown may also be used, including magnetic flux sensors such as solid-state compasses and the like, including gyroscopic sensors. While the present illustration provides numerous examples of types of sensors that may be used with band 200 (FIG. 2), others not shown or described may be implemented with or as a substitute for any sensor shown or described.

FIG. 4 illustrates an application architecture for an exemplary data-capable band. Here, application architecture 400 includes bus 402, logic module 404, communications module 406, security module 408, interface module 410, data management 412, audio module 414, motor controller 416, service management module 418, sensor input evaluation module 420, and power management module 422. In some examples, application architecture 400 and the above-listed elements (e.g., bus 402, logic module 404, communications module 406, security module 408, interface module 410, data management 412, audio module 414, motor controller 416, service management module 418, sensor input evaluation module 420, and power management module 422) may be implemented as software using various computer programming and formatting languages such as Java, C++, C, and others. As shown here, logic module 404 may be firmware or application software that is installed in memory 206 (FIG. 2) and executed by processor 204 (FIG. 2). Included with logic module 404 may be program instructions or code (e.g., source, object, binary executables, or others) that, when initiated, called, or instantiated, perform various functions.

For example, logic module 404 may be configured to send control signals to communications module 406 in order to transfer, transmit, or receive data stored in memory 206, the latter of which may be managed by a database management system ("DBMS") or utility in data management module 412. As another example, security module 408 may be controlled by logic module 404 to provide encoding, decoding, encryption, authentication, or other functions to band 200 (FIG. 2). Alternatively, security module 408 may also be implemented as an application that, using data captured from various sensors and stored in memory 206 (and accessed by data management module 412) may be used to provide identification functions that enable band 200 to passively identify a user or wearer of band 200. Still further, various types of security software and applications may be used and are not limited to those shown and described.

Interface module 410, in some examples, may be used to manage user interface controls such as switches, buttons, or other types of controls that enable a user to manage various functions of band 200. For example, a 4-position switch may be turned to a given position that is interpreted by interface module 410 to determine the proper signal or feedback to send to logic module 404 in order to generate a particular result. In other examples, a button (not shown) may be depressed that allows a user to trigger or initiate certain actions by sending another signal to logic module 404. Still further, interface module 410 may be used to interpret data from, for example, accelerometer 210 (FIG. 2) to identify specific movement or motion that initiates or triggers a given response. In other examples, interface module 410 may be used to manage different types of displays (e.g., LED, IMOD, E Ink, OLED, etc.). In other examples, interface module 410 may be implemented differently in function, structure, or configuration and is not limited to those shown and described. As shown, audio module 414 may be configured to manage encoded or unencoded data gathered from various types of audio sensors. In some examples, audio module 414 may include one or more codecs that are used to encode or decode various types of audio waveforms. For example, analog audio input may be encoded by audio module 414 and, once encoded, sent as a signal or collection of data packets, messages, segments, frames, or the like to logic module 404 for transmission via communications module 406. In other examples, audio module 414 may be implemented differently in function, structure, configuration, or implementation and is not limited to those shown and described. Other elements that may be used by band 200 include motor controller 416, which may be firmware or an application to control a motor or other vibratory energy source (e.g., notification facility 208 (FIG. 2)). Power used for band 200 may be drawn from battery 214 (FIG. 2) and managed by power management module 422, which may be firmware or an application used to manage, with or without user input, how power is consumer, conserved, or otherwise used by band 200 and the above-described elements, including one or more sensors (e.g., sensor 212 (FIG. 2), sensors 302-328 (FIG. 3)). With regard to data captured, sensor input evaluation module 420 may be a software engine or module that is used to evaluate and analyze data received from one or more inputs (e.g., sensors 302-328) to band 200. When received, data may be analyzed by sensor input evaluation module 420, which may include custom or "off-the-shelf analytics packages that are configured to provide application-specific analysis of data to determine trends, patterns, and other useful information. In other examples, sensor input module 420 may also include firmware or software that enables the generation of various types and formats of reports for presenting data and any analysis performed thereupon.

Another element of application architecture 400 that may be included is service management module 418. In some examples, service management module 418 may be firmware, software, or an application that is configured to manage various aspects and operations associated with executing software-related instructions for band 200. For example, libraries or classes that are used by software or applications on band 200 may be served from an online or networked source. Service management module 418 may be implemented to manage how and when these services are invoked in order to ensure that desired applications are executed properly within application architecture 400. As discrete sets, collections, or groupings of functions, services used by band 200 for various purposes ranging from communications to operating systems to call or document libraries may be managed by service management module 418. Alternatively, service management module 418 may be implemented differently and is not limited to the examples provided herein. Further, application architecture 400 is an example of a software/system/application-level architecture that may be used to implement various software- related aspects of band 200 and may be varied in the quantity, type, configuration, function, structure, or type of programming or formatting languages used, without limitation to any given example.

FIG. 5 A illustrates representative data types for use with an exemplary data-capable band. Here, wearable device 502 may capture various types of data, including, but not limited to sensor data 504, manually-entered data 506, application data 508, location data 510, network data 512, system/operating data 514, and user data 516. Various types of data may be captured from sensors, such as those described above in connection with FIG. 3. Manually-entered data, in some examples, may be data or inputs received directly and locally by band 200 (FIG. 2). In other examples, manually-entered data may also be provided through a third-party website that stores the data in a database and may be synchronized from server 114 (FIG. 1) with one or more of bands 104-112. Other types of data that may be captured including application data 508 and system/operating data 514, which may be associated with firmware, software, or hardware installed or implemented on band 200. Further, location data 510 may be used by wearable device 502, as described above. User data 516, in some examples, may be data that include profile data, preferences, rules, or other information that has been previously entered by a given user of wearable device 502. Further, network data 512 may be data is captured by wearable device with regard to routing tables, data paths, network or access availability (e.g., wireless network access availability), and the like. Other types of data may be captured by wearable device 502 and are not limited to the examples shown and described. Additional context- specific examples of types of data captured by bands 104-112 (FIG. 1) are provided below.

FIG. 5B illustrates representative data types for use with an exemplary data-capable band in fitness-related activities. Here, band 519 may be configured to capture types (i.e., categories) of data such as heart rate/pulse monitoring data 520, blood oxygen saturation data 522, skin temperature data 524, salinity/emission/outgassing data 526, location/GPS data 528, environmental data 530, and accelerometer data 532. As an example, a runner may use or wear band 519 to obtain data associated with his physiological condition (i.e., heart rate/pulse monitoring data 520, skin temperature, salinity/emission/outgassing data 526, among others), athletic efficiency (i.e., blood oxygen saturation data 522), and performance (i.e., location/GPS data 528 (e.g., distance or laps run), environmental data 530 (e.g., ambient temperature, humidity, pressure, and the like), accelerometer 532 (e.g., biomechanical information, including gait, stride, stride length, among others)). Other or different types of data may be captured by band 519, but the above-described examples are illustrative of some types of data that may be captured by band 519. Further, data captured may be uploaded to a website or online/networked destination for storage and other uses. For example, fitness-related data may be used by applications that are downloaded from a "fitness marketplace" or "wellness marketplace," where athletes, or other users, may find, purchase, or download applications, products, information, etc., for various uses, as well as share information with other users. Some applications may be activity-specific and thus may be used to modify or alter the data capture capabilities of band 519 accordingly. For example, a fitness marketplace may be a website accessible by various types of mobile and non-mobile clients to locate applications for different exercise or fitness categories such as running, swimming, tennis, golf, baseball, football, fencing, and many others. When downloaded, applications from a fitness marketplace may also be used with user-specific accounts to manage the retrieved applications as well as usage with band 519, or to use the data to provide services such as online personal coaching or targeted advertisements. More, fewer, or different types of data may be captured for fitness-related activities.

In some examples, applications may be developed using various types of schema, including using a software development kit or providing requirements in a proprietary or open source software development regime. Applications may also be developed by using an application programming interface to an application marketplace in order for developers to design and build applications that can be downloaded on wearable devices (e.g., bands 104-106 (FIG. 1)). Alternatively, application can be developed for download and installation on devices that may be in data communication over a shared data link or network connection, wired or wireless. For example, an application may be downloaded onto mobile computing device 116 (FIG. 1) from server 114 (FIG. l), which may then be installed and executed using data gathered from one or more sensors on band 104. Analysis, evaluation, or other operations performed on data gathered by an application downloaded from server 114 may be presented (i.e., displayed) on a graphical user interface (e.g., a micro web browser, WAP web browser, Java/Java-script- based web browser, and others, without limitation) on mobile computing device 116 or any other type of client. Users may, in some examples, search, find, retrieve, download, purchase, or otherwise obtain applications for various types of purposes from an application marketplace. Applications may be configured for various types of purposes and categories, without limitation. Examples of types of purposes include running, swimming, trail running, diabetic management, dietary, weight management, sleep management, caloric burn rate tracking, activity tracking, and others, without limitation. Examples of categories of applications may include fitness, wellness, health, medical, and others, without limitation. In other examples, applications for distribution via a marketplace or other download website or source may be implemented differently and is not limited to those described.

FIG. 5C illustrates representative data types for use with an exemplary data-capable band in sleep management activities. Here, band 539 may be used for sleep management purposes to track various types of data, including heart rate monitoring data 540, motion sensor data 542, accelerometer data 544, skin resistivity data 546, user input data 548, clock data 550, and audio data 552. In some examples, heart rate monitor data 540 may be captured to evaluate rest, waking, or various states of sleep. Motion sensor data 542 and accelerometer data 544 may be used to determine whether a user of band 539 is experiencing a restful or fitful sleep. For example, some motion sensor data 542 may be captured by a light sensor that measures ambient or differential light patterns in order to determine whether a user is sleeping on her front, side, or back. Accelerometer data 544 may also be captured to determine whether a user is experiencing gentle or violent disruptions when sleeping, such as those often found in afflictions of sleep apnea or other sleep disorders. Further, skin resistivity data 546 may be captured to determine whether a user is ill (e.g., running a temperature, sweating, experiencing chills, clammy skin, and others). Still further, user input data may include data input by a user as to how and whether band 539 should trigger notification facility 208 (FIG. 2) to wake a user at a given time or whether to use a series of increasing or decreasing vibrations or audio tones to trigger a waking state. Clock data (550) may be used to measure the duration of sleep or a finite period of time in which a user is at rest. Audio data may also be captured to determine whether a user is snoring and, if so, the frequencies and amplitude therein may suggest physical conditions that a user may be interested in knowing (e.g., snoring, breathing interruptions, talking in one's sleep, and the like). More, fewer, or different types of data may be captured for sleep management-related activities.

FIG. 5D illustrates representative data types for use with an exemplary data-capable band in medical-related activities. Here, band 539 may also be configured for medical purposes and related-types of data such as heart rate monitoring data 560, respiratory monitoring data 562, body temperature data 564, blood sugar data 566, chemical protein/analysis data 568, patient medical records data 570, and healthcare professional (e.g., doctor, physician, registered nurse, physician's assistant, dentist, orthopedist, surgeon, and others) data 572. In some examples, data may be captured by band 539 directly from wear by a user. For example, band 539 may be able to sample and analyze sweat through a salinity or moisture detector to identify whether any particular chemicals, proteins, hormones, or other organic or inorganic compounds are present, which can be analyzed by band 539 or communicated to server 114 to perform further analysis. If sent to server 114, further analyses may be performed by a hospital or other medical facility using data captured by band 539. In other examples, more, fewer, or different types of data may be captured for medical-related activities.

FIG. 5E illustrates representative data types for use with an exemplary data-capable band in social media/networking-related activities. Examples of social media/networking-related activities include activities related to Internet-based Social Networking Services ("SNS"), such as Facebook®, Twitter®, etc. Here, band 519, shown with an audio data plug, may be configured to capture data for use with various types of social media and networking-related services, websites, and activities. Accelerometer data 580, manual data 582, other user/friends data 584, location data 586, network data 588, clock/timer data 590, and environmental data 592 are examples of data that may be gathered and shared by, for example, uploading data from band 519 using, for example, an audio plug such as those described herein. As another example, accelerometer data 580 may be captured and shared with other users to share motion, activity, or other movement-oriented data. Manual data 582 may be data that a given user also wishes to share with other users. Likewise, other user/friends data 584 may be from other bands (not shown) that can be shared or aggregated with data captured by band 519. Location data 586 for band 519 may also be shared with other users. In other examples, a user may also enter manual data 582 to prevent other users or friends from receiving updated location data from band 519. Additionally, network data 588 and clock/timer data may be captured and shared with other users to indicate, for example, activities or events that a given user (i.e., wearing band 519) was engaged at certain locations. Further, if a user of band 519 has friends who are not geographically located in close or near proximity (e.g., the user of band 519 is located in San Francisco and her friend is located in Rome), environmental data can be captured by band 519 (e.g., weather, temperature, humidity, sunny or overcast (as interpreted from data captured by a light sensor and combined with captured data for humidity and temperature), among others). In other examples, more, fewer, or different types of data may be captured for medical-related activities.

FIG. 6 illustrates an exemplary communications device system implemented with multiple exemplary data-capable bands. The exemplary system 600 shows exemplary lines of communication between some of the devices shown in FIG. 1, including network 102, bands 104-110, mobile communications device 118, and laptop 122. In FIG. 6, examples of both peer- to-peer communication and peer-to-hub communication using bands 104-110 are shown. Using these avenues of communication, bands worn by multiple users or wearers (the term "wearer" is used herein to describe a user that is wearing one or more bands) may monitor and compare physical, emotional, mental states among wearers (e.g., physical competitions, sleep pattern comparisons, resting physical states, etc.).

Peer-to-hub communication may be exemplified by bands 104 and 108, each respectively communicating with mobile communications device 118 or laptop 122, exemplary hub devices. Bands 104 and 108 may communicate with mobile communications device 118 or laptop 122 using any number of known wired communication technologies (e.g., Universal Service Bus (USB) connections, TRS/TRRS connections, telephone networks, fiber-optic networks, cable networks, etc.). In some examples, bands 104 and 108 may be implemented as lower power or lower energy devices, in which case mobile communications device 118, laptop 122 or other hub devices may act as a gateway to route the data from bands 104 and 108 to software applications on the hub device, or to other devices. For example, mobile communications device 118 may comprise both wired and wireless communication capabilities, and thereby act as a hub to further communicate data received from band 104 to band 110, network 102 or laptop 122, among other devices. Mobile communications device 118 also may comprise software applications that interact with social or professional networking services ("SNS") (e.g., Facebook®, Twitter®, Linkedln®, etc.), for example via network 102, and thereby act also as a hub to further share data received from band 104 with other users of the SNS. Band 104 may communicate with laptop 122, which also may comprise both wired and wireless communication capabilities, and thereby act as a hub to further communicate data received from band 104 to, for example, network 102 or laptop 122, among other devices. Laptop 122 also may comprise software applications that interact with SNS, for example via network 102, and thereby act also as a hub to further share data received from band 104 with other users of the SNS. The software applications on mobile communications device 118 or laptop 122 or other hub devices may further process or analyze the data they receive from bands 104 and 108 in order to present to the wearer, or to other wearers or users of the SNS, useful information associated with the wearer's activities.

In other examples, bands 106 and 110 may also participate in peer-to-hub communications with exemplary hub devices such as mobile communications device 118 and laptop 122. Bands 106 and 110 may communicate with mobile communications device 118 and laptop 122 using any number of wireless communication technologies (e.g., local wireless network, near field communication, Bluetooth®, Bluetooth® low energy, ANT, etc.). Using wireless communication technologies, mobile communications device 118 and laptop 122 may be used as a hub or gateway device to communicate data captured by bands 106 and 110 with other devices, in the same way as described above with respect to bands 104 and 108. Mobile communications device 118 and laptop 122 also may be used as a hub or gateway device to further share data captured by bands 106 and 110 with SNS, in the same way as described above with respect to bands 104 and 108.

Peer-to-peer communication may be exemplified by bands 106 and 110, exemplary peer devices, communicating directly. Band 106 may communicate directly with band 110, and vice versa, using known wireless communication technologies, as described above. Peer-to-peer communication may also be exemplified by communications between bands 104 and 108 and bands 106 and 110 through a hub device, such as mobile communications device 118 or laptop 122.

Alternatively, exemplary system 600 may be implemented with any combination of communication capable devices, such as any of the devices depicted in FIG. 1, communicating with each other using any communication platform, including any of the platforms described above. Persons of ordinary skill in the art will appreciate that the examples of peer-to-hub communication provided herein, and shown in FIG. 6, are only a small subset of the possible implementations of peer-to-hub communications involving the bands described herein.

FIG. 7 illustrates an exemplary wellness tracking system for use with or within a distributed wellness application. System 700 comprises aggregation engine 710, conversion module 720, band 730, band 732, textual input 734, other input 736, and graphical representation 740. Bands 730 and 732 may be implemented as described above. In some examples, aggregation engine 710 may receive input from various sources. For example, aggregation engine 710 may receive sensory input from band 730, band 732, and/or other data-capable bands. This sensory input may include any of the above-described sensory data that may be gathered by data-capable bands. In other examples, aggregation engine 710 may receive other (e.g., manual) input from textual input 734 or other input 736. Textual input 734 and other input 736 may include information that a user types, uploads, or otherwise inputs into an application (e.g., a web application, an iPhone® application, etc.) implemented on any of the data and communications capable devices referenced herein (e.g., computer, laptop, computer, mobile communications device, mobile computing device, etc.). In some examples, aggregation engine 720 may be configured to process (e.g., interpret) the data and information received from band 730, band 732, textual input 734 and other input 736, to determine an aggregate value from which graphical representation 740 may be generated. In an example, system 700 may comprise a conversion module 720, which may be configured to perform calculations to convert the data received from band 730, band 732, textual input 734 and other input 736 into values (e.g., numeric values). Those values may then be aggregated by aggregation engine 710 to generate graphical representation 740. Conversion module 720 may be implemented as part of aggregation engine 710 (as shown), or it may be implemented separately (not shown). In some examples, aggregation engine 710 may be implemented with more or different modules. In other examples, aggregation engine 710 may be implemented with fewer or more input sources. In some examples, graphical representation 740 may be implemented differently, using different facial expressions, or any image or graphic according to any intuitive or predetermined set of graphics indicating various levels and/or aspects of wellness. As described in more detail below, graphical representation 740 may be a richer display comprising more than a single graphic or image (e.g., FIGS. 10 and 11).

In some examples, aggregation engine 710 may receive or gather inputs from one or more sources over a period of time, or over multiple periods of time, and organize those inputs into a database (not shown) or other type of organized form of information storage. In some examples, graphical representation 740 may be a simple representation of a facial expression, as shown. In other examples, graphical representation 740 may be implemented as a richer graphical display comprising inputs gathered over time (e.g., FIGS. 10 and 11 below).

FIG. 8 illustrates representative calculations executed by an exemplary conversion module to determine an aggregate value for producing a graphical representation of a user's wellness. In some examples, conversion module 820 may be configured to process data associated with exercise, data associated with sleep, data associated with eating or food intake, and data associated with other miscellaneous activity data (e.g., sending a message to a friend, gifting to a friend, donating, receiving gifts, etc.), and generate values from the data. For example, conversion module 820 may perform calculations using data associated with activities ("activity data") to generate values for types of exercise (e.g., walking, vigorous exercise, not enough exercise, etc.) (810), types of sleep (e.g., deep sleep, no sleep, not enough deep sleep, etc.) (812), types of meals (e.g., a sluggish/heavy meal, a good meal, an energizing meal, etc.) (814), or other miscellaneous activities (e.g., sending a message to a friend, gifting to a friend, donating, receiving gifts, etc.) (816). In some implementations, these values may include positive values for activities that are beneficial to a user's wellness and negative values for activities that are detrimental to a user's wellness, or for lack of activity (e.g., not enough sleep, too many minutes without exercise, etc.). In one example, the values may be calculated using a reference activity. For example, conversion module 820 may equate a step to the numerical value 0.0001, and then equate various other activities to a number of steps (810, 812, 814, 816). Note that while in this example types of sleep 812, types of meals 814, and miscellaneous activities 816 are expressed in numbers of steps, FIG. 8 is not intended to be limiting is one of numerous ways in which to express types of sleep 812, types of meals 814, and miscellaneous activities 816. For example, types of sleep 812, types of meals 814, and miscellaneous activities 816 can correspond to different point values of which one or more scores can be derived to determine aggregate value 830. Similarly, aggregate value 830 can be expressed in terms of points or a score. In some examples, these values may be weighted according to the quality of the activity. For example, each minute of deep sleep equals a higher number of steps than each minute of other sleep (812). As described in more detail below (FIGS. 10 and 11), these values may be modulated by time. For example, positive values for exercise may be modulated by negative values for extended time periods without exercise (810). In another example, positive values for sleep or deep sleep may be modulated by time without sleep or not enough time spent in deep sleep (812). In some examples, conversion module 820 is configured to aggregate these values to generate an aggregate value 830. In some examples, aggregate value 830 may be used by an aggregation engine (e.g., aggregation engine 710 described above) to generate a graphical representation of a user's wellness (e.g., graphical representation 740 described above, FIGS. 10 and 11 described below, or others).

FIG. 9 illustrates an exemplary process for generating and displaying a graphical representation of a user's wellness based upon the user's activities. Process 900 may be implemented as an exemplary process for creating and presenting a graphical representation of a user's wellness. In some examples, process 900 may begin with receiving activity data from a source (902). For example, the source may comprise one of the data-capable bands described herein (e.g., band 730, band 732, etc.). In another example, the source may comprise another type of data and communications capable device, such as those described above (e.g., computer, laptop, computer, mobile communications device, mobile computing device, etc.), which may enable a user to provide activity data via various inputs (e.g., textual input 734, other input 736, etc.). For example, activity data may be received from a data-capable band. In another example, activity data may be received from data manually input using an application user interface via a mobile communications device or a laptop. In other examples, activity data may be received from sources or combinations of sources. After receiving the activity data, another activity data is received from another source (904). The another source also may be any of the types of sources described above. Once received, the activity data from the source, and the another activity data from another source, is then used to determine (e.g., by conversion module 720 or 730, etc.) an aggregate value (906). Once determined, the aggregate value is used to generate a graphical representation of a user's present wellness (908) (e.g., graphical representation 740 described above, etc.). The aggregate value also may be combined with other information, of the same type or different, to generate a richer graphical representation (e.g., FIGS. 10 and 11 described below, etc.).

In other examples, activity data may be received from multiple sources. These multiple sources may comprise a combination of sources (e.g., a band and a mobile communications device, two bands and a laptop, etc.) (not shown). Such activity data may be accumulated continuously, periodically, or otherwise, over a time period. As activity data is accumulated, the aggregate value may be updated and/or accumulated, and in turn, the graphical representation may be updated. In some examples, as activity data is accumulated and the aggregate value updated and/or accumulated, additional graphical representations may be generated based on the updated or accumulated aggregate value(s). In other examples, the above-described process may be varied in the implementation, order, function, or structure of each or all steps and is not limited to those provided.

FIG. 10 illustrates an exemplary graphical representation of a user's wellness over a time period. Here, exemplary graphical representation 1000 shows a user's wellness progress over the course of a partial day. Exemplary graphical representation 1000 may comprise a rich graph displaying multiple vectors of data associated with a user's wellness over time, including a status 1002, a time 1004, alarm graphic 1006, points progress line 1008, points gained for completion of activities 1012-1016, total points accumulated 1010, graphical representations 1030-1034 of a user's wellness at specific times over the time period, activity summary data and analysis over time (1018-1022), and an indication of syncing activity 1024. Here, status 1002 may comprise a brief (e.g., single word) general summary of a user's wellness. In some examples, time 1004 may indicate the current time, or in other examples, it may indicate the time that graphical representation 1000 was generated or last updated. In some other examples, time 1004 may be implemented using different time zones. In still other examples, time 1004 may be implemented differently. In some examples, alarm graphic 1006 may indicate the time that the user's alarm rang, or in other examples, it may indicate the time when a band sensed the user awoke, whether or not an alarm rang. In other examples, alarm graphic 1006 may indicate the time when a user's band began a sequence of notifications to wake up the user (e.g., using notification facility 208, as described above), and in still other examples, alarm graphic 1006 may represent something different. As shown here, graphical representation 1000 may include other graphical representations of the user's wellness at specific times of the day (1030, 1032, 1034), for example, indicating a low level of wellness or low energy level soon after waking up (1030) and a more alert or higher energy or wellness level after some activity (1032, 1034). Graphical representation 1000 may also include displays of various analyses of activity over time. For example, graphical representation may include graphical representations of the user's sleep (1018), including how many total hours slept and the quality of sleep (e.g., bars may represent depth of sleep during periods of time). In another example, graphical representation may include graphical representations of various aspects of a user's exercise level for a particular workout, including the magnitude of the activity level (1020), duration (1020), the number of steps taken (1022), the user's heart rate during the workout (not shown), and still other useful information (e.g., altitude climbed, laps of a pool, number of pitches, etc.). Graphical representation 1000 may further comprise an indication of syncing activity (1024) showing that graphical representation 1000 is being updated to include additional information from a device (e.g., a data-capable band) or application. Graphical representation 1000 may also include indications of a user's total accumulated points 1010, as well as points awarded at certain times for certain activities (1012, 1014, 1016). For example, shown here graphical representation 1000 displays the user has accumulated 2,017 points in total (e.g., over a lifetime, over a set period of time, etc.) (1010).

In some examples, points awarded may be time-dependent or may expire after a period of time. For example, points awarded for eating a good meal may be valid only for a certain period of time. This period of time may be a predetermined period of time, or it may be dynamically determined. In an example where the period of time is dynamically determined, the points may be valid only until the user next feels hunger. In another example where the period of time is dynamically determined, the points may be valid depending on the glycemic load of the meal (e.g., a meal with low glycemic load may have positive effects that meal carry over to subsequent meals, whereas a meal with a higher glycemic load may have a positive effect only until the next meal). In some examples, a user's total accumulated points 1010 may reflect that certain points have expired and are no longer valid. In some examples, these points may be used for obtaining various types of rewards, or as virtual or actual currency, for example, in an online wellness marketplace, as described herein (e.g., a fitness marketplace). For example, points may be redeemed for virtual prizes (e.g., for games, challenges, etc.), or physical goods (e.g., products associated with a user's goals or activities, higher level bands, which may be distinguished by different colors, looks and/or features, etc.). In some examples, the points may automatically be tracked by a provider of data- capable bands, such that a prize (e.g., higher level band) is automatically sent to the user upon reaching a given points threshold without any affirmative action by the user. In other examples, a user may redeem a prize (e.g., higher level band) from a store. In still other examples, a user may receive deals. These deals or virtual prizes may be received digitally via a data-capable band, a mobile communications device, or otherwise.

FIG. 11 illustrates another exemplary graphical representation of a user's wellness over a time period. Here, exemplary graphical representation 1100 shows a summary of a user's wellness progress over the course of a week. Exemplary graphical representation 1100 may comprise a rich graph displaying multiple vectors of data associated with a user's wellness over time, including a status 1102, a time 1104, summary graphical representations 1106-1116 of a user's wellness on each days, points earned each day 1120-1130, total points accumulated 1132, points progress line 1134, an indication of syncing activity 1118, and bars 1136-1140. Here, as with status 1002 in FIG. 10, status 1102 may comprise a brief (e.g., single word) general summary of a user's wellness. In some examples, time 1104 may indicate the current time, or in other examples, it may indicate the time that graphical representation 1100 was generated or last updated. In some other examples, time 1104 may be implemented using different time zones. In still other examples, time 1104 may be implemented differently. As shown here, graphical representation 1100 may include summary graphical representations 1106-1116 of the user's wellness on each day, for example, indicating a distress or tiredness on Wednesday (1110) or a positive spike in wellness on Friday (1116). In some examples, summary graphical representations 1106-1116 may indicate a summary wellness for that particular day. In other examples, summary graphical representations 1106-1116 may indicate a cumulative wellness, e.g., at the end of each day. Graphical representation 1100 may further comprise an indication of syncing activity 1118 showing that graphical representation 1100 is being updated to include additional information from a device (e.g., a data-capable band) or application. Graphical representation 1100 may also include indications of a user's total accumulated points 1132, as well as points earned each day 1120-1130. For example, shown here graphical representation 1100 displays the user has accumulated 2,017 points thus far, which includes 325 points earned on Saturday (1130), 263 points earned on Friday (1128), 251 points earned on Thursday (1126), and so on. As described above, these points may be used for obtaining various types of rewards, or as virtual or actual currency, for example, in an online wellness marketplace (e.g., a fitness marketplace as described above). In some examples, graphical representation 1100 also may comprise bars 1136-1140. Each bar may represent an aspect of a user's wellness (e.g., food, exercise, sleep, etc.). In some examples, the bar may display the user's daily progress toward a personal goal for each aspect (e.g., to sleep eight hours, complete sixty minutes of vigorous exercise, etc.). In other examples, the bar may display the user's daily progress toward a standardized goal (e.g., a health and fitness expert's published guidelines, a government agency's published guidelines, etc.), or other types of goals.

FIGs. 12A-12F illustrate exemplary wireframes of exemplary webpages associated with a wellness marketplace. Here, wireframe 1200 comprises navigation 1202, selected page 1204A, sync widget 1216, avatar and goals element 1206, statistics element 1208, information ticker 1210, social feed 1212, check-in/calendar element 1214, deal element 1218, and team summary element 1220. As described above, a wellness marketplace may be implemented as a portal, website or application where users, may find, purchase, or download applications, products, information, etc., for various uses, as well as share information with other users (e.g., users with like interests). Here, navigation 1202 comprises buttons and widgets for navigating through various pages of the wellness marketplace, including the selected page 1204A-1204F (e.g., the Home page, Team page, Public page, Move page, Eat page, Live page, etc.) and sync widget 1216. In some examples, sync widget 1216 may be implemented to sync a data-capable band to the user's account on the wellness marketplace. In some examples, the Home page may include avatar and goals element 1206, which may be configured to display a user's avatar and goals. Avatar and goals element 1206 also may enable a user to create an avatar, either by selecting from predetermined avatars, by uploading a user's own picture or graphic, or other known methods for creating an avatar. Avatar and goals element 1206 also may enable a user to set goals associated with the user's health, eating/drinking habits, exercise, sleep, socializing, or other aspects of the user's wellness. The Home page may further include statistics element 1208, which may be implemented to display statistics associated with the user's wellness (e.g., the graphical representations described above). As shown here, in some examples, statistics element 1208 may be implemented as a dynamic graphical, and even navigable, element (e.g., a video or interactive graphic), wherein a user may view the user's wellness progress over time. In other examples, the statistics element 1208 may be implemented as described above (e.g., FIGS. 10 and 11). The Home page may further include information ticker 1210, which may stream information associated with a user's activities, or other information relevant to the wellness marketplace. The Home page may further include social feed 1212, which may be implemented as a scrolling list of messages or information (e.g., encouragement, news, feedback, recommendations, comments, etc.) from friends, advisors, coaches, or other users. The messages or information may include auto -generated encouragement, comments, news, recommendations, feedback, achievements, opinions, actions taken by teammates, or other information, by a wellness application in response to data associated with the user's wellness and activities (e.g., gathered by a data-capable band). In some examples, social feed 1212 may be searchable. In some examples, social feed 1212 may enable a user to filter or select the types of messages or information that shows up in the feed (e.g., from the public, only from the team, only from the user, etc.). Social feed 1212 also may be configured to enable a user to select an action associated with each feed message (e.g., cheer, follow, gift, etc.). In some examples, check- in/calendar element 1214 may be configured to allow a user to log their fitness and nutrition. In some examples, check-in/calendar element 1214 also may be configured to enable a user to maintain a calendar. Deal element 1218 may provide a daily deal to the user. The daily deal may be featured for the marketplace, it may be associated with the user's activities, or it may be generated using a variety of known advertising models. Team summary element 1220 may provide summary information about the user's team. As used herein, the term "team" may refer to any group of users that elect to use the wellness marketplace together. In some examples, a user may be part of more than one team. In other examples, a group of users may form different teams for different activities, or they may form a single team that participates in, tracks, and shares information regarding, more than one activity. A Home page may be implemented differently than described here.

Wireframe 1230 comprises an exemplary Team page, which may include a navigation 1202, selected page 1204B, sync widget 1216, team manager element 1228, leaderboard element 1240, comparison element 1242, avatar and goals element 1206A, statistics element 1208 A, social feed 1212A, and scrolling member snapshots element 1226. Avatar and goals element 1206A and statistics element 1208 A may be implemented as described above with regard to like- numbered or corresponding elements. Navigation 1202, selected page 1204B and sync widget 1216 also may be implemented as described above with regard to like-numbered or corresponding elements. In some examples, team manager element 1228 may be implemented as an area for displaying information, or providing widgets, associated with team management. Access to team manager element 1228 may be restricted, in some examples, or access may be provided to the entire team. Leaderboard element 1240 may be implemented to display leaders in various aspects of an activity in which the team is participating (e.g., various sports, social functions (e.g., clubs), drinking abstinence, etc.). In some examples, leaderboard element 1240 may be implemented to display leaders among various groupings (e.g., site -wide, team only, other users determined to be "like" the user according to certain criteria (e.g., similar activities), etc.). In other examples, leaderboard element 1240 may be organized or filtered by various parameters (e.g., date, demographics, geography, activity level, etc.). Comparison element 1242 may be implemented, in some examples, to provide comparisons regarding a user's performance with respect to an activity, or various aspects of an activity, with the performance of the user's teammates or with the team as a whole (e.g., team average, team median, team favorites, etc.). Scrolling member snapshots element 1226 may be configured to provide brief summary information regarding each of the members of the team in a scrolling fashion. A Team page may be implemented differently than described here.

Wireframe 1250 comprises an exemplary Public page, which may include navigation 1202, selected page 1204C, sync widget 1216, leaderboard element 1240A, social feed 1212B, statistics report engine 1254, comparison element 1242A, and challenge element 1256. Navigation 1202, selected page 1204C and sync widget 1216 may be implemented as described above with regard to like -numbered or corresponding elements. Leaderboard element 1240A also may be implemented as described above with regard to leaderboard element 1240, and in some examples, may display leaders amongst all of the users of the wellness marketplace. Social feed 1212B also may be implemented as described above with regard social feed 1212 and social feed 1212A. Comparison element 1242 A may be implemented as described above with regard to comparison element 1242, and in some examples, may display comparisons of a user's performance of an activity against the performance of all of the other users of the wellness marketplace. Statistics report engine 1254 may generate and display statistical reports associated with various activities being monitored by, and discussed in, the wellness marketplace. In some examples, challenge element 1256 may enable a user to participate in marketplace-wide challenges with other users. In other examples, challenge element 1256 may display the status of, or other information associated with, ongoing challenges among users. A Public page may be implemented differently than described here. Wireframe 1260 comprises an exemplary Move page, which may include navigation 1202, selected page 1204D, sync widget 1216, leaderboard element 1240B, statistics report engine 1254, comparison element 1242B, search and recommendations element 1272, product sales element 1282, exercise science element 1264, daily movement element 1266, maps element 1280 and titles element 1258. Navigation 1202, selected page 1204D, sync widget 1216, leaderboard element 1240B, statistics report engine 1254, and comparison element 1242B may be implemented as described above with regard to like-numbered or corresponding elements. The Move page may be implemented to include a search and recommendations element 1272, which may be implemented to enable searching of the wellness marketplace. In some examples, in addition to results of the search, recommendations associated with the user's search may be provided to the user. In other examples, recommendations may be provided to the user based on any other data associated with the user's activities, as received by, gathered by, or otherwise input into, the wellness marketplace. Product sales element 1282 may be implemented to display products for sale and provide widgets to enable purchases of products by users. The products may be associated with the user's activities or activity level. Daily movement element 1266 may be implemented to suggest an exercise each day. Maps element 1280 may be implemented to display information associated with the activity of users of the wellness marketplace on a map. In some examples, maps element 1280 may display a percentage of users that are physically active in a geographical region. In other examples, maps element 1280 may display a percentage of users that have eaten well over a particular time period (e.g., currently, today, this week, etc.). In still other examples, maps element 1280 may be implemented differently. In some examples, titles element 1258 may display a list of users and the titles they have earned based on their activities and activity levels (e.g., a most improved user, a hardest working user, etc.). A Move page may be implemented differently than described here.

Wireframe 1270 comprises an exemplary Eat page, which may include navigation 1202, selected page 1204E, sync widget 1216, leaderboard elements 1240C and 1240D, statistics report engine 1254, comparison element 1242C, search and recommendations element 1272, product sales element 1282, maps element 1280A, nutrition science element 1276, and daily food/supplement element 1278. Navigation 1202, selected page 1204E, sync widget 1216, leaderboard elements 1240C and 1240D, statistics report engine 1254, comparison element 1242C, search and recommendations element 1272, product sales element 1282, and maps element 1280A may be implemented as described above with regard to like-numbered or corresponding elements. The Eat page may be implemented to include a nutrition science element 1276, which may display, or provide widgets for accessing, information associated with nutrition science. The Eat page also may be implemented with a daily food/supplement element 1278, which may be implemented to suggest an food and/or supplement each day. An Eat page may be implemented differently than described here.

Wireframe 1280 comprises an exemplary Live page, which may include navigation 1202, selected page 1204F, sync widget 1216, leaderboard element 1240E, search and recommendations element 1272, product sales element 1282, maps element 1280B, social feed 1212C, health research element 1286, and product research element 1290. Navigation 1202, selected page 1204F, sync widget 1216, leaderboard element 1240E, search and recommendations element 1272, product sales element 1282, maps element 1280B and social feed 1212C may be implemented as described above with regard to like-numbered or corresponding elements. In some examples, the Live page may include health research element 1286 configured to display, or to enable a user to research, information regarding health topics. In some examples, the Live page may include product research element 1290 configured to display, or to enable a user to research, information regarding products. In some examples, the products may be associated with a user's particular activities or activity level. In other examples, the products may be associated with any of the activities monitored by, or discussed on, the wellness marketplace. A Live page may be implemented differently than described here.

FIG. 13 illustrates an exemplary computer system suitable for implementation of a wellness application and use with a data-capable band. In some examples, computer system 1300 may be used to implement computer programs, applications, methods, processes, or other software to perform the above-described techniques. Computer system 1300 includes a bus 1302 or other communication mechanism for communicating information, which interconnects subsystems and devices, such as processor 1304, system memory 1306 (e.g., RAM), storage device 1308 (e.g., ROM), disk drive 1310 (e.g., magnetic or optical), communication interface 1312 (e.g., modem or Ethernet card), display 1314 (e.g., CRT or LCD), input device 1316 (e.g., keyboard), and cursor control 1318 (e.g., mouse or trackball).

According to some examples, computer system 1300 performs specific operations by processor 1304 executing one or more sequences of one or more instructions stored in system memory 1306. Such instructions may be read into system memory 1306 from another computer readable medium, such as static storage device 1308 or disk drive 1310. In some examples, hard-wired circuitry may be used in place of or in combination with software instructions for implementation. The term "computer readable medium" refers to any tangible medium that participates in providing instructions to processor 1304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as disk drive 1310. Volatile media includes dynamic memory, such as system memory 1306.

Common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read.

Instructions may further be transmitted or received using a transmission medium. The term "transmission medium" may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 1302 for transmitting a computer data signal.

In some examples, execution of the sequences of instructions may be performed by a single computer system 1300. According to some examples, two or more computer systems 1300 coupled by communication link 1320 (e.g., LAN, PSTN, or wireless network) may perform the sequence of instructions in coordination with one another. Computer system 1300 may transmit and receive messages, data, and instructions, including program, i.e., application code, through communication link 1320 and communication interface 1312. Received program code may be executed by processor 1304 as it is received, and/or stored in disk drive 1310, or other non-volatile storage for later execution.

FIG. 14 depicts an example of an aggregation engine, according to some examples.

Diagram 1400 depicts an aggregation engine 1408 including an insight engine and a sensor set ("1") 1432 of sensors disposed in a mobile computing device, such as mobile device 1410. Further, diagram 1400 depicts a wearable device 1420a in communication with wearable device 1410 via a communication links 1414. As shown, wearable device 1420a is within a boundary 1460, such as a communications boundary, that demarcates and ability of wearable device 1420a to communicate or receive data from mobile device 1410. In some examples, boundary 1460 is associated with a distance from mobile device 1410. As such, boundary 1460 can be determined by, for example, a maximum transmit range of the Bluetooth (e.g., BTLE) transmitter, such as a BTLE radio disposed in mobile device 1410. Note, however, boundaries 1460 need not represent a distance but can represent any state in which wearable device 1420 a transitions from a first mode of operation to a second mode of operation.

In one mode of operation, wearable device 1420a can receive at least a portion of insight data 1412 as determined by insight engine 1430. As such, wearable device 1420a is configured to operate, at least in part, based on data generated by insight engine 1430. Note that insight engine 1430 can receive sensor data 1416 from wearable device 1420a, such as from another sensor set ("2") 1442 disposed in the wearable device 1420a. In some examples, wearable device 1420a also can receive insight data from a remote server 1404 via network 1402.

Further to the example shown, when wearable device 1420a transitions across boundary 1462 to a position as shown as wearable device 1420b, wearable device 1420b transitions operation from the first mode to a second mode. In a second mode of operation, wearable device 1420b is configured to implement an insight micro-engine 1440. In this mode of operation, wearable device 1420b operates independently of aggregation engine 1408 and insight engine 1430. Therefore, insights presented to user interface 1424, such as a touch-sensitive display interface, are based on the operation of insight micro-engine 1440.

According to some embodiments, the term "insight" can refer to, for example, data correlated among a state of user (e.g., number of steps taken, number of our slapped, etc.) and other sets of data representing trends, patterns, and correlations to goals of a user (e.g., a target value of a number of steps per day) and/or supersets of generalized (e.g., average values) of anonymized data for a population at-large. With insight data, the user can understand how an activity (e.g., running, etc.) can affect other aspects of health (e.g., amount of sleep as a parameter). In some embodiments, insight data can include feedback information. For example, insights can include data derived by the structures and/or functions set forth in U.S. Pat. No. 8,446,275, which is herein incorporated by reference to illustrate at least some examples. In some cases, an insight micro-engine (and/or insight engine) can include one or more structures or functions set forth in U.S. Pat. No. 8,446,275.

To illustrate operation of insight micro-engine 1440, consider the following. According to some examples, wearable device 1420b can detect an absence of communication link 1414. Responsive to the loss of communication, insight micro-engine 1440 can be activated to operate in association with the memory (not shown) to continue to provide insights to the user independent of remote server 1404 and insight engine 1430. Insight micro-engine 1440 is configured to receive a subset of sensor data from sensor set 1442 whereby the subset of sensor data can represent a smaller amount of data due to the loss of sensor data from sensor set 1432.

Based on the subset of sensor data, and insight micro-engine 1440 can determine an activity and aggregate value representative of the state of the activity. For example, a user can walk a number of steps per unit time, and have an aggregate value of 8,123 steps in one day. Further, insight micro-engine 1440 can correlate the aggregate value to a target value, whereby the target value they represent a goal. To continue with the previous example, consider that the user wishes to walk 10,000 steps per day. As such, the target value is 10,000 steps. By this correlation, insight micro-engine 1440 can determine a difference (e.g., insight data representing how far above or below) the user's progress is relative to a goal. Insight micro-engine 1440 can generate data for presentation at a displayable user interface. In some cases the data insight includes the aggregate value and/or a graphical representation of the aggregate value. Further, the data may also include data representing an insight, whereby the insight data provides contextual information that conveys a correlation between the aggregate value and a parameter. For example, a parameter can represent value or other information associated with another state of a user, is one example. To illustrate, consider that insight micro-engine 1440 can correlate an aggregate value of steps to a parameter describing a number of hours slept over a period of time.

In some examples, insight micro-engine 1440 can derive data representing an aggregate number of steps, or any other measurable activity, at a point in time, such as 5 pm every day (e.g., as a parameter). Insight micro-engine 1440 can also compare data representing another aggregate number of steps for a different point in time (e.g., number of steps taken one week ago today). Note that the "number of steps" described above is merely an example of the units of motion or accomplishment for an activity or condition, and the various embodiments are not intended to be limited thereto.

In some examples, wearable device 1420b can transition to a position associated with wearable device 1420a. As such, logic in a wearable device 1420b can detect a presence of communication links 1414. In some embodiments, insight micro-engine 1440 and its operation can be terminated (e.g., in whole or part). Accordingly, wearable device 1420a receives data from insight engine 1430 mobile device 1410. Further, wearable device 1428 can again receive insight data from remote server 1404.

In another mode of operation, wearable device 1420a can be configured to generate and transmit a request 1480 via network 1402 to remote server 1404. Request 1480 can include data requesting an authorized downloading or transfer of data 1482 including insight engine 1430. Therefore, insight micro-engine 1440 can be preinstalled on wearable device 1420b, and upon detection of communication link 1414, instant micro-engine 1440 can cause insight engine 1430 and related applications to be downloaded onto mobile phone 1410.

Examples of wearable device 420a include Gear Live™ by Samsung, Moto 360™ and G Watch by LG, among others.

FIG. 15 is an example of a flow for presenting insights based on locally- derived sensor data, according to some embodiments. In flow 1500, an absence of communication with a mobile phone is detected at 1502, and an insight micro-engine on a wearable device is activated at 1504. Further, and insight micro-engine can receive sensor data locally and independent from other sources at 1506. Aggregated data can be determined and correlated against target data at 1508. Based on such a correlation, and insight can be generated that can describe a condition of a number of conditions or states in which a user may be in based on a parameter. For example, if a number of aggregate steps fails to surpass a minimal threshold, and insight can be generated to classify the user as a "sedentary" person. Further, an insight micro-engine can derive an insight relative to a parameter, such as age or a number of heart attacks per unit population for such classification. Thus, an insight can alert the sedentary person to the effects of non-activity to probabilities of succumbing to heart disease, as an example. At 1510, the insight is presented via a user interface.

FIG. 16 is a functional block diagram depicting an example of an insight generated by an insight micro-engine according to some embodiments. Diagram 1600 depicts an insight micro- engine 1602, data representing an insightl604, a user interface 1606. In this case, a user has an aggregate number of steps shown as 12,043. In some examples, and insight micro-engine can compare aggregate values to trigger points that can cause and insight to be generated. In the example shown, the trigger point is a threshold number of steps equivalent to walking across the Golden Gate Bridge. Other trigger points regardless of distance, time, or other aspects, can also be used.

FIG. 17 is a functional block diagram depicting an interface for transitioning an insight micro-engine into a specific mode of operation, according to some embodiments. Diagram 1700 depicts an insight micro-engine 1702 causing display of insight representation 1704a. Upon detecting in the user input and an related data 1706, insight micro-engine 1702 can cause a mode transition in which instructions are presented to the user via interface 1704b. In this case, and insight engine is not detected to be present on an associated mobile device. Should the user wish to experience a more richer set of functionality and insights, a request can be generated to download insight engine 1710 to a mobile phone.

Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the above-described inventive techniques are not limited to the details provided. There are many alternative ways of implementing the above-described invention techniques. The disclosed examples are illustrative and not restrictive.