Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR ENHANCING USER EXPERIENCE FOR USERS OF WEARABLE ELECTRONIC DEVICES
Document Type and Number:
WIPO Patent Application WO/2018/009404
Kind Code:
A1
Abstract:
Various aspects enhance the user experience for a wearer/user of a wearable electronic device (e.g., a smart watch) when interacting with the device in different contexts. In one aspect, for example, the wearable electronic device may display a limited amount of information relating to the locations of other users who are "off-screen" (e.g., not at a location that is currently being represented on a digital map displayed by the wearable electronic device), in an unobtrusive and space-efficient manner. In another example aspect, the wearable electronic device may display information relating to traffic along an expected route of the user in an unobtrusive and space-efficient manner. Various other techniques and architectures for improving user experience, and/or providing additional functionality, are also described.

Inventors:
HASABALLAH TAYLAH CHARLOTTE (US)
SUGDEN JR BRIAN KEVIN (US)
LEE JOHN JAESUNG (US)
Application Number:
PCT/US2017/039922
Publication Date:
January 11, 2018
Filing Date:
June 29, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GOOGLE LLC (US)
International Classes:
G06F17/30; G01C21/00; G01S5/14; G06Q50/00
Foreign References:
US20140222339A12014-08-07
EP2194508A12010-06-09
US20150128089A12015-05-07
Other References:
None
Attorney, Agent or Firm:
BATEMAN, Andrew, W. (US)
Download PDF:
Claims:
What is claimed is:

1. A method, implemented in an electronic device having a display and one or more processors, for displaying location indicators for one or more off-screen entities to a user of the electronic device, the method comprising:

receiving, from a remote server, map data representing an area that includes a current location of the user;

receiving, from the remote server, shared location data corresponding to a current location of a first entity, wherein the current location of the first entity is not within the area represented by the map data;

using, by the one or more processors, the map data to present on the display a digital map of the area represented by the map data, the digital map having an outer perimeter and a map center; and

using, by the one or more processors, the shared location data to present on the display, contemporaneously with the digital map of the area represented by the map data, a first location indicator corresponding to the current location of the first entity, at least by assigning the first location indicator a first visual property indicative of a first distance, wherein the first distance is a distance between the current location of the first entity and the current location of the user, and wherein the first visual property does not include any text specifying a distance, and

positioning the first location indicator, with the first visual property, (i) at an offset distance from the map center such that the first location indicator is closer to the outer perimeter of the digital map than to the map center, and (ii) in a first direction from the map center, the first direction corresponding to a direction of the current location of the first entity relative to the current location of the user.

2. The method of claim 1, wherein:

receiving shared location data includes receiving data specifying geographic coordinates of the current location of the first entity; and

prior to assigning the first location indicator, using the specified geographic coordinates to calculate the first direction and the first distance.

3. The method of claim 1, wherein:

receiving shared location data includes receiving data specifying (i) a position on the digital map and (ii) the first visual property;

assigning the first visual property includes assigning the specified first visual property; and

positioning the first location indicator includes positioning the first location indicator, with the first visual property, at the specified position on the digital map.

4. The method of claim 1, wherein positioning the first location indicator includes positioning the first location indicator such that the first location indicator is immediately adjacent to the outer perimeter of the digital map.

5. The method of claim 1, wherein assigning the first visual property includes assigning a size of the first location indicator, the size being indicative of the first distance.

6. The method of claim 5, further comprising:

receiving, from the remote server, updated shared location data corresponding to a new current location of the first entity; and

using, by the one or more processors, the updated shared location data to present on the display an updated first location indicator corresponding to the new current location of the first entity, at least by

assigning the updated first location indicator an increased size, the increased size being indicative of a second distance, and the second distance being (i) a distance between the new current location of the first entity and either the current location of the user or a new current location of the user, and (ii) less than the first distance, and positioning the updated first location indicator, with the increased size, in a second direction corresponding to a direction of the new current location of the first entity relative to either the current location of the user or the new current location of the user.

7. The method of claim 6, wherein:

the first location indicator does not include any photographic image or avatar of the first entity; and the updated first location indicator includes a photographic image or avatar of the first entity.

8. The method of claim 1, wherein assigning the first location indicator a first visual property includes assigning (i) a color of the first location indicator, the color being indicative of the first distance, or (ii) a shading of the first location indicator, the shading being indicative of the first distance.

9. The method of claim 1, further comprising:

assigning a number to be displayed as text on or next to the first location indicator, the number being a metric representing either (i) the first distance, or (ii) a distance between the current location of the first entity and a location corresponding to a point along the outer perimeter of the digital map.

10. The method of claim 1, wherein the first location indicator is a circular icon.

11. The method of claim 1, wherein the first entity is a person who previously agreed to share his or her location with the user.

12. The method of claim 1, further comprising:

receiving, from the remote server, additional shared location data corresponding to a current location of a second entity, wherein the current location of the second entity is not within the area represented by the map data;

using, by the one or more processors, the additional shared location data to present on the display, contemporaneously with both the digital map of the area represented by the map data and the first location indicator, a second location indicator corresponding to the current location of the second entity, at least by

assigning the second location indicator a second visual property indicative of a second distance, wherein the second distance is a distance between the current location of the second entity and the current location of the user, and wherein the second visual property does not include any text specifying a distance, and

positioning the second location indicator, with the second visual property, in a second direction from the map center, the second direction corresponding to a direction of the current location of the second entity relative to the current location of the user.

13. The method of claim 1, wherein the electronic device is a smart watch device, and wherein the method further comprises:

causing, by the one or more processors, both the digital map and the first location indicator to rotate on the display in a synchronized manner as the directional orientation of the electronic device changes.

14. An electronic device comprising:

a display;

one or more processors; and

a memory, the memory storing instructions that, when executed by the one or more processors, cause the electronic device to

receive, from a remote server, map data representing an area that includes a current location of the user,

receive, from the remote server, shared location data corresponding to a current location of a first entity, wherein the current location of the first entity is not within the area represented by the map data,

use the map data to present on the display a digital map of the area represented by the map data, the digital map having an outer perimeter and a map center, and use the shared location data to present on the display, contemporaneously with the digital map of the area represented by the map data, a first location indicator corresponding to the current location of the first entity, at least by

assigning the first location indicator a first visual property indicative of a first distance, wherein the first distance is a distance between the current location of the first entity and the current location of the user, and wherein the first visual property does not include any text specifying a distance, and

positioning the first location indicator, with the first visual property, (i) at an offset distance from the map center such that the first location indicator is closer to the outer perimeter of the digital map than to the map center, and (ii) in a first direction from the map center, the first direction corresponding to a direction of the current location of the first entity relative to the current location of the user.

15. The electronic device of claim 14, wherein the instructions cause the electronic device to:

position the first location indicator such that the first location indicator is immediately adjacent to the outer perimeter of the digital map.

16. The electronic device of claim 14, wherein the first visual property is a size of the first location indicator, the size being indicative of the first distance.

17. The electronic device of claim 16, wherein the first location indicator does not include any photographic image or avatar of the first entity, and wherein the instructions further cause the electronic device to:

receive, from the remote server, updated shared location data corresponding to a new current location of the first entity; and

use the updated shared location data to present on the display an updated first location indicator corresponding to the new current location of the first entity, at least by

assigning the updated first location indicator an increased size, the increased size being indicative of a second distance, and the second distance being (i) a distance between the new current location of the first entity and either the current location of the user or a new current location of the user, and (ii) less than the first distance, and positioning the updated first location indicator, with the increased size, in a second direction corresponding to a direction of the new current location of the first entity relative to either the current location of the user or the new current location of the user, the updated first location indicator including a photographic image or avatar of the first entity.

18. A method, implemented on an electronic device having a display and one or more processors, for displaying location indicators for one or more off-screen entities to a user of the electronic device, the method comprising:

presenting on the display, by the one or more processors, a digital map of an area, the digital map having an outer perimeter and a map center; presenting on the display, by the one or more processors, one or more location indicators corresponding to one or more respective people currently at one or more respective locations not represented on the digital map, each of the one or more location indicators (i) overlaying the digital map and (ii) being positioned, relative to the map center, in a direction corresponding to a direction of the respective location of the respective person relative to a location represented by the map center; and

for each of the one or more location indicators, causing, by the one or more processors, a size of the location indicator to (i) increase as the respective person moves nearer to the location represented by the map center, and (ii) decrease as the respective person moves further from the location represented by the map center.

19. The method of claim 18, further comprising:

for each of the one or more location indicators, causing, by the one or more processors, a photographic image or avatar of the respective person to appear in the location indicator when the respective person moves within a threshold distance of the location represented by the map center.

20. The method of claim 18, wherein presenting on the display the one or more location indicators includes presenting each of the one or more location indicators at a respective position immediately adjacent to the outer perimeter of the digital map.

Description:
SYSTEMS AND METHODS FOR ENHANCING USER EXPERIENCE FOR USERS OF WEARABLE ELECTRONIC DEVICES

FIELD OF TECHNOLOGY

[0001] The present disclosure relates to wearable electronic devices and, more particularly, to systems and methods for enhancing the user experience when interacting with such devices in various contexts, such as navigation, location sharing and/or the selection and display of transit options.

BACKGROUND

[0002] The background description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventor(s), to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.

[0003] As smartphones have become increasingly ubiquitous, the appearance and functionality of user interfaces for such devices has evolved to provide an ever-improving user experience. Increasingly, however, consumers are turning to wearable electronic devices, such as smart watches, to provide certain features (e.g. , navigation features) traditionally provided by smart phones. Unfortunately, the user interface displays and functions of smartphones are typically designed in view of the specific form factors and use cases associated with such devices. For example, designers know that a certain amount of display real estate is available with a smartphone. As other examples, designers can typically assume that a user will not object to reorienting his or her smartphone between landscape and portrait display modes as needed, or to interacting with a touch screen of the smartphone on a frequent and/or prolonged basis, etc. However, because a wearable electronic device may have a much different form factor (e.g. , a much smaller display) and/or be associated with different use cases, existing smartphone designs may lead to a poor user experience if applied in the wearable device context.

SUMMARY

[0004] Various embodiments/implementations described herein relate to ways in which the user experience may be enhanced for a wearer/user of a wearable electronic device (e.g. , a smart watch). As discussed further below, however, some of the techniques herein may be used in connection with other electronic devices, such as smartphones, tablet devices, and so on.

[0005] In one example embodiment, a method for displaying location indicators for one or more off-screen entities to a user is implemented in an electronic device having a display and one or more processors. The method includes receiving, from a remote server, map data representing an area that includes a current location of the user, and receiving, from the remote server, shared location data corresponding to a current location of a first entity. The current location of the first entity is not within the area represented by the map data. The method also includes using, by the one or more processors, the map data to present on the display a digital map of the area represented by the map data. The digital map has an outer perimeter and a map center. The method also includes using, by the one or more processors, the shared location data to present on the display, contemporaneously with the digital map of the area represented by the map data, a first location indicator corresponding to the current location of the first entity. Using the shared location data to present the first location indicator includes assigning the first location indicator a first visual property indicative of a first distance, and positioning the first location indicator, with the first visual property, (1) at an offset distance from the map center such that the first location indicator is closer to the outer perimeter of the digital map than to the map center, and (2) in a first direction from the map center. The first direction corresponds to a direction of the current location of the first entity relative to the current location of the user. The first distance is a distance between the current location of the first entity and the current location of the user, and the first visual property does not include any text specifying a distance.

[0006] In another example embodiment, an electronic device includes a display, one or more processors, and a memory. The memory stores instructions that, when executed by the one or more processors, cause the electronic device to (1) receive, from a remote server, map data representing an area that includes a current location of the user, (2) receive, from the remote server, shared location data corresponding to a current location of a first entity, wherein the current location of the first entity is not within the area represented by the map data, (3) use the map data to present on the display a digital map of the area represented by the map data, the digital map having an outer perimeter and a map center, and (4) use the shared location data to present on the display, contemporaneously with the digital map of the area represented by the map data, a first location indicator corresponding to the current location of the first entity. The instructions cause the electronic device to use the shared location data to present the first location indicator at least by assigning the first location indicator a first visual property indicative of a first distance, and positioning the first location indicator, with the first visual property, at an offset distance from the map center such that the first location indicator is closer to the outer perimeter of the digital map than to the map center, and in a first direction from the map center. The first direction corresponds to a direction of the current location of the first entity relative to the current location of the user. The first distance is a distance between the current location of the first entity and the current location of the user. The first visual property does not include any text specifying a distance.

[0007] In another example embodiment, a method for displaying location indicators for one or more off-screen entities to a user is implemented on an electronic device having a display and one or more processors. The method includes presenting on the display, by the one or more processors, a digital map of an area. The digital map has an outer perimeter and a map center. The method also includes presenting on the display, by the one or more processors, one or more location indicators corresponding to one or more respective people currently at one or more respective locations not represented on the digital map. Each of the one or more location indicators overlays the digital map and is positioned, relative to the map center, in a direction corresponding to a direction of the respective location of the respective person relative to a location represented by the map center. The method also includes, for each of the one or more location indicators, causing, by the one or more processors, a size of the location indicator to increase as the respective person moves nearer to the location represented by the map center, and decrease as the respective person moves further from the location represented by the map center.

[0008] In another example embodiment, a method for displaying traffic -related

information in an unobtrusive manner is implemented in an electronic device having a display and one or more processors. The method includes transmitting, to a remote server, data indicative of an expected route of a user of the electronic device, receiving, from the remote server, traffic data indicative of a current degree of traffic along at least a portion of the expected route of the user, and selecting, based on the traffic data and from among a plurality of visual properties corresponding to a plurality of respective traffic levels, a first visual property. Each of the plurality of visual properties includes a respective color, a respective shade, or a respective pattern. The method also includes causing, by the one or more processors, the display to present a user interface screen to the user. At least a portion of the user interface screen has the selected first visual property. The user interface screen does not include a digital map representing any portion of the expected route of the user.

[0009] In another example embodiment, an electronic device includes a display, one or more processors, and a memory. The memory stores instructions that, when executed by the one or more processors, cause the electronic device to transmit, to a remote server, data indicative of an expected route of a user of the electronic device, receive, from the remote server, traffic data indicative of a current degree of traffic along at least a portion of the expected route of the user, and select, based on the traffic data and from among a plurality of visual properties corresponding to a plurality of respective traffic levels, a first visual property. Each of the plurality of visual properties includes a respective color, a respective shade, or a respective pattern. The instructions also cause the electronic device to cause the display to present a user interface screen to the user. At least a portion of the user interface screen has the selected first visual property. The user interface screen does not include a digital map representing any portion of the expected route of the user.

[0010] In another example embodiment, a method for displaying traffic -related

information in an unobtrusive manner is implemented in an electronic device having a display and one or more processors. The method includes presenting on the display, by the one or more processors, a user interface screen. The user interface screen includes an estimated time of arrival (ETA) indicating an expected length of time needed for a user of the electronic device to reach a destination, and a transmit mode icon indicating a current mode of transportation of the user. The method also includes processing, by the one or more processors, traffic data received from a remote server to monitor a degree of traffic along at least a portion of an expected route of the user, and causing, by the one or more processors, a color of at least a portion of the user interface screen to change in response to a change in the monitored degree of traffic. The portion of the user interface screen includes a background against which the ETA is set and/or at least a portion of the transmit mode icon.

BRIEF DESCRIPTION OF THE DRAWINGS

[0011] Figure 1 is a block diagram of an example system in which techniques for providing an improved user experience to a user of a wearable electronic device may be implemented.

[0012] Figures 2A and 2B depict an example user interface screen that includes location indicators for off- screen entities, according to one implementation and scenario. [0013] Figure 3 is a flow diagram of an example method for displaying location indicators for one or more off-screen entities to a user of the electronic device, according to one implementation.

[0014] Figures 4A and 4B depict an example user interface screen that includes age- dependent indicators of recent locations of a user, according to one implementation and scenario.

[0015] Figures 5A and 5B depict alternative user interface screens that may be varied to indicate current traffic levels along an expected route of a user, according to two different implementations .

[0016] Figure 6 is a flow diagram of an example method for displaying traffic-related information in an unobtrusive and space-efficient manner, according to one implementation.

[0017] Figure 7 depicts an example user interface screen that provides an indication of an area of uncertainty with respect to the location of a user, according to one implementation and scenario.

[0018] Figure 8 is a block diagram of an example system in which navigation-related functions of a mobile communications device, such as a smartphone, may be accessed by a wearable electronic device remote from the mobile communications device, according to one implementation.

[0019] Figure 9 depicts a set of example user interface screens that may be used to pin a transit line for user notification purposes, according to one implementation.

[0020] Figure 10 depicts a pair of example user interface screens occurring before and after a user gesture for viewing a location of a selected transit station.

[0021] Figures 11 A and 1 IB depict two example scenarios in which a smart watch display screen is rotated to align with the direction of a user' s gaze, according to one implementation.

DETAILED DESCRIPTION OF THE DRAWINGS

Overview

[0022] In one aspect, a digital map on the display of a first user's wearable electronic device shows the shared locations of other users (e.g. , contacts that have expressly agreed to share location with the first user). In addition to showing the pinpoint locations of any sharing users who are currently within the area represented by the digital map, the display may show visual indicators that provide information about the locations of sharing users who are not currently within the area represented by the digital map, but are within the general vicinity (e.g. , within some threshold distance of the first user). For example, the display may include, for each nearby sharing user, a circular indicator (or other shape or icon, etc.) that is just within the outer perimeter of the digital map. The angular position of the indicator, relative to the map center, is indicative of the direction of the sharing user' s location relative to the first user's location. In some implementations, the indicator for another user's offscreen location may vary in size (and/or color, shade, etc.) as that other user comes nearer to, or goes further away from, the area represented by the digital map. For a circular indicator, for instance, the circle radius may decrease as the other user goes further away from the area (e.g. , until a threshold distance is crossed and the indicator disappears entirely), and increase as the other user comes closer to the area (e.g. , until the indicator is fully on the digital map and pinpoints the precise location of the other user, and/or until a maximum size is reached, etc.). Moreover, in some implementations, the sharing user's photograph or avatar may appear within the indicator whenever the sharing user is sufficiently near to the mapped area (e.g. , whenever a circular indicator for the user is at least a threshold size).

[0023] In another aspect, a screen on the display of a wearable electronic device shows the current traffic level along an expected route (e.g. , as determined using destination and the current user location when in a navigation mode, or as predicted based on the current street and/or heading, etc.). The traffic level may be indicated in a manner that is relatively unobtrusive, and requires little or no additional display/screen "real estate." For example, the traffic level may be indicated by changing the color of a portion of the display (e.g. , green for light traffic, yellow or orange for medium traffic, red for heavy traffic, etc.). In some implementations, the color of a majority of the display is adjusted according to the current traffic level. For example, the background of all text and/or icons shown on the display may be set to the appropriate color. In other implementations, the color of a smaller portion of the display is adjusted according to the current traffic level. For example, the system may only adjust the color of an icon shown on the display (e.g. , an icon showing a current transmit mode of the user, such as car, bus, etc.). In some implementations, the color-coding technique is used in connection with an "uncluttered" display screen that shows relatively little information to facilitate convenient, "at a glance" navigation. For example, the color- coded display may show only an estimated time of arrival (ETA) for a destination, a transmit mode, a current time, and/or a "next turn" arrow or other indicator, without showing a digital map of any portion of the expected route.

[0024] Various other aspects are also described herein. In one aspect, for example, if the user's smartphone is not near enough to the wearable electronic device to assume co-location of the two devices, and if the wearable electronic device includes self-locating (e.g. , GPS) functionality, the wearable device may communicate remotely with the smartphone (e.g. , via the Internet and cellular and/or WiFi networks) to make use of navigation features and processing power of the smartphone. Still other aspects relate to displaying past locations of a user of a wearable electronic device, "pinning" transit stations and/or lines using a wearable device display (for purposes of receiving future notifications), automatic rotation of the display of a wearable device display (to provide a useful orientation of the display relative to the perspective of the user), and convenient techniques for changing the information shown on a wearable device display (e.g. , without the user having to fully remove either hand from a steering wheel).

Example system

[0025] Figure 1 illustrates an example system 10 in which techniques for improving user experience when using (e.g. , viewing and/or otherwise interacting with) a wearable electronic device 12 may be implemented. In addition to the wearable electronic device 12, the example system 10 includes a mobile communications device 14, a map server 16, a traffic server 20, and a transit server 22. The mobile communications device 14, traffic server 20 and transit server 22 may be communicatively coupled with map server 16 via a network 24, and wearable electronic device 12 may be communicatively coupled with mobile

communications device 14 via a short range link 26. Generally, the system 10 may include more, fewer and/or different components than are seen in Figure 1. For example, the system 10 may not include traffic server 20 and/or transit server 22.

[0026] Network 24 may include any suitable combination of one or more wired and/or wireless communication networks, such as one or more local area networks (LANs), metropolitan area networks (MANs), and/or wide area network (WANs). As just one specific example, network 24 may include a cellular network, the Internet, and a server-side LAN. In some implementations, the portion(s) of network 24 used by mobile communications device 14 to communicate with map server 16, by traffic server 20 to communicate with map server 16, and/or by transit server 22 to communicate with map server 16 may be wholly or partially separate from and independent of each other. Wearable electronic device 12 and mobile communications device 14 may both be located on a user who is located remotely from servers 16, 20 and 22.

[0027] Short range link 26 may use any suitable technology that enables wearable electronic device 12 to communicate in a wired or, preferably, a wireless manner with mobile communications device 14. For example, short range link 26 may be a Bluetooth link.

[0028] While Figure 1 shows only one mobile communications device 14, it is understood that map server 16 may also be in communication with numerous other mobile

communications devices similar to mobile communications device 14, with some or all of those other mobile communications devices being in communication with respective wearable devices similar to wearable electronic device 12. Moreover, while referred to herein as a "server," map server 16 (as well as traffic server 20 and/or transit server 22) may, in some implementations, include multiple co-located or remotely distributed computing devices.

[0029] While shown in Figure 1 as having the form factor of a wrist watch, wearable electronic device 12 may instead be a different type of wearable electronic device having a display, such as smart glasses, for example. Similarly, while shown in Figure 1 as having a smartphone form factor, mobile communications device 14 may instead be any other mobile or portable computing device with wireless communication capability (e.g. , a tablet computer, a laptop computer, a head unit computer installed within a vehicle, etc.).

[0030] In the example implementation of Figure 1, wearable electronic device 12 includes a processor 30, a user interface 32, a network interface 34, one or more orientation sensors 36, and a memory 40. The processor 30 may be a single processor (e.g. , a central processing unit (CPU)), or may include a set of processors (e.g. , a CPU and a graphics processing unit (GPU)).

[0031] User interface 32 includes hardware, firmware and/or software configured to enable a user to interact with (i.e. , both provide inputs to and perceive outputs of) wearable electronic device 12. For example, user interface 32 may include a touchscreen with both display and manual input capabilities. User interface 32 may also include other components, such as a microphone (with associated processing components) that provides voice control/input capabilities to the user, one or more physical input keys or buttons, and so on. [0032] Network interface 34 includes hardware, firmware and/or software configured to enable wearable electronic device 12 to exchange electronic data with mobile

communications device 14 via short range link 26. For example, as shown in Figure 1, network interface 34 may include a Bluetooth ("BT") transceiver. However, other short range technologies may instead be used (e.g. , WiFi, RFID, etc.). Orientation sensor(s) 36 include(s) one or more sensors configured to sense physical parameters or states, which in turn allow processor 30 to determine the physical orientation, and/or small-scale movement, of wearable electronic device 12 in space. For example, orientation sensor(s) 36 may include a compass, one or more accelerometers (e.g. , in x-, y- and z-directions), a gyrometer, an inclinometer, and so on. In some implementations, mapping/navigation application 42 (discussed below) is configured to use orientation sensor(s) 36 to determine when particular user motions are made, and changes the information displayed via user interface 32 based on the user motion. More specific examples of how orientation sensor(s) 36 may be used are discussed below.

[0033] Memory 40 is a computer-readable, non-transitory storage unit or device, or collection of units/devices, that may include persistent and/or non-persistent memory components. Memory 40 stores instructions that are executable by processor 30 to perform various operations, including the instructions of various software applications and the data generated and/or used by such applications. In the example implementation of Figure 1, memory 40 stores at least mapping/navigation application 42, which may be a single application or multiple applications (e.g. , separate mapping and navigation applications). Generally, mapping/navigation application 42 is executed by processor 30 to provide mapping and navigation display screens and features to a user of wearable electronic device 12, and includes a traffic indication unit 44 and an off-screen display unit 46. The traffic indication unit 44 generally provides unobtrusive, space-efficient indications of traffic levels along an expected route of the user (e.g. , during navigation), and off-screen display unit 46 generally provides digital map-based indications of locations of other users who have agreed to share their locations but are not currently within the area represented by the digital map. The operation of mapping/navigation application 42, including traffic indication unit 44 and off-screen display unit 46, is discussed further below.

[0034] Mobile communications device 14 includes a processor 50, a user interface 52, a network interface 54, a global positioning satellite (GPS) unit 56, and a memory 60. The processor 50 may be a single processor (e.g. , a CPU), or may include a set of processors (e.g. , a CPU and a GPU).

[0035] User interface 52 includes hardware, firmware and/or software configured to enable a user to interact with (i.e. , both provide inputs to and perceive outputs of) mobile

communications device 14. For example, user interface 52 may include a touchscreen with both display and manual input capabilities. User interface 52 may also include other components, such as a keyboard (with associated processing components) that provides input capabilities to the user, and/or a microphone (with associated processing components) that provides voice control/input capabilities to the user.

[0036] Network interface 54 includes hardware, firmware and/or software configured to enable mobile communications device 14 to exchange electronic data with wearable electronic device 12 via short range link 26, and to exchange electronic data with map server 16 via network 24. For example, as shown in Figure 1, network interface 54 may include a Bluetooth transceiver for communications with wearable electronic device 12, and cellular and WiFi transceivers for communications with map server 16 (e.g. , with one or the other of the cellular and WiFi transceivers being utilized depending in part on the proximity of mobile communications device 14 to an available WiFi access point). However, other suitable communication technologies may instead be used.

[0037] GPS unit 56 includes hardware, firmware and/or software configured to enable mobile communications device 14 to self-locate using GPS technology (alone, or in combination with the services of map server 16 and/or another server not shown in Figure 1). Alternatively, or in addition, mobile communications device 14 may include a unit configured to self-locate, or configured to cooperate with a remote server or other device(s) to self-locate, using other, non-GPS technologies. For example, mobile communications device 14 may include a unit configured to self-locate using WiFi positioning technology (e.g. , by sending signal strengths detected from nearby access points to map server 16 along with identifiers of the access points, or to another server configured to retrieve access point locations from a database and calculate the position of mobile communications device 14 using trilateration, etc.).

[0038] Memory 60 is a computer-readable, non-transitory storage unit or device, or collection of units/devices, that may include persistent (e.g. , hard disk) and/or non-persistent memory components. Memory 60 stores instructions that are executable on processor 50 to perform various operations, including the instructions of various software applications and the data generated and/or used by such applications. In the example implementation of Figure 1, memory 60 stores at least a mapping/navigation application 62. While not shown in Figure 1, memory 60 may also store a positioning application (e.g. , within a mapping application) that utilizes GPS unit 56 to determine the geographic location of the mobile communications device 14. Generally, mapping/navigation application 62 (and any positioning application) is executed by processor 50 to access the mapping/navigation services (and positioning services, if available) provided by map server 16, and to support various mapping/navigation features provided by the mapping/navigation application 42 of wearable electronic device 12.

[0039] Map server 16 may be associated with (e.g. , owned and/or maintained by) a mapping/navigation service provider, and includes a network interface 70, a memory 72, a mapping unit 74, a location sharing unit 76, a route identification unit 80, an ETA calculation unit 82, a traffic monitoring unit 84, a transit detection unit 90, and a transit pinning unit 92.

[0040] The network interface 70 includes hardware, firmware and/or software configured to enable map server 16 to exchange electronic data with mobile communications devices 14 and other, similar mobile communications devices (not shown in Figure 1) via network 110. For example, network interface 70 may include a wired or wireless router and a modem.

[0041] Memory 72 is a computer-readable, non-transitory storage unit or device, or collection of units/devices, that may include persistent (e.g. , hard disk) and/or non-persistent memory components. Memory 72 may store data that is generated and/or used by the various units 74-92, for example.

[0042] Mapping unit 74 may generally be configured to provide mapping-related services to client devices, such as mobile communications device 14. For example, mapping unit 74 may receive geographic locations from mobile communications device 14 (e.g. , a location determined by GPS unit 56, or a location entered by the user of mobile communications device 14, etc.), identify the appropriate map tile data to retrieve in view of those locations and the zoom levels, and send the map tile data to mobile communications device 14 via network 24.

[0043] Location sharing unit 76 may generally be configured to maintain a data record of which users have expressly agreed to share their locations ("sharing users"), and the specific users with whom the sharing users have agreed to share their locations ("monitoring users"). Location sharing unit 76 may also identify when a sharing user is within an area represented by a digital map being presented to (or about to be presented to) a corresponding monitoring user. In such a scenario, map server 16 may send an indication of the sharing user' s pinpoint location to a mobile communications device of the monitoring user (e.g. , mobile

communications device 14), where the pinpoint location may be displayed on the digital map. Location sharing unit 76 may determine or collect locations of sharing users on a periodic basis, only when a corresponding monitoring user requests that a sharing user's location be shown, or according to some other suitable basis. Location sharing unit 76 may also provide monitoring users with information regarding the locations of sharing users who are "offscreen" (e.g. , not within the area represented by a digital map presented to the monitoring user), but are nonetheless nearby (e.g. , within a threshold distance of) the monitoring user. Various techniques for providing location indicators for such "off-screen" sharing users are described below, in connection with Figures 2A, 2B and 3.

[0044] Route identification unit 80 may generally be configured to identify specific routes that a user may take. Route identification unit 80 may determine the route based on a request for directions, entered by a user via user interface 32 of wearable electronic device 12 or user interface 52 of mobile communications device 14 when utilizing mapping/navigation application 42 or mapping/navigation application 62, for example. For instance, the expected route may correspond to a "best" (e.g. , shortest in terms of time and/or distance) route between the user' s current location and a destination (e.g. , street address, place, etc.) entered or selected by the user. In other implementations, the route may be a route that is predicted without the benefit of a navigation request. For instance, route identification unit 80 may determine an expected route based on the user's current location, the user' s current direction, the street currently being traveled, the user' s mode of transit (e.g. , car versus bus), and/or other information.

[0045] ETA calculation unit 82 may generally be configured to determine estimated times of arrival (ETAs) for users to arrive at intended destinations. Each destination may be one that was entered or selected by a user (e.g. , as discussed above in connection with route identification unit 80). ETA calculation unit 82 may calculate each ETA based on the user' s current location (e.g. , as determined by GPS unit 56), the destination location, and an expected speed or speeds. Expected speeds may be determined using the user's current speed, and/or using speed limits or average travel times along an expected route between the user's current location and the destination (e.g. , a route determined by route identification unit 80), for example. ETA calculation unit 82 may also use other factors when calculating ETA, such as an expected level of traffic along the expected route (e.g. , as determined by traffic monitoring unit 84, discussed below), construction along the route, and/or accidents along the route, for example. ETA calculation unit 82 may calculate each ETA as a time of day (e.g. , 4: 15pm) and/or as a relative amount of time (e.g. , 34 minutes), for example.

[0046] Traffic monitoring unit 84 may generally be configured to determine current traffic conditions along expected routes identified by route identification unit 80. To this end, traffic monitoring unit 84 may receive data indicative of the expected routes from route

identification unit 80 and, for each route (or for each street along each route, etc.), generate a request for traffic conditions. Map server 16 may send the traffic condition requests to traffic server 20 via network 24, and in response receive the corresponding traffic condition data. Traffic server 20 may be a server associated with (e.g. , owned or maintained by) a government agency or contracting third party, for example. The traffic condition data may be indicative of traffic density levels (e.g. , "high," "moderate" or "low"), traffic delays (e.g. , "17 minutes"), and/or other information indicative of the amount of traffic for a portion of a particular street. Traffic monitoring unit 84 may process the traffic condition data, and provide traffic level data (e.g. , the traffic condition data itself, or information derived therefrom) to ETA calculation unit 82.

[0047] In some implementations, traffic monitoring unit 84 may cause map server 16 to send notifications of current traffic levels, as determined based on information from traffic monitoring unit 84, to respective mobile communications devices via network 24. In a scenario where map server 16 sends such a notification to mobile communications device 14, mapping/navigation application 62 may then cause mobile communications device 14 to send the notification, or another notification derived therefrom, to wearable electronic device 12 (via short range link 26). Traffic indication unit 44 of mapping/navigation application 42 may then process the notification, and cause user interface 32 to present an indication of the current traffic level to the user. Various techniques for indicating the current traffic level to the user are described in more detail below in connection with Figures 5A, 5B and 6.

[0048] Transit detection unit 90 may generally be configured to detect when certain transit options that are relatively near to the current locations of users. For example, transit detection unit 90 may detect when a user is within a threshold distance of a known train station, subway station or bus station. When a known transit option is nearby, transit detection unit 90 may generate a notification of that proximity, and cause map server 16 to send the notification to the user' s mobile communications device via network 24. For instance, if transit detection unit 90 detects that mobile communications device 14 is within a threshold distance of a particular subway station (e.g. , using a location of mobile

communications device 14 obtained by GPS unit 56 and a known location of the station), map server 16 may send mobile communications device 14 a notification that specifies the name (and/or other identifier) of the station, and possibly also a pinpoint location of the station (and/or a distance to the station, etc.). Mapping/navigation application 62 may then cause mobile communications device 14 to send the notification, or another notification derived therefrom, to wearable electronic device 12 (via short range link 26).

Mapping/navigation application 42 of wearable electronic device 12 may then process the notification, and cause user interface 32 to present an indication of the nearby subway station to the user.

[0049] In some implementations, transit detection unit 90 only generates notifications for nearby stations, and/or particular lines at nearby stations, if users have previously "pinned" the station or line to show their interest. Transit pinning unit 92 may generally be configured to maintain a data record specifying which stations and/or lines have been pinned by each user. In some implementations, users can also specify additional requirements for notifications (e.g. , pinning a line only when headed in a particular direction or to a particular destination, etc.). Transit pinning unit 92 may also modify the data record when a user "unpins" a pinned station or line (e.g. , indicates that the station or line is no longer of interest). Various techniques for pinning and unpinning stations and/or lines (and/or lines headed in particular directions, etc.) are described in more detail below in connection with Figure 9.

[0050] In some implementations, each of mapping unit 74, location sharing unit 76, route identification unit 80, ETA calculation unit 82, traffic monitoring unit 84, transit detection unit 90, and/or transit pinning unit 92 may be a component of software stored in memory 72 (or elsewhere) and executed by one or more processors (not shown in Figure 1) of map server 16 to perform the functions described herein. In some implementations, map server 16 includes more, fewer and/or different units than are shown in Figure 1. For example, map server 16 may include a positioning module that assists mobile communications devices such as mobile communications device 14 when the devices are attempting to self-locate. As another example, map server 16 may not include traffic monitoring unit 84, transit detection unit 90 and/or transit pinning unit 92.

[0051] Moreover, the components shown in wearable electronic device 12 and mobile communications device 14 of Figure 1, and/or the functionality of each, is/are distributed differently than shown in Figure 1 and described elsewhere herein. In some implementations, for example, wearable electronic device 12 includes additional components and functionality to reduce or remove its dependency upon mobile communications device 14. For instance, wearable electronic device 12 may include a GPS unit similar to GPS unit 56. As another example, network interface 34 may be similar to network interface 54, such that wearable electronic device 12 may communicate more directly with map server 16 via network 24. As yet another example, mapping/navigation application 42 may include substantially all of the functionality described below in connection with both mapping/navigation application 42 and mapping/navigation application 62.

[0052] Operation of the various components of the example system 10 is discussed in further detail below, according to various aspects and implementations. It is understood, however, that the techniques described below may instead be implemented in systems other than the example system 10.

Example techniques for providing location indicators for off- screen entities

[0053] Figures 2A and 2B depict an example user interface screen 100 that includes location indicators for off-screen entities, according to one implementation and scenario. User interface screen 100 may be a screen shown on a display of wearable electronic device 12 (e.g. , a display of user interface 32). The outermost circle in broken lines in each of Figures 2A and 2B may illustrate the display panel on which the screen 100 is presented. While shown as a circular screen, other shapes are also possible (e.g. , square, rectangle, octagon, etc.). In some implementations, user interface screen 100 is instead a screen shown on a display of mobile communications device 14 (e.g. , a display of user interface 52), or another suitable electronic device.

[0054] As seen in Figure 2A, user interface screen 100 depicts a digital map 102 of a geographic area centered on a current location of a "monitoring" user, which corresponds to a monitoring user location indicator 104. The current user location may be a location of mobile communications device 14 as determined by GPS unit 56 (e.g. , assuming devices 12 and 14 of Figure 1 are both on the user' s person and therefore co-located), or a location of wearable electronic device 12 as determined by a GPS unit therein (not shown in Figure 1), for example. Mobile communications device 14 and/or wearable electronic device 12 may generate the digital map 102 based on map tile data that mobile communications device 14 or wearable electronic device 12 received from map server 16 in response to sending data indicative of the current user location to the map server 16, for example. In one

implementation, mobile communications device 14 determines the current user location using GPS unit 56, sends the current user location to map server 16 via network 24, receives map tile data from map server 16 via network 24 in response, and then provides the map tile data (or other map data generated based on the map tile data) to wearable electronic device 12 via short range link 26. Processor 30 of wearable electronic device 12 may then process the received map data to present the user interface screen 100 to the user via a display of user interface 32.

[0055] In one implementation, user interface screen 100 also displays a sharing user location indicator for each sharing user who is within, or sufficiently nearby, the geographic area of digital map 102 (e.g. , within 1 mile of the location corresponding to monitoring user location indicator 104, or within 0.5 miles of the outer perimeter of the displayed geographic area, etc.). Location sharing unit 76 of map server 16 may determine which sharing users should be assigned a location indicator and, for each such indicator, generate data that may be used by mobile communications device 14 and/or wearable electronic device 12 to determine the appropriate placement and appearance of the indicator.

[0056] To determine the appropriate number of sharing user location indicators, location sharing unit 76 may compare the current location of the monitoring user (corresponding to monitoring user location indicator 104) to each sharing user' s current location (if known), and assign a location indicator to each sharing user who is within a threshold distance of the monitoring user' s current location. As an example, location sharing unit 76 may determine that a first sharing user is 250 meters from the monitoring user, that a second sharing user is 450 meters from the monitoring user, and that all other sharing users (if any) are more than a threshold distance of 1 kilometer away from the monitoring user (or have an unknown location). In response, location sharing unit 76 may determine that location indicators are needed only for the first and second sharing users.

[0057] In some implementations, location sharing unit 76 also determines a direction of each sharing user relative to the monitoring user. The direction may be expressed as a compass bearing (e.g. , degrees/minutes/seconds), for example. Alternatively, the direction may be ascertained from the previously determined distance information. For example, if location sharing unit 76 determined that a first sharing user is 30 meters south and 240 meters west of the monitoring user, it can be determined that the first sharing user is 7.125 degrees south of due west relative to the monitoring user.

[0058] Map server 16 may then send to mobile communications device 14 data that specifies (or from which can be derived) the distance and direction of each sharing user. Mobile communications device 14 may then forward the received data (or other data derived therefrom) to wearable electronic device 12 via short range link 26. At wearable electronic device 12, off-screen display unit 46 may analyze the received data to determine the appropriate placement and size of each sharing user location indicator. In particular, offscreen display unit 46 may use directional data (e.g. , a compass bearing, or x and y Cartesian distances, etc.) to determine a placement of each sharing user location indicator relative to monitoring user location indicator 104, and distance data (e.g. , the distance itself, or x and y Cartesian distances, etc.) to determine a size of each sharing user location indicator.

[0059] The direction in which the location indicator of a particular sharing user is offset from the map center (and/or from the monitoring user location indicator 104) corresponds to the direction in which the current location of that sharing user is offset from the current location of the monitoring user in the real world. Put differently, the direction in which a particular sharing user location indicator is offset from the map center may be the same as the direction in which that sharing user location indicator would be offset from the map center if the digital map 102 were enlarged enough (without changing zoom level) to show the precise, pin-point location of the sharing user.

[0060] In the example scenario of Figure 2A, no sharing users are currently within the geographic area represented by digital map 102, but three sharing users are currently within a threshold distance of the monitoring user's location. Thus, user interface screen 100 includes three location indicators 106 A, 106B and 106C, each corresponding to one of the three nearby, sharing users. The size of each sharing user location indicator may monotonically increase or decrease as the distance from the monitoring user increases or decreases, respectively. In this example scenario, location indicator 106A is larger than location indicator 106B, and location indicator 106B is larger than location indicator 106C, to quickly indicate to the viewer (monitoring user) that the sharing user for location indicator 106C is furthest away from the monitoring user, and that the sharing user for location indicator 106A is nearest to the monitoring user. In some implementations, textual information is displayed as well for each nearby sharing user (e.g. , textual bearing and/or distance information within or next to each of indicators 106A, 106B and/or 106C). Preferably, however, none of location indicators 106 A through 106C includes any textual information, in order to reduce the amount of visual "clutter" appearing on the user interface screen 100.

[0061] While Figure 2A shows an implementation in which each of location indicators 106A, 106B and 106C is a circle, other shapes, images and/or icons may instead be used. Moreover, while Figure 2A shows an implementation in which indicator size is varied to reflect off-screen distance, other visual properties may instead be varied based on distance. For example, color, shading, pattern, shape, icon image, and/or other visual properties may be varied to indicate distance. Moreover, while Figure 2A shows an implementation in which each of location indicators 106A, 106B and 106C is displayed immediately adjacent to, and within, the outer perimeter of digital map 102, other arrangements are possible. For example, each of location indicators 106 A, 106B and 106C may be displayed slightly inside and apart from the outer perimeter or, if the digital map 102 does not occupy the entire user interface screen 100, just outside the outer perimeter, etc.

[0062] Figure 2B shows user interface screen 100 and digital map 102 at a later time, after each of the three sharing users has moved but the monitoring user has remained more or less stationary. As seen in Figure 2B, the sharing user corresponding to location indicator 106C has moved further away from the monitoring user, and therefore location indicator 106C has become smaller. Conversely, the sharing user corresponding to location indicator 106A has moved closer, and therefore the location indicator 106A has increased in size. Moreover, an image (e.g. , facial photograph) or avatar of the corresponding sharing user may appear within the location indicator 106A. For example, off-screen display unit 46 may determine whether nearby sharing users are within another, smaller threshold distance of the monitoring user and, if so, generate location indicators that include the image or avatar for the respective sharing user.

[0063] As is also seen in Figure 2B, the sharing user corresponding to location indicator 106B has moved into the geographic area represented by digital map 102, and therefore location indicator 106B is no longer at the outer perimeter of digital map 102. Like location indicator 106A, therefore, location indicator 106B includes an image or avatar of the corresponding sharing user. In some implementations, images or avatars are only shown for sharing users who are entirely within the area represented by digital map 102. In still other implementations, no images or avatars are shown for any sharing users, regardless of location or proximity.

[0064] In some implementations, digital map 102 and each of location indicators 106A through 106B rotates about the center point of digital map 102 in a synchronized manner as the monitoring user changes his or her direction or heading. If the monitoring user was facing towards the point represented by the top of digital map 102 (when viewing Figure 2A in the conventional orientation), for example, and turns left by 30 degrees, orientation sensor(s) 36 may detect the change in direction and cause both the digital map 102 and location indicators 106A through 106B to rotate clockwise by 30 degrees. In other implementations, such rotation only occurs when map server 16 (e.g. , using location data from GPS unit 56) detects that the monitoring user has traveled in a new direction for at least some threshold distance, or in another suitable manner.

[0065] In various implementations, the functionality described above may be incorporated in different components of system 10. For example, location sharing unit 76 of map server 16 or mapping/navigation application 62 of mobile communications device 14 may instead determine the appropriate size and/or placement of each sharing user location indicator within digital map 102, in which case wearable electronic device 12 may simply display the location indicators 106A through 106C in accordance with that determination. As another example, off-screen display unit 46 may receive current locations of all sharing users from map server 16 (either directly, or via mobile communications device 14), and from that information determine which location indicators are needed.

[0066] In an alternative implementation, one or more of the off- screen indicators 106 A through 106C is/are instead used to indicate off-screen locations of stationary, non-person entities, such as places of interest (e.g. , restaurants, bars, museums, theaters, etc., and/or a home and/or other address associated with a user). For example, location sharing unit 76 may determine that a location indicator is needed for each place on a predetermined list of places (e.g. , places previously pinned or otherwise selected by the user) so long as that place is within a threshold distance of the monitoring user, even if the place is not yet in the area represented by digital map 102. In one such implementation, an image (e.g. , place photograph) and/or other information (e.g. , place name and/or website link) is added to each indicator that is within a second, smaller threshold distance of the monitoring user.

[0067] An example method 200 for displaying location indicators for one or more offscreen entities to a user of an electronic device (e.g. , wearable electronic device 12 or mobile communications device 14 of Figure 1) is discussed next with reference to Figure 3, according to one implementation. The method 200 may be implemented as instructions stored on a computer-readable medium and executed by one or more processors. With reference to Figure 1, for example, the method 200 may be implemented using

mapping/navigation application 42 (e.g. , off-screen display unit 46) of wearable electronic device 12, or mapping/navigation application 62 of mobile communications device 14.

[0068] At block 202, map data is received from a remote server (e.g. , map server 16 of Figure 1). The map data may be received directly from the remote server (e.g. , via network 24), or may be received via another electronic device associated with the user (e.g. , from mobile communications device 14 via short range link 26, after mobile communications device 14 received the map data from map server 16 via network 24). The map data represents an area that includes a current location of the user of the electronic device. The map data may be received in response for a request for map data that was sent by the electronic device implementing the method 200, or by another electronic device of the user, to the remote server.

[0069] At block 204, shared location data is received from the remote server. The shared location data may be received in the same manner as the map data (e.g. , directly or via another electronic device of the user), for example. The shared location data corresponds to the current location of a first entity that is not within the area represented by the map data. For example, the shared location may include data specifying geographic coordinates of the current location of the first entity, or data specifying the position on the digital map at which a location indicator of the first entity should be placed. In some implementations, the shared location data may include additional data as well, such as data specifying a visual property (e.g. , a particular size, color, shade and/or shape) to be used for a location indicator of the first entity. For example, the first entity may be a person (e.g. , a pre-configured "contact") who previously agreed to share his or her location with the user, or a nearby "point of interest" such as a restaurant, a pinned/tagged street address (e.g. , the user' s home), or another non-mobile place. [0070] At block 206, the map data received at block 204 is used to present, on a display of the electronic device, a digital map of the area represented by the map data. The digital map has an outer perimeter (e.g. , along the perimeter of a circular or rectangular display on a smart watch, or along the rectangular perimeter of a smartphone display, etc.) and a map center (e.g. , a point aligned with, or near to, the center of the display). In some

implementations, an icon or other marker at the map center corresponds to the current location of the user.

[0071] At block 208, the shared location data received at block 204 is used to present, on the display of the electronic device of the user, a first location indicator corresponding to the current location of the first entity. The first location indicator (e.g. , a circular or square icon, etc.) is displayed contemporaneously with the digital map of the area represented by the map data. Block 208 may include multiple operations. For example, as seen in Figure 3, block 208 may include assigning the first location indicator a visual property indicative of a distance between the current location of the first entity and the current location of the user (block 208A), and positioning the first location indicator, with the assigned visual property, at an offset distance from the map center, and in a direction from the map center that corresponds to the direction of the current location of the first entity relative to the current location of the user (block 208B). The visual property does not include any text specifying a distance, but rather is a particular size, color, shading, shape, or other visual property that allows the user to quickly ascertain the approximate distance of the first entity without necessarily reading any text. In some implementations, however, other information is assigned, in addition to the visual property and for display in connection with the first location indicator. For example, the distance between the current location of the first entity and the current location of the user, or the distance between the current location of the first entity and a location corresponding to a point along the outer perimeter of the digital map, etc., may be displayed on or next to the first location indicator.

[0072] If the shared location data received at block 204 included data specifying a visual property (e.g. , by including a known field value corresponding to a particular size or other visual property, or by explicitly stating the size or other visual property, etc.), the assigned visual property may simply be the one that is so specified. If the shared location data did not specify a visual property, but did specify geographic coordinates of the first entity, the assigned visual property may be determined by using the specified coordinates to calculate a distance from the monitoring user (or a distance from an outer perimeter of the digital map, etc.), and then locally selecting a visual property that corresponds to the calculated distance.

[0073] Similarly, if the shared location data received at block 204 included data specifying the position on the digital map at which a location indicator of the first entity should be placed, the first location indicator may simply be positioned at the specified position. As another example, if the shared location data specified a direction in which the location indicator of the first entity should be offset from the location indicator of the user, the first location indicator may be positioned such that it is offset from the map center in the specified direction. As yet another example, if the shared location data did not specify a position on the digital map or a direction, but did specify geographic coordinates of the first entity, the position may be determined by locally using the specified coordinates to calculate a direction in which the first location indicator is to be offset from the map center.

[0074] The offset distance from the map center may be set such that the first location indicator is closer to the outer perimeter of the digital map than to the map center. For example, the first location indicator may be positioned immediately adjacent to, and within, the outer perimeter, as shown for each of the sharing user location indicators 106A through 106C in Figure 2A. As other examples, the first location indicator may be just outside or partially outside of the outer perimeter (if the digital map does not fill the entire

display/screen area), or within the digital map with a small offset/gap from the outer perimeter, etc.

[0075] The method 200 may also include one or more additional blocks not shown in Figure 3. For example, the method 200 may include an additional block at which both the digital map and the first location indicator are caused to rotate on the display, in a synchronized manner, as the directional orientation of the electronic device changes. As another example, in one implementation where the assigned visual property is a particular size of the first location indicator, the method 200 may include a first additional block at which updated shared location data, corresponding to a new current location of the first entity, is received from the remote server, and a second additional block at which the updated shared location data is used to present, on the display, an updated first location indicator corresponding to the new current location of the first entity. The second additional block may include, for example, assigning the updated first location indicator an increased (or decreased) size that is indicative of a new, larger (or smaller) distance measured between the new current location of the first entity and either the current location of the user or a new current location of the user. The second additional block may also include positioning the updated first location indicator, with the increased (or decreased) size, in a new direction corresponding to a direction of the new current location of the first entity relative to either the current location of the user or the new current location of the user. In one such

implementation, the updated first location indicator differs by including a photographic image or avatar of the first entity (e.g. , a same image or avatar that appears next to the first entity when viewed in a "contacts" list, if the first entity is a person).

[0076] As yet another example, the method 200 may include a first additional block at which additional shared location data, corresponding to a current location of a second entity that is also outside of the area represented by the map data, is received from the remote server, and a second additional block at which the additional shared location data is used to present on the display, contemporaneously with both the digital map of the area represented by the map data and the first location indicator, a second location indicator corresponding to the current location of the second entity. The second additional block may include assigning the second location indicator a visual property indicative of a distance between the current location of the second entity and the current location of the user, and positioning the second location indicator (with the second visual property) in a direction from the map center that corresponds to a direction of the current location of the second entity relative to the current location of the user.

[0077] Using the techniques described above and shown in Figures 2A, 2B and 3, a user may be provided with useful location information for off-screen entities. For example, while the monitoring user may not learn the precise location of a particular sharing user, the monitoring user may nonetheless learn enough about the sharing user' s location to call the sharing user and ask what he or she is doing, or to head towards and meet the sharing user, etc. Moreover, adjusting visual properties of indicators for off-screen sharing users, rather than using other techniques (e.g. , requiring the user to zoom out, etc.), may make more effective use of the limited real estate available on electronic devices with relatively small displays (e.g. , smart watches), and/or may require less user interaction with the display screen.

Example techniques for providing indications of recent user locations [0078] Figures 4A and 4B depict an example user interface screen 300 that includes age- dependent indicators of recent locations of a user, according to one implementation and scenario. User interface screen 300 may be a screen shown on a display of wearable electronic device 12 (e.g. , a display of user interface 32). The outermost circle in broken lines in each of Figures 4A and 4B may illustrate the display panel on which the screen 300 is presented. While shown as a circular screen, other shapes are also possible (e.g. , square, rectangle, octagon, etc.). In some implementations, user interface screen 300 is instead a screen shown on a display of mobile communications device 14 (e.g. , a display of user interface 52), or another suitable electronic device.

[0079] As seen in Figure 4A, user interface screen 300 depicts a digital map 302 of a geographic area centered on a current location of a user, which corresponds to a current location indicator 304. The current user location may be a location of mobile

communications device 14 as determined by GPS unit 56, for example (e.g. , assuming devices 12 and 14 of Figure 1 are both on the user's person and therefore co-located), or a location of wearable electronic device 12 as determined by a GPS unit therein (not shown in Figure 1), for example. Mobile communications device 14 and/or wearable electronic device 12 may generate the digital map 302 based on map tile data that mobile communications device 14 or wearable electronic device 12 received from map server 16 in response to sending data indicative of the current user location to the map server 16, for example. In one implementation, mobile communications device 14 determines the current user location using GPS unit 56, sends the current user location to map server 16 via network 24, receives map tile data from map server 16 via network 24 in response, and then provides the map tile data (or other map data generated based on the map tile data) to wearable electronic device 12 via short range link 26. Processor 30 of wearable electronic device 12 may then process the received map data to present the user interface screen 300 to the user via a display of user interface 32.

[0080] In one implementation, user interface screen 300 also displays a number of past location indicators 306, corresponding to locations of the user at a number of respective times in the recent past. While Figure 4A shows only four past location indicators 306A through 306D, more or fewer (e.g. , none) may be shown at any given time depending on the zoom level, the user' s traveling speed, and the rate at which indicators 306 are generated and/or disappear, for example. Each of the past location indicators 306 is associated with a visual property that indicates the age of the indicator (e.g. , the amount of time since the user was at the location corresponding to that point on the digital map 302). The visual property is represented in Figure 4A as a density of horizontal lines within the respective indicator, with a lower density corresponding to an older location of the user. In practice, however, the different horizontal line densities may correspond to different patterns (e.g. , other than parallel lines), different shadings (e.g. , a lighter shading corresponding to an older user location), different colors (e.g. , a lower, or higher, frequency in the visible spectrum corresponding to an older user location), different transparency levels (e.g. , more

transparency, such that the digital map 302 may be seen more clearly, corresponding to an older user location), and/or any other suitable visual properties that do not expressly specify (i.e. , in text) the age of the respective indicators. In some implementations, past location indicators expire (i.e. , disappear entirely from the digital map 302, even if positioned at a location still shown on digital map 302) when the indicators pass a certain age. If the visual property is a shading or a transparency, for example, each of past location indicators 306 may become lighter and/or clearer over time until the indicator can no longer be seen at all.

[0081] In some implementations, new past location indicators are generated and displayed on digital map 302 on a periodic time basis (e.g. , every 30 seconds since the previous indicator was generated, etc.). In other implementations, new past location indicators are generated and displayed on digital map 302 each time some other criteria related to age, or potentially related to age, is met. For example, a new past location indicator may be generated and displayed each time the user has moved at least 50 feet from the location of the most recent past location indicator. In still other implementations, more complex criteria are used, such as generating and displaying a new past location indicator each time that (1) at least 30 seconds have passed since the most recent indicator, and (2) the user has moved at least 20 feet from the location corresponding to the most recent indicator. Each new past location indicator may be displayed on digital map 302 as soon as the user has moved to or past the corresponding location or, alternatively, only after the user has moved some threshold distance away from that location (and/or only after some threshold amount of time has passed since the user arrived at that location, etc.).

[0082] As seen by the densities of horizontal lines in Figure 4A, past location indicator 306A corresponds to the oldest user location, past location indicator 306B corresponds to the second oldest user location, past location indicator 306C corresponds to the third oldest user location, and past location indicator 306D corresponds to the fourth oldest user location. Figure 4B shows a scenario where the user has traveled further along the street. While past location indicators 306A through 306B have moved entirely off the digital map 302 and are therefore no longer displayed, the horizontal line densities of past location indicators 306C and 306D have been reduced in Figure 4B to illustrate that the visual property of each has changed to reflect an older location. Moreover, a new past location indicator 306E has appeared along the user's route due to the time and/or distance since the user was at the location corresponding to past location indicator 306D.

[0083] In some implementations, the number of past location indicators shown, and/or the amount of time and/or distance between each past location indicator, may change depending upon the zoom level (e.g. , as set by the user). If a user viewing user interface screen 300 (as seen in Figure 4A) were to zoom out substantially, for example, some of the past location indicators 306A through 306D might cluster together and overlap each other. Merely decreasing the size of each indicator 306 might make the indicators hard to see. Thus, indicators 306A and 306B may instead, in some implementations, be combined to form a single indicator, indicators 306C and 306D might be combined to form a single indicator, indicators 306A through 306D might be combined to form a single indicator, etc. In some implementations, the number of indicators to be combined depends directly or indirectly on the zoom level and/or the distance(s) between past location indicators. In some

implementations, the shading, transparency, or other visual property used to indicate the age of a combined indicator is an average of the visual properties of all indicators that have been combined, and/or the position of the combined indicator is a median position calculated from all indicators that have been combined. In other implementations, past location indicators are calculated anew at each zoom level, using new, zoom-level-dependent times and/or distances between locations (and a log of past locations) to determine the position and age-dependent visual property of each past location indicator.

[0084] Using the techniques described above and shown in Figures 4 A and 4B, a user can essentially leave a trail of digital "breadcrumbs" that allow the user to retrace his or her steps, or to realize when he or she is "traveling in circles," etc. Moreover, adjusting visual properties of location indicators at pinpoint locations on a digital map may make effective use of the limited real estate available on electronic devices with relatively small displays (e.g. , smart watches), and avoid overcrowding of the displays.

Example techniques for providing indications of traffic levels along an expected user route [0085] Figures 5A and 5B depict alternative user interface screens that may be varied to indicate current traffic levels along an expected route of a user, according to two different implementations. The user interface screens of Figures 5A and 5B may be screens shown on a display of wearable electronic device 12 (e.g. , a display of user interface 32). While shown as a circular screen, other shapes are also possible (e.g. , square, rectangle, octagon, etc.). In some implementations, each of the user interface screens of Figures 5A and 5B is instead a screen shown on a display of mobile communications device 14 (e.g. , a display of user interface 52), or another suitable electronic device.

[0086] As seen in Figure 5A, a user interface screen 320 generally depicts information that may be useful to a user when navigating to a particular destination. Specifically, in the implementation shown, user interface screen 320 includes a transit mode icon 322 set against a background area 324, a current time 326, relative estimated time of arrival (ETA) 330A, an absolute ETA 330B, and a distance 332. Transit mode icon 322 may indicate a mode of travel that was selected by the user, or a mode of travel that was automatically detected by map server 16 of Figure 1 based on time/location data generated by GPS unit 56 and received from mobile communications device 14, for example. In the example scenario of Figure 5A, the transit mode is a driving mode, and therefore transit mode icon 322 depicts a stylized automobile.

[0087] The relative ETA 330A specifies a predicted amount of time for the user to arrive at his or her intended destination. The relative ETA 330A may be calculated by ETA calculation unit 82 of Figure 1, based on the user' s destination (e.g. , as entered or selected via an application such as mapping/navigation application 42 or mapping/navigation application 62), expected route (e.g. , as determined by route identification unit 80 of Figure 1), and/or current location (e.g. , as determined by GPS unit 56). Still other types of information may also, or instead, be used to calculate the relative ETA 330A, such as the user's speed, traffic levels along the expected route (e.g. , as determined by traffic monitoring unit 84), and/or other information.

[0088] The absolute ETA 330B is the time of day at which arrival at the destination is expected (e.g. , the current time 326 plus the relative ETA 330A). The distance 332 is the distance to the user' s destination, relative to the user's current location and in view of the expected route. In various other implementations, user interface screen 320 includes more information (e.g. , text specifying the destination), less information (e.g. , no absolute ETA or no relative ETA, no distance, no transit mode indicator, etc.), and/or a different layout of information.

[0089] In addition to the textual information shown in Figure 5A, and in addition to transit mode icon 322, user interface screen 320 may indicate the current level of traffic between the user's current location and the expected destination and along the user's expected route (e.g. , a traffic level as determined by traffic monitoring unit 84, discussed above in connection with Figure 1). In particular, at least a portion of the user interface screen 320 may change color to reflect the current traffic level. In one implementation, for example, at least a portion of user interface screen 320 turns green for light or no traffic, yellow or orange for moderate traffic, and red for heavy traffic. In other implementations, other colors may be used to represent each traffic level, and/or fewer or more than three different traffic levels may be represented (e.g. , only red or green for relatively light versus relatively heavy traffic, or a continuous adjustment of visible frequencies between yellow and red to reflect the traffic level more finely, etc.). Moreover, in some implementations, a different color (e.g. , blue) may be used to represent the case where traffic data does not apply (e.g. , if the transit mode icon 322 indicates the user is walking or riding on a train, etc.).

[0090] The portions of user interface screen 320 that change to indicate the current traffic level, and the manner of the change, may vary according to different implementations. In one implementation, for example, the portion of circular transit mode icon 322 that surrounds the stylized automobile graphic (but not the automobile graphic itself) turns a moderate shade of the appropriate color, while the background area 324 turns a darker shade of the same color (e.g. , dark green for light or no traffic, dark red for heavy traffic, etc.). This is represented in Figure 5A by showing different dot densities for different shades of the same color. As a more specific example, a specific green color may be applied to the portion of transit mode icon 322 that surrounds the stylized automobile graphic, and the same color may be applied to the background area 324 but with a 70% black overlay to darken the green color. In other implementations, either the transit mode icon 322 or the background area 324 does not change color to indicate traffic level.

[0091] Another possible implementation is shown in Figure 5B, where a user interface screen 340 includes a transit mode icon 342 set against a background area 344, a current time area 346, a relative ETA 350A, an absolute ETA 350B, and a distance 352. Again, the dot density shown may represent the shade of a particular color. As seen in Figure 5B, the stylized automobile graphic within transit mode icon 342 and the background area 344 have the same dot density, representing the same color and shade, while the current time area 346 has a higher dot density to represent a darker shade of the same color. Again, other implementations are possible (e.g. , using the same color and shade in current time area 346 as in the background area 344 and the automobile graphic of transit mode icon 342, etc.).

[0092] While user interface screen 320 of Figure 5A may provide a more aesthetically pleasing user interface, and may prolong battery life due to the darker shade of the relatively large background area 324, user interface screen 340 of Figure 5B may allow the user to more quickly ascertain the traffic level. This may be important, for example, if the user is driving and can only spare quick glances at his or her watch. In either of the depicted implementations, however, the user may be presented with information that is highly relevant to his or her arrival time. Moreover, both implementations may conserve the limited real estate available on the display of a wearable or other electronic device (e.g. , compared to conventional techniques for showing traffic levels, such as presenting a full digital map with traffic levels indicated separately for each street segment).

[0093] Other implementations are also possible. In one implementation, for example, the background area (e.g. , background area 324 of user interface screen 320 or background area 344 of user interface screen 340) may have a fixed color or shade (e.g. , black or gray), and only a portion of a transit mode icon (e.g. , the portion of icon 322 or icon 342 that surrounds the stylized automobile or other graphic) changes color to indicate traffic level. In still other implementations, one or more other non-textual visual properties, other than color, are used to indicate traffic level. For example, the dotted areas of user interface screen 320 or user interface screen 340 may indicate areas that change shading (rather than color) to indicate traffic level, or change a fill pattern to indicate traffic level, etc.

[0094] Reference is now made to Figure 1 to illustrate possible ways in which the example system 10 may implement the above techniques for indicating traffic levels. In one implementation, traffic monitoring unit 84 determines the current traffic level along the user's expected route based (as identified by route identification unit 80), and map server 16 sends data indicating the determined traffic level to mobile communications device 14 via network 24. Mapping/navigation application 62 may then cause mobile communications device 14 to forward the traffic level data (or data derived therefrom) to wearable electronic device 12 via short range link 26. Traffic indication unit 44 of mapping/navigation application 42 may then select the appropriate color coding to indicate the current traffic level, and cause user interface 32 to display a user interface screen in accordance with the color coding (e.g. , user interface screen 320 or user interface screen 340). In an alternative implementation, mapping/navigation application 62 includes traffic indication unit 44 and therefore determines the appropriate color coding. In this implementation,

mapping/navigation application 62 causes mobile communications device 14 to send an indicator of the determined color coding to wearable electronic device 12 via short range link 26. In yet another implementation, map server 16 includes traffic indication unit 44 to determine the appropriate color coding, and sends an indicator of that color coding to wearable electronic device 12 either directly (if wearable electronic device 12 is capable of such communication), or via mobile communications device 14.

[0095] An example method 400 for displaying location indicators for displaying traffic- related information in an unobtrusive and space-efficient manner is discussed next with reference to Figure 6, according to one implementation. The method 400 may be implemented as instructions stored on a computer-readable medium and executed by one or more processors. With reference to Figure 1, for example, the method 400 may be implemented using mapping/navigation application 42 (e.g. , traffic indication unit 44) of wearable electronic device 12, or mapping/navigation application 62 of mobile

communications device 14.

[0096] At block 402, data indicative of an expected route of a user of an electronic device (e.g. , wearable electronic device 12 or mobile communications device 14 of Figure 1) is transmitted to a remote server (e.g. , to map server 16 of Figure 1, via network 24). The transmitted data may merely allow the remote server (e.g. , route identification unit 80 of Figure 1) to identify the expected route, rather than explicitly specifying the route. For example, the data may indicate a current location of the user, and/or a destination that was either previously selected by the user (e.g. , by selecting a previously "pinned" place), or previously entered by the user (e.g. , as a street address), via a user interface screen presented on a display of the electronic device. In an alternative implementation, the data does not indicate the user' s destination, and the remote server predicts the expected route based only on other information (e.g. , current location, current direction of travel, and knowledge of roadways in the area). [0097] At block 404, traffic data that is indicative of a current degree of traffic along at least a portion of the expected route of the user (e.g. , between the user' s current location and the destination, if provided to the remote server at block 402) is received from the remote server (e.g. , via network 24 of Figure 1). The traffic data may specify a metric that measures traffic level (e.g. , an average or maximum measured time delay, an estimated average number of vehicles per mile, an estimated average distance between vehicles, etc.), or a traffic level that is derived from such a measurement or observed by an individual (e.g. , "low," "medium" or "high"), for example. Alternatively, the traffic data may directly specify a color or other visual property to be applied based on the current level of traffic along the expected route (e.g. , "green," "yellow" or "red").

[0098] At block 406, a visual property is selected based on the received traffic data. The visual property is selected from among a plurality of visual properties corresponding to a plurality of respective traffic levels. If the traffic data received at block 404 directly specified the visual property, the selection may simply involve selecting the specified visual property. In not, the selection may involve accessing a table or other data structure to determine which visual property corresponds to the information (measurement, level, etc.) contained in the received traffic data.

[0099] Each of the visual properties available for selection may include a respective color (e.g. , green for light or no traffic, yellow or orange for moderate traffic, and red for heavy traffic), a respective shade (e.g. , a light shade for light or no traffic, a medium shade for moderate traffic, and a dark shade for heavy traffic) and/or a respective pattern (e.g. , a more cluttered/dense background pattern as traffic becomes heavier). In some implementations, the plurality of visual properties includes a very large number of possible properties (e.g. , tens or hundreds of different colors and/or shades), such that a virtually continuous range of different traffic levels may be indicated.

[00100] At block 408, a display of the electronic device (e.g. , a touchscreen display or other display of user interface 32 or user interface 52 of Figure 1) is caused to present a user interface screen to the user, with at least a portion of the user interface screen having the selected visual property. For example, the user interface screen may have an appearance similar to any of the various implementations described above in connection with Figures 5A and 5B. In at least some implementations, the presented user interface screen does not include a digital map representing any portion of the expected route of the user. [00101] The method 400 may also include one or more additional blocks not shown in Figure 6. For example, the method 400 may include a first additional block at which updated traffic data, indicative of a new current degree of traffic along at least the same portion of the expected user route, is received from the remote server, a second additional block at which a new visual property is selected from among the available visual properties based on the updated traffic data, and a third additional block at which the display of the electronic device is caused to present an updated user interface screen to the user, with at least a portion of the updated user interface screen having the newly selected visual property (e.g. , a different color to reflect a different degree/level of traffic).

Example techniques for setting an initial zoom level based on degree of location uncertainty

[00102] Figure 7 depicts an example user interface screen 500 that provides an indication of an area of uncertainty with respect to the location of a user, according to one

implementation and scenario. The user interface screen 500 may be a screen shown on a display of wearable electronic device 12 (e.g. , a display of user interface 32). The outermost circle in broken lines in Figure 7 may illustrate the display panel on which the screen 500 is presented. While shown as a circular screen, other shapes are also possible (e.g. , square, rectangle, octagon, etc.). In some implementations, user interface screen 500 is instead a screen shown on a display of mobile communications device 14 (e.g. , a display of user interface 52), or another suitable electronic device.

[00103] As seen in Figure 7, user interface screen 500 depicts a digital map 502 of a geographic area centered on a current location of a user, which corresponds to a location indicator 504. The current user location may be a location of mobile communications device 14 as determined by GPS unit 56 (e.g. , assuming devices 12 and 14 of Figure 1 are both on the user' s person and therefore co-located), or a location of wearable electronic device 12 as determined by a GPS unit therein (not shown in Figure 1), for example. Mobile

communications device 14 and/or wearable electronic device 12 may generate the digital map 502 based on map tile data that mobile communications device 14 or wearable electronic device 12 received from map server 16 in response to sending data indicative of the current user location to the map server 16, for example. In one implementation, mobile

communications device 14 determines the current user location using GPS unit 56, sends the current user location to map server 16 via network 24, receives map tile data from map server 16 via network 24 in response, and then provides the map tile data (or other map data generated based on the map tile data) to wearable electronic device 12 via short range link 26. Processor 30 of wearable electronic device 12 may then process the received map data to present the user interface screen 500 to the user via a display of user interface 32.

[00104] User interface screen 500 also includes an uncertainty indicator 506 indicating a zone/area of uncertainty or imprecision in the current user location. In the example of Figure 7, the uncertainty can be expressed as a radius of uncertainty, and thus the uncertainty indicator 506 defines a circle having that radius. The uncertainty indicator 506 may be a line that defines the outer perimeter of the circle. Preferably, however, the uncertainty indicator 506 is a solid (filled-in), semi-transparent circle (e.g. , semi-transparent blue or gray, etc.) through which some or all of the features of digital map 502 are still visible. The zone/area of uncertainty may be determined using techniques known to those skilled in the relevant art. For example, the uncertainty may be based on a known uncertainty inherent to a positioning system (e.g. , a known uncertainty or maximum uncertainty when self-locating using GPS unit 56 of Figure 1), or based on one or more factors and/or metrics (e.g. , a number of satellite signals for which a lock is obtained, metrics relating to the quality of satellite signals received by mobile communications device 14 of Figure 1 and processed by GPS unit 56, etc.).

[00105] Because Figure 7 corresponds to a scenario where the radius of uncertainty is slightly less than the radius of digital map 502, a non-zero gap 510 exists between the outer perimeter of digital map 502 and the outer perimeter of uncertainty indicator 506. In conventional systems, however, the gap 510 may not exist in some scenarios. That is, the radius of uncertainty may be at least as large as the radius of the digital map 502. In such a situation, a user may not always recognize that the zone of uncertainty encompasses the entire area shown on digital map 502 (or an even larger area). For example, the user may simply think that no zone of uncertainty is being shown, or that the zone of uncertainty is so small as to not be easily seen.

[00106] To prevent such confusion, and/or to avoid misleading the user by presenting a map area that is significantly smaller than the zone of uncertainty, the initial zoom level of digital map 502 may be automatically set such that the radius of the digital map 502 is greater than the radius of the zone of uncertainty (i.e. , such that the gap 510 is always greater than zero or some other threshold). For example, the initial zoom level (e.g. , the zoom level when a user first requests navigation, first launches a mapping application, selects a control that zooms the map to his or her current location, etc.) may be set to a default value unless the default value would cause the gap 510 to disappear (or fall below some threshold value, etc.), in which case the zoom level may instead be set such that the gap 510 exceeds some minimum acceptable value (e.g. , some minimum number of display pixels, or some minimum percentage of the digital map 502, etc.). To avoid subsequent changes in zoom level that may be both undesired and distracting, the zoom level may not be automatically reset, for the purpose of maintaining a sufficient gap 510, if/when the uncertainty level continues to change (e.g. , decrease) as the user continues to utilize the mapping and/or navigation application.

[00107] Referring to Figure 1, mapping/navigation application 62 of mobile

communications device 14 may determine a location of mobile communications device 14 using GPS unit 56, and determine an uncertainty in the location as discussed above.

Mapping/navigation application 62 may then determine the appropriate initial zoom level, according to the technique discussed above, and send data specifying both the location and the initial zoom level to map server 16 via network 24. Mapping unit 74 of map server 16 may then retrieve the map data corresponding to the location and zoom level, and cause map server 16 to send the retrieved map data back to mobile communications device 14 via network 24. Mobile communications device 14 may then forward the map data (or other map data derived therefrom) to wearable electronic device 12, along with data indicating the zone (e.g. , radius) of uncertainty, via short range link 26. Wearable electronic device 12 may then present the map (e.g. , digital map 502), at the initial zoom level and with an indicator of the zone of uncertainty (e.g. , uncertainty indicator 506), on a display of user interface 32. Other implementations are also possible. For example, the digital map 502 and uncertainty indicator 506 may instead be displayed by user interface 52 of mobile communications device 14, or wearable electronic device 12 may determine the location, the location uncertainty and the zoom level, and communicate directly with map server 16 without using mobile communications device 14. As another example, the zone of uncertainty and/or zoom level may be determined by map server 16 rather than mobile communications device 14 or wearable electronic device 12.

Example techniques for accessing navigation-related functionality of a remote mobile communications device

[00108] As seen by way of the systems and techniques described above, wearable electronic devices may communicate with smartphones or other mobile communications devices in order to provide navigation services to users. For example, a wearable electronic device may lack GPS capability, and therefore may rely on the GPS unit of the user' s smartphone in order to determine the user' s current location. Even for a wearable electronic device that can independently self-locate, however, it may be advantageous to cooperate with the user' s mobile communications device. For example, the mobile communications device may provide significantly more processing power than the wearable electronic device, provide additional or enhanced navigation features, and/or contain a battery that can support navigation-related communications and processing operations for a substantially longer period of time.

[00109] Figure 8 is a block diagram of an example system 600 in which a wearable electronic device 612 (e.g. , a smart watch, smart glasses, etc.) may access navigation-related functions of a mobile communications device 614 (e.g. , a smartphone) even when wearable electronic device 612 is remote from mobile communications device 614, according to one implementation. As seen in Figure 8, the example system 600 includes, in addition to wearable electronic device 612 and mobile communications device 614, a map server 616 and a network 622. Map server 616 may be similar to map server 16 of Figure 1, and/or network 622 may be similar to network 24 of Figure 1, for example.

[00110] Wearable electronic device 612 includes a processor 630, a user interface 632, a network interface 634, a GPS unit 636, and a memory 640 storing a mapping/navigation application 642. Processor 630, user interface 632 and/or memory 640 may be similar to processor 30, user interface 32 and/or memory 40 of Figure 1, respectively. Network interface 634 may be similar to network interface 34 of Figure 1, but further includes (in addition to Bluetooth or some other short range communication transceiver) a cellular and/or WiFi transceiver to enable communications via network 622. GPS unit 636 may be similar to GPS unit 56 (within mobile communications device 14) of Figure 1 but, in some

implementations, may be less accurate (e.g. , due to a relatively small, simple and/or inexpensive GPS receiver in wearable electronic device 612). Mapping/navigation application 642 may be similar to mapping/navigation application 42 of Figure 1, but includes a remote navigation unit 644 (discussed further below).

[00111] Mobile communications device 614 includes a processor 646, a memory 648 storing a mapping/navigation application 650, a user interface 652, a network interface 654, and a GPS unit 656. Processor 646, memory 648, user interface 652, network interface 654 and/or GPS unit 65 may be similar to processor 50, memory 60, user interface 52, network interface 54 and/or GPS unit 56 of Figure 1, respectively. Mapping/navigation application 650 may be similar to mapping/navigation application 62 of Figure 1, but includes a remote navigation unit 658 (discussed further below). Wearable electronic device 612 and mobile communications device 614 may have been "paired," for navigation purposes, using one or more manual and/or automatic procedures (e.g. , using any suitable techniques already known in the art) at a previous time.

[00112] The decision of how and when wearable electronic device 612 and mobile communications device 614 cooperate to provide navigation services to the user may depend on which device was used to initiate navigation, on whether devices 612 and 614 are close enough (and powered on) to enable short range communications (e.g. , Bluetooth), and possibly on whether the user has configured devices 612 and 614 to communicate via short range communications (e.g. , whether the user has manually disabled Bluetooth on one or both devices for battery conservation or other reasons, or has put one or both of devices 612, 614 in airplane mode, etc.). In a first scenario, the user initiates navigation (e.g. , enters a destination and request for directions, etc.) via mapping/navigation application 650 of mobile communications device 614, at a time when wearable electronic device 612 is nearby and powered on. In this scenario, remote navigation unit 658 may cause mobile communications device 614 to automatically detect that wearable electronic device 612 is nearby (e.g. , using a protocol of the short range communication portion of network interface 654). In response, mobile communications device 614 may perform certain navigation-related functions (e.g. , communicating with map server 616 via network 622, processing map tile data from map server 616, determining user location via GPS unit 656, etc.), and send wearable electronic device 612 messages (e.g. , via short range link 26 of Figure 1) that cause wearable electronic device 612 to display navigation information to the user (e.g. , ETAs, "next turn" indicators, highlighted routes on a digital map, etc.). In some implementations and scenarios, mobile communications device 614 discovers (e.g. , using the short range communication protocol) multiple nearby wearable electronic devices that are both paired with mobile communications device 614 and powered on, and causes each such device to enter a navigation mode and generate a navigation display for the user.

[00113] In a second scenario, the user initiates navigation via mapping/navigation application 650 of mobile communications device 614, at a time when no nearby wearable electronic devices (that have been paired, and are powered-on, etc.) can be detected. In this scenario, mobile communications device 614 may handle all client-side navigation functions, without attempting to communicate with wearable electronic device 612 or any other wearable electronic device (e.g. , such that the user views navigation outputs only via user interface 652 of mobile communications device 614, and not via user interface 632 of wearable electronic device 612). Alternatively, mobile communications device 614 may send a message to each paired wearable electronic device to cause a window or other user interface to appear at the remote wearable device, with the user interface presenting an option to enable remote navigation (e.g. , in a manner similar to the fourth scenario, described below), despite the fact that the user initiated navigation via mobile communications device 614.

[00114] In a third scenario, the user initiates navigation via mapping/navigation application 642 of wearable electronic device 612, after which wearable electronic device 612 detects that mobile communications device 614 is nearby and powered on (using a protocol of the short range communication portion of network interface 632). In this scenario, wearable electronic device 612 may utilize the short range communication link to make use of the processing capabilities of processor 646 and/or the functionality/features of mapping/navigation application 650. For example, mapping/navigation application 650 may handle communications with map server 616 (e.g. , packaging and sending location data to map server 616 via network 622, processing map tile data from map server 616, etc.).

Further, user locations may be determined using GPS unit 656 rather than GPS unit 636 in this scenario, because GPS unit 656 may be faster, more accurate, more reliable, etc., and/or may be preferred due to a larger-capacity battery in mobile communications device 614. In some implementations, mobile communications device 614 only operates in concert with a wearable electronic device on which navigation was initiated, regardless of whether any other wearable electronic device(s) is/are turned on and paired with mobile communications device 614, and regardless of the proximity of the other wearable electronic device(s) to mobile communications device 614. Mobile communications device 614 may not send any navigation-related message to the other wearable electronic device(s), for example.

[00115] In a fourth scenario, the user initiates navigation via mapping/navigation application 642 of wearable electronic device 612, at a time when mobile communications device 614 is powered on, but too far away to form a short range communication link with wearable electronic device 612. For example, the user may have accidentally left mobile communications device 614 in his or her car, or may have purposely left mobile

communications device 614 in a docking station at his or her home, etc. In scenarios such as this, remote navigation unit 644 may generally enable wearable electronic device 612 to remotely access certain navigation-related features and/or components of mobile communications device 614. To this end, when a user initiates navigation, remote navigation unit 644 may cause wearable electronic device 612 to detect that mobile communications device 614 is not nearby and powered on (e.g. , using a protocol of the short range communication portion of network interface 634). Remote navigation unit 644 may then generate a request to initiate a remote navigation mode, and cause network interface 634 (e.g. , a cellular transceiver, or possibly a WiFi transceiver, etc.) to send the request to mobile communications device 614 via network 622. In particular, the request may be sent via the Internet using a known IP address of mobile communications device 614 (e.g. , an IP address that was manually entered by the user, or automatically identified during an earlier discovery or navigation pairing process, etc.).

[00116] If no response is received (e.g. , if mobile communications device 614 is powered down), remote navigation unit 644 may cause wearable electronic device 612 to

communicate directly with map server 616 for navigation purposes. If mobile

communications device 614 receives the request, however, remote navigation unit 658 may generate a response message indicating that mobile communications device 614 is available, and cause network interface 654 to send the response to wearable electronic device 612 via network 622 (e.g. , via the Internet). Thereafter, mapping/navigation application 642 may operate in concert with mapping/navigation application 650 to provide navigation services to the user (e.g. , as discussed above in connection with Figure 1, but with GPS unit 636 providing the user' s location rather than GPS unit 656). For example, GPS unit 636 may periodically update the user' s location, and remote navigation unit 644 may cause user interface 632 to send the updated location to mobile communications device 614 via network 622 (e.g. , via the Internet). Mobile communications device 614 may receive the updated locations via network interface 654, and provide the locations to mapping/navigation application 650. Mapping/navigation application 650 may then cause user interface 652 to send the updated locations to map server 616 via network 622 (possibly along with other information, such as a user-selected zoom level received from wearable electronic device 612, etc.). In response, map server 616 may send to mobile communications device 614 map tile data and, if needed, updated navigation information (e.g. , next turn information) and/or other information (e.g. , ETA, traffic level information, etc.). Mapping/navigation application 650 may process the data received from map server 616 and cause corresponding messages for wearable electronic device 612 to be generated (e.g. , messages providing updated map and/or navigation information). Remote navigation unit 658 may cause the messages to be sent to the remote wearable electronic device 612 via network 622 (e.g. , via the Internet). Remote navigation unit 644 may receive and process the messages, and provide the contents to mapping/navigation application 642. Mapping/navigation application 642 may then cause user interface 632 to generate navigation displays based on the received information (e.g. , displays with ETAs, next turn information, etc.). It is understood that other suitable protocols and/or messages may instead be used to enable wearable electronic device 612 to cooperate with mobile communications device 614 while the two devices are remote from each other.

[00117] The remote navigation mode may be terminated in different ways, depending upon the implementation. For example, wearable electronic device 612 may, during remote navigation, send updated locations at regular time intervals (or each time the user has moved at least a certain distance, etc.). If mobile communications device 614 determines that a predetermined timeout period (e.g. , 60 seconds, three minutes, etc.) has passed since the last location update, mobile communications device 614 may exit the remote navigation mode (e.g. , by no longer sending navigation-related messages to wearable electronic device 612). Mobile communications device 614 may then either use its own location, as determined by GPS unit 656, for navigation, or may exit navigation completely.

[00118] If wearable electronic device 612 comes near to mobile communications device 614 while remote navigation is enabled, and that proximity is detected by wearable electronic device 612 and/or mobile communications device 614, remote navigation units 644 and 658 may terminate the remote navigation mode and return to the mode described above in connection with the first scenario. Conversely, if wearable electronic device 612 is initially near to mobile communications device 614 per the first scenario described above, but the two devices 612, 614 then move apart (e.g. , too far for the short range link), then remote navigation unit 644 and/or 658 may attempt to determine which device is currently with the user. If an accelerometer (or other sensor) of wearable electronic device 612 detects movement (or a particular type of movement consistent with human use rather than mere transport in a car, etc.), for example, remote navigation unit 644 may send mobile

communications device 614 a message via network 622 indicating that remote navigation should be initiated. If a similar sensor of mobile communications device 614 detects movement, or a particular type of movement, mobile communications device 614 may begin independent navigation (e.g. , per the second scenario described above) without entering the remote navigation mode. Example techniques for pinning and unpinning transit stations, lines and/or headsigns

[00119] As with smartphones, wearable electronic devices may be used to provide notifications to a user when the user is near to a location of potential interest. For example, notifications may be provided when a user draws near a transit station (e.g. , bus stop, train or subway station, etc.). To avoid overwhelming the user with too much information, however, it may be desirable to limit and/or prioritize such notifications based on the stations and/or lines in which the user has previously indicated interest (e.g. , by previously selecting, or "pinning," the station/line). This may be especially important in the context of a wearable electronic device, which may have a very limited display screen area that can easily become crowded with information. Moreover, users of smart watches and other wearable electronic devices with small displays may prefer to avoid extensive manual inputs. Thus, it may be advantageous to generally provide an intuitive and elegant interface for pinning and unpinning stations, lines, and/or "headsigns" (i.e. , particular destinations and/or directions associated with particular lines).

[00120] Figure 9 depicts a set of example user interface screens 700A through 700D that may be used to pin a transit line, according to one implementation. The user interface screens of Figure 9 may be screens shown on a display of wearable electronic device 12 (e.g. , a display of user interface 32 of Figure 1). The outermost circle in broken lines for each screen may illustrate the display panel on which the respective screen is presented. While shown as a circular screen, other shapes are also possible (e.g. , square, rectangle, octagon, etc.). In some implementations, each of the user interface screens of Figure 9 is instead a screen shown on a display of mobile communications device 14 (e.g. , a display of user interface 52 of Figure 1), or another electronic device.

[00121] As seen in Figure 9, the user interface screen 700A includes an area 702 showing the current time, and an indication ("Nearby") that the transit stations being shown are those that are in the general vicinity of the user (e.g. , within some threshold distance, as determined using a GPS unit such as GPS unit 56 of Figure 1 or GPS unit 636 of Figure 8). Each of a number of areas 704 corresponds to a different nearby station. As seen in Figure 9, for example, an area 704A indicates that a first transit station ("8 Av/w 25 St") is 50 meters away from the user, an area 704B indicates that a second transit station ("28 St") is 0.8 kilometers away from the user, and an area 704C indicates that a third transit station (not entirely shown on the display) is at another distance (also not entirely shown) from the user. The user may look at additional stations, if any, by swiping down (i.e. , making an upward motion on the user interface screen 700A with a finger). Icons under each station (e.g. , the "N," "Q" and "R" icons in the area 704B) may indicate particular lines that utilize that station.

[00122] The user interface screen 700B is a screen that may be presented to the user in response to the user selecting the "28 St" station (e.g. , by touching area 704B and swiping right, i.e. , making a leftward motion with a finger). The area 702 now shows the selected station ("28 St"), and a number of areas 706. In some implementations and/or scenarios, each of areas 706 corresponds to a specific headsign (e.g. , destination and/or direction) for a particular line that utilizes the selected station. Alternatively, each of areas 706 may correspond to a different line without being broken out further based on headsign (e.g. , such that the user must again swipe right to see individual headsigns for a line). An ETA may be provided for each headsign. As seen in Figure 9, for example, an area 706A indicates that an "N" line train, with the destination "Astoria - Ditmars Blvd," is expected to arrive in 1 minute (with a subsequent train being expected in 6 minutes), an area 706B indicates that an "N" line train, with the destination "Coney Island - Stillwell Ave," is expected to arrive in 2 minutes (with a subsequent train being expected in 6 minutes), and another area 706C is only partially shown on the display. The user may look at additional lines and/or headsigns, if any, by swiping down (i.e. , making an upward motion on user interface screen 700B with a finger). The user may return to user interface screen 700A by swiping left (i.e. , making a rightward motion with a finger).

[00123] The user interface screen 700C is a screen that may be presented to the user in response to the user selecting the "Astoria - Ditmars Blvd" headsign (e.g. , by touching area 706A and swiping right). A new area 710 now shows the current time as well as the selected line and headsign, and a new area 712 shows the next train ETA ("1 min"), the subsequent train ETA ("6 min"), and an icon 714 that enables the user to pin the line/headsign. The user interface screen 700D is a screen that may be presented to the user in response to the user selecting the icon 714 (e.g. , by touching the icon 714). In response to the user selection, the icon 714 may change to an icon 716 indicating that the line/headsign has been successfully pinned. The user may return to user interface screen 700B by swiping left.

[00124] Once a line/headsign has been pinned, future lists of lines/headsigns, such as that shown in user interface screen 700B, may prioritize the pinned line/headsign by placing it at or near the top of the list. Moreover, in some implementations, pinning a line and/or headsign may cause the corresponding station, or all stations utilized by the pinned line and/or headsign, to be automatically pinned. In some implementations, stations may be pinned independently of lines/headsigns (e.g. , by selecting a pin icon similar to icon 714, but appearing in a dedicated area at the bottom of user interface screen 700B, not shown in Figure 9, or in another suitable location). Regardless of how a station is pinned, a pinned station may be prioritized (e.g. , within a station list such as that shown in user interface screen 700 A), or may be exempted from a general rule that no automatic notifications are provided for nearby stations. In some implementations, a small pin icon (or other suitable icon) appears in each of the areas 704 that corresponds to a previously pinned station, and/or appears in each of the areas 706 that corresponds to a previously pinned line/headsign.

[00125] Unpinning a station, line and/or headsign may be similar to the pinning process. For example, a new icon (not shown in Figure 9), such as an "X," may appear in user interface screen 700D after the line/headsign has been pinned, and the user may select the new icon to unpin the line/headsign such that the line/headsign is no longer prioritized (or such that no future notifications are provided when the user draws near the line/headsign, etc.). In one implementations, unpinning a line and/or headsign also unpins the

corresponding station or stations if, and only if, no other lines and/or headsigns are currently pinned for the corresponding station or stations.

[00126] It is understood that other processes and/or user interface screens may be used to pin and/or unpin stations, lines and/or headsigns. Moreover, additional features may be provided to the user. For example, links to a transit service provider web page may be presented to the user on user interface screen 700C and/or 700D. Another example feature is illustrated in Figure 10, which depicts a pair of example user interface screens 720A and 720B, according to one implementation and scenario. As with the user interface screens of Figure 9, the user interface screens of Figure 10 may be screens shown on a display of wearable electronic device 12 (e.g. , a display of user interface 32 of Figure 1), the outermost circle in broken lines for each screen may illustrate the display panel on which the respective screen is presented, and other, non-circular screen shapes are possible (e.g. , square, rectangle, octagon, etc.). Also like the user interface screens of Figure 9, the user interface screens of Figure 10 may, in some implementations, instead be screens shown on a display of mobile communications device 14 (e.g. , a display of user interface 52 of Figure 1), or another electronic device. [00127] The user interface screen 720A may be the same as the user interface screen 700B of Figure 9, and may appear as a part of the user-directed sequence discussed above in connection with Figure 9. For example, areas 722 and 726A through 726C of user interface screen 720A may be the same as areas 702 and 706A through 706C, respectively, of user interface screen 700B, and the user interface screen 720A may (like user interface screen 700B) appear in response to swiping right (or left) on the "28 St" station in user interface screen 700A.

[00128] If the user touches the area 722 and swipes up (i.e., moves his or her finger rapidly downwards relative to user interface screen 720A), a map 730 may be presented to the user within interface screen 720B. The map 730 may depict the user's current location with an indicator 732, and a location of the selected "28 St" station with another indicator 734. The zoom level may initially be set such that the user can immediately see the indicator 734. Alternatively, techniques may be used to indicate the direction in which the station is located, and/or the approximate distance, if a default or current zoom level causes the station to currently be off-screen. For example, techniques similar to those described in connection with Figures 2A and 2B may be used to show an off-screen location of the station.

[00129] The data representing the map 730 (e.g., map tile data), and the positions of indicators 732 and 734 on the map 730, may be provided by the map server 16. In one implementation, for example, map server 16 may have previously sent map tile data for the vicinity of the user's current location to mobile communications device 14, and wearable electronic device 12 may send to mobile communications device 14 (via short range link 26) data indicating that the user swiped up on the area 722. Mobile communications device 14 may, in response, send map server 16 a request for the location of the "28 St" station. Upon receiving the location, mobile communications device 14 may cause wearable electronic device 12 to display the indicator 734 at the appropriate location. Map server 16 may locally store the location of a number of stations, or may retrieve station locations from transit server 22, for example.

[00130] To return to the user interface screen 720A, the user may touch an area 736 and swipe down (i.e., move his or her finger rapidly upwards relative to user interface screen 720B). In other implementations, alternative user inputs may cause user interface screen 720B to be displayed, and/or the map 730 may be presented within a user interface having a different appearance and/or including different elements. For example, the user interface screen 720B may instead appear when the user swipes up on any location within user interface screen 720A (not just when touching area 722), and/or the user interface screen 720A may reappear when the user swipes down on any location within user interface screen 720B (not just when touching area 736). As other examples, the area 722 may instead be located at the bottom of user interface screen 720A (and the user may cause user interface screen 720B to appear by touching area 722 and swiping down rather than up), and/or the area 736 may instead be located at the top of user interface screen 720B (and the user may cause user interface screen 720A to reappear by touching area 736 and swiping up rather than down).

Other example techniques for improving user experience with wearable electronic devices

[00131] Still other techniques enable a wearable electronic device, such as wearable electronic device 12 of Figure 1 or wearable electronic device 612 of Figure 8, to enhance the user experience in other ways. In one implementation, for example, a smart watch may detect when the user "flicks" his or her wrist (e.g. , causes a rapid movement of the watch face in one rotational direction about the axis of his or her wrist/forearm, followed quickly by a rapid movement in the opposite rotational direction). The direction and speed (or direction and acceleration, etc.) of the watch face may be determined by processor 30 using

information from orientation sensor(s) 36 of Figure 1, for example, and processor 30 may further detect the time between the two rotational movements of the wrist flick. Processor 30 may then compare the direction, speed, acceleration, and/or time to threshold values (e.g. , a minimum speed and/or acceleration, a maximum time between a change of directions, etc.), or evaluate those and/or other parameters under a more complex algorithm, to determine whether to trigger a particular action. For example, processor 30 may cause a display of user interface 32 to present a digital map showing the user' s current location when the user flicks his or her wrist while in a navigation mode. The display may change to the map view from an initial view such as that shown in Figure 5 A or 5B, for example.

[00132] In another example implementation, a smart watch may automatically adjust the rotation of its user interface screen (e.g. , a screen showing a digital map, such as screen 100 of Figure 2A, or another type of screen such as screen 320 of Figure 5A, etc.) to maintain a good viewing angle for the user as the watch moves about. For example, a person who is driving may wish to view a map or other information on a smart watch by merely rotating his or her wrist inward slightly, without completely removing a hand from the steering wheel, and a person who is cycling may wish to view information on the smart watch without removing a hand from the handle bars, etc., while a person who is standing or walking may be accustomed to moving the entire arm, and rotating the wrist, to view the watch face in the conventional manner (e.g. , in front of the chest and angled up toward the eyes). Moreover, different people may have different ways of positioning the arm and wrist relative to the eyes and head when viewing a watch face. Thus, the conventional smartphone approach of turning the device to change between two orientations corresponding to two orthogonal different views (i.e. , landscape or portrait mode) may be unsatisfactory for smart watch users.

[00133] To understand how the display screen may be rotated, a reference "upward direction vector" that passes through the center of the display screen may be defined. If a digital map is being viewed, for example, the upward direction vector may be defined as a vector that aligns with the user' s direction of travel and passes through the center of the screen/map. If textual information (e.g. , an ETA, etc.) is being viewed, the upward direction vector may be defined as a vector that, if the text is viewed in the conventional manner (e.g. , as shown in Figure 5A with the characters being oriented right-side up), points from the bottom of the screen to the top of the screen, and again passes through the center of the screen.

[00134] In one implementation, the smart watch automatically adjusts the user interface screen in response to user movement of the watch, such that the upward direction vector of the screen aligns, at least approximately, with the direction of the user's gaze (assuming the user is looking in the direction of the smart watch display). One such implementation is depicted in Figures 11A and 1 IB. As seen in Figure 11A, in a first scenario 750, a user 752 wearing a smart watch 754 is driving a car and gripping the steering wheel. A display screen of the smart watch 754 (shown as presenting the letter "A" merely for illustration purposes) is rotated such that the upward direction vector of the screen aligns with a vector 756 that extends from a vertical line 760 roughly aligned with the center of the head of user 752 (appearing as a single point in Figure 11 A due to the overhead view) through the center of the screen of smart watch 754.

[00135] In a second scenario 770, shown in Figure 1 IB, the user 752 is viewing his or her watch in a more conventional manner (in front of his or her chest), and therefore the upward direction vector of the screen is aligned with a new vector 772 that extends from the same vertical line 760 through the new location of the center of the screen. As seen in Figures 11 A and 1 IB, the net result is that the letter "A" has rotated slightly in a counterclockwise direction on the display screen. Of course, the physical constraints of the watch display will not, in most scenarios, allow the upward direction vector to be perfectly aligned, in all three dimensions, with the vector 756 or 752. Thus, if the display/face of smart watch 754 is tilted such that the plane in which the upward direction vector of the screen resides is different than the plane in which the vector 756 or 772 resides, the vector resulting from the projection of the upward direction vector onto the plane of the vector 756 or 772 - rather than the upward direction vector itself - may be made to align with the vector 756 or 772. In some implementations, the upward direction vector (or the vector resulting from its projection onto another plane) may be aligned with a different vector instead of the vector 756 or 772. For example, the upward direction vector or its projection may be aligned with a vector that begins near the front of the user' s head (e.g. , near the eyes) rather than the vertical line 760.

[00136] The manner in which the amount of rotation is determined, according to one implementation, is now described in connection with Figure 1 (e.g. , with smart watch 754 corresponding to wearable electronic device 12). In this implementation, processor 30 may interact with one or more orientation sensor(s) 36 (e.g. , one or more accelerometers, a gyrometer, an inclinometer, etc.) to determine the likely orientation of the watch face relative to the user's head and/or eyes. In the scenarios of Figures 11A and 1 IB, for example, mapping/navigation application 42 may request information such as acceleration, compass bearings, tilt metrics, and/or other information from orientation sensor(s) 36, and process the information to determine the vector 756 or 772. Thereafter, mapping/navigation application 42 may cause the user interface screen provided on a display of user interface 32 to rotate about the display screen center such that the upward direction vector of the screen aligns with the vector 756 or 772. The amount of rotation needed may depend on the current rotation of the display screen, and the manner in which the display screen has moved. In some implementations, a calibration process is used (e.g. , once, or each time the watch is powered up, etc.) in order to estimate the location of the vertical line 760 (relative to the initial location of the smart watch 754) and/or to estimate an initial vector such as vector 756 or 772. The user may be prompted to view his or her watch in a position such as that shown in Figure 1 IB during the calibration process, for example. Once a frame of reference has been determined, mapping/navigation application 42 may rotate the screen as needed based on relative movements of the smart watch 754. It is understood that, while various vectors have been described above in order to describe how the display screen may be automatically rotated, those vectors are not necessarily calculated or utilized in some implementations. For example, the rotation may instead be accomplished by rotating the display screen in lock step with the reading of a compass within orientation sensor(s) 36, but with some rotations being avoided or canceled out using GPS data (e.g. , to avoid rotating the display screen as the user steers his or her car in a new direction, or begins walking in a new direction, etc.).

[00137] In some implementations, the display screen is only rotated in certain modes of operation. For example, a smart watch may trigger screen rotation when a user selects a driving navigation mode, or when the user's location (e.g. , GPS) information indicates that he or she is moving at a high speed and therefore is likely driving, etc. Further, in some of these implementations, the display screen is simply rotated by a fixed, predetermined amount (e.g. , 40 degrees clockwise, etc.) when the user is determined to be driving (or cycling, etc.), without calculating a rotation.

Example aspects of the invention

[00138] Although the foregoing text sets forth a detailed description of numerous different aspects and embodiments of the invention, it should be understood that the scope of the patent is defined by the words of the claims set forth at the end of this patent. The detailed description is to be construed as exemplary only and does not describe every possible embodiment because describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims. By way of example, and not limitation, the disclosure herein contemplates at least the following two sets of aspects.

[00139] A first set of aspects includes:

[00140] Aspect 1 - A method, implemented in an electronic device having a display and one or more processors, for displaying location indicators for one or more off-screen entities to a user of the electronic device, the method comprising: receiving, from a remote server, map data representing an area that includes a current location of the user; receiving, from the remote server, shared location data corresponding to a current location of a first entity, wherein the current location of the first entity is not within the area represented by the map data; using, by the one or more processors, the map data to present on the display a digital map of the area represented by the map data, the digital map having an outer perimeter and a map center; and using, by the one or more processors, the shared location data to present on the display, contemporaneously with the digital map of the area represented by the map data, a first location indicator corresponding to the current location of the first entity, at least by (1) assigning the first location indicator a first visual property indicative of a first distance, wherein the first distance is a distance between the current location of the first entity and the current location of the user, and wherein the first visual property does not include any text specifying a distance, and (2) positioning the first location indicator, with the first visual property, (i) at an offset distance from the map center such that the first location indicator is closer to the outer perimeter of the digital map than to the map center, and (ii) in a first direction from the map center, the first direction corresponding to a direction of the current location of the first entity relative to the current location of the user.

[00141] Aspect 2 - The method of aspect 1, wherein: receiving shared location data includes receiving data specifying geographic coordinates of the current location of the first entity; and prior to assigning the first location indicator, using the specified geographic coordinates to calculate the first direction and the first distance.

[00142] Aspect 3 - The method of aspect 1, wherein: receiving shared location data includes receiving data specifying (i) a position on the digital map and (ii) the first visual property; assigning the first visual property includes assigning the specified first visual property; and positioning the first location indicator includes positioning the first location indicator, with the first visual property, at the specified position on the digital map.

[00143] Aspect 4 - The method of any one of aspects 1-3, wherein positioning the first location indicator includes positioning the first location indicator such that the first location indicator is immediately adjacent to the outer perimeter of the digital map.

[00144] Aspect 5 - The method of any one of aspects 1-4, wherein assigning the first visual property includes assigning a size of the first location indicator, the size being indicative of the first distance.

[00145] Aspect 6 - The method of aspect 5, further comprising: receiving, from the remote server, updated shared location data corresponding to a new current location of the first entity; and using, by the one or more processors, the updated shared location data to present on the display an updated first location indicator corresponding to the new current location of the first entity, at least by (1) assigning the updated first location indicator an increased size, the increased size being indicative of a second distance, and the second distance being (i) a distance between the new current location of the first entity and either the current location of the user or a new current location of the user, and (ii) less than the first distance, and (2) positioning the updated first location indicator, with the increased size, in a second direction corresponding to a direction of the new current location of the first entity relative to either the current location of the user or the new current location of the user.

[00146] Aspect 7 - The method of aspect 6, wherein: the first location indicator does not include any photographic image or avatar of the first entity; and the updated first location indicator includes a photographic image or avatar of the first entity.

[00147] Aspect 8 - The method of any one of aspects 1-7, wherein assigning the first location indicator a first visual property includes assigning (i) a color of the first location indicator, the color being indicative of the first distance, or (ii) a shading of the first location indicator, the shading being indicative of the first distance.

[00148] Aspect 9 - The method of any one of aspects 1-8, further comprising: assigning a number to be displayed as text on or next to the first location indicator, the number being a metric representing either (i) the first distance, or (ii) a distance between the current location of the first entity and a location corresponding to a point along the outer perimeter of the digital map.

[00149] Aspect 10 - The method of any one of aspects 1-9, wherein the first location indicator is a circular icon.

[00150] Aspect 11 - The method of any one or aspects 1-10, wherein the first entity is a person who previously agreed to share his or her location with the user.

[00151] Aspect 12 - The method of any one of aspects 1-11, further comprising: receiving, from the remote server, additional shared location data corresponding to a current location of a second entity, wherein the current location of the second entity is not within the area represented by the map data; using, by the one or more processors, the additional shared location data to present on the display, contemporaneously with both the digital map of the area represented by the map data and the first location indicator, a second location indicator corresponding to the current location of the second entity, at least by (1) assigning the second location indicator a second visual property indicative of a second distance, wherein the second distance is a distance between the current location of the second entity and the current location of the user, and wherein the second visual property does not include any text specifying a distance, and (2) positioning the second location indicator, with the second visual property, in a second direction from the map center, the second direction corresponding to a direction of the current location of the second entity relative to the current location of the user.

[00152] Aspect 13 - The method of any one of aspects 1-12, wherein the electronic device is a smart watch device, and wherein the method further comprises: causing, by the one or more processors, both the digital map and the first location indicator to rotate on the display in a synchronized manner as the directional orientation of the electronic device changes.

[00153] Aspect 14 - An electronic device comprising: a display; one or more processors; and a memory, the memory storing instructions that, when executed by the one or more processors, cause the electronic device to (1) receive, from a remote server, map data representing an area that includes a current location of the user, (2) receive, from the remote server, shared location data corresponding to a current location of a first entity, wherein the current location of the first entity is not within the area represented by the map data, (3) use the map data to present on the display a digital map of the area represented by the map data, the digital map having an outer perimeter and a map center, and (4) use the shared location data to present on the display, contemporaneously with the digital map of the area

represented by the map data, a first location indicator corresponding to the current location of the first entity, at least by (A) assigning the first location indicator a first visual property indicative of a first distance, wherein the first distance is a distance between the current location of the first entity and the current location of the user, and wherein the first visual property does not include any text specifying a distance, and (B) positioning the first location indicator, with the first visual property, (i) at an offset distance from the map center such that the first location indicator is closer to the outer perimeter of the digital map than to the map center, and (ii) in a first direction from the map center, the first direction corresponding to a direction of the current location of the first entity relative to the current location of the user.

[00154] Aspect 15 - The electronic device of aspect 14, wherein the instructions cause the electronic device to: position the first location indicator such that the first location indicator is immediately adjacent to the outer perimeter of the digital map.

[00155] Aspect 16 - The electronic device of aspect 14 or 15, wherein the first visual property is a size of the first location indicator, the size being indicative of the first distance.

[00156] Aspect 17 - The electronic device of any one of aspects 14-16, wherein the first location indicator does not include any photographic image or avatar of the first entity, and wherein the instructions further cause the electronic device to: receive, from the remote server, updated shared location data corresponding to a new current location of the first entity; and use the updated shared location data to present on the display an updated first location indicator corresponding to the new current location of the first entity, at least by (1) assigning the updated first location indicator an increased size, the increased size being indicative of a second distance, and the second distance being (i) a distance between the new current location of the first entity and either the current location of the user or a new current location of the user, and (ii) less than the first distance, and (2) positioning the updated first location indicator, with the increased size, in a second direction corresponding to a direction of the new current location of the first entity relative to either the current location of the user or the new current location of the user, the updated first location indicator including a photographic image or avatar of the first entity.

[00157] Aspect 18 - A method, implemented on an electronic device having a display and one or more processors, for displaying location indicators for one or more off-screen entities to a user of the electronic device, the method comprising: presenting on the display, by the one or more processors, a digital map of an area, the digital map having an outer perimeter and a map center; presenting on the display, by the one or more processors, one or more location indicators corresponding to one or more respective people currently at one or more respective locations not represented on the digital map, each of the one or more location indicators (i) overlaying the digital map and (ii) being positioned, relative to the map center, in a direction corresponding to a direction of the respective location of the respective person relative to a location represented by the map center; and for each of the one or more location indicators, causing, by the one or more processors, a size of the location indicator to (i) increase as the respective person moves nearer to the location represented by the map center, and (ii) decrease as the respective person moves further from the location represented by the map center.

[00158] Aspect 19 - The method of aspect 18, further comprising: for each of the one or more location indicators, causing, by the one or more processors, a photographic image or avatar of the respective person to appear in the location indicator when the respective person moves within a threshold distance of the location represented by the map center.

[00159] Aspect 20 - The method of aspect 18 or 19, wherein presenting on the display the one or more location indicators includes presenting each of the one or more location indicators at a respective position immediately adjacent to the outer perimeter of the digital map.

[00160] A second set of aspects includes:

[00161] Aspect 1 - A method, implemented in an electronic device having a display and one or more processors, for displaying traffic -related information in an unobtrusive manner, the method comprising: transmitting, to a remote server, data indicative of an expected route of a user of the electronic device; receiving, from the remote server, traffic data indicative of a current degree of traffic along at least a portion of the expected route of the user; selecting, based on the traffic data and from among a plurality of visual properties corresponding to a plurality of respective traffic levels, a first visual property, wherein each of the plurality of visual properties includes (i) a respective color, (ii) a respective shade, or (iii) a respective pattern; and causing, by the one or more processors, the display to present a user interface screen to the user, wherein at least a portion of the user interface screen has the selected first visual property, and wherein the user interface screen does not include a digital map representing any portion of the expected route of the user.

[00162] Aspect 2 - The method of aspect 1, wherein transmitting data indicative of the expected route of the user includes transmitting data indicative of a destination selected or entered by the user via another user interface screen presented on the display.

[00163] Aspect 3 - The method of aspect 2, wherein: transmitting data indicative of the expected route of the user further includes transmitting data indicative of a current location of the user; and receiving traffic data indicative of a current degree of traffic along at least a portion of the expected route of the user includes receiving traffic data indicative of a current degree of traffic along a route between the current location of the user and the destination.

[00164] Aspect 4 - The method of aspect 3, wherein: causing the display to present the user interface screen to the user includes causing the display to present an estimated time of arrival (ETA) indicating an expected length of time needed for the user to reach the destination; and at least the portion of the user interface screen having the selected first visual property includes a background against which the ETA is set.

[00165] Aspect 5 - The method of aspect 4, wherein causing the display to present an ETA includes causing the display to present an ETA that accounts for the current degree of traffic. [00166] Aspect 6 - The method of aspect 1, wherein transmitting data indicative of the expected route of the user includes transmitting location data indicative of one or more current locations of the user, each of the one or more current locations corresponding to a roadway along which the user is currently traveling.

[00167] Aspect 7 - The method of any one of aspects 1-6, wherein each of the plurality of visual properties includes a respective one of a plurality of colors.

[00168] Aspect 8 - The method of aspect 7, wherein the plurality of colors includes: green to indicate a relatively light traffic level; yellow or orange to indicate a moderate traffic level; and red to indicate a relatively heavy traffic level.

[00169] Aspect 9 - The method of any one of aspects 1-8, wherein: causing the display to present the user interface screen to the user includes causing the display to present a transmit mode icon to the user, the transmit mode icon indicating a current mode of transportation of the user; and at least the portion of the user interface screen having the selected first visual property includes at least a portion of the transmit mode icon.

[00170] Aspect 10 - The method of aspect 9, wherein: each of the plurality of visual properties includes a respective one of a plurality of colors; the first visual property includes a first color of the plurality of colors; at least the portion of the user interface screen having the selected first visual property further includes a background against which the transmit mode icon is set; and the background against which the transmit icon is set is shaded a darker shade of the first color than is the transmit mode icon.

[00171] Aspect 11 - The method of any one of aspects 1-10, further comprising: receiving, from the remote server, updated traffic data indicative of a new current degree of traffic along at least the portion of the expected route of the user; selecting, based on the updated traffic data and from among the plurality of visual properties corresponding to the plurality of respective traffic levels, a second visual property different than the first visual property; and causing, by the one or more processors, the display to present an updated user interface screen to the user, wherein at least a portion of the updated user interface screen has the selected second visual property.

[00172] Aspect 12 - An electronic device comprising: a display; one or more processors; and a memory, the memory storing instructions that, when executed by the one or more processors, cause the electronic device to (1) transmit, to a remote server, data indicative of an expected route of a user of the electronic device, (2) receive, from the remote server, traffic data indicative of a current degree of traffic along at least a portion of the expected route of the user, (3) select, based on the traffic data and from among a plurality of visual properties corresponding to a plurality of respective traffic levels, a first visual property, wherein each of the plurality of visual properties includes (i) a respective color, (ii) a respective shade, or (iii) a respective pattern, and (4) cause the display to present a user interface screen to the user, wherein at least a portion of the user interface screen has the selected first visual property, and wherein the user interface screen does not include a digital map representing any portion of the expected route of the user.

[00173] Aspect 13 - The electronic device of aspect 12, wherein: the user interface screen includes an estimated time of arrival (ETA) indicating an expected length of time needed for the user to reach a destination; and at least the portion of the user interface screen having the selected first visual property includes a background against which the ETA is set.

[00174] Aspect 14 - The electronic device of aspect 13, wherein each of the plurality of visual properties includes a respective one of a plurality of colors.

[00175] Aspect 15 - The electronic device of any one of aspects 12-14, wherein: the user interface screen includes a transmit mode icon indicating a current mode of transportation of the user; and at least the portion of the user interface screen having the selected first visual property includes at least a portion of the transmit mode icon.

[00176] Aspect 16 - The electronic device of aspect 15, wherein: each of the plurality of visual properties includes a respective one of a plurality of colors; the first visual property includes a first color of the plurality of colors; at least the portion of the user interface screen having the selected first visual property further includes a background against which the transmit mode icon is set; and the background against which the transmit icon is set is shaded a darker shade of the first color than is the transmit mode icon.

[00177] Aspect 17 - The electronic device of any one of aspects 12-16, wherein the electronic device is a wearable electronic device.

[00178] Aspect 18 - A method, implemented in an electronic device having a display and one or more processors, for displaying traffic -related information in an unobtrusive manner, the method comprising: presenting on the display, by the one or more processors, a user interface screen including (i) an estimated time of arrival (ETA) indicating an expected length of time needed for a user of the electronic device to reach a destination, and (ii) a transmit mode icon indicating a current mode of transportation of the user; processing, by the one or more processors, traffic data received from a remote server to monitor a degree of traffic along at least a portion of an expected route of the user; and causing, by the one or more processors, a color of at least a portion of the user interface screen to change in response to a change in the monitored degree of traffic, wherein the portion of the user interface screen includes one or both of (i) a background against which the ETA is set and (ii) at least a portion of the transmit mode icon.

[00179] Aspect 19 - The method of aspect 18, wherein causing the color of at least the portion of the user interface screen to change includes: causing both (i) the background against which the ETA is set, and (ii) at least the portion of the transmit mode icon, to change in response to the change in the monitored degree of traffic; and causing the background to have a darker shade of the color than the portion of the transmit icon.

[00180] Aspect 20 - The method of aspect 18 or 19, further comprising: causing, by the one or more processors, the color of at least the portion of the user interface screen to change to a new color in response to determining that the current mode of transportation of the user has changed.

Other considerations

[00181] The following additional considerations apply to the foregoing discussion.

Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter of the present disclosure.

[00182] Unless specifically stated otherwise, discussions in the present disclosure using words such as "processing," "computing," "calculating," "determining," "presenting," "displaying," or the like may refer to actions or processes of a machine (e.g. , a computer) that manipulates or transforms data represented as physical (e.g. , electronic, magnetic, or optical) quantities within one or more memories (e.g. , volatile memory, non- volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.

[00183] As used in the present disclosure any reference to "one implementation" or "an implementation" means that a particular element, feature, structure, or characteristic described in connection with the implementation is included in at least one implementation or embodiment. The appearances of the phrase "in one implementation" in various places in the specification are not necessarily all referring to the same implementation.

[00184] As used in the present disclosure, the terms "comprises," "comprising,"

"includes," "including," "has," "having" or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, "or" refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).

[00185] Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for improving user experience for users of wearable electronic devices through the disclosed principles in the present disclosure. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed in the present disclosure. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed in the present disclosure without departing from the spirit and scope defined in the appended claims. Additionally, while particularly preferred embodiments are illustrated in the drawings of the present disclosure, such as the user interface screens of Figures 2A, 2B, 4A, 4B, 5 A, 5B, 7 and 9, it is understood that the functional features disclosed and claimed herein can be accomplished using user interface screens that differ ornamentally from these drawings, and that ornamental features of the drawings are not dictated by function.