Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR PROVIDING AUGMENTING REALITY INFORMATION ASSOCIATED WITH SIGNAGE
Document Type and Number:
WIPO Patent Application WO/2015/077766
Kind Code:
A1
Abstract:
Systems and/or methods may be provided for augmenting reality. For example, a real-world sign along a route being navigated or traversed may be recognized or identified (e.g., based on directions or navigation instructions for the route and/or an expected location). A determination may be made as to whether the real-world sign may be relevant. An appearance of the real-world sign may be adapted, based, at least in part, on the real world sign being relevant. The appearance of the sign may be adapted by augmenting a view of a real-world scene that includes the real-world sign with augmentation information (e.g., a virtual object). The view of the real-world scene that includes the real-world sign may be augmented with augmentation information by emphasizing or de-emphasizing the appearance of the real-world sign in the view with the augmentation information.

Inventors:
SINGH MONA (US)
Application Number:
PCT/US2014/067377
Publication Date:
May 28, 2015
Filing Date:
November 25, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PCMS HOLDINGS INC (US)
International Classes:
G06K9/00; G01C21/00; G06T19/00
Domestic Patent References:
WO2010005285A12010-01-14
Foreign References:
US20120224060A12012-09-06
US20100253541A12010-10-07
US20100198488A12010-08-05
US20080056535A12008-03-06
US20120223845A12012-09-06
Other References:
ANDREAS MOGELMOSE ET AL: "Vision-Based Traffic Sign Detection and Analysis for Intelligent Driver Assistance Systems: Perspectives and Survey", IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, IEEE, PISCATAWAY, NJ, USA, vol. 13, no. 4, 1 December 2012 (2012-12-01), pages 1484 - 1497, XP011474105, ISSN: 1524-9050, DOI: 10.1109/TITS.2012.2209421
Attorney, Agent or Firm:
ROCCIA, Vincent, J. et al. (Suite 1700Philadelphia, PA, US)
Download PDF:
Claims:
What is claimed is:

1. A method for augmenting reality, the method comprising:

recognizing a real-world sign along a route being navigated or traversed;

determining whether the real- world sign is relevant; and

adapting, based, at least in part, on the real world sign being relevant, an appearance of the real-world sign, wherein the appearance of the sign is adapted by augmenting a view of a real-world scene that includes the real-world sign with augmentation information.

2. The method of claim 1, wherein the augmentation information comprises a virtual object.

3. The method of claim 2, wherein the virtual object comprises at least one of the following: a highlight, a box, a circle, or a lowlight.

4. The method of claim 1, wherein augmenting the view of the real-world scene that includes the real-world sign with augmentation information comprises emphasizing the appearance of the real-world sign in the view with the augmentation information.

5. The method of claim 4, wherein emphasizing the appearance of the real-world sign in the view with the augmentation information comprises highlighting the appearance of the real-world sign in the view.

6. The method of claim 1, wherein augmenting the view of the real-world scene that includes the real-world sign with augmentation information comprises de-emphasizing the appearance of the real-world sign in the view with the augmentation information.

7. The method of claim 6, wherein de-emphasizing the appearance of the real- world sign in the view with the augmentation information comprises lowlighting the appearance of the real-world sign in the view.

8. A method for augmenting reality, the method comprising: obtaining directions or navigations instructions for a route to be navigated; determining, based, at least in part, on the directions or navigations instructions, a real- world sign expected to be disposed along, or in connection, with the route to be navigated;

determining whether the real- world sign is relevant;

recognizing the real- world sign along the route as the route is being navigated; and adapting, based, at least in part, the real-world sign being relevant and recognized, an appearance of the real-world sign, wherein the appearance of the sign is adapted by augmenting a view of a real-world scene that includes the real-world sign with augmentation information.

9. The method of claim 8, wherein the augmentation information comprises a virtual object.

10. The method of claim 8, wherein augmenting the view of the real-world scene that includes the real-world sign with augmentation information comprises emphasizing the appearance of the real-world sign in the view with the augmentation information.

11. The method of claim 10, wherein emphasizing the appearance of the real-world sign in the view with the augmentation information comprises highlighting the appearance of the real-world sign in the view.

12. The method of claim 8, wherein augmenting the view of the real-world scene that includes the real-world sign with augmentation information comprises de-emphasizing the appearance of the real-world sign in the view with the augmentation information.

13. The method of claim 12, wherein de-emphasizing the appearance of the real- world sign in the view with the augmentation information comprises lowlighting the appearance of the real-world sign in the view.

14. A method for augmenting reality, the method comprising:

obtaining directions or navigations instructions for a route to be navigated; obtaining, based, at least in part, on the directions or navigations instructions, an expected location for a real- world sign associated with the route to be navigated;

determining whether the real- world sign is relevant;

recognizing the real-world sign along the route as the route is being navigated based on the expected location; and

adapting, based, at least in part, on the real-world sign being relevant and recognized, an appearance of the real-world sign, wherein the appearance of the sign is adapted by augmenting a view of a real-world scene that includes the real-world sign with augmentation information..

15. The method of claim 14, wherein the augmentation information comprises a virtual object.

16. The method of claim 14, wherein augmenting the view of the real-world scene that includes the real-world sign with augmentation information comprises emphasizing the appearance of the real-world sign in the view with the augmentation information.

17. The method of claim 16, wherein emphasizing the appearance of the real-world sign in the view with the augmentation information comprises highlighting the appearance of the real-world sign in the view.

18. The method of claim 14, wherein augmenting the view of the real-world scene that includes the real-world sign with augmentation information comprises de-emphasizing the appearance of the real-world sign in the view with the augmentation information.

19. The method of claim 6, wherein de-emphasizing the appearance of the real- world sign in the view with the augmentation information comprises lowlighting the appearance of the real-world sign in the view.

Description:
SYSTEMS AND METHODS FOR PROVIDING AUGMENTING REALITY

INFORMATION ASSOCIATED WITH SIGNAGE

CROSS-REFERNCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of United States Provisional Application No. 61/908,406 filed November 25, 2013, which is hereby incorporated by reference herein.

BACKGROUND

[0002] Augmented reality (AR) may focus on combining real world and computer- generated data, for example, by blending augmentation information and real-world footage for display to an end user, generally in real or near-real time. Today, the scope of AR may be expanded to broad application areas, such as advertising, navigation, and entertainment to name a few. As such, there may be increasing interest in providing seamless integration of augmentation information into real- world scenes.

[0003] However, AR may present challenges such as new challenges for end user experience, and in particular, for appropriately displaying the augmentation information especially in view of its use with wearable devices or computers, navigation devices, smartphones, and/or the like and/or display footprint limitations associated with such devices. Further, current methods or techniques for displaying data on such devices, unfortunately, may not be suitable or thought out. For example, current methods or techniques for displaying augmentation information on devices may be arbitrary, may display or provide an excessive amount of information from the augmentation that may overwhelm a user, and/or the like. Such current methods or techniques for displaying augmentation information may be particularly problematic in driving situations under which a large number of real-world signs compete for a driver's attention and too much information may be distracting, overwhelming, and/or the like.

SUMMARY

Systems and/or methods may be provided for augmenting reality. For example, a real- world sign along a route being navigated or traversed may be recognized or identified (e.g., based on directions or navigation instructions for the route and/or an expected location). A determination may be made as to whether the real-world sign may be relevant. An appearance of the real- world sign may be adapted, based, at least in part, on the real world sign being relevant. The appearance of the sign may be adapted by augmenting a view of a real-world scene that includes the real-world sign with augmentation information such a virtual object. The view of the real-world scene that includes the real-world sign may be augmented with augmentation information by emphasizing or de- emphasizing the appearance of the real- world sign in the view with the augmentation information.

BRIEF DESCRIPTION OF THE DRAWINGS

[0004] A more detailed understanding may be had from the detailed description below, given by way of example in conjunction with drawings appended hereto. Figures in such drawings, like the detailed description, are examples. As such, the Figures and the detailed description are not to be considered limiting, and other equally effective examples are possible and likely. Furthermore, like reference numerals in the Figures indicate like elements, and wherein:

[0005] Figure 1 is a block diagram illustrating an example of an augmented reality system;

[0006] Figure 2 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0007] Figures 3-5 are images illustrating examples of real-world views showing augmented reality in connection with real-world signage;

[0008] Figure 6 is a block diagram illustrating an example of an augmented reality system;

[0009] Figure 7 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0010] Figure 8 is a block diagram illustrating an example of an augmented reality system;

[0011] Figure 9 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0012] Figure 10 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0013] Figure 11 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0014] Figure 12 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0015] Figure 13 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0016] Figure 14 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0017] Figure 15 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit; [0018] Figure 16 is a flow diagram illustrating example flow directed to augmenting reality via a presentation unit;

[0019] Figure 17 is a block diagram illustrating an example of an augmented reality system;

[0020] Figures 18-28 are flow diagrams illustrating example flows directed to augmenting reality via a presentation unit;

[0021] Figures 29-39 are flow diagrams illustrating example flows directed to using alerts for emphasizing real-world signage;

[0022] Figure 40A is a system diagram of an example communications system in which one or more disclosed embodiments may be implemented;

[0023] Figure 40B is a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in Figure 40A; and

[0024] Figures 40C, 40D, and 40E are system diagrams of example radio access networks and example core networks that may be used within the communications system illustrated in Figure 40A.

DETAILED DESCRIPTION

[0025] In the following detailed description, numerous specific details are set forth to provide a thorough understanding of embodiments and/or examples disclosed herein. However, it will be understood that such embodiments and examples may be practiced without some or all of the specific details set forth herein. In other instances, well-known methods, procedures, components and circuits have not been described in detail, so as not to obscure the following description. Further, embodiments and examples not specifically described herein may be practiced in lieu of, or in combination with, the embodiments and other examples described, disclosed or otherwise provided explicitly, implicitly and/or inherently (collectively "provided") herein.

[0026] As described herein, systems, methods, apparatuses, devices, and/or the like for may be used to provide or display augmented reality information. For example, systems, methods, apparatuses, devices, and/or the like may provide augmenting reality with respect to real- world signs including a view of real-world scenes that include real-world signs (e.g., by way of an augmented-reality presentation and/or user interface).

[0027] As described herein, in examples, a view and/or real-world view (e.g., that may be collectively be real- world view) may include or may be any view of a physical space. The real- world view may be viewable via (e.g., on, through, etc.) the presentation unit. As an example, the real- world view may be, or include, a view a physical space that includes any of the real- world signs and/or the real-world scenes having real-world signs, and/or that includes augmentation information in connection with any of the real- world signs and/or the real-world scenes. The augmentation information may be projected into the physical space (e.g., using holographic techniques and/or the like) or otherwise presented via the presentation unit so as to appear to be located or otherwise disposed within the physical space. Alternatively and/or additionally, the augmentation information may be provided or presented (e.g., displayed) so as to appear to be located or otherwise disposed on a display screen of the presentation unit. In various embodiments, some of the augmentation information may be projected into (or otherwise displayed to appear in) the physical space, and some of the augmentation information may be presented so as to appear to be located or otherwise disposed on the display screen.

[0028] In examples herein, a real-world sign may include text. A real-world sign may have one or more characteristics, features and/or attributes (e.g., sign characteristics). The sign characteristics may include, for example, a shape (e.g., a geometric shape), aspect ratio (e.g., length-to-height ratio), orientation, whether corners may be rounded, fill color, text color, size range, expected location (e.g., above a section of a path, road, and/or the like; on a pole disposed adjacent to the a section of a path, road, etc.; and/or the like).

[0029] The real- world sign may convey or otherwise provide a notice such as an informational notice (e.g., a notice that may include names, instructions, reminders, warnings, and/or the like), an instructional notice, and/or the like. For example, a notice may include (e.g., the real- world sign may bear) any of a name, instruction, direction, warning, advertisement (e.g., for a product and/or a service), and/or the like. In an example, the real-world sign may include a stop sign, yield sign, safety sign, a road or exit sign, a mile marker sign, and/or the like. A stop sign, for example, may provide notice to stop. A yield sign provides notice to yield. A configurable safety sign (e.g., a dynamic and/or updatable sign) may provide notice, for example, of upcoming construction, reduced speed, congestion ahead, etc. An exit sign provides notice of an exit. Other types of signs provide other types of notices.

[0030] The notice conveyed by the real-world sign may be determined, garnered, and/or otherwise obtained from the text, sign characteristics, and/or the like that may be disposed on the real- world sign.

[0031] In examples herein, a method may be provided for augmenting reality or displaying or providing augmented reality information using a presentation component, unit, or device. For example, a real-world sign may be determined or recognized along a route being navigated and/or being traversed. Further, an appearance of the real-world sign (e.g., real-world-sign appearance) may be adapted or modified by augmenting a real-world view of a real-world scene that includes the real- world sign. For example, upon determining a type of real-world sign, augmented reality information such as information associated with the real-world sign including route information, and/or the like may be provided, portions of the real- world sign may be changed (e.g., emphasized and/or deemphasized as described herein), and/or the like such that the appearance of the real-world sign may be adapted or modified.

[0032] In an example herein, adapting the real-world-sign appearance may include, for example, emphasizing or de-emphasizing, the real-world-sign appearance. Emphasizing and/or de-emphasizing the real- world- sign appearance may be carried out or performed by augmenting one or more portions of the real-world view associated with, or otherwise having connection to, the real- world sign and/or the real- world scene (e.g., portions neighboring the real-world sign). Emphasizing the real- world-sign appearance may draw attention to the real- world sign and/or to some portion of the real-world sign. De-emphasizing the real-world sign appearance may obscure the real-world sign (e.g., may make it inconspicuous and/or unnoticeable).

[0033] According to examples herein, a method may be provided that may be directed to augmenting reality (e.g., via the presentation unit). For example, a real-world sign along a route being navigated and/or being traversed may be determined or recognized. A determination of whether the real- world sign may be relevant (e.g., a relevancy determination) may be made. In an example, the real- world-sign appearance may be adapted and/or modified by augmenting a real- world view of a real-world scene that may be the real-world sign based, at least in part, on the relevancy determination.

[0034] In one or more examples, adapting the real-world-sign appearance may be based, and/or conditioned, on the real-world sign being (e.g., determined to be) relevant and/or being (determined to be) not relevant. As described herein, in an example, the real-world-sign appearance may be adapted and/or modified by emphasizing and/or de-emphasizing its appearance. For example, the real-world-sign appearance may be (i) de-emphasized based, and/or conditioned, on the real- world sign being relevant; and/or (ii) emphasized based, and/or conditioned, on the real-world sign being not relevant. According to examples herein, adapting the real-world-sign appearance may include (i) emphasizing the real-world-sign appearance for embodiments in which adapting the real-world-sign appearance may be based, and/or conditioned on, the real-world sign being relevant; (ii) de- emphasizing the real-world- sign appearance for embodiments in which adapting the real-world-sign appearance may be based, and/or conditioned on, the real-world sign being not relevant, and/or the like. Other forms of adapting may be used and/or provided including those in which the real-world-sign appearance may be (i) de-emphasized based, and/or conditioned, on the real-world sign being relevant; (ii) emphasized based, and/or conditioned, on the real-world sign being not relevant, and/or the like.

[0035] According to an example, adapting the real-world-sign appearance by augmenting the real-world view of the real-world scene may include displaying, via the presentation unit, a virtual object (e.g., augmentation information) in connection with the real- world sign and/or with the real-world scene disposed in the real-world view. For example, displaying virtual object in the real-world view may include projecting or otherwise superimposing the virtual object into (onto) the real- world view of the real- world scene.

[0036] The virtual object may include, or may be, any of an overlay, text, animation, and the like. Further, in examples, the virtual object may define a shape, and this shape may be, for example, a geometric shape. The geometric shape may appear as a two-dimensional geometrical shape or as a three-dimensional geometrical shape.

[0037] The virtual object may emphasize and/or de-emphasize the real-world-sign appearance. In examples in which the virtual object may emphasize the real-world-sign appearance, the virtual object may appear as one or more of the following: (i) an outline of the real- world sign; (ii) a frame positioned about, near, or around, at least a portion of (e.g., some or all) the real- world sign; (iii) a frame positioned about, near, or around a region of interest of the real-world sign, and/or the like. The virtual object may emphasize the real- world-sign appearance by one or more of the following (i) highlighting at least a portion of the real- world sign (e.g., some or the entire real-world sign), (ii) lowlighting at least a portion of the real-world sign; (iii) lowlighting at least a portion of the view of a real- world scene neighboring the real-world sign (e.g., some or the entire view of the real-world scene); (iv) lowlighting the real-world view of the real-world scene, for example, except for the real-world sign (e.g., highlighting the real- world sign); (v) lowlighting the view of the real-world scene except, for example, for the real- world sign and a portion of the real-world view of the real-world scene in a vicinity of the real- world sign (e.g., highlighting the real-world sign and the portion of the real-world view of the real-world scene in the vicinity of the real-world sign); and/or the like.

[0038] As described herein, in examples, the virtual object may de-emphasize the real-world- sign appearance by one or more of the following: (i) lowlighting at least a portion of the real- world sign (e.g., some or all of the real-world sign); (ii) highlighting at least a portion of the view of a real-world scene neighboring the real-world sign (e.g., some of or the entire view of the real- world scene); (iii) highlighting the real-world view of the real- world scene except, for example, for the real- world sign (e.g., lowlighting the real- world sign); (iv) highlighting the view of the real-world scene except, for example, for the real-world sign and a portion of the real-world view of the real- world scene in a vicinity of the real- world sign (e.g., lowlighting the real-world sign and the portion of the real- world view of the real- world scene in the vicinity of the real- world sign); and/or the like.

[0039] According to examples, the virtual object may be colored, and/or have a level of transparency (e.g., a selectable level of transparency). The color of the virtual object may be visually distinct (e.g., from its surroundings). Further, in an example, the virtual object may be visually weighted. There may be one or more rules (e.g., general rules) that may determine how much weight may be assigned to a visual object. In an example, larger, darker, high contrast, brightness, and/or the like may provide or make an object such as the virtual object visually weighted (e.g. these may be the one or more rules for visually weighting). For example, a part of a sign that appears in bold (e.g., via a virtual object) may be visually weighted and may be likely to grab the user's attention.

[0040] In an example, making the relevancy determination may include evaluating a set of rules including, for example, which rules (e.g., relevancy rules) may be configured for making the relevancy determination. Making the relevancy determination, according to an example, may include determining, considering, establishing, concluding, indicating, and/or the like that the recognized real- world sign may be relevant (i.e., may be a relevant real- world sign) on condition that one or more of the relevancy rules may be satisfied.

[0041] The relevancy rules may include a default rule. The default rule may specify that the real-world sign may be relevant, for example, unless (or conditioned on) the notice conveyed by the real- world sign may be (being) otherwise specified as being not relevant (e.g., matches or corresponds to a notice specified as being not relevant). Alternatively or additionally, the default rule may specify that the real-world sign may not be relevant, for example, unless, or conditioned on, the notice conveyed by the real- world sign being otherwise specified as being relevant (e.g., may match or correspond to a notice specified as being relevant).

[0042] One or more of the relevancy rules may specify that the real-world sign may be relevant based on or conditioned on the notice being related to, and/or falling within, one or more specified classes of signage (e.g., signage classes). The specified signage classes may include, for example, one or more of the following: (i) a signage class for signage and/or attendant notices that may be related to or classified as related to safety (e.g., safety-class signage); (ii) a signage class for signage and/or attendant notices that may be related to or classified as related to directions and/or navigation (e.g., way-forward-class signage); (iii) a signage class for signage and/or attendant notices that may be related to or classified as related to advertisements (e.g., advertisement-class signage); (iii) a signage class for signage and/or attendant notices that may be related to or classified as related to a path, street, route, lane, and/or the like (e.g., passageway-class signage); (iv) a signage class for signage and/or attendant notices that may be related to or classified as related to (a) having or specifying temporal conditions, and/or (b) being related to signage having or specifying temporal conditions (e.g., collectively temporal-class signage); (v) a signage class for signage and/or attendant notices that may be related to or classified as related to a configuration, characteristic, feature, attribute and/or accouterments of a vehicle (e.g., vehicle-configuration- class signage); (vi) a signage class for signage and/or attendant notices that may be related to or classified as related to one or more preferences of a user (e.g., user-preference-class signage). According to an example, the specified signage classes may include other signage classes, as well. Additionally, in examples, the signage classes may be dynamically created and/or may only be limited by the entire universe of signs and/or attendant notices.

[0043] The way-forward-class signage may include signage and/or attendant notices that may be related to one or more (e.g., turn-by-turn) directions or navigation instructions for the route being navigated and/or being traversed. The vehicle-configuration-class signage, for example, may include signage and/or attendant notices that may be related to a configuration, characteristic, feature, attribute and/or accouterments of a vehicle that may be undergoing the route being navigated and/or being traversed. The user-preference-class signage may include signage and/or attendant notices that may be selected and/or specified by a user of the vehicle undergoing the route being navigated and/or being traversed.

[0044] According to an example, the signage and/or attendant notices of a signage class may not be related in kind, type, and/or the like. The signage and/or attendant notices of a signage class may not have a common relationship. The signage and/or attendant notices of a signage class may be related to one another by belonging to such signage class. Further, in examples, the signage classes may not be mutually exclusive. For example, one or more of the signs and/or attendant notices belonging to one signage class may belong to one or more other signage classes. Alternatively or additionally, the signage classes may be mutually exclusive such that a signage class may not include a signage and/or attendant notices from another signage class.

[0045] In examples herein, the signage classes may include one or more signage classes for road signs (e.g., road-signage classes). The road-signage classes may include one or more of the following: (i) a regulatory class; (ii) a warning class; (iii) guide class; (iv) a services class; (v) a construction class; (vi) recreation class; (vii) a school zone class; (viii) an incident management class, and/or the like.

[0046] One or more of the relevancy rules may specify that the real-world sign may be relevant conditioned on the notice (i) being associated with one or more of the specified signage classes and/or (ii) not being associated with one or more of the other specified signage classes. For example, one or more of the relevancy rules may specify that the real- world sign may be relevant based on or conditioned on the notice not being related to and/or falling within, one or more of the specified signage classes. One or more of the relevancy rules, for example, may specify that the real- world sign may be relevant based on or conditioned on the notice not being associated with advertisement-class signage. In an example, one or more of the relevancy rules may specify the real-world sign may be relevant based on or conditioned on the notice (i) not being associated with the advertisement-class signage, but (ii) being associated with one or more of the other specified signage classes such as with the way- forward-class signage.

[0047] According to an example, one or more of the relevancy rules may specify that the real- world sign may be relevant based on or conditioned on one or more conditions (e.g., notice conditions) for complying with the notice being satisfied. The notice conditions may be specified by the notice in an example. For example, the notice conditions may be specified explicitly such as a sign posting one or more temporal conditions (e.g., in conjunction with a speed-limit sign disposed in a school zone). Alternatively and/or additionally, the notice conditions may be specified implicitly. By way of example, the notice conditions may be one or more conditions for complying with a notice that may be instructional (e.g., as different from informational). An example of an instructional notice may include "Use Tire Chains," "Men at Work, Drive Slowly," and/or the like that may provide instruction to a user.

[0048] Such notice may be found on a sign that is a member of a signage class, such as, for example, a road-signage class. By way of example, the notice conditions for complying with a speed-limit sign (i.e., whether or not the speed limit is being satisfied) are implicitly specified by such sign. One or more of the relevancy rules, for example, may specify that a speed-limit sign (i.e., a sign posting a speed limit) may be relevant based on or conditioned on the posted speed limit (e.g., the notice condition) not being satisfied. Such relevancy rules may specify that the speed-limit sign may be relevant if, for instance, the posted speed limit may be a maximum speed limit and the speed limit may be exceeded (e.g., or exceeded by more than a threshold or tolerance). Alternatively and/or additionally, the relevancy rules may specify that the speed-limit sign may be, relevant if the posted speed limit may be a minimum speed limit and the speed limit may not be exceeded (e.g., or may not be exceeded by more than a threshold or tolerance).

[0049] The notice conditions may be one or more temporal conditions. The temporal conditions may indicate one or more days of a week and/or one or more times of day, for example, for when the notice may be applicable. One or more of the relevancy rules may specify that a sign posting one or more temporal conditions (e.g., in conjunction with a speed- limit sign disposed in a school zone) may be relevant conditioned on one or more of the temporal conditions being satisfied. One or more of the relevancy rules may specify that a sign posting one or more temporal conditions is not, or might not be, relevant conditioned on one or more of the temporal conditions not being satisfied. One or more of the relevancy rules may specify that a sign posting one or more temporal conditions in conjunction with a speed-limit sign (e.g., disposed in a school zone) posting a maximum speed limit may not be relevant conditioned on the posted speed limit not being exceeded (or not being exceeded by more than a threshold or tolerance). One or more of the relevancy rules may specify that a sign posting one or more temporal conditions in conjunction with a speed-limit sign (e.g., disposed in a school zone) posting a maximum speed limit may be relevant conditioned on the posted speed limit being exceeded (e.g., or exceeded by more than a threshold or tolerance).

[0050] As another or additional example, one or more of the relevancy rules may specify a sign having a requirement for tire chains (e.g., tire-chain requirement) may be relevant based on or conditioned on the tire-chain requirement not being satisfied. Alternatively or additionally, one or more of the relevancy rules may specify that a sign having the tire-chain requirement may be relevant based on or conditioned on the requirement being satisfied. In an example, the notice conditions for the tire-chain requirement may include temporal conditions (e.g., during winter months) under which compliance with the tire-chain requirement may be applicable. Further, according to an example, one or more of the relevancy rules may specify that a sign having the tire-chain requirement along with the temporal conditions may be relevant based on or conditioned on the tire-chain requirement not being satisfied and the temporal conditions being satisfied. Alternatively or additionally, one or more of the relevancy rules may specify that a sign having the tire-chain requirement along with the temporal conditions may be relevant based on or conditioned on the tire-chain requirement being satisfied and/or the temporal conditions not being satisfied.

[0051] One or more of the relevancy rules may specify that the real-world sign may be relevant conditioned on and/or based, at least in part, on driving and/or environmental conditions along the route. As an example, one or more of the relevancy rules may specify that a speed-limit sign having a notice condition indicating a 55 mph speed limit may be relevant based on or conditioned on traffic along the route not being able support the 55 mph. Alternatively or additionally, in an example, one or more of the relevancy rules may specify that a speed-limit sign having a notice condition indicating a 55 mph speed limit may be relevant based on or conditioned on weather conditions along the route not being able support the 55 mph.

[0052] In another or additional example, one or more of the relevancy rules may specify that a cross-walk sign may be relevant based on or conditioned on a pedestrian being in or approaching a cross walk. One or more of the relevancy rules may specify that a stop sign may be relevant based on or conditioned on a prediction that a timely stop is in question. Further, in an example, one or more of the relevancy rules may specify that the real-world sign may be relevant based, at least in part, on one or more directions and/or navigation instructions for navigating the route being navigated and/or being traversed.

[0053] Adapting the real-world-sign appearance may include adapting the real-world-sign appearance on condition that the real-world sign is within the field of view. In examples (e.g. , in one or more of the methods described herein), a determination may be made on whether the real-world sign disposed in real-world view may be within a field of view. For example, a device or system described herein may determine whether the real-world sign that may be disposed in the real- world view may be within the field of view. According to an example, the adapted or modified real- world-sign appearance may be viewable within the field of view. The field of view may be determined (e.g., determinable) from and/or based on user input such as from and/or based on input associated with a user gaze.

[0054] Figure 1 is a block diagram illustrating an example of an augmented reality system 10 in accordance with at least some embodiments described herein. The augmented reality system 10 may be used and/or implemented in a device such as a computing device, mobile device, and/or any other suitable device that can receive, process and present (e.g., display) information. For example, the device may be a wearable computer; a smartphone; a wireless transmit/receive unit (WTRU), such as described with reference to Figures 40A-40E (below); another type of user equipment (UE) or the like. Other examples of the computing device include a mobile device, personal digital assistant (PDA), a cellular phone, a portable multimedia player (PMP), a digital camera, a notebook, and a tablet computer, a vehicle navigation computer (e.g., with a heads-up display). In general, the device may include a processor-based platform that operates on a suitable operating system, and that may be capable of executing software (e.g., including the methods described herein).

[0055] The augmented reality system 10 may include an image capture unit 100, a sign- recognition unit 110, an augmented reality unit 120, a presentation controller 130 and a presentation unit 140. The image capture unit 100 may capture real- world views of real-world scenes and may provide the captured real-world views to the sign-recognition unit 110 and/or the augmented reality unit 120. The image capture unit 100 may be, or include, any of a digital camera, a camera embedded in a mobile device, a head mounted display (HMD), an optical sensor, an electronic sensor, and/or the like.

[0056] The sign-recognition unit 110 may receive the captured real- world views from the image capture unit 100, and may recognize (e.g., carry out a recognition process or routine to recognize) real-world signage disposed in the real-world scenes depicted in the captured real- world views. In an example, the sign-recognition unit 110 may include an object recognition unit 112 and a notice recognition unit 114.

[0057] The object recognition unit 112 may identify real- world signage disposed in the real- world scenes depicted in the real- world views. In an example (e.g., to facilitate identifying the real-world signage), the object recognition unit 112 may perform object detection on real-world views. Using object detection, the object recognition unit 112 may detect and/or differentiate the real-world signage from other objects disposed within the real-worlds views. The object recognition unit 112 may use any of various known technical methodologies for performing the object detection, including, for example, edge detection, primal sketch, change(s) in viewing direction, changes in luminosity and color, and/or the like.

[0058] According to an example, the object recognition unit 112 may include a depth recognition unit 116. The depth recognition unit 116, for example, may determine real-world and/or localized map locations for the detected real-world signage. The depth recognition unit 116 may use a location recognition algorithm. The location recognition algorithm used may be an algorithm such as a Parallel Tracking and Mapping (PTAM) method and/or a Simultaneous Localization and Mapping (SLAM) method, and/or the like. The depth recognition unit 116 may obtain and use positioning information (e.g., latitude, longitude, attitude, etc.) for determining the real-world and/or localized map location for the detected real-world signage according to an example. The positioning information may be obtained from a global position system (GPS) receiver (not shown) communicatively coupled to the augmented reality system 10, sign-recognition unit 110 and/or the depth recognition unit 116, and/or via network assistance (e.g., such as, from any type of network node of a network (self-organizing or otherwise)).

[0059] The notice recognition unit 114 may obtain and/or receive notices conveyed by the real- world signage (e.g., based on the real-world sign detected by the sign recognition unit 110). For example, the notice recognition unit 114 may obtain information associated with the detected or recognized real-world signage including an indication of the real-world signage such as a type, kind, and/or the like of the sign and/or the text, characteristics, and/or the like included in the real- world signage from the sign recognition unit 110. The notice recognition unit 114 may determine notices conveyed or provided by the real-world signage based on the information such as the type, kind, text, characteristics, and/or the like (e.g., determined by the sign recognition unit 110). The notices conveyed by the real- world signage may be garnered and/or determined from text disposed on the real-world signage. Alternatively and/or additionally, the notices may be garnered and/or determined from the signage characteristics.

[0060] The augmented reality unit 120 may generate augmented reality presentations, and/or may provide the augmented reality presentations to the presentation controller 130. The augmented reality unit 120 may adapt an appearance of the real- world signage (e.g., the real- world-signage appearance). The augmented reality unit 120 may adapt the real-world-signage appearance by augmenting real-world views of real- world scenes that include the real-world signage according to an example. In an example, the augmented reality unit 120 may adapt or modify the real-word signage appearance based on the information associated with the real- world signage recognized and/or determined or detected by the sign recognition unit 110 and/or the notices conveyed thereby determined by the notice recognition unit 114. [0061] The augmented reality unit 120 may augment (e.g., overlay additional information on) the real- world views of the real- world scenes by, for example, generating for display via the presentation unit 140, virtual objects (e.g., augmentation information) for use in connection with the real- world signage (e.g., that may be determined by the sign recognition unit 110) and/or with the real-world scenes disposed in the real- world views (e.g. , and/or the notices that may be determined by the notice recognition unit 114). The augmented reality unit 120 may generate the virtual objects, for example, to enable projection or otherwise superimposition of the virtual objects into or onto the real- world views of the real-world scenes.

[0062] According to examples, a virtual object may include, or may be, any of an overlay, text, animation, and the like. In an embodiment, a sound file may be associated with the virtual object. The virtual object may define a shape, and this shape may be, for example, a geometric shape. The geometric shape may appear as a two-dimensional geometrical shape or as a three- dimensional geometrical shape.

[0063] The virtual object may emphasize and/or de-emphasize the real-world-sign appearance. In the examples in which the virtual object may emphasize (and/or deemphasize) the real- world-sign appearance, the virtual object may appear as one or more of the following: (i) an outline of the real-world sign; (ii) a frame positioned about, at least a portion of (e.g., some or all) the real-world sign; and (iii) a frame positioned about a region of interest of the real- world sign. The virtual object may emphasize the real- world-sign appearance by one or more of the following: (i) highlighting at least a portion (e.g., some or the entire) real-world sign, (ii) lowlighting some of the real- world sign; (iii) lowlighting at least a portion (e.g., some of or the entire view) of a real-world scene neighboring the real-world sign; (iv) lowlighting the real- world view of the real-world scene, for example, except for the real-world sign (e.g., highlighting the real- world sign); and/or (v) lowlighting the view of the real- world scene, for example, except for the real-world sign and a portion of the real-world view of the real-world scene in a vicinity of the real-world sign (e.g., highlighting the real- world sign and the portion of the real- world view of the real-world scene in the vicinity of the real-world sign).

[0064] The virtual object may de-emphasize the real-world-sign appearance by one or more of the following: (i) lowlighting at least a portion (e.g., some or all) of the real-world sign; (ii) highlighting at least a portion (e.g., some of or the entire view) of a real- world scene neighboring the real-world sign; (iii) highlighting the real-world view of the real- world scene, for example, except for the real-world sign (e.g., lowlighting the real-world sign); and/or (iv) highlighting the view of the real- world scene, for example, except for the real- world sign and a portion of the real- world view of the real- world scene in a vicinity of the real-world sign (e.g., lowlighting the real- world sign and the portion of the real- world view of the real-world scene in the vicinity of the real- world sign). [0065] The virtual object may be colored, and/or have some (e.g., a selectable) level of transparency. In examples, the color of the virtual object may be visually distinct (e.g., from its surroundings). The virtual object may also be visually weighted with a high contrast outline, holding of text, and/or the like (e.g., as described herein).

[0066] According to an example, the augmented reality unit 120 may generate the virtual objects in real-time (e.g. on the fly or in response to a request. Alternatively, the augmented reality unit 120 may obtain the virtual objects from a virtual object repository (not shown). The virtual object repository may store the virtual objects (augmentation information) for use in connection with the real-world signage disposed in the real-world scenes. The virtual object repository may store the virtual objects in association with (e.g., indexed by) real- world and/or localized map locations commensurate with the real- world signage.

[0067] The virtual object repository may provide the virtual objects using real- world and/or localized map locations commensurate with the real-world signage, which locations may be passed to it in a query (e.g., that may be determined by the system as described herein). The virtual object repository may provide the retrieved the virtual objects to the augmented reality unit 120 in response to the query according to one example.

[0068] The augmented reality unit 120 may provide generated and/or obtained augmentation information (e.g., the virtual objects) for adapting the real-world signage to the presentation controller 130. The presentation controller 130 may obtain the augmentation information for adapting the real-world signage from the augmented reality unit 130. The presentation controller 130 may translate the obtained augmentation information for presentation via the presentation unit 140. The presentation controller 140 may provide the translated augmentation information to the presentation unit 140.

[0069] The presentation unit 140 may present real- world views and/or an augmented reality presentation and/or a presentation portion of an augmented reality user interface (e.g., collectively augmented reality presentation). The presentation unit 140, for example, may receive the translated augmentation information (e.g., from the augmented reality unit 120) and may adapt and/or modify the real-world-signage appearances by applying the augmentation information to the real- world views.

[0070] The presentation unit 140 may be any type of device for presenting visual and/or audio presentation such as a display controller, an audio controller, a display, a speaker, and/or the like. In an example, the presentation unit 140 may include a screen of a device. The presentation unit 140 may be or may include any type of display, including, for example, a windshield display, wearable device or computer (e.g., glasses such as smart glasses, watches such as a smart watch, and/or the like), a smartphone screen, a navigation system, and/or the like. One or more user inputs may be received by, through and/or in connection with user interaction with the presentation unit 140. For example, a user may input a user input or selection by and/or through touching, clicking, drag-and-dropping, gazing at, voice/speech recognition and/or other interaction in connection with real-world views and/or augmented reality presentations presented via the presentation unit 140. For example, the presentation unit 140 may include a touch screen associated with the display and/or a microphone that may receive gestures and/or audio commands that may be received as user input. The user input may then be used to adjust one or more the real-world views or scenes, real-world signage, virtual objects, augmented information, and/or the like that may be provided the augmented reality system 10 and/or the units therein.

[0071] Figure 2 is a flow diagram illustrating example method 200 for augmenting reality (e.g., via the augmented reality system 10 including a presentation unit). As described herein, the method 200 may b described with reference to the augmented reality system of Figure 1, and to the illustrative example images or visual depicts shown in Figures 3-5. The method 200 may be carried out using other architectures in additional or alternative examples.

[0072] At 202, the sign-recognition unit 110 may recognize and/or determine a real- world sign along a route being navigated and/or being traversed as described herein. For example, the sign-recognition unit 110 may recognize and/or determine a type or kind of a real- world sign, characteristics of the real-world sign, one or more notices associated with the real-world sign and/or the like. According to an example, the sign-recognition unit 110 may recognize exit sign 300 as shown in Figure 3, a sign 400 as shown in Figure 4 posting temporal conditions in connection with a speed limit sign located in a school zone, a sign 500 as shown in Figure 5 posting "NO TRUCKS ... IN LEFT LANE." As described herein, the sign-recognition unit 110 may recognize and/or determine the real-world sign by carrying out a recognition process or routine to recognize real-world signage disposed in the real-world scenes depicted in the captured real-world views (e.g., that may be received from the image capture unit 100). In an example, the sign-recognition unit 110 may use the object recognition unit 112 and the notice recognition unit 114 to recognize and/or determine the real-world sign at 202 as described herein.

[0073] At 204, the augmented reality unit 120 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt the real-world-sign appearance by augmenting a real-world view of a real-world scene that includes the real-world sign as described herein. For example, the augmented reality unit 120 may receive an indication or information associated with the real-world sign determined and/or recognized at 202 and may adapt or modify, at 204, the real-world sign appearance of the real-world sign determined or recognized. The augmented reality unit 120 may, for example, generate a virtual object to emphasize the real-world sign. As shown in Figure 3, the virtual object may be a frame 302 positioned about the exit sign 300, for instance. Alternatively or additionally, as shown in Figure 4, the virtual object may highlight 402 at least a portion or some (e.g., one or more of the temporal conditions) or the entire sign 400. The virtual object may highlight the sign posting "NO TRUCKS ... IN LEFT LANE" by obscuring a portion 502 of the real-world scene that includes such sign as shown in Figure 5. The augmented reality unit 120 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140, which, in turn, may adapt the real-world-sign appearance by using the virtual object to augment the real-world view of a real-world scene that includes the real- world sign.

[0074] Although not shown in Figure 1, the augmented reality system 10 may include a field- of-view determining unit. The field-of-view determining unit may interface with the image capture unit 100 and/or a user tracking unit (not shown) to determine whether the real- world sign disposed in real-world view is within a field of view of a user (e.g., as part of the method 200). The user tracking unit may be, for example, an eye tracking unit.

[0075] The eye tracking unit may use or employ eye tracking technology to gather data about eye movement from one or more optical sensors, and based on such data, track where the user may be gazing and/or may make user input determinations based on various eye movement behaviors. The eye tracking unit may use any of various known techniques to monitor and track the user's eye movements.

[0076] The eye tracking unit may receive inputs from optical sensors that face the user, such as, for example, the image capture unit 100, a camera (not shown) capable of monitoring eye movement as the user views the presentation unit 140, or the like. The eye tracking unit may detect the eye position and the movement of the iris of each eye of the user. Based on the movement of the iris, the eye tracking unit may make various observations about the user's gaze. For example, the eye tracking unit may observe saccadic eye movement (the rapid movement of the user's eyes), and/or fixations (dwelling of eye movement at a particular point or area for a certain amount of time).

[0077] The eye tracking unit may generate one or more inputs by employing an inference that a fixation on a point or area (e.g., collectively a focus region) on the screen of the presentation unit 140 may be indicative of interest in a portion of the real-world view underlying the focus region. The eye tracking unit, for example, may detect or determine a fixation at a focus region on the screen of the of the presentation unit 140 and may generate the field of view based on the inference that fixation on the focus region may be a user expression of designation of the real- world sign.

[0078] The eye tracking unit may also generate one or more of the inputs by employing an inference that the user's gaze toward, and/or fixation on a focus region corresponding to, one of the virtual objects may be indicative of the user's interest (or a user expression of interest) in the corresponding virtual object. One or more inputs indicating an interest in the real- world sign may include a location (e.g., one or more sets of coordinates) associated with the real- world view according to an example.

[0079] With reference to Figure 2, the augmented reality unit 120 (in connection with the presentation controller 130 and/or the presentation unit 140) may adapt or modify (e.g., at 204) the real- world-sign appearance on condition that the real- world sign is within the field of view. Alternatively or additionally, the augmented reality unit 120 (in connection with the presentation controller 130 and/or the presentation unit 140) may adapt the real-world-sign appearance for a field of view that may be determined (e.g., determinable) from, and/or based, on user input. In various embodiments, the augmented reality unit 120 (in connection with the presentation controller 130 and/or the presentation unit 140) may adapt the real-world-sign appearance for a field of view that is determinable from, and/or based, on input associated with a user gaze.

[0100] Figure 6 is a block diagram illustrating an example of an augmented reality system 20 in accordance with at least some embodiments described herein. The augmented reality system 20 may be used and/or implemented in a device such as a computing device as described herein. The augmented reality system 20 of Figure 6 may be similar to the augmented reality system 10 of Figure 1 (e.g., except as described herein below). The augmented reality system 20 may include the image capture unit 100, the sign-recognition unit 110, a relevancy-determining unit 210, an augmented reality unit 220, the presentation controller 130 and the presentation unit 140.

[0101] The image capture unit 110 may capture an image as described herein and may provide the image to the sign-recognition unit 110. According to examples, the sign- recognition unit 110 may recognize and/or determine a real- world sign in the image and/or notices associated with the sign as described herein (e.g., above with respect to Figure 1). The sign-recognition unit 110 may provide information and/or an indication of the recognized and/or determined sign to the relevancy- determining unit 210 in one example.

[0102] The relevancy-determining unit 210 may make determinations of whether real-world sign or signage may be relevant (e.g., relevancy determinations). The relevancy- determining unit 210 may include a relevancy- determining engine 212. The relevancy-determining unit 210 and/or the relevancy- determining engine 212 may be communicatively coupled with a relevancy-rule repository 214. The relevancy-rule repository 214 may store a set of rules (e.g., relevancy rules) that may be configured for making the relevancy determination as described herein.

[0103] The relevancy rules may include a default rule. The default rule may specify that the real-world sign may be relevant, unless (or conditioned on) the notice (e.g., that may be determined from the notice recognition unit 114 of the sign-recognition unit 110) conveyed by the real-world sign may be otherwise specified as being not relevant (e.g., may match or correspond to a notice specified or indicated as being not relevant). Alternatively or additionally, the default rule may specify that the real- world sign may not be relevant unless, or conditioned on, the notice conveyed by the real-world sign being otherwise specified as being relevant (e.g., may matches or corresponds to a notice specified or indicated as being relevant).

[0104] In examples herein, one or more of the relevancy rules may specify that the real-world sign may be relevant conditioned on the notice being related to, and/or falling within, one or more specified signage classes. The specified signage classes may include, for example, one or more of the following: (i) a signage class for safety-class signage; (ii) a signage class for way- forward-class signage; (iii) a signage class for advertisement-class signage; (iii) a signage class for passageway-class signage; (iv) a signage class for temporal-class signage; (v) a signage class for vehicle-configuration-class signage; (vi) a signage class for user-preference-class signage. The specified signage classes may include other signage classes as well.

[0105] The way-forward-class signage may include signage, and/or attendant notices, related to one or more (e.g., turn-by-turn) directions or navigation instructions for the route being navigated and/or being traversed. The vehicle-configuration-class signage, in various examples, may include signage, and/or attendant notices, related to a configuration, characteristic, feature, attribute and/or accouterments of a vehicle that may be undergoing the route being navigated and/or being traversed. The user-preference-class signage, in examples, may include signage, and/or attendant notices, selected and/or specified by a user of the vehicle undergoing the route being navigated and/or being traversed.

[0106] The signage, and/or attendant notices, of a signage class may not be related in kind, type, and/or the like. Further, in an example, the signage and/or attendant notices of a signage class may not have a common relationship. The signage, and/or attendant notices, of a signage class may be related to one another by belonging to such signage class. The signage classes may not be mutually exclusive in one example as described herein. For example, one or more of the signs, and/or attendant notices, belonging to one signage class may belong to one or more other signage classes. Alternatively or additionally, the signage classes may be mutually exclusive such that no signage class includes signage and/or attendant notices from another signage class.

[0107] As described herein, the signage classes may include road-signage classes. The road- signage classes may include one or more of the following: (i) a regulatory class; (ii) a warning class; (iii) guide class; (iv) a services class; (v) a construction class; (vi) recreation class; (vii) a school zone class; (viii) incident management class; and/or the like. [0108] According to an example, one or more of the relevancy rules may specify that the real- world sign may be relevant based on or conditioned on the notice (i) being associated with one or more of the specified signage classes and/or (ii) not being associated with one or more of the other specified signage classes. One or more of the relevancy rules may specify that the real- world sign may be relevant based on or conditioned on the notice not being related to and/or falling within, one or more of the specified signage classes. One or more of the relevancy rules, for example, may specify that the real-world sign may be relevant based on or conditioned on the notice not being associated with advertisement-class signage. One or more of the relevancy rules may specify the real-world sign may be relevant conditioned on the notice (i) not being associated with the advertisement-class signage, but (ii) being associated with one or more of the other specified signage classes, such as with the way-forward-class signage.

[0109] As described herein, according to an example, one or more of the relevancy rules may specify that the real- world sign may be relevant based on or conditioned on notice conditions for complying with the notice being satisfied. The notice conditions may be specified by the notice according to an example. The notice conditions may be specified explicitly. Alternatively and/or additionally, the notice conditions may be specified implicitly. The notice conditions may be one or more conditions for complying with a notice that may be instructional (e.g., as different from being merely informational). Such notice may be found on a sign that may be a member of a signage class, such as, for example, a road-signage class. By way of example, the notice conditions for complying with a speed-limit sign (i.e., whether or not the speed limit is being satisfied) may be implicitly specified by such sign.

[0110] One or more of the relevancy rules, for example, may specify that a speed-limit sign (i.e., a sign positing a speed limit) may be relevant based on or conditioned on the posted speed limit (notice condition) not being satisfied. Such relevancy rules may specify that the speed- limit sign may be relevant if, for example, the posted speed limit may be a maximum speed limit and the speed limit may be exceeded (e.g., or exceeded by more than a threshold or tolerance). Alternatively and/or additionally, the relevancy rules may specify that the speed- limit sign may be relevant if the posted speed limit may be a minimum speed limit and the speed limit may not be exceeded (e.g., or not being exceeded by more than a threshold or tolerance).

[0111] According to one example, the notice conditions may be one or more temporal conditions. The temporal conditions may indicate one or more days of a week and/or one or more times of day, for example, for when the notice may be applicable. One or more of the relevancy rules may specify that a sign posting one or more temporal conditions (e.g., in conjunction with a speed-limit sign disposed in a school zone) may be relevant based on or conditioned on one or more of the temporal conditions being satisfied. One or more of the relevancy rules may specify that a sign posting one or more temporal conditions may not be relevant based on or conditioned on one or more of the temporal conditions not being satisfied. One or more of the relevancy rules may specify that a sign posting one or more temporal conditions in conjunction with a speed-limit sign (e.g., disposed in a school zone) posting a maximum speed limit may not be relevant based on or conditioned on the posted speed limit not being exceeded (e.g., or not being exceeded by more than a threshold or tolerance). Further, in an example, one or more of the relevancy rules may specify that a sign posting one or more temporal conditions in conjunction with a speed-limit sign (e.g., disposed in a school zone) posting a maximum speed limit may be relevant conditioned on the posted speed limit being exceeded (e.g., or exceeded by more than a threshold or tolerance).

[0112] In examples herein, one or more of the relevancy rules may specify that a sign having a tire-chain requirement may be relevant based on or conditioned on the tire-chain requirement not being satisfied. Alternatively or additionally, one or more of the relevancy rules may specify that a sign having the tire-chain requirement may be relevant based on or conditioned on the requirement being satisfied. The notice conditions for the tire-chain requirement may include temporal conditions (e.g., during winter months) under which compliance with the tire-chain requirement may be applicable. Further, one or more of the relevancy rules may specify that a sign having the tire-chain requirement along with the temporal conditions may be relevant based on or conditioned on the tire-chain requirement not being satisfied and the temporal conditions being satisfied. Alternatively or additionally, one or more of the relevancy rules may specify that a sign having the tire-chain requirement along with the temporal conditions may be relevant based on or conditioned on the tire-chain requirement being satisfied and/or the temporal conditions not being satisfied.

[0113] One or more of the relevancy rules may specify that the real-world sign may be relevant based on or conditioned on and/or based, at least in part, on driving and/or environmental conditions along the route. As an example, one or more of the relevancy rules may specify that a speed-limit sign having a notice condition indicating a 55 mph speed limit may b, relevant based on or conditioned on traffic along the route not being able support the 55 mph. Alternatively or additionally, one or more of the relevancy rules may specify that a speed-limit sign having a notice condition indicating a 55 mph speed limit may be relevant conditioned on weather conditions along the route not being able support the 55 mph.

[0114] One or more of the relevancy rules may specify that a cross-walk sign may be relevant based on or conditioned on a pedestrian being in or approaching a cross walk. One or more of the relevancy rules may specify that a stop sign may be relevant conditioned on a prediction that a timely stop may be in question (e.g., that the vehicle and/or user may be traveling too fast to stop at the upcoming stop sign). According to additional examples, one or more of the relevancy rules may specify that the real- world sign may be relevant based, at least in part, on one or more directions and/or navigation instructions for navigating the route being navigated and/or being traversed as described herein. Other relevancy rules may be provided and/or used as well that may indicate whether or not a sign, notice, and/or the like may or may not be relevant. Further, as describe herein, the relevancy rules may indicate whether a sign, notice, and/or the like may or may not be relevant based on characteristics such as a speed of £1 Cell", £1 location of £1 Cell", £1 distance of a car from, for example, a sign, a route the car may be traveling, and/or the like.

[0115] The relevancy-rule repository 214 may provide the one or more of the relevancy rules to the relevancy-determining engine 212. The relevancy-determining engine 212 may evaluate the relevancy rules to make the relevancy determination (e.g., to determine whether to apply the rule and/or adapt real-world sign and/or view as described herein, for example, with a virtual object and/or information). Making the relevancy determination may include determining, considering, establishing, concluding, indicating, and/or that the recognized real- world sign may be relevant (i.e., may be a relevant real- world sign) based on or on condition that one or more of the relevancy rules may be satisfied.

[0116] The augmented reality unit 220 may generate augmented reality presentations, and provide the generated augmented reality presentations to the presentation controller 130. The augmented reality unit 220 may adapt a real-world-signage appearance (e.g., as described herein with a virtual object and/or information) based on the relevancy determination. In examples, the augmented reality unit 220 may adapt the real- world- sign appearance based, and/or conditioned, on the real-world sign being (e.g., determined to be) relevant. The augmented reality unit 220 may adapt the real-world-sign appearance based, and/or conditioned, on the real-world sign being (e.g., determined to be) not relevant. As described herein, the augmented reality unit 220 may adapt or modify the real-world-sign appearance by emphasizing and/or de- emphasizing the real-world-sign appearance. As examples, the augmented reality unit 220 may (i) de-emphasize the real-world-sign appearance based, and/or conditioned, on the real- world sign being relevant; and/or (ii) emphasize the real- world- sign appearance based, and/or conditioned, on the real- world sign being not relevant. As noted supra, for simplicity of exposition in the present description, adapting the real-world-sign appearance may include (i) emphasizing the real- world- sign appearance for embodiments in which adapting the real- world- sign appearance may be based, and/or conditioned on, the real- world sign being relevant; and (ii) de- emphasizing the real-world-sign appearance for embodiments in which adapting the real-world-sign appearance may be based, and/or conditioned on, the real-world sign being not relevant. Other forms of adapting may be provided and/or used including those in which the real-world-sign appearance may be (i) de- emphasized based, and/or conditioned, on the real-world sign being relevant; and/or (ii) emphasized based, and/or conditioned, on the real-world sign being not relevant.

[0117] The augmented reality unit 220 may augment the real-world views of the real-world scenes by, for example, generating for display via the presentation unit 140, virtual objects (e.g., augmentation information) for use in connection with the real- world signage and/or with the real-world scenes disposed in the real-world views as described herein. The augmented reality unit 220 may generate the virtual objects to provide or enable projection or otherwise superimposition of the virtual objects into (or onto) the real-world views of the real-world scenes.

[0118] The augmented reality unit 220 may generate the virtual objects in real-time (e.g., on the fly) as described herein. Alternatively or additionally, the augmented reality unit 220 may obtain the virtual objects from a virtual object repository (not shown). The virtual object repository may store the virtual objects (e.g., augmentation information) for use in connection with the real-world signage disposed in the real-world scenes. The virtual object repository may store the virtual objects in association with (e.g., indexed by) real-world and/or localized map locations commensurate with the real-world signage.

[0119] The virtual object repository may retrieve the virtual objects using real- world and/or localized map locations commensurate with the real-world signage, which locations may be passed to it in a query. The virtual object repository may provide the retrieved the virtual objects to the augmented reality unit 120 in response to the query as described herein.

[0120] The augmented reality unit 220 may provide generated and/or obtained augmentation information (e.g., the virtual objects) for adapting the real-world sign or signage to the presentation controller 130. The presentation controller 130 may obtain the augmentation information for adapting the real-world signage from the augmented reality unit 220. The presentation controller 130 may translate the obtained augmentation information for display via the presentation unit 140. For example, a road sign that may include a symbol may be converted into a natural language. As an example, a symbol with a line across a circle may be translated into natural language such as "Do not Enter." The presentation controller 130 may provide the translated augmentation information to the presentation unit 140 such that the augmentation information may be provided on or near the real-world sign in the real-world view or scene as described herein.

[0121] As described herein, in examples, the presentation unit 140 may present real-world views and/or an augmented reality presentation. The presentation unit 140, for example, may receive the translated augmentation information, and adapt the real-world-signage appearances by applying the augmentation information to the real-world views as described herein. [0122] Figure 7 is a flow diagram illustrating example method 700 for augmenting reality according to examples herein. The method 700 maybe described with reference to the augmented reality system of Figure 6, and to the illustrative example images shown in Figures 3-5. The method 700 may be carried out using other architectures as well (e.g., such as with other systems that may implement augmented reality).

[0080] At 702, the sign-recognition unit 110 may recognize and/or determine a real- world sign along a route being navigated and/or being traversed. The sign-recognition unit 110 may recognize, for example, an exit sign 300 as shown in Figure 3, a sign 400 as shown in Figure 4) posting temporal conditions in connection with a speed limit sign located in a school zone, and/or a sign 500 as shown in Figure 5 posting "NO TRUCKS ... IN LEFT LANE" (e.g., as described herein, for example, similar to 202). For example, the sign-recognition unit 110 may recognize and/or determine the real-world sign by carrying out a recognition process or routine to recognize real-world signage disposed in the real-world scenes depicted in the captured real-world views (e.g., that may be received from the image capture unit 100). In an example, the sign-recognition unit 110 may use the object recognition unit 112 and the notice recognition unit 114 to recognize and/or determine the real-world sign at 702 as described herein.

[0123] At 704, the relevancy-determining unit 210 may make a relevancy determination. The relevancy-determining engine 212, for example, may obtain one or more of the relevancy rules from the relevancy-rule repository 214 at 704. The relevancy- determining engine 212 may evaluate the relevancy rules with respect to the recognized real-world sign to make the relevancy determination at 704. The relevancy-determining engine 212 may determine, consider, establish, conclude, indicate, and/or the like that the recognized real- world sign may be relevant (i.e., maybe a relevant real-world sign) on condition that one or more of the relevancy rules may be satisfied as described herein.

[0124] At 706, the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt, based and/or conditioned on the relevancy determination, the real-world-sign appearance by augmenting a real-world view of a real- world scene that includes the real- world sign (e.g., similar to 204 and further using the relevancy determination). The augmented reality unit 220 may, for example, generate a virtual object to emphasize the real-world sign based on or on conditioned on the real-world sign being relevant (e.g., as determined by the relevancy-determining unit 210 at 204). In an example, the relevancy- determining unit 210 may determine that the exit sign 300 (Figure 3) may be relevant, at 204 (e.g., based on one or more of the relevancy rules such as a rule the vehicle should proceed to exit off the road associated with the exit sign 300 as it may be associated with a route the vehicle may be traveling on or should be traveling on, it may have a good and/or service desired by a user or operator of the vehicle, and/or the like). The virtual object of a frame 302 may be generated and output or provided on (e.g., to adapt) the exit sign 300 at 206 by the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 as described. Similarly, according to an example, the relevancy-determining unit 210 may determine that the sign 400 (Figure 4) may be relevant at 204 (e.g., based on one or more of the relevancy rules associated with the sign 400). The virtual object of a highlight 402 may be generated by the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 as described to highlight (e.g., to adapt) some (e.g., the temporal conditions) or the entire sign 400 at 206. In an example, the relevancy- determining unit 210 may determine sign 500 (Figure 5) may be relevant at 204 (e.g., based on one or more of the relevancy rules associated with the sign 450). The virtual object of a highlight or lowlight of the sign 500 may be generated by the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 as described at 206 to obscure a portion 502 of the real- world scene that includes such sign 500. As such, in an example, at 206, the augmented reality unit 220 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140, which, in turn, may adapt the real- world-sign appearance by using the virtual object to augment the real-world view of a real- world scene that includes the real-world sign where the virtual object may be determined or generated based on a sign being relevant.

[0125] Although not shown, similar to the augmented reality system 10 of Figure 1 described above, the augmented reality system 20 may include a field-of-view determining unit. The field-of-view determining unit may interface with the image capture unit 100 and/or a user tracking unit to determine whether the real-world sign disposed in real- world view is within a field of view of a user. As described herein, the user tracking unit may be, for example, an eye tracking unit.

[0126] With reference again to Figure 7, the augmented reality unit 220 (in connection with the presentation controller 130 and/or the presentation unit 140) may adapt the real-world- sign appearance on condition that the real- world sign may be within the field of view (e.g., similar to the augmented reality unit 120 of Figure 1). Alternatively or additionally, the augmented reality unit 220 (in connection with the presentation controller 130 and/or the presentation unit 140) may adapt the real-world-sign appearance for a field of view that may be determinable from and/or based on user input. In various embodiments, the augmented reality unit 220 (in connection with the presentation controller 130 and/or the presentation unit 140) may adapt the real-world-sign appearance for a field of view that may be determinable from and/or based on input associated with a user gaze. [0127] Figure 8 is a block diagram illustrating an example of an augmented reality system 30 in accordance with at least some embodiments described herein. The augmented reality system 30 may be used and/or implemented a device such as a computing device. The augmented reality system 30 of Figure 8 may be similar to the augmented reality system 20 of Figure 6 and/or the augmented reality system 10 of Figure 1 (e.g., except as described herein below). The augmented reality system 30 may include an image capture unit 100, a navigation unit 800, a sign-determining unit 802, a sign-recognition unit 110, a relevancy- determining unit 810, an augmented reality unit 220, a presentation controller 130 and a presentation unit 140 according to an example.

[0128] According to an example, the navigation unit 800 may generate directions and/or navigations instructions (e.g., collectively navigation instructions) for a route to be navigated. This navigation unit may track progress along the route, and/or may make adjustments to the route. The adjustments to the route may be based, and/or condition, on current position, traffic environmental conditions (e.g., snow or rainfall), updates received about the knowledge of route (e.g., destination or different way points). The navigation unit 800 may provide the navigation instructions to the sign -determining unit 802. The sign- determining unit 802 may receive or obtain one or more (e.g., a set and/or list of) real- world signs associated with the route to be navigated, based, at least in part, on the navigations instructions obtained from the navigation unit 800. The sign -determining unit 802 may obtain the real-world signs associated with the route to be navigated from a sign repository (not shown). The sign- determining unit 802 may, for example, query the sign repository using the navigation instructions. The sign repository may provide the real-world signs associated with the route to be navigated to the sign- determining unit 802 in response to the query.

[0129] The sign repository may be, or include a repository or a collection of repositories that may include geo-references to (e.g., locations and/or real-world geographic positions of), and/or details of, real-world signs disposed in connection with one or more spatial area of the earth. In examples, the sign repository may be, or include, a point cloud, point cloud library, and/or the like that may include geo-references to, and/or details of, real-world signs disposed in connection with one or more spatial area of the earth.

[0130] The details of a real- world sign may include, for example, an indication that a real- world sign may exist at the particular geo-reference to such sign; an indication of type of sign, such as, for example, a code indicating the particular type of sign and/or notice conveyed by the sign; elevation of the sign; and/or the like. In some examples, the details of a real-world sign may be limited to an indication that a real-world sign exists at the particular geo- reference to such sign. In such embodiments, additional details of the real- world signs may be determined based on (e.g., deduced, inferred, and/or the like from) other data and/or corresponding geo-references in the sign repository. For example, one or more details of a real- world sign may be deduced from the geo-reference to a real-world sign being near (e.g., in close proximity to) a corner at a four-way intersection between two roads, an exit off a highway, an entrance onto a highway, and/or the like; and/or from the geo-reference to the real-world sign being in a particular jurisdiction (e.g., country, municipality, and/or the like). Alternatively and/or additionally, additional details of the real-world sign may be obtained from one or more repositories having details of the real-world sign populated therein. The details may be populated into these repositories, for example, responsive to (e.g., the sign-recognition unit 110) recognizing or determining as described herein (e.g., above) the real-world during or otherwise in connection with a previous navigation, and/or traversal of, locations and/or real- world geographic positions corresponding to the geo-reference to the sign. Alternatively and/or additionally, the details may be populated into the repositories responsive to user input. The user input may be entered in connection with a previous navigation, and/or traversal of, locations and/or real-world geographic positions corresponding to the geo-reference to the sign. Alternatively or additionally, the user input may be entered responsive to viewing the real- world sign in one or more images (e.g., captured by the image capture unit 100). According to an example, the details may be populated into the repositories responsive to recognizing the real-world sign depicted in one or more images, and/or from one or more sources from which to garner or provide the details (e.g., web pages).

[0131] The sign repository may be stored locally in memory of the device, and may be accessible to (e.g., readable and/or writable by) a processor of the device. Alternatively and/or additionally, the sign repository may be stored remotely from the computing device, such as, for example, in connection with a server remotely located from the computing device (e.g., that may be accessed via a network connection with the device). Such server may be available and/or accessible to the device via wired and/or wireless communication, and the server may serve (e.g., provide a web service for obtaining) the real-world signs associated with the route to be navigated. The server may also receive from the device (e.g., the sign-recognition unit 110), and/or populate the sign repository with, details of the real- world signs.

[0132] The sign-determining unit 802 may pass the real-world signs associated with the route to be navigated obtained from the sign repository to the relevancy- determining unit 810. The relevancy- determining unit 810 may obtain the real-world signs associated with the route to be navigated from the sign-recognition unit 110, and/or may make relevancy determinations for the obtained real-world signs. The relevancy- determining unit 810 may include a relevancy- determining engine 812. The relevancy-determining unit 810 and/or the relevancy- determining engine 812 may communicatively couple with a relevancy-rule repository 814. The relevancy- rule repository 814 may store relevancy rules (e.g., as described herein, for example, above). These relevancy rules may be similar or the same as, and/or include some or all of, the relevancy rules stored in the relevancy- rule repository 214 (Figure 6), for example, except that the relevancy rules stored in the relevancy-rule repository 814 may be for determining whether the real-world signs associated with the route to be navigated may be relevant (e.g., as different from determining whether real- world signs recognized along a route being navigated and/or traversed may be relevant). The relevancy rules for determining whether the real- world signs associated with the route to be navigated (e.g., which may be relevant) may include other rules, as well.

[0133] The relevancy-rule repository 814 may provide the one or more of the relevancy rules to the relevancy-determining engine 812. The relevancy-determining engine 812 may evaluate the relevancy rules with respect to the real-world signs associated with the route to be navigated to make the relevancy determination. Making the relevancy determination may include determining, considering, establishing, concluding, indicating, and/or the like that the real-world signs associated with the route to be navigated may be relevant based on or on condition that one or more of the relevancy rules may be satisfied.

[0134] As described herein (e.g., above), the image capture unit 100 may capture real-world views of real-world scenes, for example, in connection with the locations and/or real-world geographic positions of the real-world signs that may be relevant (e.g., relevant-real-world signs) and may provide the captured real-world views to the sign-recognition unit 110 and/or the augmented reality unit 220.

[0135] The sign-recognition unit 110 may receive the captured real- world views from the image capture unit 100 and may recognize the real-world signage disposed in the real-world scenes depicted in the captured real-world views as described herein. The sign-recognition unit 110 may recognize or determine the relevant-real- world signs. The sign-recognition unit 110 may also recognize real-world signage other than the relevant-real-world signs, including real-world signage determined to be not relevant. The sign-recognition unit 110 may provide the recognized real-world signage to the sign repository for inclusion therein. The sign- recognition unit 110 may provide the recognized relevant-real-world signs to the augmented reality unit 220 such that the sign may be adapted and/or modified (e.g., with a virtual object or augmentation information as described herein).

[0136] The augmented reality unit 220 may generate augmented reality presentations and may provide the generated augmented reality presentations to the presentation controller 130. The augmented reality unit 220 may adapt a real-world-signage appearance based on the relevancy determination as described herein. In examples, the augmented reality unit 220 may adapt the real-world-sign appearance based, and/or conditioned, on the real-world sign being (e.g., determined to be) relevant. Further, the augmented reality unit 220 may adapt the real- world- sign appearance based, and/or conditioned, on the real-world sign being (e.g., determined to be) not relevant. The augmented reality unit 220 may augment the real-world views of the real-world scenes by, for example, generating for display via the presentation unit 140, virtual objects (e.g., augmentation information) for use in connection with the relevant- real-world sign or signage and/or with the real-world scenes disposed in the real-world views. The augmented reality unit 220 may generate the virtual objects to provide or enable projection or otherwise superimposition of the virtual objects into (onto) the real-world views of the real- world scenes.

[0137] The augmented reality unit 220 may generate the virtual objects on the fly. Alternatively, the augmented reality unit 220 may obtain the virtual objects from a virtual object repository. The virtual object repository may store the virtual objects (augmentation information) for use in connection with the real-world signage disposed in the real-world scenes. The virtual object repository may store the virtual objects in association with (e.g., indexed by) real-world and/or localized map locations commensurate with the real-world signage.

[0138] The virtual object repository may retrieve the virtual objects using real-world and/or localized map locations commensurate with the real-world signage, which locations may be passed to it in a query. The virtual object repository may provide the retrieved the virtual objects to the augmented reality unit 220 in response to the query.

[0139] The augmented reality unit 220 may provide generated and/or obtained augmentation information (e.g., the virtual objects) for adapting the real-world signage to the presentation controller 130. The presentation controller 130 may obtain the augmentation information for adapting the real-world signage from the augmented reality unit 220. The presentation controller 130 may translate the obtained augmentation information for display via the presentation unit 140. The presentation controller 140 may provide the translated augmentation information to the presentation unit 140.

[0140] The presentation unit 140 may present real-world views and/or an augmented reality presentation. The presentation unit 140, for example, may receive the translated augmentation information, and adapt the real-world-signage appearances by applying the augmentation information to the real-world views.

[0141] Figure 9 is a flow diagram illustrating example method 900 for augmenting reality according to one or more examples described herein. The method 900 may be described with reference to the augmented reality system of Figure 8, and to the illustrative example images shown in Figures 3-5. The flow 900 may be carried out using other architectures as well (e.g., such using as the augmented reality unit 10 and/or 20 described herein). [0142] At 902, the sign- determining unit 810 may obtain navigations instructions for a route to be navigated. The sign- determining unit 810 may obtain the navigation instructions, for example, from the navigation unit 800. At 904, the sign- determining unit 810 may obtain, based, at least in part, on the navigations instructions, a real-world sign associated with the route to be or being navigated. The sign-determining unit 810, for example, may obtain the real-world sign associated with the route to be navigated from the sign repository. The obtained real- world sign may be, for example, the exit sign 300 (Figure 3), the sign 400 (Figure 4) posting temporal conditions in connection with a speed limit sign located in a school zone, or the sign 500 (Figure 5) posting "NO TRUCKS ... IN LEFT LANE."

[0143] At block 906, the relevancy-determining unit 810 may make a relevancy determination. For example, after the sign- determining unit 810 may obtain the real-world sign associated with the rout to be navigated or being navigated at 904, the relevancy-determining unit 810 may determine a relevancy or may make a relevancy determination at 906. According to an example, the relevancy- determining engine 812, for example, may obtain one or more of the relevancy rules from the relevancy-rule repository 814. The relevancy-determining engine 812 may evaluate the relevancy rules with respect to the real-world sign associated with the route to be navigated to make the relevancy determination. The relevancy- determining engine 812 may determine, consider, establish, conclude, indicate, and/or the like that the real- world sign associated with the route to be navigated is relevant on condition that one or more of the relevancy rules is satisfied. The relevancy- determining unit 810 may determine that the real- world sign associated with the route to be navigated (e.g., the exit sign 300 (Figure 3), the sign 400 (Figure 4) or the sign 500 (Figure 5)) may be a relevant-real- world sign, for example.

[0144] At 908, the sign-recognition unit 110 may recognize and/or determine the relevant real- world sign along the route as the route may be being navigated. In an example, the sign- recognition unit 110 may recognize and/or determine the relevant real- world sign along the route as the route may be being navigated at 908, for example, after the relevancy- determining unit 810 may determine the real-world sign may be relevant at 906. The sign- recognition unit 110 may also recognize real-world signage other than the relevant real-world sign. The sign-recognition unit 110 may provide the recognized real- world signage, including the recognized relevant-real- world sign, to the sign repositories for incorporation therein.

[0145] At 910, the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt, based and/or conditioned on the relevancy determination, the real-world-sign appearance by augmenting a real-world view of a real- world scene that includes the real- world sign (e.g., after recognizing and/or determining the relevant sign at 908). The augmented reality unit 220 may, for example, generate a virtual object to emphasize the real-world sign on condition that the recognized real-world sign may be relevant. In an example (e.g., when or if the exit sign 300 (Figure 3) may be relevant as determined herein), the virtual object may be a frame 302 positioned about the exit sign 300 (Figure 3). Assuming that the sign 400 (Figure 4) is relevant, the virtual object may highlight 402 some (e.g., the temporal conditions) or the entire sign 400. Assuming that the sign 500 (Figure 5) is relevant, the virtual object may highlight the sign 500 " by obscuring a portion 502 of the real-world scene that includes such sign. The augmented reality unit 220 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140, which, in turn, may adapt the relevant-real-world-sign appearance by using the virtual object to augment the real-world view of a real-world scene that includes the real- world sign.

[0146] Figure 10 is a flow diagram illustrating an example method 1000 for augmenting reality according to examples herein. The method 1000 may be described with reference to the augmented reality system of Figure 8. The method 1000 may be carried out using other architectures, as well (e.g., such as the augmented reality system 10 and/or 20). The method 1000 of Figure 10 may be similar to the method 900 of Figure 9 (e.g., except that the relevancy-determining unit 810 may be configured to make a determination that the real- world sign associated with a route to be navigated may be relevant). For example, the method 1000 may include a subset of embodiments of method 900 with respect to determining whether the real-world sign associated with a route to be navigated may be relevant. For example, as shown, at 1002, the relevancy-determining unit 810 may determine or make a determination that the real world sign may be relevant (e.g., rather than whether it may be relevant as described in 906) based on a route to be navigated as described herein. In an example, the relevancy-determining unit may make such a determination at 1003 after obtaining the navigation instructions at 904 (e.g., as described above) and/or before recognition the real- world sign at 908 (e.g. as described above).

[0147] Figure 11 is a flow diagram illustrating example method 1100 for augmenting reality according to examples herein. The method 1100 may be described with reference to the augmented reality system of Figure 8. The method 1100 may be carried out using other architectures as well (e.g., such as the augmented reality system 10 and/or 20).

[0148] At 1102, the sign-determining unit 810 may obtain navigations instructions for a route to be navigated. The sign- determining unit 810 may obtain the navigation instructions, for example, from the navigation unit 800 as described herein (e.g., above).

[0149] At 1104, the sign- determining unit 810 may obtain, based, at least in part, on the navigations instructions, a real-world sign expected to be disposed along, or in connection, with the route to be navigated. For example, the sign- determining unit 810, for example, may obtain an indication that a real- world sign may be disposed along the route to be navigated, but the indication lacks an indication of (or information indicating) the type of sign and/or notice conveyed by such real-world sign. The sign-determining unit 810 may deduce or determine (e.g., at 1104) the type of sign and/or notice from other data and/or corresponding geo-references in the sign repository, and/or from one or more repositories having details of the real-world sign populated therein. The sign- determining unit 810 may determine, based, on the deduction, the real-world sign expected to be disposed along, or in connection, with the route to be navigated.

[0150] At 1106, the relevancy-determining unit 810 may make a relevancy determination (e.g., after obtaining and/or determining the type of sign along the route at 1104). The relevancy-determining engine 812, for example, may obtain one or more of the relevancy rules from the relevancy-rule repository 814. The relevancy-determining engine 812 may evaluate the relevancy rules with respect to the real-world sign expected to be disposed along, or in connection, with the route to be navigated. The relevancy-determining engine 812 may determine, consider, establish, conclude, indicate, etc. that the real-world sign expected to be disposed along, or in connection, with the route to be navigated is relevant on condition that one or more of the relevancy rules may be satisfied. The relevancy- determining unit 810 may determine that the real-world sign is a relevant-real-world sign, for example.

[0151] At 1108, the sign-recognition unit 110 may recognize the relevant real-world sign along the route as the route may be being navigated (e.g., after the relevancy determination may be made at 1106). The sign-recognition unit 110 may also recognize real-world signage other than the relevant real- world sign. The sign-recognition unit 110 may provide the recognized real- world signage, including the recognized relevant-real-world sign, to the sign repositories for incorporation therein.

[0152] At 1110, the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt, based and/or conditioned on the relevancy determination, the real-world-sign appearance by augmenting a real-world view of a real- world scene that includes the real-world sign (e.g., after the sign may be recognized at 1108). The augmented reality unit 220 may, for example, generate a virtual object to emphasize the real-world sign on condition that the recognized real-world sign is relevant. The augmented reality unit 220 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140, which, in turn, may adapt the relevant-real-world-sign appearance by using the virtual object to augment the real-world view of a real-world scene that includes the real-world sign.

[0153] Figure 12 is a flow diagram illustrating an example method 1200 for augmenting reality according to examples herein. The method 1200 may be described with reference to the augmented reality system of Figure 8. The method 1200 may be carried out using other architectures as well (e.g., such as the augmented reality system 10 and/or 20). The method 1200 of Figure 12 maybe similar to the flow 1100 of Figure 11 (e.g., except that the relevancy- determining unit 810 may be configured to make a determination that the real-world sign expected to be disposed along, or in connection, with the route to be navigated may be relevant). For example, the method 1200 may include a subset of embodiments of method 1100 with respect to determining whether the real- world sign expected to be disposed along, or in connection, with the route to be navigated may be relevant. For example, as shown, at 1202, the relevancy- determining unit 810 may determine or make a determination that the real world sign may be relevant (e.g., rather than whether it may be relevant as described in 1106). In an example, the relevancy- determining unit may make such a determination at 1202 after determining a sign expected to be on the route at 1104 (e.g., as described above) and/or before recognition the real- world sign at 1108 (e.g. as described above).

[0154] Figure 13 is a flow diagram illustrating example method 1300 for augmenting reality according to examples herein. The method 1300 may be described with reference to the augmented reality system of Figure 8. The method 1300 may be carried out using other architectures as well (e.g., such as the augmented reality system 10 and/or 20).

[0155] At 1302, the sign-determining unit 810 may obtain navigations instructions for a route to be navigated. The sign- determining unit 810 may obtain the navigation instructions, for example, from the navigation unit 80.

[0156] At 1304, the sign- determining unit 810 may obtain, based, at least in part, on the navigations instructions, an expected location and/or range of locations for a real-world sign associated with the route to be navigated (e.g., after obtaining the navigation instructions at 1302). The sign-determining unit 810, for example, may obtain expected location and/or range of locations from the sign repository.

[0157] At 1306, the relevancy-determining unit 810 may make a relevancy determination (e.g., obtaining the expected location and/or range of locations at 1304). The relevancy- determining engine 812, for example, may obtain one or more of the relevancy rules from the relevancy-rule repository 814. The relevancy-determining engine 812 may evaluate the relevancy rules with respect to the real-world sign associated with the route to be navigated. The relevancy- determining engine 812 may determine, consider, establish, conclude, indicate, and/or the like that the real-world sign associated with the route to be navigated may be relevant on condition that one or more of the relevancy rules may be satisfied. The relevancy- determining unit 810 may determine that the real-world sign is a relevant-real-world sign.

[0158] At 1308, the sign-recognition unit 110 may recognize the relevant-real-world sign along the route as the route maybe being navigated based, at least in part, on the expected location and/or range of locations (e.g. after making the relevancy determination at 1306). The sign- recognition unit 110 may also recognize real-world signage other than the relevant real-world sign. The sign-recognition unit 110 may provide the recognized real- world signage, including the recognized relevant-real- world sign, to the sign repositories for incorporation therein.

[0159] At 1310, the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt, based and/or conditioned on the relevancy determination, the real-world-sign appearance by augmenting a real-world view of a real- world scene that includes the real- world sign (e.g., after recognizing the real world sign at 1308). The augmented reality unit 220 may, for example, generate a virtual object to emphasize the real-world sign on condition that the recognized real-world sign may be relevant. The augmented reality unit 220 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140, which, in turn, may adapt the relevant-real-world-sign appearance by using the virtual object to augment the real-world view of a real- world scene that includes the real-world sign.

[0160] In various embodiments, the sign-recognition unit 110 may recognize the real-world sign along the route as the route may be being navigated and then the relevancy- determining unit 810 may make a relevancy determination for the recognized real-world sign. For example (e.g., instead of block 1306), the sign-recognition unit 110 may recognize the real-world sign along the route as the route may be being navigated based, at least in part, on the expected location and/or range of locations. The sign-recognition unit 110 may also recognize real- world signage other than the relevant real-world sign. The sign-recognition unit 110 may provide the recognized real-world signage, including the recognized relevant-real-world sign, to the sign repositories for incorporation therein. According to an example (e.g., instead of 1308), the relevancy- determining unit 810 may then make a relevancy determination and/or recognize the real-world sign. The relevancy- determining engine 812, for example, may obtain one or more of the relevancy rules from the relevancy- rule repository 814. The relevancy-determining engine 812 may evaluate the relevancy rules with respect to the recognized real-world sign (e.g., that may be made). The relevancy- determining engine 812 may determine, consider, establish, conclude, indicate, etc. that the real-world sign associated with the route to be navigated may be relevant on condition that one or more of the relevancy rules may be satisfied. The relevancy- determining unit 810 may determine that the recognized real-world sign may be a relevant-real- world sign.

[0161] Figure 14 is a flow diagram illustrating an example method 1400 for augmenting according to examples herein. The method 1400 may be described with reference to the augmented reality system of Figure 8. The method 1400 may be carried out using other architectures as well (e.g., such as the augmented reality system 10 and/or 20). The method 1400 of Figure 14 may be similar to the method 1300 of Figure 13 (e.g., except that the relevancy- determining unit 810 may be configured to make a determination that the real- world sign associated with the route to be navigated may be relevant. For example, the method 1400 may include a subset of embodiments of method 1300 with respect to determining whether the real-world sign associated with a route to be navigated may be relevant. For example, as shown, at 1402, the relevancy-determining unit 810 may determine or make a determination that the real world sign may be relevant (e.g., rather than whether it may be relevant as described in 1106). In an example, the relevancy- determining unit may make such a determination at 1402 after obtaining the navigation instructions at 1304 (e.g., as described above) and/or before recognition the real-world sign at 1308 (e.g. as described above).

[0162] Figure 15 is a flow diagram illustrating example method 1500 for augmenting reality according to examples herein. The method 1500 may be described with reference to the augmented reality system of Figure 8. The method 1500 may be carried out using other architectures as well (e.g., such as the augmented reality system 10 and/or 20).

[0163] At 1502, the sign-determining unit 810 may obtain navigations instructions for a route to be navigated. The sign- determining unit 810 may obtain the navigation instructions, for example, from the navigation unit 800.

[0164] At 1504, the sign- determining unit 810 may obtain, based, at least in part, on the navigations instructions, an expected location and/or range of locations for a real-world sign expected to be disposed along, or in connection, with the route to be navigated (e.g., after obtaining the navigation instructions at 1502). The sign- determining unit 810, for example, may obtain expected location and/or range of locations from based on evaluating (e.g., deducing from) the navigation instructions.

[0165] At 1506, the relevancy-determining unit 810 may make a relevancy determination (e.g., after obtaining an expected location and/or range of locations at 1504). The relevancy- determining engine 812 may obtain one or more of the relevancy rules from the relevancy- rule repository 814. The relevancy- determining engine 812 may evaluate the relevancy rules with respect to the real-world sign expected to be disposed along, or in connection, with the route to be navigated. The relevancy-determining engine 812 may determine, consider, establish, conclude, indicate, and/or the like that the real-world sign expected to be disposed along, or in connection, with the route to be navigated is relevant on condition that one or more of the relevancy rules is satisfied. The relevancy- determining unit 810 may determine that the real- world sign may be a relevant-real-world sign, for example.

[0166] At 1508, the sign-recognition unit 110 may recognize the relevant-real-world sign along the route as the route may be being navigated based, at least in part, on the expected location and/or range of locations (e.g., after making the relevancy determination at 1506). The sign- recognition unit 110 may also recognize real-world signage other than the relevant real-world sign. The sign-recognition unit 110 may provide the recognized real- world signage, including the recognized relevant-real- world sign, to the sign repositories for incorporation therein.

[0167] At 1510, the augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt, based and/or conditioned on the relevancy determination, the real-world-sign appearance by augmenting a real-world view of a real- world scene that includes the real-world sign (e.g., after recognizing the sign at 1508). The augmented reality unit 220 may generate a virtual object to, for example, emphasize the real- world sign on condition that the recognized real- world sign is relevant. The augmented reality unit 220 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140, which, in turn, may adapt the relevant-real-world-sign appearance by using the virtual object to augment the real-world view of a real- world scene that includes the real-world sign.

[0168] In examples, the sign-recognition unit 110 may recognize the real-world sign along the route as the route may be being navigated, and then the relevancy- determining unit 810 may make a relevancy determination for the recognized real-world sign. For example (e.g., instead of 1506), the sign-recognition unit 110 may recognize the real-world sign along the route as the route may be being navigated based, at least in part, on the expected location and/or range of locations. The sign-recognition unit 110 may also recognize real- world signage other than the relevant real- world sign. The sign-recognition unit 110 may provide the recognized real- world signage, including the recognized relevant-real-world sign, to the sign repositories for incorporation therein. In an example (e.g., instead of 1508), the relevancy-determining unit 810 may then make a relevancy determination. The relevancy- determining engine 812, for example, may obtain one or more of the relevancy rules from the relevancy- rule repository 814. The relevancy- determining engine 812 may evaluate the relevancy rules with respect to the recognized real-world sign. The relevancy- determining engine 812 may determine, consider, establish, conclude, indicate, etc. that the real-world sign associated with the route to be navigated is relevant on condition that one or more of the relevancy rules may be satisfied. The relevancy- determining unit 810 may determine that the recognized real-world sign may be a relevant-real- world sign.

[0169] Figure 16 is a flow diagram illustrating an example method 1600 for augmenting reality according to examples herein. The method 1600 may be described with reference to the augmented reality system of Figure 8. The method 1600 may be carried out using other architectures, as well. The method 1600 of Figure 16 may be similar to the flow 1500 of Figure 15 (e.g., except that the relevancy-determining unit 810 may be configured to make a determination that the real-world sign expected to be disposed along, or in connection, with the route to be navigated may be relevant). For example, the method 1600 may include a subset of embodiments of method 1500 with respect to determining whether the real- world sign expected to be disposed along, or in connection, with the route to be navigated may be relevant. For example, as shown, at 1602, the relevancy-determining unit 810 may determine or make a determination that the real world sign may be relevant (e.g., rather than whether it may be relevant as described in 1505) based on a route to be navigated as described herein. In an example, the relevancy- determining unit may make such a determination at 1602 after determining an expected location at 1504 (e.g., as described above) and/or before recognition the real- world sign at 1508 (e.g. as described above).

[0170] Figure 17 is a block diagram illustrating an example of an augmented reality system 40 in accordance with at least some examples described herein. The augmented reality system 40 may be used and/or implemented in a device such as a computing device. The augmented reality system 40 may include an image capture unit 100, a navigation unit 800, a sign- determining unit 802, a sign-recognition unit 110, a relevancy- determining unit 810, an augmented reality unit 220, a presentation controller 130 and a presentation unit 140, for example, as shown.

[0171] Operations that may be carried out in connection with the augmented reality system 40 of Figure 17 include such operation in the description the follows. Other operations, including those described herein (e.g., above with respect to the augmented reality systems 10, 20, and/or 30), may be carried out by the augmented reality system 40 of Figure 17 as well.

[0172] The sign- determining unit 802 may determine real- world signs along with their positions along a route being navigated and/or traversed (e.g., for real-world signs a vehicle or user is approaching). The sign- determining unit 802 may carry out such determination whenever the new directions for a route are determined, and/or for all real- world signs within a specified distance of the vehicle and/or for all road signs within a specified distance of the vehicle in the current direction of its travel, including up to a specified (e.g., three) turns from its current direction of travel.

[0173] The sign -determining unit 802 may determine the real- world signs based on navigation instructions provided from the navigation unit 802. The sign-determining unit 802 may obtain the real-world signs from a web service, such as, for example, a database of street sign assets accessible via a web service. In various embodiments, the sign-determining unit 802 may determine the real- world signs based on a characterization of the road signs to be considered. The sign -determining unit 802 may carry out determining the real- world signs in this way, for example, every minute and/or whenever the vehicle or user crosses a jurisdiction boundary. The sign- determining unit 802 may determine the user's (or the vehicle's) current jurisdiction (e.g., national or state boundary). The current jurisdiction may be based on measurements provided form the navigation unit 802. The sign- determining unit 802 may obtain the characterization from a web service (e.g., road signs vary across jurisdictions).

[0174] In examples, the user tracking unit (e.g., an eye tracking unit as described herein) in connection with the image capture unit 100 may determine which of the real- world signs are present in a field of view. The user tracking unit in connection with the image capture unit 100 may carry out such determination, for example, whenever the user's position changes by a given number (e.g., 10) meters and/or every second.

[0175] The sign-recognition unit 110 may, based on features corresponding to the real-world signs, identify the real-world signs within the field of view and/or recognize the notices conveyed by such real-world signs. This may include the sign-recognition unit 110 determining features of the real-world signs (e.g., including real-world signs that would be relevant to the user). The sign-recognition unit 110 may carry out determining the features of the real-world signs whenever the user begins traveling, determines fresh directions (e.g., explicitly or implicitly, according to an example, by deviating from previous directions), changes jurisdiction, and/or the like.

[0176] The relevancy-determining unit 810 may make a relevancy determination for the real- world signs recognized by the sign-recognition unit 110 and/or identified by the sign- determining unit 802. To assist in making the relevancy determination, the relevancy- determining unit 810 may receive from the sign-recognition unit 110 and/or the sign- determining unit 802 information for evaluating one or more of the relevancy rules. This information may include one or more of the following: (i) names of streets and freeway exit numbers (e.g., from the navigation instructions); (ii) information from user inputs to an onboard (vehicle) service or a navigation service, including, for example the user-specified points and/or places of interest or searched; (iii) features of the real-world signs from an onboard (vehicle) service, for providing attributes of the vehicle, such as whether it may be passenger car, number of axles, whether it may have an all- wheel drive, etc.; (iv) features of the real- world signs, for example, from a web service, including attributes from the laws of the land, such as that a sign showing silhouettes of children is significant; (v) information, for example, from a web service, indicating (and/or for determining) attributes relating to the present conditions (e.g., not specific to the user), such as whether the present day may be a holiday in the current geographical region, whether the weather conditions suggest the possibility of ice on the roads, and/or the like; (vii) information, for example, from a web service, indicating (and/or for determining) attributes relating to the present conditions (specific to the user), such as whether the user may be on lunch break or may be looking for a bar or for musical entertainment, a personal information service (e.g., a personal calendar such as Google Calendar); (viii) information, for example, from a web service indicating (and/or for determining) other relevant attributes such as time range patterns in which school zones are applicable, such as (H)H:MM - (H)H:MM (these patterns may be be captured as regular expressions), and/or the like

[0177] The relevancy- determining unit 810 may carry out making the relevancy determination whenever a new sign is recognized. With reference to the exit sign 300 (Figure 3) as an example, the relevancy-determining unit 810 may determine that the exit sign 300 may be relevant on condition that the navigation instructions call for Exit 299 to be taken.

[0178] In examples herein, the relevancy-determining unit 810 may compute a distance metric (e.g., Levenshtein Distance) between the text identified from the real-world sign and the text specified by the relevancy rules. In various embodiments, the relevancy- determining unit 810 may compute whether a regular expression pattern matches a given real-world sign. In various embodiments, the relevancy-determining unit 810 may make the relevancy determination without regard to text on the signs and/or without combining with other real- world signs. For example, a school crossing sign might be deemed relevant throughout or during some fixed hours (6:00AM to 5:00PM) on a weekday, or on a holiday (as determined from a web service, but not from another real- world sign).

[0179] In various embodiments, the relevancy- determining unit 810 may make the relevancy determination without regard to any text on the sign, and in combination with other real- world signs. For example, a school crossing sign might be deemed relevant in combination with another sign specifying the hours within which the school zone may be active.

[0180] The augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt, based and/or conditioned on the relevancy determination, the real-world-sign appearance by augmenting the real-world view of a real- world scene that includes the real-world sign (e.g., within the field of view). The augmented reality unit 220 may determine and generate the augmentation information for application to the real-world view of a real-world scene that includes the real-world sign. The augmented reality unit 220 may determine and generate the augmentation information whenever a new sign is detected and/or determined to be relevant.

[0181] The augmented reality unit 220 in connection with the presentation controller 130 and/or the presentation unit 140 may adapt the real-world-sign appearance using the augmentation information augmenting the real-world view of a real-world scene that includes the real-world sign. This may include applying emphasis to the real-world sign. The augmented reality unit 220, presentation controller 130 and presentation unit 140 may carry out adapting the real- world sign appearance whenever a new sign may be detected, a position of the projection of the sign in the field of view changes significantly (e.g., by a given number of (e.g., 5) angular degrees. To facilitate carrying out the adaptation of the real-world-sign appearance, the augmented reality unit 220, presentation controller 130 and/or presentation unit 140 may determine where in the field of view the real- world sign would appear based on tracking the user's eye gaze, including, for example, one or more of the following: (i) a specific part of the user's glasses, and/or (ii) a specific part of the vehicle's windshield that the user is driving.

[0081] The augmented reality unit 120 and/or the augmented reality unit 220 may generate the augmentation information (e.g., virtual objects) in connection with the real-world signs. The augmentation information, for example, the virtual objects may be presented in connection with the real-world signs in different states. For example, a virtual object may have a plurality of states for presenting respective presentation types of the augmentation information. The virtual object, for example, may be in a first (e.g., a compact) state for presenting a summary representation of the augmentation information ("summary"). Alternatively and/or additionally, the virtual object may be in a second (e.g., a non-compact, enlarged, extended, expanded, and/or the like) state for presenting fuller detail of the augmentation information ("fuller augmentation details"). The summary may include, for example, one or more of the following: an icon, an image, text, a concise representation of the augmentation information, and/or the like. The fuller augmentation details may include any augmentation information in addition to, and/or supplementary to, the summary. The virtual object may transition from one state to another state, and back again. For example, the virtual object may transition from the first state to the second state, and from the second state to the first state. The state change may be continuous or discontinuous. For example, the virtual object may transition from the compact state to the non-compact state by expanding (e.g., growing in size) from the compact state, and/or may transition from the non-compact state to the compact state by reducing (e.g., shrinking in size) back to the compact state. Alternatively and/or additionally, the virtual object may transition from the compact state to the non- compact state by switching to a partially or fully enlarged state, and/or may transition from the non-compact state to the compact state by switching back to the compact state. In some embodiments, the virtual object may transition from the compact state to the non-compact state by appending or otherwise adding a supplementary virtual object, and/or may transition from the non-compact state to the compact state by returning back to (e.g., removing the supplementary virtual object from) the compact state.

[0182] Figure 18 is a flow diagram illustrating example flow 1800 directed to augmenting reality via a presentation unit in accordance with an embodiment. The flow 1800 is described with reference to the augmented reality system of Figure 1. The flow 1800 may be carried out using other architectures as well (e.g., such as the augmented reality unit 20, 30, and/or 40). [0183] In method 1800, the augmented reality unit 120 (in connection with the presentation controller 130 and/or the presentation unit 140) may augment a view of a real-world scene that includes the real-world sign so as to emphasize or de-emphasize the real-world sign. The augmented reality unit 120 (in connection with the presentation controller 130 and/or the presentation unit 140) may do so by augmenting one or more portions of the real- world view associated with, or otherwise having connection to, the real-world sign and/or the real-world scene.

[0184] The augmented reality unit 120 may generate a virtual object to emphasize the real- world sign. The augmented reality unit 220 may provide the generated virtual object to the presentation controller 130 and/or the presentation unit 140. The presentation unit 140, in turn, may display the virtual object in connection with the real- world sign. In various embodiments, the presentation unit 140 (e.g., a screen of a navigation unit) may be disposed within the view of the real- world scene. For example, this view may include a viewpoint, and the viewpoint may be chosen so that the view of the real- world scene includes the real-world sign and the presentation unit 140.

[0185] Figures 19-28 are flow diagrams illustrating example methods 1900-2800 for augmenting reality according to examples herein. The methods 1900-2800 may be carried out by the augmented reality systems 30 and 40 of Figures 8 and 17, respectively, and/or other architectures (e.g., such as the augmented reality system 10 and/or 20). The methods in Figures 19-28 may include a subset of the methods 900- 1600 described in Figures 9-16 including, for example, a subset of the functions and/or actions described herein (e.g., above) for such methods.

[0186] Although not shown, for example, in the methods or flows 1800-2800, the augmented reality unit 120 and/or 220 may generate virtual object to emphasize the real-world sign. The augmented reality unit 120 and/or 220 in connection with the presentation controller 130 and/or the presentation unit 140 may display the virtual object via a presentation unit disposed within the view of the real-world scene as described herein.

[0187] Figure 29 is a flow diagram illustrating example method 2900 for using alerts for emphasizing real-world signage. The flow 2900 may be described with reference to the augmented reality systems 30 and 40 of Figures 8 and 17, respectively. The flow 2900 may be carried out using other architectures as well (e.g., such as the augmented reality system 10 and/or 20).

[0188] In method 2900, the sign-recognition unit 110 may recognize a real- world sign along a route being navigated and/or being traversed. The augmented reality unit 220 may generate an alert for emphasizing the real-world sign. The augmented reality unit 220 may provide the alert for rendering in connection with a view of a real- world scene that includes the real-world sign. The augmented reality unit 220 may provide the alert to any of the presentation controller 130 and the presentation unit 140. The presentation controller 130 and/or the presentation unit 140 may render the alert in connection with the view of a real-world scene that includes the real-world sign.

[0189] The augmented reality unit 220 may generate and/or provide the alert for rendering for audio presentation and/or for display (e.g., visual presentation). In various embodiments, the augmented reality unit 220 may generate and/or provide the alert for rendering when the real- world sign is viewable (e.g., when the real-world sign is in a field of view). In some embodiments, the sign-recognition unit 110 may determine whether the real-world sign is viewable.

[0190] In various examples, the augmented reality unit 220 may generate first and second instances of the alert. The first instance of the alert may be rendered, for example, when located at a first distance from the real- world sign. The second instance of the alert may be rendered, for example, when located at a second distance from the real- world sign. The second distance may closer to the real- world sign than the first distance.

[0191] Further, according to examples, the second instance of the alert may add additional emphasis to the real-world sign as compared to the first instance of the alert. This additional emphasis may be in the form of audio. In some embodiments, the audio increases in volume, tone or other attribute based on a third distance from the real- world sign (e.g., as approaching the sign). In some examples, the third distance may the second distance. The third distance may be based on any of a rate of approach to the real-world sign, a rate for arrival at the real- world sign, and a time to arrive at the real-world sign.

[0192] The second instance of the alert may emphasize the real- world sign differently from the first instance of the alert. The augmented reality unit 220 may generate and/or provide the first instance of the alert for rendering for audio presentation only. The augmented reality unit 220 may generate and/or provide the second instance of the alert for rendering for display. Alternatively or additionally, the augmented reality unit 220 may generate and/or provide the second instance of the alert for rendering for display and audio presentation.

[0193] In examples herein, the augmented reality unit 220 may generate and/or provide the first instance of the alert for rendering for display. The augmented reality unit 220 may generate and/or provide the second instance of the alert for rendering for audio presentation only. Alternatively or additionally, the augmented reality unit 220 may generate and/or provide the second instance of the alert for rendering for display and audio presentation.

[0194] Figures 30-39 are flow diagrams illustrating example methods 3000-3900 directed to using alerts for emphasizing real-world signage. The methods 3000-3900 may be carried out by the augmented reality systems 30 and 40 of Figures 8 and 17, respectively, and/or other architectures (e.g., such as the augmented reality system 10 and/or 20). Although not shown, the methods 2900-3900, the augmented reality unit 120 and/or 220 may generate virtual object to emphasize the real-world sign. The augmented reality unit 120 and/or 220 in connection with the presentation controller 130 and/or the presentation unit 140 may display the virtual object via a presentation unit disposed within the view of the real- world scene. Additionally, the methods in Figures 29-39 may include a subset of the methods 900- 1600 and 1800-2800 described in Figures 9- 16 and 18-28, respectively, to provide the alerts. For example, the methods 2900-3900 may perform subset of the functions and/or actions described herein (e.g., above) for such methods to provide alerts as described above.'

[0195] According to examples, a priority may be provided for such information that may be used to augment a real-word sign and/or scene including alerts and/or for characteristics types of signs, services such as restaurants, gas stations, and/or the like that may be on a route, notices, and/or the like. The priority may be used to determine whether a real- world sign may be relevant and/or whether to highlight or lowlight a sign as described herein. In an example, a user may change the priority of such information. For example, a user may change a priority such that a speed limit sign may be more important and may be highlighted over another sign according to an example. Further, context of a user and/or a vehicle of a user may be used to augment reality as described herein (e.g., provide augmentation information associated with a real- world sign and/or scene). For example, a vehicle may be running low on fuel such that the augmented reality systems described herein may use such a context of the vehicle to highlight an exit sign that may have a gas station. Similarly, the augmented reality systems described herein may highlight an exit sign with restaurants based on context of the user being hungry including a particular type of restaurant or food a user may be craving.

[0196] The methods, apparatus, systems, devices, and computer program products provided herein are well-suited for communications involving both wired and wireless networks. Wired networks are well-known. An overview of various types of wireless devices and infrastructure is provided with respect to Figures 40A-40E, where various elements of the network may utilize, perform, be arranged in accordance with and/or be adapted and/or configured for the methods, apparatuses and systems provided herein.

[0197] Figures 40A-40E (e.g., collectively Figure 40) are block diagrams illustrating an example communications system 4000 in which one or more disclosed embodiments may be implemented. In general, the communications system 4000 defines an architecture that supports multiple access systems over which multiple wireless users may access and/or exchange (e.g., send and/or receive) content, such as voice, data, video, messaging, broadcast, etc. The architecture also supports having two or more of the multiple access systems use and/or be configured in accordance with different access technologies. This way, the communications system 4000 may service both wireless users capable of using a single access technology, and wireless users capable of using multiple access technologies.

[0198] The multiple access systems may include respective accesses; each of which may be, for example, an access network, access point and the like. In various embodiments, all of the multiple accesses may be configured with and/or employ the same radio access technologies ("RATs"). Some or all of such accesses ("single-RAT accesses") may be owned, managed, controlled, operated, etc. by either (i) a single mobile network operator and/or carrier (collectively "MNO") or (ii) multiple MNOs. In various embodiments, some or all of the multiple accesses may be configured with and/or employ different RATs. These multiple accesses ("multi-RAT accesses") may be owned, managed, controlled, operated, etc. by either a single MNO or multiple MNOs.

[0199] The communications system 4000 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth. For example, the communications systems 4000 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC- FDMA), and the like.

[0200] As shown in Figure 40A, the communications system 4000 may include wireless transmit/receive units (WTRUs) 4002a, 4002b, 4002c, 4002d, a radio access network (RAN) 4004, a core network 4006, a public switched telephone network (PSTN) 4008, the Internet 4010, and other networks 4012, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements. Each of the WTRUs 4002a, 4002b, 4002c, 4002d may be any type of device configured to operate and/or communicate in a wireless environment. By way of example, the WTRUs 4002a, 4002b, 4002c, 4002d may be configured to transmit and/or receive wireless signals, and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, a terminal or like-type device capable of receiving and processing compressed video communications, or like-type device.

[0201] The communications systems 4000 may also include a base station 4014a and a base station 4014b. Each of the base stations 4014a, 4014b may be any type of device configured to wirelessly interface with at least one of the WTRUs 4002a, 4002b, 4002c, 4002d to facilitate access to one or more communication networks, such as the core network 4006, the Internet 4010, and/or the networks 4012. By way of example, the base stations 4014a, 4014b may be a base transceiver station (BTS), Node-B (NB), evolved NB (eNB), Home NB (HNB), Home eNB (HeNB), enterprise NB ("ENT-NB"), enterprise eNB ("ENT-eNB"), a site controller, an access point (AP), a wireless router, a media aware network element (MANE) and the like. While the base stations 4014a, 4014b are each depicted as a single element, it will be appreciated that the base stations 4014a, 4014b may include any number of interconnected base stations and/or network elements.

[0202] The base station 4014a may be part of the RAN 4004, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc. The base station 4014a and/or the base station 4014b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown). The cell may further be divided into cell sectors. For example, the cell associated with the base station 4014a may be divided into three sectors. Thus, in one embodiment, the base station 4014a may include three transceivers, i.e., one for each sector of the cell. In another embodiment, the base station 4014a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.

[0203] The base stations 4014a, 4014b may communicate with one or more of the WTRUs 4002a, 4002b, 4002c, 4002d over an air interface 4016, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.). The air interface 4016 may be established using any suitable radio access technology (RAT).

[0204] More specifically, as noted above, the communications system 4000 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 4014a in the RAN 4004 and the WTRUs 4002a, 4002b, 4002c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 4016 using wideband CDMA (WCDMA). WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).

[0205] In another embodiment, the base station 4014a and the WTRUs 4002a, 4002b, 4002c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 4016 using Long Term Evolution (LTE) and/or LTE- Advanced (LTE-A).

[0206] In other embodiments, the base station 4014a and the WTRUs 4002a, 4002b, 4002c may implement radio technologies such as IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.

[0207] The base station 4014b in Figure 40A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like. In one embodiment, the base station 4014b and the WTRUs 4002c, 4002d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN). In another embodiment, the base station 4014b and the WTRUs 4002c, 4002d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN). In yet another embodiment, the base station 4014b and the WTRUs 4002c, 4002d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, etc.) to establish a picocell or femtocell. As shown in Figure 40A, the base station 4014b may have a direct connection to the Internet 4010. Thus, the base station 4014b may not be required to access the Internet 4010 via the core network 4006.

[0208] The RAN 4004 may be in communication with the core network 4006, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 4002a, 4002b, 4002c, 4002d. For example, the core network 4006 may provide call control, billing services, mobile location- based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication. Although not shown in Figure 40A, it will be appreciated that the RAN 4004 and/or the core network 4006 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 4004 or a different RAT. For example, in addition to being connected to the RAN 4004, which may be utilizing an E-UTRA radio technology, the core network 4006 may also be in communication with another RAN (not shown) employing a GSM radio technology.

[0209] The core network 4006 may also serve as a gateway for the WTRUs 4002a, 4002b, 4002c, 4002d to access the PSTN 4008, the Internet 4010, and/or other networks 4012. The PSTN 4008 may include circuit-switched telephone networks that provide plain old telephone service (POTS). The Internet 4010 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite. The networks 4012 may include wired or wireless communications networks owned and/or operated by other service providers. For example, the networks 4012 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 4004 or a different RAT. [0210] Some or all of the WTRUs 4002a, 4002b, 4002c, 4002d in the communications system 4000 may include multi-mode capabilities, i.e., the WTRUs 4002a, 4002b, 4002c, 4002d may include multiple transceivers for communicating with different wireless networks over different wireless links. For example, the WTRU 4002c shown in Figure 40A may be configured to communicate with the base station 4014a, which may employ a cellular-based radio technology, and with the base station 4014b, which may employ an IEEE 802 radio technology.

[0211] Figure 40B is a system diagram of an example WTRU 4002. As shown in Figure 40B, the WTRU 4002 may include a processor 4040, a transceiver 4020, a transmit/receive element 4022, a speaker/microphone 4024, a keypad 4026, a presentation unit (e.g., display/touchpad) 4028, non-removable memory 4006, removable memory 4032, a power source 4034, a global positioning system (GPS) chipset 4036, and other peripherals 4038 (e.g., a camera or other optical capturing device). It will be appreciated that the WTRU 4002 may include any subcombination of the foregoing elements while remaining consistent with an embodiment.

[0212] The processor 4040 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a graphics processing unit (GPU), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 4040 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 4002 to operate in a wireless environment. The processor 4040 may be coupled to the transceiver 4020, which may be coupled to the transmit/receive element 4022. While Figure 40B depicts the processor 4040 and the transceiver 4020 as separate components, it will be appreciated that the processor 4040 and the transceiver 4020 may be integrated together in an electronic package or chip.

[0213] The transmit/receive element 4022 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 4014a) over the air interface 4016. For example, in one embodiment, the transmit/receive element 4022 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 4022 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 4022 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 4022 may be configured to transmit and/or receive any combination of wireless signals. [0214] In addition, although the transmit/receive element 4022 is depicted in Figure 40B as a single element, the WTRU 4002 may include any number of transmit/receive elements 4022. More specifically, the WTRU 4002 may employ MIMO technology. Thus, in one embodiment, the WTRU 4002 may include two or more transmit/receive elements 4022 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 4016.

[0215] The transceiver 4020 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 4022 and to demodulate the signals that are received by the transmit/receive element 4022. As noted above, the WTRU 4002 may have multi-mode capabilities. Thus, the transceiver 4020 may include multiple transceivers for enabling the WTRU 4002 to communicate via multiple RATs, such as UTRA and IEEE 802.11, for example.

[0216] The processor 4040 of the WTRU 4002 may be coupled to, and may receive user input data from, the speaker/microphone 4024, the keypad 4026, and/or the presentation unit 4028 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 4040 may also output user data to the speaker/microphone 4024, the keypad 4026, and/or the presentation unit 4028. In addition, the processor 4040 may access information from, and store data in, any type of suitable memory, such as the nonremovable memory 4006 and/or the removable memory 4032. The non-removable memory 4006 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 4032 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 4040 may access information from, and store data in, memory that is not physically located on the WTRU 4002, such as on a server or a home computer (not shown).

[0217] The processor 4040 may receive power from the power source 4034, and may be configured to distribute and/or control the power to the other components in the WTRU 4002. The power source 4034 may be any suitable device for powering the WTRU 4002. For example, the power source 4034 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (Ni40n), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.

[0218] The processor 4040 may also be coupled to the GPS chipset 4036, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 4002. In addition to, or in lieu of, the information from the GPS chipset 4036, the WTRU 4002 may receive location information over the air interface 4016 from a base station (e.g., base stations 4014a, 4014b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 4002 may acquire location information by way of any suitable location- determination method while remaining consistent with an embodiment.

[0219] The processor 4040 may further be coupled to other peripherals 4038, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 4038 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.

[0220] Figure 40C is a system diagram of the RAN 4004 and the core network 4006 according to an embodiment. As noted above, the RAN 4004 may employ a UTRA radio technology to communicate with the WTRUs 4002a, 4002b, 4002c over the air interface 4016. The RAN 4004 may also be in communication with the core network 4006. As shown in Figure 40C, the RAN 4004 may include Node-Bs 4040a, 4040b, 4040c, which may each include one or more transceivers for communicating with the WTRUs 4002a, 4002b, 4002c over the air interface 4016. The Node-Bs 4040a, 4040b, 4040c may each be associated with a particular cell (not shown) within the RAN 4004. The RAN 4004 may also include RNCs 4042a, 4042b. It will be appreciated that the RAN 4004 may include any number of Node-Bs and RNCs while remaining consistent with an embodiment.

[0221] As shown in Figure 40C, the Node-Bs 4040a, 4040b may be in communication with the RNC 4042a. Additionally, the Node-B 4040c may be in communication with the RNC 4042b. The Node-Bs 4040a, 4040b, 4040c may communicate with the respective RNCs 4042a, 4042b via an Iub interface. The RNCs 4042a, 4042b may be in communication with one another via an Iur interface. Each of the RNCs 4042a, 4042b may be configured to control the respective Node-Bs 4040a, 4040b, 4040c to which it is connected. In addition, each of the RNCs 4042a, 4042b may be configured to carry out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like.

[0222] The core network 4006 shown in Figure 40C may include a media gateway (MGW) 4044, a mobile switching center (MSC) 4046, a serving GPRS support node (SGSN) 4048, and/or a gateway GPRS support node (GGSN) 4050. While each of the foregoing elements are depicted as part of the core network 4006, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.

[0223] The RNC 4042a in the RAN 4004 may be connected to the MSC 4046 in the core network 4006 via an IuCS interface. The MSC 4046 may be connected to the MGW 4044. The MSC 4046 and the MGW 4044 may provide the WTRUs 4002a, 4002b, 4002c with access to circuit- switched networks, such as the PSTN 4008, to facilitate communications between the WTRUs 4002a, 4002b, 4002c and traditional land-line communications devices.

[0224] The RNC 4042a in the RAN 4004 may also be connected to the SGSN 4048 in the core network 4006 via an IuPS interface. The SGSN 4048 may be connected to the GGSN 4050. The SGSN 4048 and the GGSN 4050 may provide the WTRUs 4002a, 4002b, 4002c with access to packet-switched networks, such as the Internet 4010, to facilitate communications between and the WTRUs 4002a, 4002b, 4002c and IP-enabled devices.

[0225] As noted above, the core network 4006 may also be connected to the networks 4012, which may include other wired or wireless networks that are owned and/or operated by other service providers.

[0226] Figure 40D is a system diagram of the RAN 4004 and the core network 4006 according to another embodiment. As noted above, the RAN 4004 may employ an E-UTRA radio technology to communicate with the WTRUs 4002a, 4002b, 4002c over the air interface 4016. The RAN 4004 may also be in communication with the core network 4006.

[0227] The RAN 4004 may include eNode Bs 4060a, 4060b, 4060c, though it will be appreciated that the RAN 4004 may include any number of eNode Bs while remaining consistent with an embodiment. The eNode Bs 4060a, 4060b, 4060c may each include one or more transceivers for communicating with the WTRUs 4002a, 4002b, 4002c over the air interface 4016. In one embodiment, the eNode Bs 4060a, 4060b, 4060c may implement MIMO technology. Thus, the eNode B 4060a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 4002a.

[0228] Each of the eNode Bs 4060a, 4060b, 4060c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and the like. As shown in Figure 40D, the eNode Bs 4060a, 4060b, 4060c may communicate with one another over an X2 interface.

[0229] The core network 4006 shown in Figure 40D may include a mobility management gateway (MME) 4062, a serving gateway (SGW) 4064, and a packet data network (PDN) gateway (PGW) 4066. While each of the foregoing elements are depicted as part of the core network 4006, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.

[0230] The MME 4062 may be connected to each of the eNode Bs 4060a, 4060b, 4060c in the RAN 4004 via an Si interface and may serve as a control node. For example, the MME 4062 may be responsible for authenticating users of the WTRUs 4002a, 4002b, 4002c, bearer activation/deactivation, selecting a particular SGW during an initial attach of the WTRUs 4002a, 4002b, 4002c, and the like. The MME 4062 may also provide a control plane function for switching between the RAN 4004 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA.

[0231] The SGW 4064 may be connected to each of the eNode Bs 4060a, 4060b, 4060c in the RAN 4004 via the Si interface. The SGW 4064 may generally route and forward user data packets to/from the WTRUs 4002a, 4002b, 4002c. The SGW 4064 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 4002a, 4002b, 4002c, managing and storing contexts of the WTRUs 4002a, 4002b, 4002c, and the like.

[0232] The SGW 4064 may also be connected to the PGW 4066, which may provide the WTRUs 4002a, 4002b, 4002c with access to packet-switched networks, such as the Internet 4010, to facilitate communications between the WTRUs 4002a, 4002b, 4002c and IP-enabled devices.

[0233] The core network 4006 may facilitate communications with other networks. For example, the core network 4006 may provide the WTRUs 4002a, 4002b, 4002c with access to circuit- switched networks, such as the PSTN 4008, to facilitate communications between the WTRUs 4002a, 4002b, 4002c and traditional land-line communications devices. For example, the core network 4006 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 4006 and the PSTN 4008. In addition, the core network 4006 may provide the WTRUs 4002a, 4002b, 4002c with access to the networks 4012, which may include other wired or wireless networks that are owned and/or operated by other service providers.

[0234] Figure 40E is a system diagram of the RAN 4004 and the core network 4006 according to another embodiment. The RAN 4004 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 4002a, 4002b, 4002c over the air interface 4016. As will be further discussed below, the communication links between the different functional entities of the WTRUs 4002a, 4002b, 4002c, the RAN 4004, and the core network 4006 may be defined as reference points.

[0235] As shown in Figure 40E, the RAN 4004 may include base stations 4070a, 4070b, 4070c, and an ASN gateway 4072, though it will be appreciated that the RAN 4004 may include any number of base stations and ASN gateways while remaining consistent with an embodiment. The base stations 4070a, 4070b, 4070c may each be associated with a particular cell (not shown) in the RAN 4004 and may each include one or more transceivers for communicating with the WTRUs 4002a, 4002b, 4002c over the air interface 4016. In one embodiment, the base stations 4070a, 4070b, 4070c may implement MIMO technology. Thus, the base station 4070a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 4002a. The base stations 4070a, 4070b, 4070c may also provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and the like. The ASN gateway 4072 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 4006, and the like.

[0236] The air interface 4016 between the WTRUs 4002a, 4002b, 4002c and the RAN 4004 may be defined as an Rl reference point that implements the IEEE 802.16 specification. In addition, each of the WTRUs 4002a, 4002b, 4002c may establish a logical interface (not shown) with the core network 4006. The logical interface between the WTRUs 4002a, 4002b, 4002c and the core network 4006 may be defined as an R2 reference point, which may be used for authentication, authorization, IP host configuration management, and/or mobility management.

[0237] The communication link between each of the base stations 4070a, 4070b, 4070c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and the transfer of data between base stations. The communication link between the base stations 4070a, 4070b, 4070c and the ASN gateway 4072 may be defined as an R6 reference point. The R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 4002a, 4002b, 4002c.

[0238] As shown in Figure 40E, the RAN 4004 may be connected to the core network 4006. The communication link between the RAN 14 and the core network 4006 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example. The core network 4006 may include a mobile IP home agent (MIP-HA) 4074, an authentication, authorization, accounting (AAA) server 4076, and a gateway 4078. While each of the foregoing elements are depicted as part of the core network 4006, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.

[0239] The MIP-HA 4074 may be responsible for IP address management, and may enable the WTRUs 4002a, 4002b, 4002c to roam between different ASNs and/or different core networks. The MIP-HA 4074 may provide the WTRUs 4002a, 4002b, 4002c with access to packet- switched networks, such as the Internet 4010, to facilitate communications between the WTRUs 4002a, 4002b, 4002c and IP-enabled devices. The AAA server 4076 may be responsible for user authentication and for supporting user services. The gateway 4078 may facilitate interworking with other networks. For example, the gateway 4078 may provide the WTRUs 4002a, 4002b, 4002c with access to circuit-switched networks, such as the PSTN 4008, to facilitate communications between the WTRUs 4002a, 4002b, 4002c and traditional land-line communications devices. In addition, the gateway 4078 may provide the WTRUs 4002a, 4002b, 4002c with access to the networks 4012, which may include other wired or wireless networks that are owned and/or operated by other service providers.

[0240] Although not shown in Figure 40E, it will be appreciated that the RAN 4004 may be connected to other ASNs and the core network 4006 may be connected to other core networks. The communication link between the RAN 4004 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 4002a, 4002b, 4002c between the RAN 4004 and the other ASNs. The communication link between the core network 4006 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.

[0241] Although features and elements are provided above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations may be made without departing from its spirit and scope, as will be apparent to those skilled in the art. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly provided as such. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims. The present disclosure is to be limited only by the terms of the appended claims, along with the full scope of equivalents to which such claims are entitled. It is to be understood that this disclosure is not limited to particular methods or systems.

[0242] It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting. As used herein, the term "view" may include video. As used herein, the term "video" may mean any of a snapshot, single image and/or multiple images displayed over a time basis. As another example, when referred to herein, the terms "user equipment" and its abbreviation "UE" may mean (i) a wireless transmit and/or receive unit (WTRU), such as described infra; (ii) any of a number of embodiments of a WTRU, such as described infra; (iii) a wireless-capable and/or wired-capable (e.g., tetherable) device configured with, inter alia, some or all structures and functionality of a WTRU, such as described infra; (iii) a wireless-capable and/or wired-capable device configured with less than all structures and functionality of a WTRU, such as described infra; or (iv) the like. Details of an example WTRU, which may be representative of any UE recited herein, may be provided herein.

[0243] In addition, the methods provided herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable media include electronic signals (transmitted over wired or wireless connections) and computer-readable storage media. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto- optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

[0244] Variations of the method, apparatus and system provided above are possible without departing from the scope of the invention. In view of the wide variety of embodiments that can be applied, it should be understood that the illustrated embodiments are examples only, and should not be taken as limiting the scope of the following claims. For instance, the embodiments provided herein include handheld devices, which may include or be utilized with any appropriate voltage source, such as a battery and the like, providing any appropriate voltage.

[0245] Moreover, in the embodiments provided above, processing platforms, computing systems, controllers, and other devices containing processors are noted. These devices may contain at least one Central Processing Unit ("CPU") and memory. In accordance with the practices of persons skilled in the art of computer programming, reference to acts and symbolic representations of operations or instructions may be performed by the various CPUs and memories. Such acts and operations or instructions may be referred to as being "executed," "computer executed" or "CPU executed."

[0246] One of ordinary skill in the art will appreciate that the acts and symbolically represented operations or instructions include the manipulation of electrical signals by the CPU. An electrical system represents data bits that can cause a resulting transformation or reduction of the electrical signals and the maintenance of data bits at memory locations in a memory system to thereby reconfigure or otherwise alter the CPU's operation, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to or representative of the data bits. It should be understood that the exemplary embodiments are not limited to the above-mentioned platforms or CPUs and that other platforms and CPUs may support the provided methods. [0247] The data bits may also be maintained on a computer readable medium including magnetic disks, optical disks, and any other volatile (e.g., Random Access Memory ("RAM")) or non-volatile (e.g., Read-Only Memory ("ROM")) mass storage system readable by the CPU. The computer readable medium may include cooperating or interconnected computer readable medium, which exist exclusively on the processing system or are distributed among multiple interconnected processing systems that may be local or remote to the processing system. It should be understood that the exemplary embodiments are not limited to the above-mentioned memories and that other platforms and memories may support the provided methods.

[0248] In an illustrative embodiment, any of the operations, processes, and/or the like described herein may be implemented as computer-readable instructions stored on a computer-readable medium. The computer-readable instructions may be executed by a processor of a mobile unit, a network element, and/or any other computing device.

[0249] There is little distinction left between hardware and software implementations of aspects of systems. The use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software may become significant) a design choice representing cost vs. efficiency tradeoffs. There may be various vehicles by which processes and/or systems and/or other technologies described herein may be effected (e.g., hardware, software, and/or firmware), and the preferred vehicle may vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle. If flexibility is paramount, the implementer may opt for a mainly software implementation. Alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.

[0250] The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples may be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In an embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), and/or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, may be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein may be distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc., and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).

[0251] Those skilled in the art will recognize that it is common within the art to describe devices and/or processes in the fashion set forth herein, and thereafter use engineering practices to integrate such described devices and/or processes into data processing systems. That is, at least a portion of the devices and/or processes described herein may be integrated into a data processing system via a reasonable amount of experimentation. Those having skill in the art will recognize that a typical data processing system may generally include one or more of a system unit housing, a video display device, a memory such as volatile and nonvolatile memory, processors such as microprocessors and digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices, such as a touch pad or screen, and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity, control motors for moving and/or adjusting components and/or quantities). A typical data processing system may be implemented utilizing any suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.

[0252] The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively "associated" such that the desired functionality may be achieved. Hence, any two components herein combined to achieve a particular functionality may be seen as "associated with" each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated may also be viewed as being "operably connected", or "operably coupled", to each other to achieve the desired functionality, and any two components capable of being so associated may also be viewed as being "operably couplable" to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.

[0253] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.