Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS AND METHOD FOR EXCHANGING AND DISPLAYING DATA BETWEEN ELECTRONIC EYEWEAR, VEHICLES AND OTHER DEVICES
Document Type and Number:
WIPO Patent Application WO/2017/131814
Kind Code:
A1
Abstract:
Disclosed are systems that allow for data to be shared between vehicles, locking mechanisms, and electronic eyewear. In an embodiment, a system includes a head-worn electronic eyewear device comprising a wireless communication module and an audio or visual system configured to communicate information received via the wireless module to a wearer of the head-worn electronic eyewear device. A vehicle module is configured to communicate wirelessly with the head-worn electronic eyewear device, either directly or through a third-party device, such that vehicle data is communicated to the wireless module of the head-worn electronic eyewear device for communication to the wearer of the head-worn electronic eyewear device. Systems for using a head-worn device to communicate settings data and to authenticating a user are also disclosed. A wireless-enabled device configured to utilize data from three or more sensors in a trilateration function to locate the second wireless-enabled device is further disclosed.

Inventors:
MACK COREY (US)
KOKONASKI WILLIAM (US)
Application Number:
PCT/US2016/042105
Publication Date:
August 03, 2017
Filing Date:
July 13, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
LAFORGE OPTICAL INC (US)
International Classes:
B60R1/00; B60R16/023; B60R16/037; G02B27/01; H04N5/232
Foreign References:
US20140098008A12014-04-10
US20130281141A12013-10-24
CN103870738A2014-06-18
US20120006611A12012-01-12
JPH07294842A1995-11-10
Attorney, Agent or Firm:
KURTZ, Richard et al. (US)
Download PDF:
Claims:
Claims:

1. A system for sharing data between a vehicle and electronic eyewear, comprising: a head-worn electronic eyewear device comprising a wireless communication module and an audio or visual system configured to communicate information received via the wireless module to a wearer of the head-worn electronic eyewear device; a vehicle module associated with and in communication with a vehicle, said vehicle module being configured to communicate wirelessly with said head-worn electronic eyewear device, either directly or through a third-party device, such that vehicle data is communicated to the wireless module of the head-worn electronic eyewear device for communication to the wearer of the head-worn electronic eyewear device.

2. The system of claim 1 where the vehicle module comprises the third-party device and the third-party device is configured to access the vehicle's OBD bus or CAN system.

3. The system of claim 1 where the vehicle module comprises the third-party device and the third-party device is configured to access a vehicle or home's security or access system.

4. The system of claim 1, where the vehicle module comprises the third-party device and the third-party device is configured to access the vehicle's infotainment system.

5. The system of claim 1, where the head-worn electronic eyewear device comprises a display.

6. The system of claim 1, where the head-worn electronic eyewear device and the vehicle module are configured such that the wearer of the head-worn electronic eyewear device sees information rear camera or front camera of the vehicle.

7. The system of claim 1, where the vehicle module is configured to send visual or audio output from a park assist, collision warning or avoidance system associated with the vehicle to the head- worn electronic eyewear device.

8. The system of claim 1, where the vehicle module is configured to send data from a vehicle telematics system or GPS system to the wearer of the head-worn electronic eyewear device.

9. The system of claim 1, where vehicle settings are stored in the head-worn electronic eyewear device.

10. The system of claim 9, where the vehicle settings comprise at least one setting selected from the set consisting of: radio station settings, audio playlists, suspension settings, transmission settings, light settings, seating position, or mirror settings.

11. A system, comprising: a head-worn device comprising a wireless communication module; a first vehicle module associated with a first vehicle and configured to communicate vehicle settings data wirelessly to the head-worn device either directly or through a third- party device; said head-worn device being configured to store said vehicle settings data and later communicate said vehicle settings data to a second vehicle module associated with a second vehicle, said second vehicle module being configured to receive said vehicle settings data wirelessly either directly or through a third-party device and to utilize said vehicle settings data in operation of at least vehicle system onboard said second vehicle.

12. The system of claim 11, where the vehicle settings data comprises at least one data type selected from the set consisting of: radio station data, audio playlist data, suspension settings data, transmission settings data, light settings data, seating position data, or mirror settings data.

13. The system of claim 11, where said vehicle settings data comprises data from a telematics system or GPS system associated with the first vehicle and where the system is configured to send said vehicle settings data to the head-worn device and later upload said vehicle settings data to said second vehicle's telematics or GPS system.

14. The system of claim 11, where the head-worn device is a head-worn display.

15. A system for authenticating a user, comprising: a head-worn device comprising an on-board imaging system configured to capture and store a current image of at least one of a wearer's eyes to be compared to an original image or video of the wearer's eye as a form of authentication; a second device configured to communicate with said head-worn device and permit access upon matching of said current image to said original image.

16. The system of claim 15, where the current image comprises a still image.

17. The system of claim 15, where the current image comprises a video.

18. The system of claim 15, where the original image is stored in the second device.

19. The system of claim 15, where the original image is stored in the head-worn device.

20. The system of claim 15, where the original image is stored in a third-party device.

21. A system comprising: a first wireless-enabled device, the device having three or more sensors on board; a second wireless-enabled device; wherein the first wireless-enabled device is configured to utilize data from the three or more sensors in a trilateration function to locate the second wireless-enabled device.

22. The system of claim 21, where the first wireless enabled device comprises electronic eyewear.

23. The system of claim 21, where the first wireless-enabled device comprises a device configured to provide an augmented reality environment.

24. The system of claim 21, where the first wireless-enabled device is a vehicle.

25. The system of claim 21 where the data is plotted on virtual plane in front of the user.

26. The system of claim 25 where a waypoint, symbol, marker or other character is mapped to said virtual plane.

27. The system in accordance with claim 25, where the first wireless-enabled device is configured to utilize a mini map to indicate a position of the second wireless enabled device from a perspective that is above the user.

Description:
APPARATUS AND METHOD FOR EXCHANGING AND DISPLAYING DATA BETWEEN ELECTRONIC EYEWEAR, VEHICLES AND OTHER DEVICES

[0001] This application is a non-provisional of, and claims priority to, U.S. Provisional Application No. 62/191,752 filed July 13, 2015, the entire disclosure of which is incorporated herein by reference.

Field

[0002] The present invention relates in general to the field of mediated reality and in particular to a system and method that allows for data to be shared between vehicles, locking mechanisms, and electronic eyewear.

Brief Description of Drawings

[0003] Objects, features, and advantages of the invention will be apparent from the following more particular description of preferred embodiments as illustrated in the accompanying drawings, in which reference characters refer to the same parts throughout the various views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating principles of the invention.

[0004] Figure 1 shows an illustration of an embodiment of the system of the invention, a vehicle and its subsystems.

[0005] Figure 2 shows an alternate view of the system illustrating the invention, a vehicle and its subsystems.

[0006] Figure 3 shows an illustration of the system in an embodiment wherein the invention interacts with more than one wireless device.

[0007] Figure 4 shows a view of the components in the invention and components in other systems.

[0008] Figure 5 shows an illustration of an embodiment wherein an image of the eye is used to authenticate.

[0009] Figures 6 and 6A show illustrations of an application wherein a distance is calculated using an embodiment of the invention.

[0010] Figure 7 shows an illustration of the variables of a distance-finding application.

[0011] Figures 8 and 8 A show illustrations of an operation of a distance finding application.

[0012] Figure 9 shows an illustration of an output of a distance-finding operation from the prospective of the user. [0013] Figure 10 shows an illustration of a communication method.

[0014] Figure 11 shows an illustration of an alternate communication method.

[0015] Figure 12 illustrates a system in accordance with an embodiment of the invention.

Detailed Description

[0016] Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure are not necessarily references to the same embodiment; and, such references mean at least one.

[0017] Reference in this specification to "an embodiment" or "the embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an embodiment of the disclosure. The appearances of the phrase "in an embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.

[0018] The present invention is described below with reference to block diagrams and operational illustrations of methods and devices for exchanging and displaying data between electronic eyewear, vehicles and other devices. It is understood that each block of the block diagrams or operational illustrations, and combinations of blocks in the block diagrams or operational illustrations, may be implemented by means of analog or digital hardware and computer program instructions. These computer program instructions may be stored on computer-readable media and provided to a processor of a general purpose computer, special purpose computer, ASIC, or other programmable data processing apparatus, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implements the functions/acts specified in the block diagrams or operational block or blocks. In some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.

[0019] Figure 1 shows an embodiment of the invention wherein a wirelessly enabled vehicle has external or external-facing sensors that are used to alert the driver to certain hazards. Electronic eyewear 101 is wirelessly connected to a vehicle 301 and has the ability to transmit and receive signals to other devices or mechanisms. In this embodiment the vehicle may consist of front and rear parking sensors 202. A front facing camera system 201 and a rear facing camera sensor 201 are provided. The data from 201, 202, and 203 may be obtained from an onboard telematics system 435 that engages with other on -board electronics 439 (figure 4). The outputs from the vehicle's telematics system are output via an audio system, and/or through one or more display systems 431 (figure 4). These types of displays may comprise, but are not limited to, an onboard head up display system 103 (also 432 in figure 4). 103 is typically a system in or on the dashboard that displays certain bits of telematics and navigation information in front of the driver via a virtual image that reflects of the windshield or, in the case of vehicles such as the 2015 Mini Cooper, a flip-up reflective element between the windshield and the steering wheel. The information may also be displayed in the instrument panel 104 that is behind the steering wheel and below the windshield. In some cars today, outputs from GPS 438 (figure 4) are also displayed inl04 as seen in the MMI system in vehicles such as the 2016 model year Audi TT, where traditional instrument cluster information such as speed, engine RPM, and warning lights among other outputs may displayed interchangeably or simultaneously with GPS data. Additional data can also be output the infotainment/climate system 102. 102 is usually located in the dashboard between the driver and passenger. 102 accepts inputs from occupants in the vehicle and can store settings and start functions such as vehicle settings 124, telephony settings 125, GPS location 123, comfort settings 122 such as HVAC and seat position, radio settings 121, and playlist 120.

[0020] In an embodiment, the system including the eyewear can interface with the above system and allow the wearer of the eyewear to not only view but also interface with this data wirelessly in a way that does not avert the driver's eyes downward or otherwise away from the road. Additionally, the system can relay other simpler forms of visual or audible alert to the driver exclusively. For example, Volvo's CitySafe system is able to detect pedestrians, cyclists, and other vehicles and apply the brakes to avoid or lessen the severity of the impact. The interface to the driver (in addition to the sudden jerk of the vehicle coming to a stop) is an audible alert coupled with an array of flashing red lights below the windshield. In accordance with the invention, however, the system can reroute the audio signal from the vehicle's audio out port 441 to the electronic eyewear 101 so that the driver may hear via an audio out port 417 such as Piezo element mounted in the frame or via and aux port onboard 101. Similarly, the visible alert may be expanded from just a series of warning lights visible in system 411 of the electronic eyewear to a higher fidelity alert where the hazard has shape placed around it so that the driver may be even more informed.

[0021] With reference to figure 2, one can see that there are other ways for the system to connect a vehicle. Some of these ways included connecting a module to the OBD port 105, Bluetooth 106, Wi-Fi 107, and a cellular network 108. In the cases of 106, 107, and 108 there is often a modem that has been placed in most modern cars so that a mobile device such as a pair of electronic eyewear 101 can interface with the vehicle. 106 however has typically been left to remotely access via and OBD scanner or a closed system by the manufacturer such as On-Star by General Motors. However, in the future these systems may be opened to developers who would like to access the OBD system wirelessly. Currently a third-party device 420 may be added to the vehicle that allows for one to interface with the vehicle's OBD system or telematics system. One such system is the Automatic Module by Automatic Labs. The Automatic module plugs into a vehicles OBD port 105 (also shown at 437 in figure 4) and is able to wirelessly output or log data such as vehicle speed and engine temperature or more sophisticated functions such as moving a phone to a 'do not disturb' when the vehicle is in motion or calling emergency services when the an airbag deployment sensor has been activated.

[0022] With reference to figure 3, the system can also be used to transmit data between vehicle modules of vehicles that are not otherwise capable of vehicle-to-vehicle communication. In this example the electronic eyewear acts as storage device on a 'sneaker net' that is wireless enabled. In this embodiment, settings such as seating position, radio presets, or navigational waypoints can be uploaded from a first vehicle, stored in 101, and downloaded to a second vehicle 302.

[0023] Figure 4 illustrates an example of an electronic system of the invention where the electronic eyewear hardware 410 comprising memory 414, a processor 415, and a display system 411 (further comprising a display 412 and a driver 413) can communicate directly with a vehicle module such as a vehicle's telematics system 430 wirelessly via a wireless module 416 comprising a wireless antenna. Figure 4 also illustrates an embodiment wherein the vehicle module includes a third party device such as phone or module that includes memory 421 along with the vehicle's telematics system 430. In this embodiment, the electronic eyewear hardware 410 can communicate with the vehicle through the third party device 420. The third-party device 420 can plug in directly to 430 or communicate with 430 wirelessly via a wireless module 422 and a wireless module 436. Data from memory 440 of the telematics system 435, such as data from the instruments 433 or infotainment/climate systems 434, can be communicated wirelessly to the electronic eyewear hardware 410.

[0024] In certain applications, a software developer may choose to use 101 with a secured third- party device. In this case, the invention has an onboard authentication system that scans the eye. As every eye is different this adds a primary level of security. For individuals that are in the public eye (such as celebrities and politicians) and have numerous photos available, there may be a concern that someone may be able to lift an 'eye print' from a high resolution photo. An additional level of security is that the images used in this system can have a very high resolution and a proprietary aspect ratio, and the system can use a comparison of infrared images and conventional digital photos in order to authenticate. This system also may use a series of images or a video analysis of a person's eye to authenticate the user.

[0025] Figure 5 is an illustration of hardware that may be needed in accordance with such an embodiment. A reflective surface 501 redirects light through an optical element 502 such as a lens, waveguide, or fluid and into an image sensor 504 of a biometric matching system 503. From there the image from sensor is processed in a processor 505 and is either stored in memory 506 or is compared to an image that is stored in memory 506. If the match is positive a wireless antenna 507 will transmit a security credential. This credential may be sent to any third party device but by way of example only figure 5 shows one credential being sent a vehicle's telematics system 510 (having security module 511, lock mechanism 513, and wireless module 512) and another being sent to a home's access system 520 (having memory 521, lock mechanism 523 and wireless module 522). In both of the illustrated cases the goal being to lock or unlock a device. Note that both 510 and 520 have a wireless modem to transmit and receive data such as security credentials.

[0026] Another function of the electronic eyewear 101, is its ability to convey distances and waypoints to a user in real time. For example, figure 6 shows a trilateration function being performed with goal of assisting a user to find the location of wireless enabled object 601, which by way of example only is illustrated as a car that is out of view because a second car is in the user's line of sight. In figure 6, the user is wearing an embodiment of 101 that feature 3 on board wireless sensor 620 A, 620B, and 620C. Initially one of these three sensors will send a first signal to 601 to determine if the user is in range. If the user is, 601 will send a back a signal confirming that is 'awake'. At that point 620A, 620B, 620C will simultaneously send a signal to 601 and 601 will send the signal back to 101. 101 will then calculate the amount of time that has passed and perform additional calculation to determine the distance 621A, 621B, and 621C also illustrated as radii rl, r2, and r3. Looping this system software can simply output prompts that let you know if you going in the correction direction. For example, looking figure 6 again, one can see that 62 IB has the shortest radius. Assuming that the direction of travel is from right to left on the illustration, one can deduce that 601 is in front and towards the right of the user (quadrant 1 on figure 6a). If 621A were shortest, one would deduce that the 601 is in front and to the left (quadrant 2 on figure 6a). If 621C were shortest is would mean that 601 is behind the user (in either quadrant 3 or 4 of figure 6a).

[0027] Figure 7 shows a more advanced version of the function of figure 6 that determines the coordinates of 601 with respect to the user. In this case the known coordinates of 620A, 620B, and 620C (with 620A residing at the origin) would be preloaded into the system. The distance between 620A and 620B is "j" or 622 the distance between 620B and 620 is "d" or 623. If one considers the points associated with 620A, 620B, 620C as center points to 3 sphere, they may described by the following equations:

r ··., (x _ d) 2 ■ (y j) 2 + z 2

The wireless enabled object 601 has coordinate (x,y,z) associated with that will satisfy all three equations. In order find said coordinate the system first solves for x by subtracting r 1? and r 2 . r 9i ri '? ■ x (x — d)

Simplifying the above equation and solving for x yields the equation:

X

2d

In order to solve for y, one must solve for z in the first equation and substitute into the third equation. z 2* 7> ·.» _ x 2~ _ 2~ r§ - x 2 - y 2 Simplifying:

(x 2 - 2xd + d 2 ) + (y 2 - 2yj + ) - x 2 - y

At this point x and y are known, so the equation for z ma simply be rewritten as:

[0028] Since this is not an absolute value it is possible for there to be more than one solution. In order to find the solution, the coordinates can be matched to the expected quadrant which ever coordinate does not match the expected quadrant is thrown out. Figure 12 illustrates how the above operations may be looped with software.

[0029] Figures 8 and 9 illustrate how a virtual plane 630 can be projected out into space that may be used to draw graphics on. In this case 630 is a projected x-z plane a distance y in front of the user. Now turning to figure 9, since 630 is now being projected as if it coincident with 610, one may choose to draw a waypoint 632 and some character based data to aid a person in finding 630. To aid the user further, a mini map 633 may be displayed that shows via 631 where 601 is located relative to the user.

[0030] There may be a time when a user 710 is out of range of 601 but may be in range 610 of another wireless enable device. Figure 10 illustrates how a type of mesh work may be used to indicate to a user where 601 is located. By way of example only, the case illustrated in figure 10 shows multiple vehicles in a parking lot that are wirelessly enabled. In this case, a first car 703 communicates with a first intermediate vehicle 702, which then communicates with a second intermediate vehicle 702 that is in communication with desired car 701. In this case the electronics eyewear can process this information stating "your vehicle is on the right four vehicles away". Figure 1 1 shows a similar application of the invention where it is being used in an environment where there are multiple people using wireless devices such as smartphones, wearables or laptops. In this application 710 is sharing data with multiple first devices 711. The 71 l 's are in communication with multiple secondary devices 712 that are also in contact with the desired device 713. Some of the methods described above can be used to display where the desired device is located with a waypoint or prompts such as "ahead about 10 steps and to the right". It must also be noted that techniques described above to inherently rely on a satellite based GPS system, but rather the system can create a localized positioning system using Wi-Fi, Bluetooth, Zigbee or other ad hoc networks as this plot coordinates on the earth that relative to the user 710, whereas most satellite -based GPS assigns coordinate to user 710 relative to earth.

[0031] In some embodiments, the eyewear 101 and camera on the eye wear can be used in conjunction with one or more camera located outside of the eyewear. For example, a set of security cameras in a building, or cameras on one or more smart phone could to provide additional images to one produced by the eyewear 101 camera, that in combination may be used to examine a scene to find an object of a known shape or size. The information about the scene could then be displayed on the display systems of the eyewear. This could include complex 3D images, or simple text instructions regarding work to be done or performed in the scene. Information regarding known hazards in a scene may also be provided.

[0032] The cameras can be used to produce 3D images of the objects in the scene for later rendering. The images from multiple cameras might also be used in triangulation algorithms to locate objects in a scene relative to stored information regarding the said scene and objects in that scene.

[0033] At least some aspects disclosed can be embodied, at least in part, in software. That is, the techniques may be carried out in a special purpose or general purpose computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as ROM, volatile RAM, non-volatile memory, cache or a remote storage device. Functions expressed in the claims may be performed by a processor in combination with memory storing code and should not be interpreted as means- plus-function limitations.

[0034] Routines executed to implement the embodiments may be implemented as part of an operating system, firmware, ROM, middleware, service delivery platform, SDK (Software Development Kit) component, web services, or other specific application, component, program, object, module or sequence of instructions referred to as "computer programs." Invocation interfaces to these routines can be exposed to a software development community as an API (Application Programming Interface). The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations necessary to execute elements involving the various aspects. [0035] A machine-readable medium can be used to store software and data which when executed by a data processing system causes the system to perform various methods. The executable software and data may be stored in various places including for example ROM, volatile RAM, non-volatile memory and/or cache. Portions of this software and/or data may be stored in any one of these storage devices. Further, the data and instructions can be obtained from centralized servers or peer-to-peer networks. Different portions of the data and instructions can be obtained from different centralized servers and/or peer-to-peer networks at different times and in different communication sessions or in a same communication session. The data and instructions can be obtained in entirety prior to the execution of the applications. Alternatively, portions of the data and instructions can be obtained dynamically, just in time, when needed for execution. Thus, it is not required that the data and instructions be on a machine-readable medium in entirety at a particular instance of time.

[0036] Examples of computer-readable media include but are not limited to recordable and non- recordable type media such as volatile and non-volatile memory devices, read only memory (ROM), random access memory (RAM), flash memory devices, floppy and other removable disks, magnetic disk storage media, optical storage media (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks (DVDs), etc.), among others.

[0037] In general, a machine readable medium includes any mechanism that provides (e.g., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).

[0038] In various embodiments, hardwired circuitry may be used in combination with software instructions to implement the techniques. Thus, the techniques are neither limited to any specific combination of hardware circuitry and software nor to any particular source for the instructions executed by the data processing system.

[0039] The above embodiments and preferences are illustrative of the present invention. It is neither necessary, nor intended for this patent to outline or define every possible combination or embodiment. The inventor has disclosed sufficient information to permit one skilled in the art to practice at least one embodiment of the invention. The above description and drawings are merely illustrative of the present invention and that changes in components, structure and procedure are possible without departing from the scope of the present invention as defined in the following claims. For example, elements and/or steps described above and/or in the following claims in a particular order may be practiced in a different order without departing from the invention. Thus, while the invention has been particularly shown and described with reference to embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.