Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
IMMERSIVE DISPLAY AND AUDIO EXPERIENCE
Document Type and Number:
WIPO Patent Application WO/2017/192134
Kind Code:
A1
Abstract:
An example system to provide an immersive experience using a projector bundle and an audio unit is described. The system comprises an audio unit to allow sound in an environment, a projector bundle attachable to the audio device and to project an image to a projection surface in the environment, and a computing unit to synchronize and control the sound and the projected image. The projector bundle receives the image from the computing unit and instructions to project the image onto the projection surface. The audio unit receives the audio from the computing unit and instructions to provide the audio in the environment

Inventors:
MEHANDJIYSKY DIMITRE (US)
HOGGARTH MARCUS (GB)
GODFREYWOOD JACK (GB)
COOK SUSAN (GB)
LOUGHLIN KYLE (GB)
MASSARO KEVIN L (US)
CAO ZHENG (US)
Application Number:
PCT/US2016/030780
Publication Date:
November 09, 2017
Filing Date:
May 04, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HEWLETT PACKARD DEVELOPMENT CO LP (US)
International Classes:
G03B31/08; G03B31/06
Domestic Patent References:
WO2013078280A22013-05-30
Foreign References:
US20100309390A12010-12-09
US20130044184A12013-02-21
US20120297428A12012-11-22
US20130181901A12013-07-18
Attorney, Agent or Firm:
MAISAMI, Ceyda Azakli (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system to provide art immersive experience using a projector bundle and an audio unit, comprising;

an audio unit to allow sound in an environment;

a projector bundle attachable to the audio device and to project an image to a projection surface in the environment; and

a computing unit to synchronize and control the sound and the projected image,

wherein the projector bundle receives the image from the computing unit and instructions to project the image onto the projection surface, and wherein the audio unit receives the audio from the computing unit and instructions to provide the audio in the environment.

2. The system of claim 1 , further comprising a communication- unit to connect the projector bundle and the audio system to the computing unit,

3. The system of claim 1, wherein the projector bundle comprises a plurality of projector units, and each projector unit within the projector bundle is controlled by a separate computing unit.

4. The system of claim 1, wherein the projector bundle provides an immersive display experience within the environment.

5. The system of claim 1, wherein the audio unit includes an ambisonic spatial sound system to provide audio anywhere within the environment.

8. The system of claim 1, wherein each projector unit within the projector bundle is placed adjacent to walls of the environment.

7. The system of claim 1, wherein each projector unit within the projector bundle projects a portion of the image.

8. The system of claim 1 , further comprising a camera to scan the environment and identify the projection surface in the environment,

9. The system of ciaim 1 , wherein the projector and audio device is controlled based on input from a user,

10. A processor-implemented method for providing an immersive experience with audio in an environment, the method comprising:

instructing, a projector bundie, to project a plurality of portions of an image on a projection surface;

instructing, an audio unit, to provide an audio; and

synchronizing the audio and the piurality of portions of the image, wherein each portion in the piurality of portions of the image appears as an extension to adjacent portion on the projection surface.

11.The method of ciaim 1 , further comprising receiving a selection of the image by a user.

12. The method of claim 1 , wherein the projection surface is a white wall having reflective properties.

13. The method of claim 1 , wherein the image may be photos, videos, interactive virtuai reality content.

14. The method of ciaim 1 , further comprising instructing a camera to map the environment and identify the projection surface.

15. A non-transitory machine-readable storage medium comprising instructions executabie by a processing resource of a computing system to provide an immersive experience within an environment, the instructions executabie to;

instruct to project an image on a projection surface; instruct to provide an audio; and

synchronize the image and the audio based on a criteria, the criteria defined by user preferences.

Description:
IMMERSIVE DISPLAY AND AUDIO EXPERIENCE BACKGROUND

[0001] Computer systems typically employ a display or multiple displays which are mounted on a support stand and/or are incorporated into some other component of the computer system. For displays consumed by a group of people at the same time, it is often desirable for users to view such displays from various angles and being able to interact with it if desired. However, optimum ergonoraie placement of a display for simply viewing an image thereon is often at odds with such placement for engaging in touch interaction therewith, Moreover, the display size is rarely large enough, and various ways of displaying images on nontraditional surfaces and viewing such images may require external wearable devices.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002] Por a detailed description of various examples, reference will now be made to the accompanying drawings in which:

[0003] Figure 1 is a block diagram of an example system in accordance with the principles disclosed herein;

[0004] Figure 2 is a schematic view of the system of Figure 1 in accordance with the principles disclosed herein; and

[0005] Figure 3 is a flowchart of an example method executable by a system of Figure 1 in accordance with the principles disclosed herein.

NOTATION AND NOMENCLATURE

[0006] Certain terms are used throughout the following description and claims to refer to particular system components. As one skilled in the art will appreciate, computer companies may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms "including" and "comprising" are used In an open-ended fashion, and thus should be interpreted to mean "including, but not limited to..." Also, the term "couple" or "couples" is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct electrical or mechanical connection, through an indirect electrical or mechanical connection via other devices and connections, through an opticai eieciricai connection, or ihrough a wireless electrical connection. As used herein the term "approximately" means plus or minus 10%. In addition, as used herein, the phrase "user input device" refers to any suitable device for providing an input, by a user, into an electrical system such as, for example, a mouse, keyboard, a hand (or any finger thereof), a stylus, a pointing device, etc.

DETAILED DESCRIPTION

[0007] The following discussion is directed to various examples of the disclosure. Although one or more of these examples may be preferred, the examples disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art wili understand that the following description has broad application, and the discussion of any example is meant only to be descriptive of that example, and not intended to intimate thai the scope of the disclosure, including the claims, is limited io that example.

[0008] Referring now to Figure 1 , a system 100 in accordance with the principles disclosed herein is shown. In this example, the system 100 generally comprises a subsystem 110 including a projector bundle 120, an audio unit 130, and a computing device 150, The subsystem 110 is connected to the computing device 150, which may comprise any suitable computing device while still complying with the principles disclosed herein, in an alternative example, the subsystem 110 comprises a camera, which scans the surrounding environment (e.g., room that the system 100 is located in) and identifies a projection surface for the projector bundle 120 to project an image.

[0009] in some Implementations, the device 150 may comprise a smartphone, a tablet, a phablet, an all-in-one computer (i.e., a display that also houses the computer's board), a smart watch or some combination thereof. The computing device 150 may include at least one processing resource. In examples described herein, a processing resource may include, for example, one processor or multiple processors included in a single computing device or distributed across multiple computing devices. As used herein, a "processor" may be at least one of a central processing unit (CPU), a semiconductor-based microprocessor, a graphics processing unit (GPU), a field-programmable gate array (FPGA) to retrieve and execute instructions, other electronic circuitry suitable for the retrieval and execution instructions stored on a machine-readable storage medium, or a combination thereof. As used herein, a "machine-readable storage medium" may be any electronic, magnetic, optical, or other physical storage apparatus to contain or store information such as executable Instructions, data, arid lie like. For example, any machine-readable storage medium described herein may be any of a storage drive (e.g., a hard drive), flash memory, Random Access Memory (RAM), any type of storage disc (e.g., a compact disc, a DVD, etc.), and the like, or a combination thereof. Further, any machine-readable storage medium described herein may be non-transitory.

[00103 to∞® Implementation, the subsystem 110 and the computing device 150 may be connected fas shown in Figure 1). in another implementation, the subsystem 110 may be located In the computing device 150, Further, in one implementation, the system 100 may comprise a dispiay. For example, a user of the system 100 may use the display in the system 100 to interact with the system 100, In other implementations, the display may be in the computing device 150. For example, the user may use the display in the computing device 150 to control the projector bundle 120 and the audio unit 130.

[0011] The computing device 150 comprises a communications unit such as a transmitter/receiver chip that allows the computing device 150 to communicate with the subsystem 110 and/or another computing device {e.g., laptop, tablet). In some embodiments, the subsystem 110 may send information to and/or receive from the device 150 through any suitable type of connection while still complying with the principles disclosed herein. For example, in some implementations, the subsystem 110 is electrically coupled to the device 150 through an electric conductor, WI-FI, BLUETOOTH®, WiGig, an optical connection, an ultrasonic connection, or some combination thereof. Any suitable wireless (or wired electrical coupling) connection may be used between the subsystem 110 and the device 150 (if they are not in one unit) such as, for example, WI-FI, BLUETOOTH®, ultrasonic, electrical cables, electrical leads, electrical spring-loaded pogo pins with magnetic holding force, or some combination thereof, while still complying with the principles disclosed herein.

[0012] The projector bundle 120 may comprise any suitable digital light projector assembly for receiving data from a computing device {e.g., device 150) and projecting an image or images that correspond with that received data. For example, in some implementations, the projector bundle 120 comprises at least one short throw projector or a digital Sight processing (DLP) projector or a liquid crystal on silicon (LCoS) projector which are advantageousiy compact and power efficient projection engines capable of multiple display resolutions and sizes, such as, for example , standard XGA (1024 x 78S) resolution 4:3 aspect ratio or standard WXGA (1280 x 800) resolution 16:10 aspect ratio. In one example, the projector bundle 120 comprises one projector unit In another example, the projector bundle 120 comprises a plurality of projector units, in such example, each projector unit may project a portion of the image, and that portion of the image may appear as an extension of the portion projected by another projector unit.

[0013] The image projected by the projector bundle 120 may comprise still images, interfaces, games, videos {e.g., dynamic images), interactive virtual reality content and alike. Further, the projector bundle 120 projects the image onto a projection surface. In one example, the image projected by the projector bundle 120 may comprise information and/or images produced by software executing within the computing device 150. A user (not shown) may then interact with the image displayed on the projection surface by physically engaging with the projection surface. In another example, the user may interact with the image through an application (e.g., software) on the computing device 150. Such interaction may take place through any suitable method such as, direct interaction with a user's hand, through a stylus, or other suitable user input device(s). [0014] The audio unit 130 comprises an ambisonic sound system, providing three-dimensional (3D) sound in the environment, yore specifically, the audio unit 130 sends a sound signal with spatial information that enables the user to perceive the sound as originating from distinct spatial locations and different directions, in one example, the audio unit 130 may target one user. Thai is, the audio unit 130 may provide an effect of stereo sound when a single user is positioned within the direction of the speaker, in another example, the audio unit 130 may provide a 3D sound for multiple users regardless of the users' positions.

[0015] In one implementation, the computing device 150 comprises a controller unit. The controller unit analyzes data from the projector bundle and the audio unit, ivlore specifically, the controller unit synchs the image data from the projector bundle with the audio data from the audio unit. The controller unit may include a programmable logic controller, microprocessor, application specific integrated circuit, or the like having suitable programming code for performing the methods described herein. More specifically, the controller unit may be implemented using any suitable type of processing system where at (east one processor executes computer-readable instructions stored in a memory. As discussed earlier, the processor may be, for example, a central processing unit (CPU ), a semiconductor- based microprocessor, an application specific integrated circuit (ASIC), a field- programmable gate array (FP6A) configured to retrieve and execute instructions, other electronic circuitry suitable for the retrieval and execution instructions stored on a computer readable storage medium (e.g., the memory), or a combination thereof. The computer readable medium may be a non-transitory computer- readable medium that stores machine readable instructions, codes, data, and/or other information. The instructions, when executed by the processor (e.g., via one processing element or multiple processing elements of the processor) can cause the processor to perform processes described herein. The computer readable medium may be one or more of a non-volatile memory, a volatile memory, and/or one or more storage devices. Examples of non-volatile memory include, but are not limited to, electronically erasable programmable read only memory (EEPROSvl) and read only memory (ROM), Examples of volatile memory include, but are not limited to, static random access memory (SRAM) and dynamic random access memory (DRAM),

[0016] ln one implementation, the controller unit may perform the synchronization process based on a certain criteria. For example, such criteria may include information about the type of image being projected, type of audio being provided, the description of the environment and/or the projection surface, user preferences and/or alike. This criteria may be defined by the user or predefined to the system 100, tn the case where the user defines the criteria, the analyzer prompts the user to enter information regarding the criteria. Such criteria is applied to synchronize the image data with the audio data in a desired manner.

[0017] Now turning to Figure 2, similar to the system 100 discussed in reference to Figure 1 , the system 200 generally comprises a projector unit 210 and an audio device 220, and a computing device (not shown), in this exampie, the projector unit 210 is shown as a floor system installed along the walls of the room. However, it should be appreciated that in other examples, other suitable alignments methods or devices may be used while still complying with the principles disclosed herein. For exampie, the projector unit may be placed directly on the ground, adjacent or parallel to the walls. Moreover, In this exampie, three projector units 210 (e.g., a projector bundle) are shown. However, it should be noted that more or less number of units may be used while still complying with the principles disclosed herein.

[0018] In this implementation, the projector unit is placed in a housing unit, which is shown as a portable rectangular box. However, it should be appreciated that in other examples, other suitable shapes and sizes may be used. The housing unit may be any suitable structure for supporting the components while still complying with the principles disclosed herein. The rectangular box may come in any shape and size. Further, in this example, the audio unit 220 is outside the housing unit for the projection unit 210. In another example, such housing unit may host the audio unit 220, and the projection unit 210 and the audio unit 220 may both be located in the portable rectangular box. In that case, the projector unit and the audio unit are substantially hidden inside the housing unit when the subsystem is viewed from a viewing surface. In addition, in this implementation, the system 200 comprises a plurality of projector units 210, and one audio device 220. In another implementation, there may be as many audio units as there are projector units. Moreover, in some implementations, the housing unit comprises a tilt mechanism (e.g., hinge) that includes an axis of rotation such that the projector unit 210 and the audio device 220 may rotate up to a certain degrees. In one example, the subsystem rotates to attain an optimal projection angle for the projector unit. The projector unit may rotate to project images, interlaces, games, videos, interactive virtual reality content and alike onto surrounding surfaces.

[0019] In some implementations, the housing unit may be portable. For example, the user may choose to carry the housing unit as a wearable accessory. SVIore specifically, the user may choose to carry the system around in an environment (e.g., home). Further, the computing unit of the system 200 may be a portable device that moves with the user (e.g., mobile device, tablet, smart watch and alike). In other examples, the housing unit may have a permanent location in an environment (e.g., a room in a house). In either example, the projector unit 210 and the audio unit 220 maintain a connection with the computing unit. When the operation of the projector unit 210 or the audio unit 220 is initiated, the system 200 may confirm that the projector unit 210 or the audio unit 220 has an active connection with the computing device, in some examples, the system 200 may comprise a camera, which scans the surrounding and identifies a projection surface for the projector to project the image from the computing device when provided.

[0020] In some examples, the .projector units 210 may project a plurality of images onto a plurality of projection surfaces. Each image on each surface may be controlled by one or different users. These systems may communicate with one another either directly or via the computing units that they are communicatively connected to. More specifically, the system 200 may be used to create a shared digital workspace or gaming space for remote collaboration between one or more users. Another system (with a projector unit and an audio unit) may be communicatively linked to the system 200 through any suitable connection such as, for example, an electric conductor, WI-FI, BLUETOOTH®, an optical connection, an ultrasonic connection, or some combination thereof, such that information and/or data may pass freely between the systems. During collaboration between users, images may fee projected on projection surfaces associated with the systems. Further, one user may interact with the projected image on the projection surface of a first system by pointing with a hand (or other object).

[0021] Referring now to Figure 3, a flowchart of an example method executable by a system similar to the systems 100-200 described in reference to Figures 1-2 is shown in accordance with the principles disclosed herein. At block 310, a computing device instructs a projector to project an image, in one example, the image may be selected by a user. More specifically, the selection of the image may comprise identifying a category of images (e.g., still image, dynamic image) in addition to the image itself (e.g., photo, video, interactive virtual reality content). Further, the selection of the image may comprise choosing a theme (e.g., game, movie, book, etc.) for the image. Such image is projected onto a projection surface in the environment that the system is located in,

[0022] in an alternative example, the system comprises a camera, and the camera scans surrounding area for location awareness. More specifically, the camera maps the environment (e.g., the room, the surrounding up to a certain distance) and determines unoccupied (empty) and occupied areas, including the objects within the occupied areas. Further, the camera may determine the location, size and color of the scanned objects. Based on all this information, in one example, the system identifies a projection surface based on at least one criteria related to the mapping of the area and the image selected by the user.

[0023] At block 320, the device instructs an audio unit to provide audio in the environment, in one example, the audio is selected by the user through the computing device, and the selection is communicated to the audio unit by the computing device. At block 330, the computing device synchronizes the image with the audio based on a certain criteria. In one example, the criteria may be based on user preference. For example, the user may prefer tie audio to be loud when the image is a video game, or the user may prefer the audio to be soft when the image being projected is a photo slide show.

[0024] The above discussion is meant to be illustrative of the principles and various embodiments of the present invention. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.

[0025] Although the flowchart of Fig. 3 shows a specific order of performance of certain functionalities, method 300 is not limited to that order. For example, the functionalities shown in succession in the flowchart may be performed in a different order, may be executed concurrently or with partial concurrence, or a combination thereof. In some examples, features and functionalities described herein in relation to Fig. 3 may be provided in combination with features and functionalities described herein in relation to any of Figs. 1-2.

[0026] The above discussion is meant to be illustrative of the principles and various embodiments of the present invention. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated, it is intended that the following claims be interpreted to embrace ail such variations and modifications.