Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MULTI-DEVICE VIDEO AND AUDIO
Document Type and Number:
WIPO Patent Application WO/2018/141064
Kind Code:
A1
Abstract:
A portion of a complete image is displayed on a mobile device, which includes a display, a processor and may include connectors at defined locations for interconnecting with other devices. The method includes determining a geometry of a composite device including a display, for display of the complete image, based on a spatial location of other device displays relative to the mobile device display, for example, based on the location of the connectors. Bounding coordinates of the mobile device display are determined within the composite display, based on the composite display geometry. The portion of the complete image is determined from the composite display geometry, the bounding coordinates, and the complete image dimensions. The complete image is generated contemporaneously with each of the other devices generating the complete image, and the portion of the complete image is displayed on the mobile device display.

Inventors:
SZETO TIMOTHY JING YIN (CA)
REYES DAVID MICHAEL LOPEZ (CA)
KUSCIK ZOLTAN (CA)
Application Number:
PCT/CA2018/050119
Publication Date:
August 09, 2018
Filing Date:
February 02, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NANOPORT TECH INC (CA)
International Classes:
G09G5/12; G06F3/14; G10L19/008; H04W88/02
Domestic Patent References:
WO2015070321A12015-05-21
Foreign References:
US9158333B12015-10-13
US8810533B22014-08-19
US20160210257A12016-07-21
US9684612B12017-06-20
US20150065069A12015-03-05
Other References:
OHTA ET AL.: "MovieTile: Interactively Adjustable Free Shape Multi-Display of Mobile Devices", SIGGRAPH ASIA 2015 MOBILE GRAPHICS AND INTERACTIVE APPLICATIONS, 2 November 2015 (2015-11-02), pages 1 - 7, XP058075508
ANTON NIJHOLT, TERESA ROMÃO, DENNIS REIDSMA: "Advances in Computer Entertainment . Lecture Notes in Computer Science", vol. 7624, 2012, SPRINGER, ISSN: 15443574, article TAKASHI OHTA; JUN TANAKA: "Pinch: An Interface That Relates Applications on Multiple Touch-Screen by 'Pinching' Gesture", pages: 320 - 335, XP009176351
Attorney, Agent or Firm:
SMART & BIGGAR (CA)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A mobile device comprising: a display; a processor; and a memory storing processor executable instructions that when executed cause said processor to: determine a geometry of a composite display, for display of a complete image, based on at least a spatial location of each display of one or more other devices relative to the display of the mobile device; determine bounding coordinates of the display of the mobile device within the composite display, based on the geometry of the composite display; determine a portion of the complete image that is to be displayed on the display of the mobile device from the the geometry of the composite display, the bounding coordinates, and dimensions of the complete image; and display the portion of the complete image on the display of the mobile device, contemporaneously with each of the one or more other devices displaying a determined portion of the complete image for display on that other device.

2. The mobile device of claim 1 , further comprising: a plurality of connectors each for interconnecting the mobile device with at least one of the one or more other devices, each of the plurality of connectors located in a defined location on the mobile device and configured to provide an indication detectable by the processor of when a connection to one of the other devices is made or lost, wherein the one or more other devices are detected as interconnected with the mobile device by way of at least one of the plurality of connectors, and the spatial location of each display of the one or more other devices relative to the display of the mobile device is determined based on at least the defined location of the at least one of the plurality of connectors.

3. The mobile device of claim 1 , wherein the memory further stores processor

executable instructions that when executed cause the processor to: synchronize the mobile device to display the portion of the complete image in synchronism with the portion of the complete image for display on each of the one or more other devices being displayed on each of the one or more other devices.

4. The mobile device of claim 1 , wherein the memory further stores processor

executable instructions that when executed cause the processor to: synchronize the mobile device to generate the complete image at the mobile device in synchronism with generating the complete image at each of the one or more other devices.

5. The mobile device of claim 1 , wherein the memory further stores processor

executable instructions that when executed cause the processor to: maintain a software state consistent with a software state at each of the one or more other devices, the software state defining the complete image that is to be displayed across the display of the mobile device and the displays of each of the one or more other devices on the composite display.

6. The mobile device of claim 5, wherein the memory further stores processor

executable instructions that when executed cause the processor to:

47

\ send notification of a user input to each of the one or more other devices for simulation at the one or more other devices, the user input defining the software state.

7. A computer-implemented method for displaying a portion of a complete image on a display of a mobile device, the mobile device comprising a processor, said method comprising: determining a geometry of a composite display, for display of the complete image, based at least on a spatial location of each display of one or more other devices relative to the display of the mobile device; determining bounding coordinates of the display of the mobile device within the composite display, based on the geometry of the composite display; determining the portion of the complete image that is to be displayed on the display of the mobile device from the geometry of the composite display, the bounding coordinates, and dimensions of the complete image; generating the complete image, contemporaneously with each of the one or more other devices generating the complete image; and displaying the portion of the complete image on the display of the mobile device.

8. The computer-implemented method of claim 7, wherein: the mobile device further comprises a plurality of connectors each for interconnecting the mobile device with the one or more other devices, each of the plurality of connectors at a defined location on the mobile device and configured to provide an indication detectable by the processor of when a connection to one of the other devices is made or lost; the one or more other devices are detected as interconnected with the mobile device by way of at least one of the plurality of connectors; the spatial location of each display of the one or more other devices relative to the display of the mobile device is determined based on at least the defined location of the at least one of the plurality of connectors.

9. The computer-implemented method of claim 7, wherein displaying the portion of the complete image on the display of the mobile device is contemporaneous with each of the one or more other devices displaying a determined portion of the complete image for display on that other device.

10. The computer-implemented method of claim 9, further comprising: synchronizing the mobile device to display the portion of the complete image in synchronism with the determined portion of the complete image for display on each of the one or more other devices being displayed on each of the one or more other devices.

11. The computer-implemented method of claim 7, further comprising: synchronizing the mobile device to generate the complete image in

synchronism with generating the complete image at each of the one or more other devices.

12. The computer-implemented method of claim 7, further comprising: maintaining a software state consistent with a software state at each of the one or more other devices, the software state defining the complete image that is to be displayed across the display of the mobile device and the display of each of the one or more other devices on the composite display.

13. The computer-implemented method of claim 12, further comprising: sending notification of a user input to each of the one or more other devices for simulation at each of the one or more other devices, the user input defining the software state.

14. The method of any one of claims 7 to 13, further comprising: from the geometry of the composite device determining a stream of a multi- stream audio that is to be reproduced at the mobile device with a

complementary stream of the multi-stream audio to be reproduced at another device of the composite device; contemporaneously reproducing at the device the stream of the multi-stream audio, as determined, in synchronism with the complementary stream at the another device to provide multi-stream audio to a user.

15. A method of displaying a complete image on a display surface formed by placing multiple displays of multiple mobile devices in proximity to each other, said method comprising: determining a geometry of the display surface, based at least on the relative locations of the displays of the multiple devices and the dimensions of each of the displays; determining, for each device, a bounding region of the complete image that delineates a portion of the complete image that is to be displayed on the display of that device, based on the geometry of the display surface and the

dimensions of the complete image; generating, at each of the multiple mobile devices, the complete image; and at each of the multiple mobile devices, displaying the portion of the complete image within the bounding region for that device, thereby forming the complete image on the display surface and across the multiple displays.

16. The method of claim 15, further comprising: synchronizing each of the multiple mobile devices to generate the complete image in synchronism.

17. The method of claim 15, further comprising: synchronizing each of the multiple mobile devices to display the portion of the complete image within the bounding region for that device in synchronism.

18. A computer-implemented method for re-producing multi-channel audio at a

composite device comprising first and second audio capable devices

interconnected with each other, each device comprising a processor, said method comprising: determining a geometry of a composite device, based at least on a spatial location of each of the first and second audio capable devices relative to each other; determining a stream of the multi-stream audio that is to be reproduced at each of the first and second devices, from the geometry of the composite device; and contemporaneously reproducing at each of the first and second devices the stream of the multi-stream audio in synchronism, as determined, to provide multi-stream audio to a user.

19. The computer-implemented method of claim 18, wherein each of the first and

second devices comprise magnetic connectors, and wherein said first and second devices are interconnected by said magnetic connectors.

20. The computer-implemented method of claim 17,wherein said multi-stream audio comprises left and right channels, and wherein said left channel is reproduced on the left-most of said first and second audio capable devices in said composite device, and wherein said right channel is reproduced on the right-most of said first and second device in said composite device.

21.The computer-implemented method of claim 20, wherein each of said right and left channel is reproduced in two speakers at each of said first and second device, respectively.

22. The computer-implemented method of claim 20, wherein each of said right and left channel is reproduced in one of at least two available speakers at each of said first and second device, respectively.

23. The method of claim 18, further comprising: synchronizing each of the multiple mobile devices to generate the multi-channel audio in synchronism at the first and second devices.

24. A mobile device comprising: a processor; an audio decoder, for decoding multi-stream audio; memory storing processor executable instructions that when executed cause said processor to: determine a geometry of a composite device including the mobile device and other devices, based on at least a spatial location of the other devices relative to the location of the mobile device; determine a selected stream of the multi-stream audio that is to be reproduced at mobile device, from the geometry of the composite device, with remaining streams of the multi-stream audio to be reproduced at other devices of the composite device; and reproducing at the modile device the selected stream of the multi-stream audio in synchronism with complementary streams at the other devices, to provide multi-channel audio to a user.

25. The mobile device of claim 24, further comprising two speakers, and wherein the selected stream is reproduced at both said two speakers.

Description:
MULTI-DEVICE VIDEO AND AUDIO

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] The present application relates to, and claims priority from, U.S. provisional application no. 62/454,477, filed February 3, 2017; U.S. provisional application no. 62/516,153, filed June 7, 2017; and U.S. provisional application no. 62/563,018, filed September 25, 2017, the contents of which are hereby incorporated by reference.

TECHNICAL FIELD

[0002] The following relates to mobile devices that may interact with each other and are capable of determining spatial relationships to interconnected devices, and methods.

BACKGROUND

[0003] Mobile computing devices (e.g., mobile phones, tablets, laptop computers, etc.) are usually provided with a plurality of connection options which allow the devices to communicate with other devices electronically, or to receive or supply energy to the other devices (including obtaining energy from a power supply), or to add functionality to the devices, such as to connect the device to a peripheral device (e.g., a keyboard, a mouse, speakers, etc.).

[0004] Conventional devices allow for content to be presented that spans the displays of interconnected devices, but may require significant system resources. For example, a fully rendered image displayed by a rectangular array of pixels arranged in 800 rows by 480 columns, with the color of each pixel represented by twenty-four bits per pixel, requires over 9.2 megabits (or 1152 kilobytes) of storage in memory.

[0005] While conventional communication channels such as Universal Serial Bus (USB) release 3.0 can support data transfer rates of up to 5 Gbit/s, it is apparent that transmission of fully rendered image files across interconnected devices may be a strain on system resources, in particular, the necessary bandwidth to support transmission between interconnected devices, for example, in real-time.

[0006] Accordingly, there is a need for improved display of content, such as an image or a frame of a video, spanning the displays of multiple interconnected devices without consuming unnecessary bandwidth between the interconnected devices.

[0007] Likewise, there is a need for improved presentation of audio across the multiple interconnected devices.

SUMMARY

[0008] According to an aspect, there is provided a mobile device comprising: a display; a processor; and a memory storing processor executable instructions that when executed cause said processor to: determine a geometry of a composite display, for display of a complete image, based on at least a spatial location of each display of one or more other devices relative to the display of the mobile device; determine bounding coordinates of the display of the mobile device within the composite display, based on the geometry of the composite display; determine a portion of the complete image that is to be displayed on the display of the mobile device from the the geometry of the composite display, the bounding coordinates, and dimensions of the complete image; and display the portion of the complete image on the display of the mobile device, contemporaneously with each of the one or more other devices displaying a determined portion of the complete image for display on that other device.

[0009] According to another aspect, there is provided a computer-implemented method for displaying a portion of a complete image on a display of a mobile device, the mobile device comprising a processor, said method comprising: determining a geometry of a composite display, for display of the complete image, based at least on a spatial location of each display of one or more other devices relative to the display of the mobile device; determining bounding coordinates of the display of the mobile device within the composite display, based on the geometry of the composite display; determining the portion of the complete image that is to be displayed on the display of the mobile device from the geometry of the composite display, the bounding coordinates, and dimensions of the complete image; generating the complete image, contemporaneously with each of the one or more other devices generating the complete image; and displaying the portion of the complete image on the display of the mobile device.

[0010] According to another aspect, there is provided a method of displaying a complete image on a display surface formed by placing multiple displays of multiple mobile devices in proximity to each other, said method comprising: determining a geometry of the display surface, based at least on the relative locations of the displays of the multiple devices and the dimensions of each of the displays; determining, for each device, a bounding region of the complete image that delineates a portion of the complete image that is to be displayed on the display of that device, based on the geometry of the display surface and the dimensions of the complete image;

generating, at each of the multiple mobile devices, the complete image; and at each of the multiple mobile devices, displaying the portion of the complete image within the bounding region for that device, thereby forming the complete image on the display surface and across the multiple displays.

[0011] According to another aspect, there is provided a computer-implemented method for re-producing multi-channel audio at a composite device comprising first and second audio capable devices interconnected with each other, each device comprising a processor, said method comprising: determining a geometry of a composite device, based at least on a spatial location of each of the first and second audio capable devices relative to each other; determining a stream of the multi-stream audio that is to be reproduced at each of the first and second devices, from the geometry of the composite device; and contemporaneously reproducing at each of the first and second devices the stream of the multi-stream audio in synchronism, as determined, to provide multi-stream audio to a user.

[0012] According to another aspect, there is provided a mobile device comprising: a processor; an audio decoder, for decoding multi-stream audio; memory storing processor executable instructions that when executed cause the processor to:

determine a geometry of a composite device including the mobile device and other devices, based on at least a spatial location of the other devices relative to the location of the mobile device; determine a selected stream of the multi-stream audio that is to be reproduced at mobile device, from the geometry of the composite device, with remaining streams of the multi-stream audio to be reproduced at other devices of the composite device; and reproducing at the modile device the selected stream of the multi-stream audio in synchronism with complementary streams at the other devices, to provide multi-channel audio to a user.

[0013] Other features will become apparent from the drawings in conjunction with the following description.

BRIEF DESCRIPTION OF THE DRAWINGS

[0014] In the figures which illustrate example embodiments,

[0015] FIG. 1 is a schematic block diagram of two interconnected mobile computing devices located in proximity to one another that each displays a portion of a complete image;

[0016] FIG. 2 is a block diagram of example hardware components of a first mobile computing device of FIG. 1 , according to an embodiment;

[0017] FIG. 3 is a block diagram of example software components in the first mobile computing device of FIG. 1 , according to an embodiment;

[0018] FIG. 4 depicts an example of a connectivity data store at the first mobile computing device of FIG. 1 , according to an embodiment;

[0019] FIG. 5 depicts an example of a display coordinates data store at the first mobile computing device of FIG. 1 , according to an embodiment; [0020] FIG. 6 is a flow chart illustrating determination of a portion of an image to display on a device in a multi-device display, according to an embodiment;

[0021] FIG. 7 is a flow chart illustrating determination of a portion of a next complete image to display on a device in a multi-device display, according to an embodiment;

[0022] FIG. 8A is a schematic block diagram of two interconnected mobile computing devices of FIG. 1 , cooperatively displaying, across the devices, a visual representation of a calculator application in an initial state;

[0023] FIG. 8B is a schematic block diagram of two interconnected mobile computing devices of FIG. 8A, cooperatively displaying, across the devices, a visual representation of the calculator application in a second state;

[0024] FIG. 9 is a diagram of a complete image and an image portion for display, stored on each of the respective interconnected computing devices in the second state of FIG. 8B; and

[0025] FIG. 10 is a schematic block diagram of two mobile computing devices in a calibration mode, in preparation for cooperatively displaying, across the devices, a complete image;

[0026] FIG. 11 is a schematic block diagram of two interconnected devices located in proximity to one another that each reproduce a channel of multi-stream audio;

[0027] FIG. 12 is a schematic block diagram of an audio splitter/decoder and channel selector of the devices of FIG. 11 ;

[0028] FIG. 13 is a schematic block diagram of three interconnected devices located in proximity to one another that each reproduce a channel of multi-stream audio; and [0029] FIGS. 14 and 15 are schematic block diagram of two multi-speaker devices located in proximity to one another that may each reproduce a channel of multi-stream audio.

DETAILED DESCRIPTION

[0030] For convenience, like reference numerals in the description refer to like elements in the drawings.

[0031] FIG. 1 depicts two devices 100 and 102, each including a housing 104 defined by respective external surfaces 106. Devices 100, 102 can be any suitable electronic devices that interface with one another to provide complementary functions as described herein. At least one of the devices 100, 102 may be a mobile computing device. For clarity in the discussion below, mobile computing devices are commonly referred to as "mobile devices" or "devices" for brevity.

[0032] Example mobile devices include, without limitation, cellular phones, cellular smart-phones, wireless organizers, pagers, personal digital assistants, computers, laptops, handheld wireless communication devices, wirelessly enabled notebook computers, portable gaming devices, tablet computers, or any other portable electronic device with processing and communication capabilities. In at least some

embodiments, mobile devices as referred to herein can also include, without limitation, peripheral devices such as displays, printers, touchscreens, projectors, digital watches, cameras, digital scanners and other types of auxiliary devices that may communicate with another computing device.

[0033] In one example, each of devices 100, 102 may be a smartphone, or one may be a smartphone and the other a peripheral device (e.g., a speaker, a keyboard, a display screen, a camera). In another example, one device may be a touchscreen enabled device and the other a type of communication device (e.g., a router) for connecting to other devices. As will be apparent, other types of computing devices 100 and 102 can be envisaged that benefit from interconnection and interoperability. [0034] Further, in some embodiments, for example as depicted in FIG. 1 , devices 100 and 102 may be of the same type - generally identical in structure and

components. In other embodiments exemplified below, device 100 (or a similar device) may interoperate with other different yet compatible devices, in a manner exemplified herein.

[0035] Each of devices 100, 02 may include a user interface or input interface such as a touch display 110A, 110B, respectively, that cooperates with another when the spatial locations of devices is established relative to one another (e.g., to provide one larger touch screen), collectively forming a composite display 110. Each device may further be capable of producing audio for listening by a user, as further detailed below.

[0036] Touch displays 11 OA, 110B may, for example, be capacitive display screens that include a touch sensing surface. These may be integrated as a single component. Alternatively, touch displays 11 OA, 110B may include suitably arranged separate display and touch components. Touch displays 11 OA, 110B may be adapted for sensing a single touch, or alternatively, multiple touches simultaneously. Touch displays 11 OA, 110B may sense touch by, for example, fingers, a stylus, or the like. Touch displays 11 OA, 110B may return the coordinates of any touch or touches for use by a process or device 100 or 102. Likewise, touch displays 11 OA, 110B may be used to display pixelated graphics - in the form of computer rendered graphics, video and the like.

[0037] In some embodiments, composite display 110, as a larger interconnected screen, allows input to be received on either one of touch displays 11 OA, 110B of devices 100 and 102.

[0038] Each of mobile devices 100 and 102 includes respective connectors 120 and 122 for allowing interconnection of devices 100 and 102. In the example illustrated in FIG. 1 , device 100 includes four connectors 120A, 120B, 120C, 120D (individually and collectively connector(s) 120) and device 102 includes four connectors 122A, 122B, 122C, 122D (individually and collectively connector(s) 122). [0039] Connectors 120 and connectors 122 may for example be physical connectors to a serial communications port, such as a USB port, or the like. In a particular embodiment, connectors 120 and 122 may be magnetic connectors, as detailed in PCT Publication No. WO 20 5/07032 , the contents of which are hereby incorporated by reference.

[0040] Although connectors 120 and 122 have been shown at the corners of each edge of devices 100 and 102, other locations of connectors 120 and 122 may be envisaged. For example, connectors on each of devices 100 and 102 can be located at the centre of the top, bottom, left and right edges of the devices, as for example illustrated in US Patent Application No. 15/013,750, the contents of which are hereby incorporated by reference. Additionally, although four connectors have been shown, the number of connectors provided on devices 100 and 102 may vary from device to device, and may depend on the type of device 100, 102.

[0041] Devices 100 and 102 shown in FIG. 1 have been illustrated in combination, with particular exemplary connector and device form factor and geometry. Of course, alternate configurations, layout, and positioning for the connectors and alternate size and layout of the devices are possible.

[0042] Similarly, although two interconnected devices 100, 102 are shown in FIG. 1 , multiple (e.g., three or more) interconnected devices can be envisaged having alternate connector configurations, layout, and position and alternate size and layout of device 100. Other geometries of the devices, for example, generally rectangular with rounded corners, oval, or rounded in shape, may be contemplated by a person skilled in the art. Example devices having different geometries are for example illustrated in US Patent Application No. 15/013,750.

[0043] A combined device coordinate system, may be associated with the combination of interconnected devices 100, 102. For example, each rectangular device may have a width and a length, the dimensions of which can be expressed in millimetres. Using a rectangular system of coordinates, and defining an origin at a point, (e.g., the bottom-left corner of device 100, designated as O in FIG. 1), various points in a combined device coordinate system may be represented by a coordinate defined by values along an x-axis and y-axis, for example, in millimetres, extending from the origin O. As would be understood by a person skilled in the art, other two- dimensional coordinate systems may be used and other units of distance to define points on the combination of interconnected devices 100, 102, forming a composite device.

[0044] In the discussion herein, in reference to a rectangular coordinate system with an x-axis and y-axis originating at the bottom-left corner of device 100, as illustrated in FIG. 1 , movement may be characterized as "rightward" in a positive direction along the x-axis, and "upward" or "vertically" in a positive direction along the y-axis, akin to the layout shown, for example, in FIG. 1.

[0045] Composite display 110 and touch displays 11 OA, 110B shown in FIG. 1 have been illustrated with particular exemplary connector and device form factor and geometry. Of course, alternate configurations, layout, and positioning for the connectors and alternate size and layout of the devices are possible.

[0046] Composite display 110 may be defined at particular coordinates within a combined device coordinate system of devices 100, 102. For example, composite display 110 may be designated at minimum and maximum extents in the x and y axes within the combined device coordinate system of devices 100, 102.

[0047] Composite display 110 may have its own associated coordinate system, using the visual display of composite display 110 and its pixels as a frame of reference. For example, composite display 110 may have a width and a length, the dimensions of which can be expressed in pixels. Using a rectangular system of coordinates, and defining an origin at a point, (e.g., the bottom-left corner of the display), various points on the display may be represented by a coordinate defined by values along an x-axis and y-axis, for example, in pixels, extending from the origin. As would be understood by a person skilled in the art, other two-dimensional coordinate systems may be used to define points on a display. [0048] Each of touch displays 110A, 110B may be defined at particular coordinates within the coordinate system of display 110. For example, a bottom-left corner of display 11 OA on display 110 may be designated at (x,y) coordinates on device 100 of (0,0), namely, the bottom-left corner of display 11 OA is at the origin point at the bottom-left corner of touch display 110.

[0049] Each touch display 11 OA, 110B may have its own associated coordinate system, using the visual display of each display 11 OA, 110B and its pixels as a frame of reference. For example, a display may have a width and a length, the dimensions of which can be expressed in pixels. Using a rectangular system of coordinates, and defining an origin at a point, (e.g., the bottom-left corner of the display), various points on the display may be represented by a coordinate defined by values along an x-axis and y-axis, for example, in pixels, extending from the origin.

[0050] As disclosed in US Patent Application No. 15/013,750, device 100 may maintain connectivity information for each of its connectors 120 in a data store, that may exist in memory as discussed in further detail below, and that may be used to determine the spatial relationship of devices (e.g., device 102) that are interconnected (e.g., mechanically and/or electrically and/or wirelessly) to device 100.

[0051] The connectivity information for mobile device 100 can include information about whether a connection exists for each physical connector 120 on mobile device 100 with another device (e.g., device 102), and the defined relative physical location of each of connectors 120 on device 100 (e.g., x, y parameters relative to the device, general location descriptors such as top, bottom, left, right).

[0052] Based on knowledge of the location of connectors 120, the relative spatial location of device 102 relative to other devices (e.g. device 100), for example within a combined device coordinate system, may be deduced. For example, interconnection with connector 120B may allow deduction that device 102 is connected to the right of device 100. Additionally, this connectivity information may optionally be augmented with more specific information about interconnected devices (e.g., size of any interconnected device, type of device, device identification information, location of connectors on an interconnected device, and devices interconnected with an interconnected device, etc.). Furthermore, knowledge of the location of components such as user interfaces on devices 100, 102 may be used to deduce the relative spatial locations of the user interfaces, for example, touch displays 11 OA, 110B of devices 100, 102.

[0053] In the example of FIG. 1 , connectors 120B and 122A, as well as 120C and 122D are physically (e.g., mechanically) connected to one another in a side by side arrangement. In addition to the physical/mechanical connection, devices 100 and 102 are in data communication with one another.

[0054] Such data communication may occur through a communication channel established through electrical conduction of signals between electrical contacts of the respective interconnected connectors (e.g., connectors 120B and 122A, and/or connectors 120C and 122D). This type of connection may be provided as a USB compatible bus is established through the interconnected device connectors (e.g., connectors 120B and 122A). Alternatively, data communication may be made through a suitable wireless interfaces at devices 100, 102 - for example established as a result of the proximity of device 100 to device 102. Possible wireless interfaces include WiFi interfaces; Bluetooth interfaces; NFC interfaces; and the like. Extremely high frequency (EHF) communication is also contemplated. An example of such EHF communications is described in http://keyssa.com and U.S. Patent Publication No. 2015/0065069, both of which are hereby incorporated by reference in their entirety. Other forms of wireless interfaces/communication will be appreciated to those of ordinary skill in the art.

[0055] Once a mechanical/physical connection is established between respective connectors (e.g., connectors 120B, 122A), devices 100, 102 can sense the physical interconnection (e.g., directly via the connectors and/or with external sensors) , as for example disclosed in International PCT Application No. PCT/CA2017/050055, the contents of which are hereby incorporated by reference. In embodiments in which connectors 120, 122 provide mechanical connection and data connectivity, a change in the electrical characteristics at the electrical contacts of the respective interconnected connectors (e.g., connectors 120B and 122A) such as but not limited to: a change in voltage, impedance, etc., can be used to indicate a physical coupling of the respective connectors (e.g., 120B and 122A).

[0056] In other embodiments, devices 100, 102 may communicate using extremely short range wireless communication, and devices 100, 102 can detect an EHF signal (e.g., received from an interconnected device 102 at device 100) which can be used to indicate that the electronic connector elements (e.g., as contained within connectors 120B, 122A) are located within a few millimetres of one another.

[0057] In some embodiments, connectors 120 and 122 include magnets utilized to physically connect devices 100 and 102 both mechanically and electrically (as discussed in PCT Publication No. WO 2015/070321). In other embodiments, at least some of connectors 120 may be adapted to physically mate with particular ones of respective connectors 122 such that when mated, connectors 120 and 122 allow interconnected devices 100 and 102 to connect both mechanically and/or electrically. In this embodiment, connectors 120 may optionally allow device 100 to transfer or receive power and/or data to or from interconnected devices such as device 102.

[0058] In some embodiments, sensors (e.g., Hall Effect sensors) on devices 100, 102 can be used to detect a magnetic field of one or more magnets in a proximate connector 120, 122. Such sensors may be integrated within each of connectors 120, 122 or provided as a separate external component. Other mechanical sensors may alternatively be used. For example, if a connector (e.g., connector 120B) includes a moveable magnetic element, a pressure sensor (not shown) can be used to detect attractive force of another connector (e.g., connector 122A) on that element and thereby detect a mechanical connection of the connectors 120B and 122A, as for example disclosed in International PCT Application No. PCT/CA2017/050055.

[0059] An indication of the physical/mechanical connectivity of devices 100 and 102 by way of one or more connectors 120, 122 can trigger a first device 100 to determine the relative spatial location of an interconnected device 102 relative to the first device 100, as for example detailed in US Patent Application No. 15/013,750. Likewise, device 102 may perform a similar method, and also determine its relative spatial location of interconnected device 100. As noted above, such relative spatial location information may be stored in a data store.

[0060] Displays 1 OA, 110B, collectively composite display 110, may be operable to display content spanning displays 11 OA, 110B of both devices 100, 102. Such content may include, for example, an image, image frames in sequence forming a video stream, such images representing an operating system user interface or an application user interface. Other content will be contemplated by a person skilled in the art.

[0061] As shown in FIG. 1 , touch display 11 OA on device 100 may display an image portion 130A, and touch display 110B on device 102 may display an image portion 130B. Collectively, image portions 130A, 130B form a complete image 130 as displayed across touch displays 11 OA, 110B.

[0062] As would be understood by a person skilled in the art, complete image 130 as well as image portions 130A and 130B, may be represented as a fully drawn or rendered image, namely a rectangular array of picture elements (pixels) arranged in rows and columns, with a bitmap stored in memory defining the value of each pixel in a drawn image. The rectangular array may represent a rectangular coordinate system. As would be understood by a person skilled in the art, other two-dimensional coordinate systems may be used.

[0063] Complete image 130 may be, for example, a representation of a single image, or a single frame of a video stream containing multiple image frames, the video stream having a next complete image in a subsequent frame.

[0064] Complete image 130 may be an image generated by a software application running on device 100, representing a software state, and contain user interface elements that a user may interact with. Other images that may be illustrated by complete image 130 will be understood by a person skilled in the art. [0065] The dimensions for a complete image 130 available to a software application may be governed by the size and characteristics of composite display 110.

[0066] The relationship between complete image 130 and each of image portions 130A, 130B, and their relative positions, sizes and configurations, may be defined as described below with reference to FIG. 6.

[0067] FIG. 2 is a simplified block diagram of a mobile device 100 (an example mobile computing device), according to an example embodiment. Mobile device 100 includes a processor 202, display 1 OA, an I/O interface 208, connectors 120, a communication subsystem and network interface 210 which allows communication to external devices (e.g., interconnected devices such as device 102), and a memory 212.

[0068] Processor 202 controls the overall operation of mobile device 100.

Communication functions, including data and voice communications, are performed through communication subsystem and network interface 210.

[0069] Processor 202 is exemplified in the embodiments described herein as a central processing unit (CPU) with "integrated graphics" processing to render data into an image in pixels for display and, for example, for carrying out the instructions of the modules as described below with reference to FIG. 3.

[0070] As would be understood by a person skilled in the art, in some embodiments processor 202 may also comprise a separate and discrete graphics processing unit (GPU), for example as part of a graphics card (not shown) in an alternate hardware layout, designed to perform the mathematical and geometrical calculations necessary for graphics rendering, and typically handling computations only for graphics.

[0071] Processor 202 may include an integrated Digital Signal Processor (DSP) adapted for video encoding and decoding.

[0072] As would further be understood by a person skilled in the art, in some embodiments processor 202, I/O interface 208, a communication subsystem and network interface 210 and memory 212 may be part of a system-on-a-chip, such as the Qualcomm™ Snapdragon 810 processor, that integrates a number of hardware components on a single integrated circuit in an alternate hardware layout.

[0073] Communication subsystem and network interface 210 enables device 100 to communicate with other devices (e.g., device 102). In some embodiments, device 100 may communicate with device 102 via connectors 120 by way of a bus or point to point communications (as shown in FIG. 2). Additionally, device 100 may further

communicate with device 102 via communication subsystem and network interface 210.

[0074] In other embodiments, connectors 120 provide a mechanical/physical connection and the data connection between devices 100 and 102 is established instead via the communication subsystem and network interface 210 (e.g., using wireless communications such as WiFi, Bluetooth, Wireless USB, capacitive coupling communications). In such embodiments, connectors 120 may not be connected to I/O interface 208 (not shown). In addition to establishing data communication between devices 100, 102 and communicating regarding whether device 100 is interconnected to device 102, wireless data communication can also be used to share connectivity information (e.g., for establishing data communications) prior to any mechanical connections being made.

[0075] In one example, connectors 120 of device 100 may utilize communication subsystem 210 to receive messages from and send messages to interconnected devices (e.g., request and receive additional spatial information from interconnected devices, such as from device 102). Accordingly, in one embodiment, device 100 can communicate with other interconnected devices using a USB or other direct

connection, as may be established through connectors 120, 122. In another

embodiment, device 100 communicates with interconnected devices (e.g., device 102) using Bluetooth, NFC, or other types of wireless communications as envisaged by a person skilled in the art. [0076] Memory 212 may include a suitable combination of any type of electronic memory that is located either internally or externally such as, for example, flash memory, random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), or the like.

[0077] I/O interface 208 enables device 100 to communicate via connectors 120, for e.g., to exchange data and establish communication with other devices 102. I/O interface 208 may also enable device 100 to interconnect with various input and output peripheral devices. As such, device 100 may include one or more input devices, such as a keyboard, mouse, camera, touch screen (e.g., display 110A), a microphone, and may also include one or more output devices such as a display screen (e.g., display 11 OA) and a speaker.

[0078] Device 100 may be adapted to operate in concert with one or more interconnected devices (e.g., device 102). In particular, device 100 includes an operating system and software components, which are described in more detail below with reference to FIG. 3. Device 100 may store the operating system and software code in memory 212 and execute that software code at processor 202 to adapt it to operate in concert with one or more interconnected devices (e.g., device 102). The software code may be implemented in a high level procedural or object oriented programming or scripting language, or a combination thereof. The software code may also be implemented in assembly or machine language.

[0079] As exemplified in PCT Publication No. WO 2015/070321 , device 100 and interconnected device (e.g., device 102) may each store software code which when executed, provides a coordinator at each of devices 100, 102 which performs various functions, including detection and registration of devices connected to each of devices 100, 102. Additionally, a coordinator of each device 100, 102 may coordinate task sharing between devices and task assignment from one device (e.g., device 100) to another (e.g., device 102). The coordinator may also coordinate data transfer between the devices 100, 102. Thus, a coordinator at a first device 100 can communicate with a coordinator at other devices (e.g., device 102) by way of a bus or a network or both (not shown). By way of these communications, the respective coordinators of devices 100, 102 may establish peer-to-peer relationship or a master-slave relationship, depending on the nature of the desired communication as may be established between device 100 and/or interconnected devices 102.

[0080] Those skilled in the art will appreciate that portions of an operating system, for example operating system 300 described below, and remaining software

components, such as specific device applications, or parts thereof, may be temporarily loaded into a volatile store forming part of memory 212. Memory 212 or a portion thereof may be on processor 202. Other software components can also be included, as is well known to those skilled in the art.

[0081] FIG. 3 illustrates an organizational block diagram of software components at device 100/102 as stored within the memory of FIG. 2 for allowing detection of spatial relationships of other interconnected mobile devices (e.g., device 102). As illustrated, software components include an operating system 300, connectivity module 302, a device identification module 304, a communication module 306, a spatial relationship synthesizer module 308, a connectivity data store 312, a synchronizing module 314, and a display coordinates data store 322. Connectivity data store 312 includes information related to one or more of: connectivity, device and connector information for device 100. The operating system and components may be loaded from persistent computer readable memory onto device 100/102.

[0082] Operating system 300 may allow basic communication and application operations related to the mobile device. Generally, operating system 300 is

responsible for determining the functions and features available at device 100, such as keyboards, touch screen, synchronization with applications, email, text messaging and other communication features as will be envisaged by a person skilled in the art. In an embodiment, operating system 300 may be Android™ operating system software, Linux operating system software, BSD derivative operating system software, iOS™ operating system software, or any other suitable operating system software. In embodiments in which an Android operating system platform is in use, software components described herein may be implemented using features of a framework API (Application Programming Interface) for the Android platform.

[0083] In some embodiments, device 100 and device 102 may have different operating systems that are capable of displaying content in manners contemplated herein. For example, an operating system on one device could be Android and the other iOS.

[0084] Connectivity module 302 operates in conjunction with connectors 120, and coordinates detection of when a connection is made or lost at each of the connectors 120 on device 100. Connectivity module 302 further maintains data store 312 which includes connectivity information that indicates whether a connection exists for each of the connectors 120 on the mobile device 100. Data store 312 may have any suitable format within memory 212. Further, in response to sensing that a new connection has been made or lost with a particular connector 120, connectivity module 302 updates the connectivity information in data store 312. Examples of such connectivity information are shown within data store 312 in FIG. 4.

[0085] Device identification module 304 causes processor 202 to store connector information including a pre-defined physical location of each of connectors 120 relative to the device (e.g., x-y parameters indicating location; general location parameters TOP-LEFT, TOP-RIGHT, BOTTOM-RIGHT, BOTTOM-LEFT) within memory 212. The pre-defined physical location of each of the connectors may be defined upon fabrication and/or programming of device 100 and/or connectors 120.

[0086] Device identification module 304 further maintains and/or updates device information including, for example, the type of connectors 120 and potential types of devices that can be coupled to each connector 120 (e.g., smartphone, peripheral devices, etc.) within memory 212. The relative physical location of each connector 120 is typically known with reference to a coordinate system attached to device 100 (e.g., extending in millimetres from a defined corner). Examples of connector information indicating relative location of connectors 120 is also shown in connectivity data store 312 of FIG. 4.

[0087] Additionally, in an embodiment, device information module 304 further includes device information, such as but not limited to: size of device 100 (e.g., 100 mm x 200 mm), type of device (e.g., model), display 110 characteristics (e.g., pixel size, pixel colour depth, pitch, etc.) and other device information that may be used to derive spatial information. In another exemplary embodiment, device identification module 304 further includes information about the location of touch sensors on device 100 (e.g., relative to the device's coordinate system). The device information may be stored in memory 212. The location information of the touch sensors may be predefined (e.g., upon fabrication and/or programming of device 100) and stored within memory 212.

[0088] Thus, based on connector information provided by device identification module 304 (e.g., connector locations on the device), device type, device size, and touch screen information), connectivity module 302 can determine the relative spatial location of each of the other devices interconnected to mobile device 100. In the example configuration of FIG. 1 , connectivity module 302 indicates, by way of the information in connectivity data store 312 shown in FIG. 4 that interconnected device 102 is located on the right side of device 100. By default, connectivity module 302 may assume that interconnected device 102 has the same characteristics, for example, device type, device size, touch display, as device 100. Additional information (e.g., device type, device size, and user interface or touch display information) can be provided by interconnected devices via communication module 306, and used by connectivity module 302 to further refine the determined relative spatial location of each of the other devices interconnected to mobile device 100 and for use by software applications of the devices for processing input/output display operations (e.g., determining merging of the multiple display screens for screen stitching).

[0089] Mobile device 100 may receive additional information on an interconnected device related to device size and/or display size of the interconnected device. For example, a device interconnected to mobile device 100 that is larger than device 100 may be interconnected with connectors 120B and 120C of device 100, which would initially allow deduction that the interconnected device is connected to the right of device 100. However, the additional information relating to device size may allow further refinements to the determined relative spatial location, for example, by indicating that device 102 extends in length beyond the length of device 100 and is perhaps centred upwards of display 110 of device 100. Additional information on an interconnected device may be stored in memory 212, for example, information indicating that the interconnected device extends beyond the length of device 100.

[0090] Communication module 306 is configured to establish a communication channel between device 100 and each interconnected device, such as device 102, using known techniques, for example, via communication subsystem and network interface 210, as described above.

[0091] Device 100 may further include a spatial relationship synthesizer module 308 stored in the memory 212. The synthesizer module 308 consolidates connectivity and other information received by one or more of modules 302, 304, and 306 to determine how to process input and output received on the device 100 relative to multiple input and output screens or touch displays provided by the interconnected device(s) (e.g., device 102) relative to the first device 100, this information can be useful for stitching together multiple displays (e.g., determining how to divide image data, for example a complete image 130, to span displays 110A and 110B on devices 100 and 102).

[0092] In one example, module 308 is configured to collect information regarding the location of displays on each device (e.g., devices 100, 102) and display parameters (e.g. resolution, pixel pitch, and display dimensions) in order to synthesize outputs onto multiple interconnected displays (e.g., displays 11 OA and 110B of devices 100 and 102) and/or to process the inputs obtained via an interconnected display based on the display parameters and the location of the displays on each device 100, 102 and/or inputs obtained via each of the interconnected displays. [0093] Other functionalities of the relationship synthesizer module 308 can include processing gestures across multiple devices or spanning an output display, or multi- device display, across a selected number of interconnected device displays (e.g., displays 110A, 110B), to allow rendering of graphics on a larger display surface (e.g., composite display 110) as described in further detail below.

[0094] Spatial relationship synthesizer module 308 further maintains display coordinates data store 322. Display coordinates data store 322 includes display coordinate information that indicates the dimensions x, y of combined devices 100, 102; coordinates of the minimum and maximum extents of composite display 110 within combined devices 100, 102 coordinate system defined by an x-axis and a y- axis; geometry of composite display 110 expressed by its number of pixels having a width x' and length y', coordinates of the minimum and maximum extents of display 11 OA within composite display 110 coordinate system defined by an x'-axis and a y - axis; dimensions of complete image 130 defined by width a and length b, dimensions of complete image 130 defined by width a and length b, coordinates of the minimum and maximum extents of image portion 130A within complete image 130 defined by an a-axis and lb-axis; and coordinates of the minimum and maximum extents of image portion 130A within display 11 OA coordinate system defined by an x"-axis and y"- axis.

[0095] Display coordinates data store 322 may have any suitable format within memory 212. Display coordinates data store 322 may, for example, have the form shown in FIG. 5. Other fields may be included in display coordinates data store 322, as will be appreciated to those of ordinary skill.

[0096] Display coordinates data store 322 may also store information related to interconnected devices such as device 102, for example, the definition of image portion 130B of complete image 130, as well as the coordinates of display 110B at which to display image portion 130B.

[0097] Synchronizing module 314 is configured to cause processor 202 to synchronize devices 100, 102, for example, such that both devices 100, 102 determine and display their respective image portions 130A, 130B of complete image 130 contemporaneously. The synchronization between the devices may take a variety of forms, for example, synchronization to the same state of an application running on each of devices 100, 102, which would be represented by the same complete image 130. Synchronization may be done by signalling between devices 100, 102 containing time-stamps or initialization for the state of an application being run on both devices 100, 102, or other processes such as rendering. Other synchronization techniques would be understood by a person skilled in the art.

[0098] In use, based on connectivity information, device 100 can determine the relative spatial location of one or more interconnected devices (e.g. device 102 is connected on the right side of device 100) as well as the relative spatial locations of user interfaces of one or more interconnected devices (e.g., displays 110A and 110B of devices 100 and 102).

[0099] As well, once devices 100 and 102 are interconnected with one another, devices 100 and 102 can communicate with one another and exchange information as needed over a communication channel established between the devices - for example by way of a USB or other connection known to a person skilled in the art, between device 100 and 102.

[00100] In one embodiment, once two devices are proximate each other and a connection is established between respective connectors 120, 122 of devices 100, 102 and the relative spatial location of interconnected mobile device 102 is determined relative to mobile device 100, the spatial information may be stored in connectivity data store 312.

[00101] As illustrated in FIG. 6, by way of example, device 100 may initiate determination of a portion of an image in a multi-device display. Blocks 600 may be implemented by modules 302, 304, 306, 308 and 314 and may operate on connectivity data store 312 and display coordinates data store 322. [00102] At block S602, device 100 may sense whether a connection has been made for one of connectors 120 with one or more interconnected devices (e.g., via the connectivity module 302), as detailed in PCT Publication No. WO

2015/070321. If a connection to an interconnected device is detected, device 100 proceeds to block S604.

[00103] At block S604, spatial relationship synthesizer module 308 at device 100 may retrieve spatial location data corresponding to interconnected device 102, as described above and detailed in PCT Publication No. WO 2015/070321.

[00104] At block S605, synchronizing module 314 is configured to synchronize devices 100, 102 such that image portions 130A, 130B are displayed on devices 100, 102 contemporaneously to form a display of complete image 130, for example, by transferring to device 102 or receiving from device 102 signalling instructions.

[00105] A synchronization signal may be transmitted via an established communication channel. Devices 100, 102 may be paired to form an established communication channel before synchronization at block S605. The communication channel may be provided as a USB compatible bus is established through the interconnected device connectors (e.g., connectors 120B and 122A), with, for example, device 100 serving as a USB host and device 102 serving as a USB slave, or other connection techniques as described above. In some embodiments, a communication channel may be established through a wireless connection. As would be understood by a person skilled in the art, other types of communication channel may be established, as appropriate.

[00106] Signalling between devices 100, 102 may contain, for example time- stamps or initialization for processes to be run on both devices 100, 102, such as a particular software state, rendering, etc.

[00107] In some embodiments, synchronization may be provided such that device 100 performs determination of a portion of an image for multi-device display as described in blocks S602 to S612 described herein, contemporaneously with device 102 determining a portion of an image for multi-device display on device 102, namely, by performing steps substantially similar to blocks S602 to S612. In particular, block S610, described below, in which a portion of complete image 130 to display is determined, may occur contemporaneously to device 102 determining a portion of complete image 130 for display on display 110B of device 102.

[00108] In some embodiments, synchronization may be provided to generate or render complete image 130, or display each of image portions 130A, 130B on devices 100, 102 concurrently.

[00109] In some embodiments, synchronization may be provided to initialize the state of selected software to be the same on both devices 100, 102.

[00110] In some embodiments:, synchronization may be provided by a time- stamp to synchronize run-time, for example, of a state of applications running on both devices 100, 102, represented by a particular image, or of rendering, of content such as an image, video stream or other media content.

[00111] In some embodiments, synchronization may occur only for a complete image of the first frame in a video stream, following which each of devices 100, 102 continues rendering and displaying the remaining images frames in the video stream in sequence.

[00112] At block S606, spatial relationship synthesizer module 308 at device 100 may adapt processor 202 of device 100 to assess the spatial location of device 102 based on the retrieved spatial location data corresponding to device 102 to determine coordinates, area and geometry of composite display 110. Spatial relationship synthesizer module 308 may then define the coordinates of image portion 130A for display on display 11 OA, as further detailed in blocks S608 to S616, at least based on coordinates, area and geometry of composite display 110; coordinates, area and geometry of each of displays 11 OA, 110B; and dimensions of complete image 130 to be displayed. [00113] In the example shown in FIG. 1 , device 100 may determine that device 102 is connected to the right side of device 100.

[00114] Dimensions of combined devices 100, 102 may be determined based on the determined relative spatial location of interconnected mobile device 102 relative to device 100, for example as determined by connectivity module 302 and stored in connectivity data store 312, in combination with additional information on device 100 and/or interconnected device 102, such as dimensions of device 100 and device 102.

[00115] Dimensions of device 100 may be determined from device information such as size of device 100 from device information module 304. Dimensions of device 102 may be determined from additional spatial information communicated from interconnected device 102.

[00116] In the layout shown in FIG. 1 , based on the locations of devices 100, 102, dimensions of combined devices may be determined as the combined widths of combined devices 100, 102 and the length of device 100.

[00117] Composite display 110 may be considered as a stitching of each of displays 11 OA, 110B. Composite display 110 may have a geometry, defined by the shape, size and relative position of composite display 110, for example, within the coordinate system of combined devices 100, 102.

[00118] By identifying the spatial location and display attributes of each display within a device, for example display 11 OA within device 100 and display 110B within device 102, a geometry of all of the displays in combination may be constructed, namely, a geometry of composite display 110.

[00119] In embodiments in which an Android platform is in use, using the framework API, a Display object may represent information on the geometry of a logical display, namely composite display 110, including the size, density, or other characteristics (for example, normalized resolution) of composite display 110. [00120] One way in which the geometry of a rectangular composite display 110 may be defined is by the minimum and maximum extents of composite display 110 within combined devices 100, 102 coordinate system. The coordinates of the minimum and maximum extents of composite display 110 may be determined within combined devices 100, 102 coordinate system defined by an -axis and a y-axis, based on the determined relative spatial locations of the interconnected devices and additional information on the devices such as the location of displays 11 OA, 110B within respective devices 100, 102.

[00121] For example, a bottom-left corner of composite display 110 within the coordinate system of combined devices 100, 102 may be designated at (x,y) coordinates on device 100 of (1 mm, 5 mm), namely, the bottom-left corner of display 110 is offset 1 mm to the right and 5 mm above the bottom-left corner of device 100 and extend a distance in milimeters along the -axis from a minimum position min(x) to a maximum position max(x) and extend a distance in milimeters along the y-axis from a minimum position min(y) to a maximum position max(y), and this information may be stored in display coordinates data store 322, for example, in the format shown in FIG. 5.

[00122] While the geometry of composite display 110 may be expressed by the extents in the x and y axes of combined devices 100, 102, as described above, the geometry of composite display 110 may also be expressed by its number of pixels along a width x' and length y', and this information may be stored in display

coordinates data store 322, for example, in the format shown in FIG. 5. The width x' and length y' of composite display 110 may define a composite display coordinate system having x' and y' axes.

[00123] In some embodiments, the geometry of composite display 110 may be stored in display coordinates data store 322 in other formats that define the geometry, for example, as a rectangle having an origin, width and length, a polygon defined by a number of vertices that have connecting lines between them, or parametric equations that can express the coordinates of the points that make up the geometric form of a composite display. Other formats for defining a geometry would be understood by a person skilled in the art.

[00124] Displays that are located at spatial locations adjacent one another may together form a composite display shape having an overall length and width in a rectangular geometry. As such, the overall length and width of a composite display, for example, composite display 110, may be representative of the total number of pixels available for display, as between display 11 OA and display 110B.

[00125] In the layout shown in FIG. 1 , display 11 OA and 110B are adjacent each other, forming a rectangular shape having four vertices. In an example, displays 11 OA, 110B may each have a display resolution, namely, the physical number of columns and rows of pixels creating the display, of 720 x 1200 pixels, resulting in a composite display 110 geometry of 1440 x 1200 pixels, where 1440 pixels is the width, or number of columns, and 1200 pixels is the height, or number of rows.

[00126] In embodiments having multiple (e.g., three or more) interconnected devices, the interconnected devices having various geometries, sizes and positions, the composite display geometry may not be rectangular, and may be defined by more than four vertices.

[00127] In some embodiments, the geometry of composite display 110 may be used by a software application running on device 100 to modify the content, size, or other attributes of complete image 130. For example, given the availability of a composite display 110 upon which to display content instead of merely a single display 11 OA, a software application may be prompted to generate a complete image or replace an existing complete image with one containing additional content or pixel data that can fully span all of composite display 110, namely, both displays 11 OA, 110B. From the perspective of the software application, the size of the display at device 100 may be the size of the composite display 110. For example, if a software application running on device 100 generates a complete image having dimensions of 720 x 1200 pixels, upon determining a composite display 110 geometry of 1440 x 1200 pixels (as described below), the software application may, treating the entire composite display 110 as the display on which to generate a complete image, revise complete image to an image having dimensions of 1440 x 1200 pixels. As such, the methods described herein may operate in conjunction with existing software applications without modification to any of the software applications.

[00128] At block S608, spatial relationship synthesizer module 308 at device 100 may adapt processor 202 of device 100 to determine the bounding coordinates of a display portion of device 100, for example, display 11 OA, within composite display 110.

[00129] A coordinate system formed by composite display 110 is used as the frame of reference within which the bounding coordinates of display 11 OA are defined. For example, coordinates of display 11 OA may be defined in conjunction with determination of composite display at block S606, namely, based on the determined relative spatial locations of the interconnected devices and additional information on the devices such as the location, sizes and other attributes of displays 110A, 110B within respective devices 100, 102. Coordinates of the minimum and maximum extents of display 11 OA can be expressed within composite display 110 coordinate system defined by an x'-axis and a y'-axis.

[00130] For example, in the layout shown in FIG. 1 , display 11 OA may be designated as the left half of composite display 110. Similarly, display 110B may be designated as the right half of composite display 110.

[00131] In an example in which composite display 110 has a coordinate system having an x'-axis and a y -axis defined by a geometry of 1440x1200 pixels, bounding coordinates for display 11 OA on device 100 would be 0 <= x' <= 720 and 0<= y' <= 1200, meaning that display 110A extends from pixel column 0 to pixel column 720 and from pixel row 0 to pixel row 1200 within composite display 110. Bounding coordinates for display 110B on device 102 would be 720 <= x' <= 1440 and 0 <= y' <= 1200, meaning that display 110B extends from pixel column 720 to pixel column 1440, and from pixel row 0 to pixel row 1200 within composite display 110. [00132] In some embodiments, the dot pitch of the displays of each of the interconnected devices may not be the same, and the determination of the geometry of composite display, as well as bounding coordinates for the individual device displays, may be adjusted to account for it. In an example, a device having a display with a resolution of 1920 x 1080 pixels may be located to the left of a device having a display with a resolution of 640 x 480 pixels. In one example, the geometry of the resultant composite display may be defined by the resulting maximum width and length, namely by adding the pixel widths, resulting in a composite display of 2560 x 1080 pixels in an x'-y' coordinate system, the bounding coordinates of the left display being defined as 0 <= x' <= 1920, 0 <= y' <= 1080, and the bounding coordinates of the right display being defined as 1920 <= x" <= 2560, 0 <= y' <= 1080. In another example, the pixels occupied by a display in each device may be normalized to the lowest resolution, and the geometry of the resultant composite display may be defined by the resulting maximum width and length. In the example resolutions above, this would cause the 1920 x 1080 display to be normalized to 640 x 480 pixels, and result in a composite display geometry of 1280 x 480 pixels in an x'-y' coordinate system, the bounding cooridnates of the left display being defined as 0 <= x' <= 640, 0 <= y' <= 480, and the bounding coordinates of the right display being defined as 641 <= x' <= 1280, 0 <= y' <= 480.

[00133] Similar modifications may be applied to the geometry of the composite display and the bounding coordinates of the individual displays in embodiments containing devices with different display densities, for e.g., dots per inch.

[00134] Similarly, if the size of the interconnected devices is not the same, the interconnected devices are not located directly adjacent each other, or the

interconnected devices are offset from each other, the geometry of the composite display and bounding coordinates of the individual displays may be adjusted to account for these attributes. For example, null pixels may be added within a composite display to represent an offset between displays or devices, for example, a bezel between two devices displays, and the bounding coordinates of the devices meeting at a midway point between the null space. [00135] At block S610, spatial relationship synthesizer module 308 at device 100 may adapt processor 202 of device 100 to determine a portion of a complete image to display. This may translate, for example, to a portion 130A of complete image 130 to display.

[00136] The dimensions of complete image 130 may be known or previously determined, or determined from a fully rendered complete image 130.

[00137] Complete image 130 may be generated by a software application running on device 100.

[00138] Coordinates of image portion 130A may be defined within a coordinate system defined by the dimensions of complete image 130, namely, representing what portion of complete image 130 to display on display 11 OA. The coordinates of image portion 130A may also be defined within coordinate system of display 11 OA, namely, at what position within display 110A at which to display image portion 130A. In some embodiments, the proportions of image portion 130A to complete image 130 may be correlated to the proportions of display device 11 OA to composite display 110.

[00139] Complete image 130 is a fully rendered image containing data about each pixel's colour and location for display, stored in memory 212. The pixel data for complete image 130 may be stored, for example, in a section of memory 212

designated as a frame buffer, containing, for example, a bitmap of complete image 130.

[00140] Image portion 130A is stored in memory 212 as a subset of the frame buffer for display at display 11 OA and its location the frame buffer may be mapped to the coordinates of image portion 130A that have been defined with respect to complete image 130.

[00141] Since the image portion 130A of the frame buffer is a subset of the complete image 130 in the frame buffer, in some embodiments, there may not be a need to store pixel data for image portion 130A anywhere else in memory 212, and it may not need to be moved from the frame buffer to elsewhere in memory 212 for display at display 110A.

[00142] In some embodiments, image portion 130A may be defined as a portion of complete image 130 that has had further effects applied to it. These effects may include dimensional effects such as scaling and stretching, and visual effects such as brightness and contrast. Other possible effects would be contemplated by a person skilled in the art.

[00143] Defining the coordinates of image portion 130A, both within the coordinate system of complete image 130 and within the coordinate system of display 11 OA, may take into account various factors including the display configurations of device 100, as well as device 102, for example, the physical display size, as measured across the display diagonal; display density, namely the number of pixels within a physical area of the display, commonly referred to as dots per inch ("dpi"); orientation of the display from the user's point of view, for example landscape or portrait; and resolution, namely the total number of physical pixels on the display.

[00144] Dimensions of complete image 130 may be defined by width a and length J , and coordinates of the minimum and maximum extents of image portion 130A within complete image 130 defined by an a-axis and b-axis; and coordinates of the minimum and maximum extents of image portion 130A within display 11 OA

coordinate system defined by an x"-axis and y"-axis.

[00145] Using the example of FIG. 1 , image portion 130A may be defined as encompassing the left half of complete image 130, correlating to display 110A

encompassing the left half of composite display 110. Similarly, image portion 130B may be defined as encompassing the right half of complete image 130, correlating to display 110B encompassing the right half of composite display 110.

[00146] In the example shown in FIG. 1 , image portion 130A may be displayed on a portion of display 11 OA. Alternatively, image portion 130A may be displayed across the entirety of display 11 OA, as appropriate. [00147] In some embodiments, the bounding coordinates of display 110A may extend beyond the resolution of complete image 130. For example, if composite display 110 has a coordinate system having an x'-axis and a y'-axis defined by a geometry of 1440 x 1200 pixels and a complete image has a resolution of 720 x 1200 pixels, bounding coordinates for display 11 OA of device 100 may be defined as 0 <= x > <= 720 and 0 <= y' <= 1200. Bounding coordinates for display 110B on device 102 would be 720 <= x' <= 1440 and 0 <= y' <= 1200. In such an example, image portion 130A may be scaled horizontally for display across all pixels of display 1 OA, and image portion 130B may be similarly scaled horizontally across display 110B at device 102.

[00148] The above embodiments for defining the coordinates of an image portion of a complete image are provided as examples, without limitation.

[00149] At block S612, display coordinates data store 322 is maintained to indicate display coordinates information for a portion of complete image 130 to display on display 11 OA of device 100.

[00150] The display coordinate information determined in blocks S606 to S610 are stored in display coordinates data store 322, the form of which is shown in FIG. 5 by way of example. Display coordinates data store 322 may include information that indicates the dimensions x, y of combined devices 100, 102; coordinates of the minimum and maximum extents of composite display 110 within combined devices 100, 102 coordinate system defined by an x-axis and a y-axis; geometry of composite display 110 expressed by its number of pixels having a width x' and length y';

coordinates of the minimum and maximum extents of display 11 OA within composite display 110 coordinate system defined by an x'-axis and a y'-axis; dimensions of complete image 130 defined by width a and length J ; dimensions of complete image 130 defined by width a and length jb; coordinates of the minimum and maximum extents of image portion 130A within complete image 130 defined by an a-axis and b- axis; and coordinates of the minimum and maximum extents of image portion 130A within display 11 OA coordinate system defined by an x"-axis and y"-axis. [00151] Spatial relationship synthesizer module 308 may update relevant display coordinates information in display coordinates data store 322 upon determining new connectivity information or new complete image dimensions.

Interestingly, because the physical display image buffer is a subset of the logical display image buffer, there is no need to move rendered image data from the logical display to the physicial display.

[00152] At block S616, device 100 may instruct processor 202 to display image portion 130A, where determined, on display 110A.

[00153] The defined coordinates of image portion 130A with reference to complete image 130, as stored in display coordinates data store 322, may be used to identify the portion of the frame buffer in memory 212 containing pixel data of image portion 130A for display on display 110A by a display controller. The data on the remaining pixels (i.e., the remainder of complete image 130) may be unused, or transmitted to another device as described below.. In some embodiments, device 100 executes an image viewing application to display the content of image portion 130A from the frame buffer.

[00154] In some embodiments, for example, using the Android operating system and an associated framework API, applications and the operating system may render images to intermediate buffers called Surface objects. Image portion 130A may be designated as an "application surface". An Android system service called

SurfaceFlinger, with the help of a Hardware Composer service, is responsible for compositing multiple Surfaces, for example, an application surface, status bar, navigation bar, etc, into a resulting frame to be displayed, and stored in the frame buffer. This resulting frame buffer for display on display 110A may therefore contain more than simply image portion 130A.

[00155] As will be appreciated, determination of a portion of an image as part of a multi-device display may occur, on each of device 100 and 102, for example, concurrently, in particular for synchronism of rendering images. [00156] In some embodiments, the visual depiction of image portions 130A, 130B may not fully form complete image 130. For example, in certain device or display layouts, some of complete image 130 may be omitted from display due to for example, size and geometric constraints of the interconnected devices.

[00157] FIG. 7 illustrates determination of a portion of a next complete image to display on a device (for example, device 100 of devices 100, 102) in a multi-device display, for example, a next complete image frame from a video stream. As illustrated in FIG. 7, it may not be necessary to perform blocks S606 to S612, if the dimensions of a complete image are the same as the last complete image and connectivity information as between the devices has not changed. The next image portion can be rendered based on the coordinates from display coordinates 322 as applied to the next complete image, as necessary.

[00158] At block S702, device 100 may determine whether the dimensions of the next complete image as the same as the last complete image. This may be known or previously determined, or determined following rendering of the next complete image.

[00159] At block S704, device 100 may sense whether the connectivity

information in data store 312 is the same as that for processing of the last image.

Connectivity information may change, for example, if a new connection has been made or lost with a particular connector 120.

[00160] If both complete image dimensions and connectivity information is the same as last, device 100 proceeds directly to block S616, as shown in FIG. 6 and described above, to display an appropriate portion of the next complete image on display device 11 OA.

[00161] In some embodiments, synchronization with device 102 may be unnecessary, for example, devices 100, 102 were synchronized at a point during processing of the last complete image, and device 100 may proceed directly to block S616 to display the portion of the next complete image. In other embodiments, devices 100, 102 may synchronize again before the next image portion is rendered, for example, by using any of the techniques described above.

[00162] If complete image dimensions have changed since last and/or connectivity information has changed, device 100 proceeds to block S606 as shown in FIG. 6. The process proceeds as described above with reference to FIG. 6. At block S612, display coordinates data store 322 is updated to reflect new display coordinates information based on the newly determined complete image dimensions and/or connectivity information, which is used at block S616 to display a portion of the next complete image.

[00163] In some embodiments, a next complete image from a video stream may be completely rendered, display coordinates data store 322 updated only if the complete image dimensions or connectivity information has changed, and device 100 proceeds directly to block S616 to display a portion of the next complete image on display device 11 OA on the basis of the information in display coordinates data store 322.

[00164] In some embodiments, multi-device display as described above may occur in the context of the execution of selected software at both devices 100, 102.

[00165] For example, upon connecting devices 100, 102, one of device 100, 102 may select software for execution at both devices 100, 102. This may be

communicated between devices 100, 102, for example, over a communication channel. This may cause the same (or compatible) application to be executed at both devices 100, 102. In this instance, compatible may refer to different versions of the same application, or applications having similar functionality (e.g., two different image viewing applications).

[00166] Optionally, one of devices 100, 102 may transmit over a communication channel the selected software to the other device. The software may include code and data for installation and execution of an application, and may take, for example, the form of an Android application package (an APK file). The software may be accompanied by user-specific data, for e.g., photos, documents, etc. for use in conjunction with the application. This may include, for example, a bitmap containing full pixel data of a complete image such as complete image 130 for display in an image viewing application.

[00167] At each device 100, 102, the selected software may be initialized to a pre-defined initial state, for example, using the user-specific data. The initial state of the selected application may be initialized to be the same on both devices 100, 102. As such, both image viewing application on devices 100, 102 may be initialized to display the same complete image, processed as described above with reference to FIG. 6 for multi-device display such that the complete image spans the displays 11 OA, 110B of both devices 100, 102.

[00168] In some embodiments, devices 100, 102 may be configured to maintain a software state consistent across the two devices, for example, when a user provides input at one of devices 100, 102. In particular, when user input is received at one of devices 100, 102, that devices may send a notification of the user input to the other device. That user input may then be simulated at the other device.

[00169] This allows a user interface capable of receiving user inputs, such as displays 11 OA, 110B or mouse or keyboard input, to span display 11 OA, 110B of devices 100, 102.

[00170] Conveniently, the running application at each device 100, 102 need not have any knowledge that the image to be rendered by the running application at each device 100, 102 is not displayed at that device. Modification to each running application may therefore be small or non-existent, while allowing cross device display of the image.

[00171] An example of handling user input for the purpose of multi-device display is illustrated in FIGS. 8A, 8B and 9.

[00172] FIGS. 8A, 8B show devices 100, 102 cooperatively displaying, across devices 100, 102, a calculator application in an initial state, and a second state, respectively. More specifically, each of device 100, 102 executes its own instance of the calculator application, and each of display 11 OA, 110B shows a portion of the calculator application, represented in an instance as complete image 130' spanning image portion 130A' displayed on display 110A and image portion 130B' displayed on display 110B.

[00173] As depicted in FIGS. 8A, 8B, the application includes several user interface elements (also referred to as "controls"), for example buttons 800, and a textbox 810.

[00174] FIG. 8B depicts a touch input 820 at button "3" on display 110B of device 102. Input 820 may be handled at device 102 in a conventional manner, namely, with inputs passed from input sensors through drivers and operating system to the calculator application. In this example, input 820 causes the application's state to change at device 102, and complete image 130' is updated to complete image 130", with the number "3" in textbox 810 as shown in FIG. 8B.

[00175] Additionally, device 102 may transmit a notification of input 820 to device 100. Upon receiving this notification, application state also changes at device 100, so that complete image 130' on device 100 is also updated to complete image 130", with the number "3" in textbox 810. In some embodiments, the number "3" in textbox 810 may not be displayed on display 110A if only image portion 130A" is displayed.

[00176] In some embodiments, a user input notification may be in the form of x,y coordinates of input 820, for example, pixel position within the coordinate system of one of displays 110A, 110B. In this case, device 100 or device 102, upon receiving the notification, may simulate a touch input at those x,y coordinates, with reference to a coordinate system associated with complete image 130' or composite display 110, which in turn may trigger a change in application state.

[00177] In some embodiments, a user input notification may be in the form of a unique identifier for the user interface element and an action, for example, button 800 "3" pressed. In this case, device 100, upon receiving the notification may simulate that action on the identified user interface element.

[00178] Complete image 130" and image portion 130A" on device 100, as well as complete image 130" and image portion 130B" on device 102 are shown in FIG. 9.

[00179] As will be appreciated, touch input may originate at display 110A or

110B, on device 100 or 102 respectively.

[00180] In alternate embodiments, generating and rendering a complete image may occur on a single device of the interconnected devices. In an example, device 100 may serve in a host role, for example device 100 serving as a USB host and device 102 serving as a USB slave, performs rendering and then transmits a selected portion of the rendered frame to device 102 for display. For example, device 100 may render the entirety of complete image 130. Device 100 may then divide the rendered complete image 130 into portions, and select a portion for display on display 11 OA, namely image portion 130A, and a portion for display on display 110B of device 102. Image portions 130A, 130B may be defined as described above. Device 100 may then transmit image portion 130B for device 102 to device 102, for example, by way of a communication channel established through magnetic connectors or through a wireless connection. Image portion 130B may be encoded or compressed for transmission, using techniques known by a person skilled in the art. For example, compression or decompression functions may be performed by operation of a DSP included in processor 202.

[00181] In alternate embodiments, the relative spatial location of another device that may or may not be interconnected to a mobile device may be determined without use of connectors such as connectors 120.

[00182] For example, as illustrated in FIG. 10, two mobile computing devices 1000, 1002 may be placed adjacent to each other and set to a calibration mode, in preparation for cooperatively displaying, on a composite display 1010 or display surface across devices 1000, 1002, a complete image. [00183] In calibration mode, each device may display a Quick Response Code (QR Code™) 1050, 1052. The QR Code may encode information that identifies the device. The QR Code may also encode additional information such as device type, device size, and user interface or touch display information.

[00184] A separate calibration device (not shown), may then take a photograph of devices 1000, 1002 in calibration mode, and locate QR Codes 1050, 1052 using conventional image processing techniques.

[00185] The calibration device may be any suitable electronic device, for example, a mobile computing device such as a smartphone including a camera, and having the ability to connect with devices 1000, 1002. Connection between the calibration device and devices 1000, 1002 may allow for data communication between the calibration device and devices 1000, 1002. In some embodiments, the calibration device may be of the same type as devices 1000, 1002 - generally identical in structure and components.

[00186] The locations of QR Codes 1050, 1052 may then be used to establish the relative spatial locations and geometries of devices 1000, 1002. QR Codes 1050, 1052 may also be decoded to obtain an identifier or additional information of each of the devices.

[00187] Further image processing may be applied to the photograph to determine extents of displays 1010A, 1010B, and refine the relative spatial locations of devices 1000, 1002 and estimate the geometries of displays 101 OA, 1010B. Geometry information, for example, relating to devices 1000, 1002 and displays 1010A, 1010B, may also be obtained from additional information encoded in the QR Code itself.

[00188] Through image processing of QR Codes 1050, 1052 and additional information encoded in the QR Codes themselves, thereby determining relative locations of displays 1010A, 1010B, dimensions of displays 1010A, 1010B, a geometry of composite display 1010 may be determined, the geometry being defined by the shape, size and relative position of composite display 1010, for example, within combined devices 1000, 1002.

[00189] In embodiments in which displays are located at spatial locations adjacent one another, as shown for example in FIG. 10, displays 101 OA, 1010B may together form a composite display shape having an overall length and width in a rectangular geometry, and the overall length and width of the composite display 1010 may be representative of the total number of pixels available for display, as between display 101 OA and display 1010B. The geometry of a composite display 1010 may also be defined, for example, as described above with reference to block S606.

[00190] Based on the geometry of composite display 1010, bounding coordinates or a bounding region of display 1010A of device 1000 may be determined within composite display 1010, for example, as described above with reference to block

S608. For example, in the layout shown in FIG. 10, display 101 OA may be designated as the left half of composite display 1010. Similarly, display 1010B may be designated as the right half of composite display 1010.

[00191] A portion of a complete image that is to be displayed on display 1010A may be determined based on the geometry of composite display 1010, the bounding coordinates, and dimensions of the complete image, for example, as described above with reference to block S610. For example, in the layout shown in FIG. 10, the portion of a complete image to be displayed on display 101 OA may be defined as

encompassing the left half of the complete image, correlating to display 101 OA

encompassing the left half of composite display 1010.

[00192] In some embodiments, the calibration device may send to devices 1000, 1002 the photorgraph of devices 1000, 1002 and/or one or more of the following determinations: location of QR Codes 1050; relative locations of displays 101 OA, 1010B; dimensions of displays 1010A, 1010B; geometry of composite display 1010; bounding coordinates of displays 1010A, 1010B; and/or portion of a complete image to be displayed on displays 1010A, 1010B. As such, in various embodiments, any of the above determinations may be performed on the calibration device and/or one or more of devices 1000, 1002. In some embodiments, the photograph of devices 1000, 1002 may be sent to a remote server (not shown), and the remove server may perform one or more of the above determinations, and communicate to devices 1000, 1002 the relative locations of devices 1000, 1002. As such, communication between the calibration device and devices 1000, 1002 may be optional.

[00193] Following a calibration mode, the mobile computing devices may be set to a display mode for cooperatively displaying a complete image across devices 1000, 1002. A complete image may be generated contemporaneously at devices 1000, 1002, and the portion of the complete image displayed on the display 101 OA of device 1000.

[00194] Examples are described above with reference to two connected devices from the perspective of one of those devices. In some embodiments, the steps as described above may be applied to each of the multiple devices, for example, by determining relative locations of the displays of all of the devices and dimensions of all of the device displays, which when placed in proximity to each other form a display surface. From there, a geometry of the display surface may be determined based on the relative locations of all of the device displays and dimensions of all of the device displays. At each of the multiple devices, a bounding region may then be determined for a portion of a complete image to be displayed on that device display based on the geometry of the display surface and dimensions of the complete image. Each of the multiple devices may then generate the complete image and display the portion of the complete image. The generation of the complete image and/or displaying of the respective portions of the complete image may occur in synchronism across the multiple devices.

[00195] As will now be appreciated, multichannel audio may be similarly distributed across multiple proximate devices, such as device 100, 102 (FIGS. 1-4).

[00196] To that end, stereophonic audio playback may for example be played by interconnected devices 100 and 102. FIG. 11 further shows devices 100 and 102, each having a speaker 150, in accordance with some embodiments. Speaker 150 may be a conventional speaker for a mobile device configured for single channel (monoaural) sound. Accordingly, each of device 100 and 102 is at least capable of producing monoaural sound. To that end, software stored within memory 212 (FIG.2) of each device 100, 102 may further include audio processing software, capable of processing digital multi-channel audio. Each device 100, 102 may include a suitable audio codec, that may for example be able to decode multistream audio, that may be lossy or lossless. Example codecs may be able to decode wav, aiff, mp3, mpeg-4, flac, aac, wma or other codec known to those of ordinary skill.

[00197] In example embodiments, as illustrated in FIG. 11 each of devices 100 and 102 is configured for playback of a particular channel of audio of an audio stream. Device 100, for example plays left channel audio through its speaker 150 while device 102 on the right plays right channel audio through its speaker 150. The left and right streams may be played in synchronism, allowing a user/listener to experience stereo sound.

[00198] In the depicted embodiment, each device 100, 102 has access to data reflective of the multi-channel audio. This data may be stored within computer readable memory 212 at each of the devices, or may be transmitted (e.g., streamed) from one of the devices 100, 102 to the other device 102, 100, or transmitted from a third device (e.g., a remote server - not shown - over a computer network) to each of devices 100, 102. In a typical embodiment, multi-stream audio may be decoded and demultiplexed into constituent, complementary, streams at each device 100, 102. The data reflective of multi-channel audio may also be extracted from a video stream comprising both image and audio data.

[00199] Specifically, multi (e.g. dual) channel audio is processed by a channel splitter/decoder 160 at each device 100, 102 that divides the audio data into two separate channels: a right audio channel and a left audio channel, illustrated in FIG. 12. Channel splitter/decoder 160 may take the form of a standard audio decoder, that further demultiplexes multiple received and decoded audio channels. The multiple (e.g. two) audio channels are received at the channel selector 162 that selects one of the channels according to a right/left channel select signal (which may be outputted by spatial relationship synthesizer module 308 described above). The selected audio channel may be selected based on relative spatial location of device 100 to device 102, as for example determined by spatial relationship synthesizer module 314. The leftmost of the two devices (e.g. device 100 in FIG. 11) may output the left audio channel, and the rightmost, the right audio channel.

[00200] Channel selector 162, , outputs the selected single channel audio for output at the device's speaker 150 (which may pass through appropriate amplifiers, DSPs, DACs, etc.). Channel selector 162 may be formed in hardware at each device 100, 102, or as a software component within memory 212.

[00201] Playback of the respective audio channels at the two devices 100, 102 may be synchronized in manners described above, using for example synchronization module 314. Specifically, one of the two devices 100, 102 may provide the other (i.e. device 102, 100) with a synchronizaiton cue or synchronization data that allows the other device 102, 100 to commence playback of a stream at that device 102, 100, at the same timepoint within a stream as playback of a complementary stream at the other device. Possiblly, synchronization cues/data may be exchanged over a communications or signalling channel, as described above, between the two devices 100, 102. The synchronization cue/data, may for example, explicitly identify the location within a played stream.

[00202] In an alternate embodiment, one of devices 100, 102 splits the audio data into right and left audio channels (as for example performed by a codec forming channel splitter/decoder 160 within memory 212), and then selects one of the channels (left or right), for example by way of channel selector 162, for playback at that device and transmits the unselected channel to the other device.

[00203] FIG. 13 shows a composite system formed of three devices 100, 102 and 104. Device 104 may be substantially the same as devices 100 and 102, as decribed above. In this system, device 104 determines that it is in the center position (e.g., when all of its connectors are engaged) between devices 100 and 102, and thus plays no audio, in the case of a stereo audio signal. Meanwhile the left and right devices 100, 102 respectively reproduce left and right audio channels, as described above. Spatial separation of left and right channels is increased compared to the system shown in FIG. 12.

[00204] Optionally, if the decoded audio stream includes a third center channel (as for example a Dolby 5.1 or similar audio signal might), device 104 may play that center channel in FIG. 15. Each of devices 100, 102, 104 may decode left, right and center channels and devices 100, 102, 104 may select the respective channel for playback at speaker 150 of that device.

[00205] FIG. 14-15 shows devices 100' and 102' which each include two speakers: a left speaker 152a and a right speaker 152b. Devices 100' and 102'

Devices 100' and 102' are otherwise substantially similar to devices 100 and 102, respectively. When devices 100' and 102' are not interconnected, each device uses its two speakers for local stereophonic sound production at that device. Specifically, at each device 100', 102', the left speaker produces left channel audio while the right speaker produces right channel audio at te device. Again, software with memory may adapt devices 100' and 102' to so play audio.

[00206] FIG. 15 shows devices 100' and 102' connected by way of magnetic connectors 120 and 122. When so connected, devices 100' and 102' determine their relative spatial locations and audio channel selection as described above. Thereafter, devices 100' and 102' cooperate to produce stereophonic sound. The right device 102' plays right channel audio on both its left and right speakers. Meanwhile, the left device 100' plays left channel audio on both its left and right speakers 152a, 152b.

[00207] In one particular embodiment, the channel selector 162 (FIG. 13) may be configured to output dual channel audio to drive speakers 152a, 152b, with both channels containing right channel audio, or both channels contain left channel audio. Namely, compared to the dual channel audio that is inputted to the channel splitter, one of the channels has been duplicated and used to replace the other one of the channels. [00208] Optionally, devices 100' and 102' may cooperate to produce stereophonic sound with only the leftmost speaker of the left device playing left channel audio and only the rightmost speaker of the right device playing right channel audio. Spatial separation of left and right channels is increased compared when compared to having both speakers of both devices active, as described above.

[00209] In an embodiment, forming a composite display in manners described above may be combined with producing stereophonic sound in manners disclosed herein. Of course, embodiments that only distribute multi-channel audio, as described, without displaying video (or even having the capacity to display) video are possible.

[00210] Examples are largely described above with reference to two connected devices, for example devices 100, 102 or devices 1000, 1002. In some embodiments, more than two devices can similarly be connected to show content spanning the displays of each of the devices. Given an array of connected devices, on each device, the portion of the complete image displayed on that device's physical display is determined according to that device's position in the array. Devices in the array may exchange position information to determine its position in the array, in collaboration with other devices.

[00211] Of course, the above described embodiments are intended to be illustrative only and in no way limiting. The described embodiments are susceptible to many modifications of form, arrangement of parts, details and order of operation. The disclosure is intended to encompass all such modification within its scope, as defined by the claims.