Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHODS, UNITS AND SYSTEMS FOR RENDERING DATA STREAMS
Document Type and Number:
WIPO Patent Application WO/2019/043716
Kind Code:
A1
Abstract:
Methods, units and systems for rendering data streams in devices, for example, toys. In one aspect, the method for rendering a time-dependent data streams in a device may comprise: storing in the device at least one time-dependent data stream, the data stream comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; start rendering by the device a stored at least one segment, following an event, the event related to information in the at least one segment descriptor and; rendering at least one segment in accordance to information in the descriptor.

Inventors:
LI-HOD EHUD (IL)
ALJADEFF DANIEL (IL)
Application Number:
PCT/IL2018/050975
Publication Date:
March 07, 2019
Filing Date:
September 03, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
LI HOD EHUD (IL)
ALJADEFF DANIEL (IL)
International Classes:
A63F13/00; H04N21/83; A63F13/30; A63H3/28; A63H3/33; A63H30/02; A63H30/04; G06F3/0484; G09B5/00; H04N21/845; H04W56/00
Domestic Patent References:
WO2004061850A12004-07-22
WO2003100748A12003-12-04
Foreign References:
DE102015117503A12017-04-20
DE202004020667U12006-01-19
DE102007045129A12009-04-02
US20070253581A12007-11-01
EP2486557A12012-08-15
Attorney, Agent or Firm:
CHECHIK, Haim et al. (IL)
Download PDF:
Claims:
CLAIMS:

1. A method for rendering a time-dependent data stream in a device, comprising:

a) storing in the device at least one time-dependent data stream,

the data stream comprising at least one segment of data,

each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of th e respective segment an d; b) starting rendering by the device a stored at least one segment, following an event, the event related to information in the at least one segment descriptor and;

c) rendering at least one segment in accordance to information in the descriptor.

2. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the device is a mobile device embedded in one of the following: a toy, a book, a video console, a game console, a smart TV and the like.

3. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the device is an application implemented in one of the following: a smartphone, a video console, a game console, a smart TV and the like.

4. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the device comprises means to receive commands and transmit data and wherein the commands may cause the device to perform one or more of the following: activation, deactivation, record audio, record video, activate sensors., record sensors data and transmit data,

5. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the time-dependent data stream comprises one or more of the following: audio, video, images and textual playback data, motion activation commands, light activation commands, sound activation commands and digital control.

6. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the time-dependent data stream is formatted in accordance with EPUB ver, 3.0 format

7. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the segment descriptor further comprises a time offset to start the rendering of th e time dependent data stream segment, parameters related to the rendering of the data stream segment and the description of an event to start rendering the data stream segment.

8. A method for rendering a time-dependent data stream in a device according to claim 1, wherein the event to start rendering by the device a stored at least one segment is triggered by one of the following: receiving a wireless message from another device, receiving a trigger from a sensor, receiving a trigger from a switch, receiving an input from a barcode reader and a trigger from an internal timer or clock.

9. A method for rendering a time-dependent data stream in a device according to claim 2, wh erein the device communicates with one or more of the following: a smartphone, a tablet, a smart TV, a video console, a game console, virtual video platforms and the like.

10. A method for rendering a time-dependent data stream in a device according to claim 2, wherein the device is a toy in the field of view of an augmented or mixed reality unit, and wherein the augmented reality unit recognizes the toy and communicates with it.

11. A method for rendering a time-dependent data stream in a plurality of devices, comprising: a) storing in each of the plurality of devices at least one time-dependent data stream;

the data stream in each device comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment; b) transmitting wireless periodic messages;

receiving in the devices a plurality of the wireless periodic messages;

c) synchronizing by each of the devices an internal time base in accordance with the time of arrival and the contents of the wireless periodic messages;

d) starting rendering by the plurality of devices, a respective stored at least one segment, following an event, the event related to information in the respective at least one segment descriptor; and

e) rendering by the plurality of devices a respective at least one segment in accordance to information in the descriptor and the respective time base in each of the plurality of devices.

12. A method for rendering a time-dependent data stream in a plurality of devices according to claim 11, wherein the time-dependent data stream is either the same or different in the plurality of devices.

13. A method for rendering a time-dependent data stream in a plurality of devices according to claim 11, wherein the data stream descriptors are either the same or different in the plurality of devices.

14. A method for rendering a time-dependent data stream in a plurality of devices according to claim 11, wherein the data streams in the plurality of devices are rendered at the same or different times.

15. A method for rendering a time-dependent data stream in a plurality of devices, comprising:

a) providing a plurality of devices comprising a first device and at least a second device,

b) storing in each of the plurality of devices at least one time-dependent data stream;

the data stream in each device comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment; c) starting rendering by the first device, a stored at least one segment, following an event, the event related to information in the at least one segment descriptor; d) rendering the by the first device at least one segment in accordance to information in the descriptor;

e) transmitting by the first mobile device a wireless a message, the transmission of the wireless message synchronized to the rendering of the at least one segment;

f) receiving in at least one second mobile device the wireless message; and g) rendering by a second mobile device, a respective at least one segment in accordance to information in the descriptor and the received wireless message,

16. A method for rendering a time-dependent data stream in a plurality of devices according to claim 11, wherein the transmitted wireless message by the first device can modify a descriptor in the data stream of the second device.

17. A method for rendering a time-dependent data stream in a device, comprising:

a) transmitting at least one time-dependent data stream,

the data stream comprising at least one segment of data,

each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment; b) receiving by the device at least one time-dependent data stream, c) starting rendering by the device at least one segment of the received time- dependent data stream, following an event, the event related to information in the at least one segment descriptor; and

d) rendering at least one segment in accordance to information in the descriptor.

18, A device for rendering a time-dependent data stream, comprising:

a) means for storing at least one time-dependent data stream,

the data stream comprising at least one segment of data,

each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; means for rendering a stored at least one segment;

b) means for detecting an event;

wherein the device starts rendering at least one segment of the time- dependent data stream following an event, the event related to information in the at least one segment descriptor; and

wherein the device renders at least one segment of the time-dependent data stream in accordance to information in the segment descriptor.

19. A device for rendering a time-dependent data stream according to claim 18, wherein the device is battery powered.

20. A device for rendering a time-dependent data stream according to claim 18, wherein the device further comprises wireless communication means, a processor, memory means to store instructions and data and audio reproduction means,

21. A computer-readable medium containing instructions for controlling a processor in at least one device to perform a method of rendering at least one time- dependent data stream, the method comprising the steps of:

a) storing in the device at least one time-dependent data stream,

the data stream comprising at least one segment of data,

each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment; b) starting rendering by the device a stored at least one segment, following an event, the event related to information in the at least one segment descriptor; and

c) rendering at least one segment in accordance to information in the descriptor.

Description:
METHODS. UNITS AND SYSTEMS FOR RENDERING DATA STREAMS

Field of the Invention

[001] The present invention relates to rendering data streams in devices including methods, units and systems for rendering data streams in toys.

Background of the Invention

[002] The fast growth of technologies intended for mobile devices is driving the development of smart toys which are changing the face of the toy industry.

[001] Toys incorporating smart technology, very attractive to both kids and adults are enabling a new era of play. Smart toys which connect to the internet, to mobile apps and games consoles are emerging as the key market for toy vendors.

[002] However, development of smart toys presents high barriers for the traditional toy makers:

® Technological - usually requires outsourcing/partnership/acquisition

® Time consuming - Long time to develop the infrastructure

® Costly - high expenses, high risk

® Existing platforms are few and provide very partial solution comprising functions as:

® Collecting and sending sensors data to mobile apps ® Limited capabilities of remote HW activation on the toy ® Play is coupled with a mobile app hence mobile device is mandatory for play [003] Toy connectivity (i cluding to the Internet] which few years ago was considered as a novelty is now a fundamental feature of play.

[004] Accordingly, methods, units and systems consistent with the disclosed embodiments provide an improvement for the development and creation of smart toys including rendering of data streams.

[005] The disclosed embodiments comprise a method of rendering time- dependent data streams in a device (e.g. a toy).

In one aspect, the method for rendering a time-dependent data streams in a device may comprise: storing in the device at least one time-dependent data stream, the data stream comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; start rendering by the device a stored at least one segment, following an event, the event related to information in the at least one segment descriptor and; rendering at least one segment in accordance to information in the descriptor.

[006] The device according to the disclosed embodiment may be a mobile or static device, in some cases implemented as an embedded module in a toy (e.g. doll, puppet), in a construction toy, educational toy, in a book, etc. In other cases, it may be an object implemented as an application in a smartphone, video console, game console, smart TV, mobile computer, etc. [007] Storing the data stream in the device may be performed using a cable, or through wireless messages. The data stream may be preloaded or loaded (e.g. using USB, wireless, etc.) to the device from a computer (e.g. desktop, laptop, notebook, tablet, etc.), from a smartphone, directly from the Internet, from a storage device (e.g. disk-on-key, SD card, etc.) or from any other device suitable for that purpose.

[008] According to one embodiment, the time-dependent data stream may comprise audio (voice, music), video, images and textual playback data, motion activation commands, light activation commands, sound activation commands, control commands (e.g. actuators, relays, capture audio\image\video, track movements etc.), external communication commands etc.

[009] Further according to the disclosed embodiment, the time dependent data stream may be stored in the device in an internal flash memory or RAM.

[010] According to one embodiment the time-dependent data stream format may be a proprietary format or in accordance to standard formats as EPUB ver. 3.0 format,

[011] According to certain embodiments, the segment descriptor may comprise; a. Time or time offset to start the rendering of the time de pendent data stream segment

b. Parameters related to the rend ering of the data stream segment:

volume, pitch, intensity, motion parameters, etc,

c. Event required to start the rendering of the data stream segment ί, End of rendering of a previous segment

1. Segment descriptors may point to the next segment to be rendered

2, Segment descriptors may point to multiple next

segments to be rendered while the selection is performed by an event

ii, Specification of an event (e.g. receiving a wireless message or trigger from a sensor)

iii. Specification of an action triggered by an event

[012] According to another embodiment, the information in the segment descriptor can be modified in real time by: a. Other segment descriptors b. Occurrence of an event

[013] The descriptor format may be binary, text, in accordance with XML, SMIL or any other format. Optionally the time dependent data stream may comprise a descriptor which may describe features and contents of the data stream.

[014] Many types of events may be defined to start rendering by the device a stored segment from the data stream.

[015] According to certain embodiments, the event may comprise: a. Receiving a wireless message (e.g. from another fix/mobile device, network, etc.). The wireless message may have the following characteristics:

i. Encrypted, protected, requiring authentication ii. In accordance to IEEE802.Ha/b/g/n/ac/ad, IEEE802.15.4, IEEE802.15.4a, Bluetooth 4.0/5.0, Wireless HART, B-IoT, LTE CatO for M2M, LTE-M Rel. Cat-0, Cat-1 and Cat-4, LoRaWAN™, Low Power Wide Area Network (LPWAN), Weightless -N, Weightless-P, Weightless-W, Z-Wave, or with any other wireless transmission protocol.

iii. Using audio, ultrasound, near field communication (NFC), RFID, diffused infrared, or any other wireless technology. iv. Comprise information to a specific device (i.e. singlecast), group of devices (i.e. multicast) or all the devices in communication range (i.e. broadcast).

v. Transmitted from a wireless sensor which may be embedded in a toy or any other device.

vi. Transmitted from an augmented or mixed reality device (e.g. smartphones, glasses) that may be pointing to the rendering device (e.g. devices in the center of the field of view of the augmented reality glasses, or smartphone camera) or may comprise devices in the field of view of the augmented reality device.

vii. Transmitted from a multimedia device (e.g. tablets, smart TV, game console, smartphones, etc.) or from a virtual video platforms (e.g. web based ones like YouTube, etc.).

Trigger from a sensor (e.g. internal or external) in the device or connected to the device. The sensor may comprise: i. Proximity, distance measurement sensors

ii. Voice/music recognition (e.g. command, song, etc.) and sound detection sensors

iii. Image recognition sensors

iv. Motion, accelerometer, inclinometer and shock sensors v. Light sensor

vi. Touch sensor

Trigger from a switch, actuator, relay, etc. d. Trigger form a received input read with a barcode reader

e. Trigger from an internal timer or clock in the device

[016] According to another embodiment, the event may be waiting to an external decision and/or processing of an external device as follows:

a. An external processor may process data transmitted from the device b, The external processor may transmit back to the device data related to the rendering of a segment in the device.

[017] The method may further comprise devices which are activated by events or timers and deactivated by events, timers, end of rendering of a data stream, information in a segment descriptor, etc. In one embodiment the device may receive commands to record audio/video data, record sensor data, transmit the recorded data to another unit, while the specified recording of data may start following an event.

[018] According to one embodiment, the method may further comprise devices which are activated to exchange information (e.g. using a wireless link, audio link, infrared/ultrasound link, etc.) with external units in communication range with the device. According to one embodiment, those external units may be smart multimedia units (e.g. smart TV, game consoles, tablets, srnartphones, etc.), virtual video platforms (e.g. web based ones like YouTube, etc.) or any other object (e.g. smart devices) comprising suitable wireless communication means. [019] According to one embodiment, the device may interact with a smart multimedia unit (e.g. smart TV, game consoles, tablets, smartphones, etc.) or with virtual video platforms (e.g. web based ones like YouTube, etc.) in different ways.

[020] For example, and according to one embodiment, this communication may be used to create new interactivity patterns between toys and TV series, where manipulation of a smart toy comprising a device as disclosed in the present invention (e.g. button press, speech recognition) may interactively alter the TV show (e.g. different evolution of the plot) and vice versa where toys can act based on content displayed on screen.

[021] In terms of patterns of interactivity, and according to one

embodiment, toys may be used to extend a content that was broadcasted either in parallel or just with a reference to a certain piece of content, for example to discover more details about the characters, to extend the plot with content that was not broadcasted, etc. This may be similar to the experience of a "second screen" typically implemented by a mobile application usually running on a tablet, wherein according to embodiments of the present invention, this experience is physically achieved using toys and through patterns of play.

[022] Further in accordance to the described embodiment, messages or sounds transmitted by the device may be received by the smart multimedia unit (e.g. smart TV) and trigger an action in the multimedia unit (e.g. start/pause/stop displaying a movie or clip, change the sequence of a movie, change setup in the multimedia unit, etc.). According to another embodiment, the device may receive messages or sounds transmitted by a smart multimedia unit. Further according to this embodiment, the transmitted messages may be generated by the multimedia unit during the display of a movie/video clip or game and generate in the toy various actions,

[023] The disclosed embodiments also comprise a method for rendering a time-dependent data stream in a plurality of devices which may comprise: storing in each of the plurality of devices at least one time-dependent data stream; the data stream in each device comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; transmitting wireless periodic messages and; receiving in the devices a plurality of the wireless periodic messages and; synchronizing by each of the devices an internal time base in accordance with the time of arrival and the contents of the wireless periodic messages and; start rendering by the plurality of devices, a respective stored at least one segment, following an event, the event related to information in the respective at least one segment descriptor and; rendering by the plurality of devices a

respective at least one segment in accordance to information in the descriptor and the respective time base in each of the plurality of devices,

[024] According to one embodiment of the disclosed method, the periodic message may be a Bluetooth Low Energy (BLE) beacon, an ultrasound message, an Near Field Communication (NFC) or Low Frequency (e.g. 125KHz] message, an diffused infrared (IR), Wi-Fi beacon, etc. The wireless periodic message may address a specific device, group of devices or all the devices in communication range,

[025] According to another embodiment, the time base in the rendering device may be implemented by hardware (e.g. real time clock), by SW, use a crystal or in accordance to any other method well known to the skilled in the art.

[026] Since the disclosed embodiment comprises a plurality of devices, the time-dependent data stream may be identical or different in each of the devices, the descriptors maybe identical or different in each of the devices and the rendering by the devices maybe simultaneous or at different times.

[027] The disclosed embodiments also comprise a method for rendering a time-dependent data stream in a plurality of devices, the plurality of devices comprising a first device and at least one second device, storing in each of the plurality of devices at least one time-dependent data stream; the data stream in each device comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the res pective segment and; start rendering by the first device, a stored at least one segment, following an event, the event related to information in the at least one segment descriptor and; rendering the by the first device at least one segment in accordance to information in the descriptor and; transmitting by the first mobile device a wireless a message, the transmission of the wireless message synchronized to the rendering of the at least one segment and; receiving in at least one second mobile device the wireless message and; rendering by a second mobile device, a respective at least one segment in accordance to information in the descriptor and the received wireless message.

[028] According to one embodiment of the disclosed method the wireless message may have characteristics as previously described, It also may: a. Modify a descriptor in a second rendering device of the plurality of devices

b. Include information for the rendering of one or more data stream segments. For example, this information may comprise:

i. Time offset to start the rendering of the time dependent data stream segment

ii. Parameters related to the rendering of the data stream

segment: volume, pitch, intensity, motion parameters, etc. iii. Trigger type to start the rendering of a specific data stream segment

[029] The disclosed embodiments also comprise method for rendering a time-dependent data stream in a device, comprising: transmitting at least one time- dependent data stream, the data stream comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; receiving by the device at least one time-dependent data stream, start rendering by the device at least one segment of the received time-dependent data stream, following an event, the event related to information in the at least one segment descriptor and; rendering at least one segment in accordance to information in the descriptor,

[030] Another disclosed embodiment may comprise a device for rendering a time-dependent data stream, comprising: means for storing at least one time- dependent data stream, the data stream comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; means for rendering a stored at least one segment, and; means for detecting an event; wherein the device starts rendering at least one segment of the time- dependent data stream following an event, the event related to information in the at least one segment descriptor and; wherein the device renders at least one segment of the time-dependent data stream in accordance to information in the segment descriptor.

[031] In one embodiment, the device comprises a processor which may be used to perform functions consistent with disclosed embodiments. The processor may be a low-power high performance processor (e.g. ARM® Cortex®-M4 32b) able to perform the required tasks. The processor may comprise memory to store instructions and data (e.g. program parameters, measurement results, system configuration, system identification, time-dependent data streams comprising video and audio data, etc.). The memory may comprise random access memory (RAM), flash memory, ROM, SDRAM, DRAM, and the like.

[032] The device processor may also comprise interfaces to communicate with other functions and modules in the device.

[033] According to certain embodiments of the disclosed device, the device may comprise: a. Power from an internal battery, capacitor or be powered from an external power source

b. Long/local range wireless communication and antennas

c. Short range wireless communication (e.g. NFC, BLE, etc.)

d. Wired communication (e.g. USB, Ethernet, etc.)

e. USB/wireless charging

f. Memory (RAM, flash memory, etc.) and/or other data storage components

g. A processor optionally including additional digital, analog and RF HW h. Sensors (accelerometer, inclinometer, compass, light, sound, etc.) i. Audio/video reproduction and /or recording means, audio/video recording means including codecs, speaker, microphone, video camera, screen, etc.

j. Lights, actuators, relays, motors and similar components or modules

[034] According to another embodiment of the disclosed methods, the time- dependent data stream may comprise the voice and/or music related to a book which is read by a single or plurality of rendering devices (e.g. puppets). [035] According to one embodiment, the book may be on a tablet (e.g. iPad), smartphone, laptop, notebook, or printed as a paper book, for example comprising sensors to sense an open page or with a touch button on each page, the paper book also comprising a wireless transmitter.

[036] Further in accordance with this embodiment, opening the book in certain page or displaying that page in an electronic book, may trigger the book to transmit wireless messages that may be received by the rendering devices (e.g. puppets) to render a time-dependent data stream related to that page in the book.

[037] According to one embodiment, the device is a compact unit which may be inserted into a toy also by a third party (i.e. not the toy or device manufacturer). Further in accordance with this embodiment, the device is a generic device which can fit many different toys and/or different characters. Each toy may have an RFID tag which may be read by the device embedded in the toy and based on the read information, the device can program one or more of the following:

a. Toy's character (e.g. Mickey mouse, Donald duck, etc.)

b. Toy's HW and interface (e.g. this may be an option if the toy has HW interfaced to the device)

c. Enabled HW (e.g. voice, LED's, sensors, etc)

d. Device activity characteristics (e.g. communication with other toys or objects, sound volume, etc.)

e. Status of play progress. The toy may retrieve from the RFID tags

previously stored information comprising the status or progress of a play (e.g. points achieved, game level completed, goals completed, weapon or accessories acquired, etc.). The RFID tags may comprise communication means in accordance to NFC, BLE, 1SO/IEC 18000, etc.

[038] According to one embodiment, the toys may store back in the RFID tags information comprising the status or progress of a play (e.g. points achieved, game level completed, goals completed, weapon or accessories acquired, etc.). That way, a player playing with a specific toy may continue playing with a specific toy from the point he interrupted the game or continue the game with another toy. In another embodiment, the status or progress information may be stored in a memory (e.g. flash memory) inside the toys.

[039] According to another embodiment, it may be possible to change the RFID tag in the toy to let the toy behave differently in accordance with the

information stored in the RFID tag (i.e. by reading the information in the RFID tag).

[040] According to another embodiment, the RFID reader inside the device may communicate with RFID tags attached to accessories in the proximity of the toy. For example, after assembling or attaching to a toy different accessories (e.g. tools, weapon, dresses, etc.), the toy will recognize those accessories and behave accordingly,

[041] According to one embodiment, a toy may also have direct

communication with an object comprising wireless communication means (e.g. object adapted to loT), Further according to this embodiment, a toy may interact with smart furniture, appliances and other devices inside smart rooms such as lights to create a new playing experience.

[042] According to one embodiment, a system comprising one or plurality of devices, the devices which may comprise means for self-location in an area or to means to enable other units to locate them in an area.

[043] Further in accordance with this embodiment, the location of a device or a plurality of devices may trigger the rendering of a data stream in one or more rendering devices.

[044] According to one embodiment, locating the devices may use measuring parameters related to the wireless signals (e.g. Received Signal Strength- RSS, Time of Arrival (TOA), Angle of Arrival-AoA, etc.) transmitted by one or more devices.

[045] According to another embodiment, some of the devices in a plurality of devices may estimate their relative distance by performing ranging based on Received Signal Strength Indication (RSSI) measurement, two-way ranging (TWR), symmetrical double sided - two-way ranging (SDS-TWR), measurements based on TOA (Time of Arrival) and TOT (Time of Transmission) of wireless signals.

[046] According to another embodiment, some of the devices may comprise a GPS receiver to provide a global location. [047] Another disclosed embodiment may comprise a computer-readable medium containing instructions for controlling a processor in at least one device to perform a method of rendering at least one time-dependent data stream, the method comprising the steps of storing in the device at least one time-dependent data stream, the data stream comprising at least one segment of data, each at least one data segment comprising a data stream to be rendered by the device and at least one segment descriptor, each at least one segment descriptor providing information for rendering the data stream of the respective segment and; start rendering by the device a stored at least one segment, following an event, the event related to information in the at least one segment descriptor and; rendering at least one segment in accordance to information in the descriptor.

[048] According to one embodiment the instructions for controlling a processor may consist of a computer program which: a. May be executed on a plurality of different devices, comprising different HW functions and embedded in different toys or modules. b. Render different time-dependent data streams: The computer program may be a generic program able to render a data stream in accordance to the data embedded in the data stream and other data provided from sensors, messages received through a wireless link, etc.

[049] For example, the computer program in two devices embedded in different toys (e.g. a puppet and a book) may be the same while the stored time- dependent data stream, the sensors, the audio/video HW and other functions in the device may be different. [050] This characteristic of the computer program may provide to toys the ability to change their behavior (e.g. represent two different features in a comic story) in accordance to the stored time-dependent data stream and without modifying the computer program itself,

[051] Other embodiments are disclosed that are configured to perform similar and other aspects than those exemplified above. It is to be understood that the disclosed embodiments are not limited to the details of construction and to the arrangements set forth in the following description or illustrated in the drawings. The disclosed embodiments may comprise additional aspects in addition to those described and are capable of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein, as well as in the abstract, are for the purpose of description and should not be regarded as limiting.

[052] The accompanying drawings, which are incorporated and constitute part of the specification, illustrate certain embodiments of the disclosure, and together with the description, serve to explain exemplary principles of the disclosed embodiments.

[053] As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as a basis for designing other structures, methods, and/or systems for carrying out the several purposes of the present disclosure. It is important, therefore, to recognize that the claims should be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the present disclosure.

Brief Description of the Drawings

[054] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate various exemplary aspects of the disclosure and, together with the description, explain disclosed principles.

[055] FIG. 1 is a diagram illustrating an exemplary system that may be used to implement certain aspects of the disclosed embodiments;

[056] FIG. 2 is a diagram illustrating an exemplary structure of a time- dependent data stream consistent with certain disclosed embodiments;

[057] FIG. 3 is a diagram illustrating an exemplary system that may be used to implement certain aspects of the disclosed embodiments;

[058] FIG. 4 is a diagram illustrating an exemplary system comprising a plurality of devices that may be used to implement certain aspects of the disclosed embodiments;

[059] FIG. 5 is a diagram illustrating an exemplary system that may be used to implement certain aspects of the disclosed embodiments;

[060] FIG. 6 is a diagram illustrating an exemplary system comprising a book and a toy consistent with certain disclosed embodiments;

[061] FIG. 7 is a diagram illustrating an exemplary system comprising an electronic tablet and a toy consistent with certain disclosed embodiments; [062] FIG, 8 is a diagram illustrating an exemplar},' ' system that may be used to implement certain aspects of the disclosed embodiments; and

[063] FIG, 9 is a block diagram illustrating exemplary functions comprised in a data stream rendering device consistent with disclosed.

Detailed Description of Preferred Embodiments

[064] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well as the singular forms, unless the context clearly indicates otherwise,

[065] It will be further understood that the terms "comprises" and/or "comprising" and/or "includes" and/or "including" when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof,

[066] Unless otherwise defined, ail terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one having ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.

[067] In describing embodiments of the invention, it will be understood that a number of techniques and steps are disclosed. Each of these has individual benefit and each can also be used in conjunction with one or more, or in some cases all, of the other disclosed techniques. Accordingly, for the sake of clarity, this description will refrain from repeating every possible combination of the individual steps in an unnecessary fashion,

[068] Nevertheless, the specification and claims should be read with the understanding that such combinations are entirely within the scope of the invention.

[069] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be evident, however, to one skilled in the art that the present invention may be practiced without these specific details,

[070] The present disclosure is to be considered as an exemplification of the invention, and is not intended to limit the invention to the specific embodiments illustrated by the figures or description below,

[071] Reference will now be made in detail to the present embodiments of the disclosure, certain examples of which are illustrated in the accompanying drawings. [072] FIG. 1 is a diagram illustrating an exemplary system 1 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG. 1 may vary consistent with the disclosed embodiments.

[073] In one embodiment, system 1 may comprise at least one toy 10 which may comprise an embedded device 13 with wireless communication means to communicate 6 with a wireless router 2, According to one embodiment the wireless router 2 may be also connected 3 to the Internet 5 through a communication network 4.

[074] The wireless communication 6 between the device 13 (embedded in toy 10) and the network 4 may be either unidirectional, from the network 4 to the toy 10, or bidirectional, and in accordance to one or more wireless communication technologies including IEEE802.Ha/b/g/n/ac/ad, IEEE802.15.4, IEEE802.15.4a, Bluetooth 4.0/5.0, Wireless HART, NB-IoT, LTE CatO for M2M, LTE-M Rel. Cat-0, Cat- 1 and Cat-4, LoRaWAN™, Low Power Wide Area N etwork (LPWAN), Weightless-N, Weigh tless-P, Weightless-W, Z-Wave, Low frequency near field, or any other known wireless network that facilitates wireless communications between elements.

[075] In accordance to this embodiment, the toy 10 may receive wireless messages comprising information (e.g. time-dependent data streams 21) originated in the Internet 5. According to another embodiment, the same wireless

communication link 6 may be used to send data from the toy 10 to the Internet S and/or other devices connected to the communication network 4. [076] According to one embodiment, the wireless communication link 6 may be used to transfer to the toy 10 a time-dependent data stream 21 to be stored in a memory of a device embedded 13 in the toy 10, Further according to this embodiment, the time dependent data stream 21 may be stored in the device in an internal flash memory or RAM.

[077] The embedded device 13 may comprise means to render data streams 21. For example, according to one embodiment, the time-dependent data stream 21 may comprise music and/or voice 14 which may be reproduced with an audio decoder, an audio amplifier and a speaker 15.

[078] According to one embodiment the time-dependent data stream 21 format may be a proprietary format or in accordance to standard formats as EPUB ver. 3,0 format.

[079] According to one embodiment, the embedded device 13 may comprise means to move the toy legs 11 or arms 12 in accordance to information provided in the stored data stream 21, Further according to another embodiment, the device 13 may comprise means to activate lights 16 which may be in the toy's head.

[080] According to one embodiment, the instructions for rendering a data stream 21 which may also include the movement of legs 11 or arms 12, activation of lights 16 and other functions may be specified in segment descriptors 22b-25b which are part of the data stream 21. [081] According to certain embodiment, the embedded device 13 may start rendering a data stream 21 after the occurrence of an event. For example, referring to Fig. 1, the event may be the reception of a wireless message 6 or a trigger from a motion sensor in the device 13.

[082] Referring now to FIG. 2, a diagram illustrating an exemplary structure of a time-dependent data stream 21 consistent with certain disclosed embodiments is shown.

[083] According to one embodiment the time-dependent data stream may comprise at least one or a plurality of segments 22a-25a. Further according to one embodiment each segment 22a-25a may comprise a respective segment descriptor 22b-25b and a segment data 22c-25c. Optionally the data stream may also comprise a data stream descriptor 26.

[084] For the sake of simplicity, in this application the term "rendering a data stream segment 22a-25a" means rendering the data 22c-25c embedded in the respective segment 22a-2Sa in accordance with information in the respective segment descriptor 22b-25fo. Such a rendering may comprise playing a sound or voice clip, recording a voice, activating an actuator, etc.

[085] According to certain embodiments, the segment descriptor 22b-25b may comprise: a. Time or time offset to start the rendering of a time dependent data stream segment 22a-25a.

b. Parameters related to the rendering of a data stream segment 22a- 25a: volume, pitch, intensity, motion parameters, etc. c. Description of an event required to start the rendering of the data stream segment 22a-25a:

i, End of rendering of a previous segment

1. Segment descriptors 22b-25b may point to the next segment 22a-25a to be rendered

2, Segment descriptors 22b-25b may point to multiple next segments 22a-25a to be rendered while the selection is performed by an event

ii, Specification of an event (e.g. receiving a wireless message 6 or trigger from a sensor in the device 13)

[086] According to another embodiment, the information in the segment descriptor 22b-2Sb can be modified in real time by: a. Other segment descriptors 22b-25b b. Occurrence of an event

[087] The descriptor 22b-2Sb format may be binary, text, in accordance with XML, SMIL or any other format Optionally the time dependent data stream 21 may comprise a descriptor 26 which may describe features and contents of the data stream 21.

[088] Many types of events may be defined to start rendering by the device

13 a stored segment 22a-25a from the data stream 21.

[089] According to certain embodiments, the event may comprise: a. Receiving a wireless message 6 (e.g. from another fix/mobile device, network, etc.). The wireless message 6 may have the following characteristics:

i. Encrypted, protected, requiring authentication ii. In accordance to IEEE802.Ha/b/g/n/ac/ad, IEEE802.15.4, IEEE802.15.4a, Bluetooth 4.0/5.0, Wireless HART, NB-IoT, LTE CatO for M2M, LTE-M Rel. Cat-0, Cat-1 and Cat-4, LoRaWAN™, Low Power Wide Area Network (LPWAN), Weightless-N, Weightless-P, Weightless-W, Z-Wave, or with any other wireless transmission protocol.

iii. Using ultrasound, near field communication (NFC), RFID,

diffused infrared, or any other wireless technology. iv. Comprise information to a specific device (i.e. singlecast), group of devices (i.e. multicast) or all the devices in communication range (i.e. broadcast).

v. Transmitted from a wireless sensor which may be embedded in a toy 10 or any other device 13.

vi. Transmitted from an augmented reality device (e.g. glasses) that may be pointing to the rendering device (e.g. devices in the center of the field of vie w of the augmented reality glasses) or may comprise devices in the field of view of the augmented reality glasses.

b. Trigger from a sensor (e.g. internal or external) in the device 13 or connected to the device 13. The sensor may comprise: i. Proximity, distance measurement sensors

ii. Voice/music recognition (e.g. command, song, etc.) and sound detection sensors

iii. Image recognition sensors

iv. Motion, acceierometer, inclinometer and shock sensors v. Light sensor

vi. Touch sensor

c. Trigger from a switch, actuator, relay, etc.

d. Trigger form a received input read with a barcode reader e. Trigger from an internal timer in the device 13

[090] According to another embodiment, the event may be waiting to an external decision and/or processing of an external device as follows:

a. An external processor may process data transmitted from the device 13

b. The external processor may transmit back to the device data related to the rendering of a segment 22a-25a in the device 13,

[091] According to certain embodiments, the time-dependent data streams are created by a special computer program,

[092] According to one embodiment, a PC program is used to compile streams of data 21 to be rendered by a device 13 including the insertion of the segment descriptors 22fo-25b. The PC program may allow the user to create, edit and compile different data streams for different devices and/or applications.

[093] FIG. 3 is a diagram illustrating an exemplar system 40 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG. 3 may vary consistent with the disclosed embodiments,

[094] In one embodiment, a construction toy 41 (e.g. a construction toy comprising plastic bricks, small vehicles, animals, mini figures, etc.) may comprise a brick 42 in which a device 46 (similar to the device 13 described in Fig. 1) is embedded. [095] This brick 42 with the embedded device 46 may also comprise few connectors required to provide power to the device 46 and to connect to the device electrical/electronic elements like LED's 43, small speaker 44, etc.

[096] The shape of the brick 42 may allow kids to assemble buildings, cars or other construction shapes while assembling the brick 42 with the em bedded device 46 as part of the whole construction shape.

[097] In another embodiment, the construction toy 41 may comprise a plurality of different bricks 42 with embedded devices 46, each brick comprising interfaces to different functions (e.g. small motors, speaker, microphone, sensors, etc.).

[098] In one embodiment, construction toy 41 may comprise at least one brick 42 which may comprise an embedded device 46 with wireless communication means to communicate 6 with a wireless router 2. According to one embodiment the wireless router 2 may be also connected 3 to the Internet 5 through a communication network 4.

[099] The wireless communication 6 between the device 46 and the network 4 may be either unidirectional, from the network 4 to the toy 10, or bidirectional, similarly to the wireless communication described for device 13 in Fig. 1.

[0100] In accordance to this embodiment, the device 46 may receive wireless messages comprising information (e.g. time-dependent data streams 21) originated in the Internet 5. According to another embodiment, the same wireless communication link 6 may be used to send data from the device 46 to the Internet 5 and/or other devices connected to the communication network 4.

[0101] According to one embodiment, the wireless communication link 6 may be used to transfer to the device 46 a time-dependent data stream 21 to be stored in a memory in the device 46 embedded in the brick 42. Further according to this embodiment, the time dependent data stream 21 may be stored in the device in an internal flash memory or RAM.

[0102] The embedded device 46 may comprise means to render data streams 21. For example, according to one embodiment, the time-dependent data stream 21 may comprise music and/or voice 45 which may be reproduced with an audio decoder, an audio amplifier and a speaker 44.

[0103] According to one embodiment, the instructions for rendering a data stream 21 which may also include the activation of motors (not shown), activation of lights 43 and other functions, may be specified in segment descriptors 22b-25b whi ch are part of the data stream 21.

[0104] Similarly to the device 13 in Fig. 1, the embedded device 46 may start rendering a data stream 21 after the occurrence of an event For example, referring to Fig. 3, the event may be the reception of a wireless message 6 or a trigger from a switch (not shown) connected to the device 46.

[0105] FIG. 4 is a diagram illustrating an exemplary system 60 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG. 4 may vary consistent with the disclosed embodiments,

[0106] According to one embodiment, system 60 comprises a plurality of toys 10, 61 and 62, each of the toys 10, 61 and 62 comprising an embedded device 13, 69 and 70. The embedded devices 10, 61 and 62 may comprise wireless communication means to communicate 81 with a smartphone 72.

[0107] According to certain embodiments, the smartphone 72 may also comprise communication means 80 with a wireless router 2. According to one embodiment the wireless router 2 may be also connected 3 to the Internet 5 through a communication network 4.

[0108] The wireless communication 81 between the devices 13, 69 and 70 (embedded in their respective toys 10, 61 and 62) and the smartphone 72 may be either unidirectional, from the smartphone 72 to the toys 10, 61 and 62, or bidirectional, and in accordance to one or more wireless communication

technologies including IEEE802.Ha/b/g/n/ac/ad, 1EEE802.15.4, IEEE802.15.4a, Bluetooth 4.0/5.0, NB-IoT, Low frequency near field, or any other known wireless communication protocols that facilitate wireless communications between

smartphones 72 and other devices.

[0109] Further according to the described embodiment, time-dependent data streams are stored in a plurality of devices 13, 69 and 70 which could have been previously transferred 81 to the devices 13, 69 and 70 from the smartphone 72 or transferred using other interfaces (e.g. USB). [0110] According to one embodiment, the smartphone 72 may transmit periodic wireless beacons 81. Those wireless beacons 81 may be received by the plurality of devices 13, 69 and 70 which may synchronize their internal time base (e.g. real time clock) in accordance to the received beacons 81. This technique to synchronize wireless devices are well known in local area networks (e.g.

IEEE802.11x) and beyond the scope of this application. Those synchronization techniques may allow synchronizing a plurality of devices 13, 69 and 70 to an accuracy of few microseconds or even less,

[0111] Having a synchronized plurality of devices 13, 69 and 70 is important when rendering in the devices 13, 69 and 70 time-dependent data streams which require accurate mutual synchronization. For example, and according to one embodiment, toys 10, 61 and 62 may simulate a conversation in which each toy 10, 61 and 62 may render at a specific time a segment 22a-25a of a data stream 21 thus providing the effect of a conversation between the toys. In another

embodiment, the toys 10, 61 and 62 may render simultaneously a data stream segment 22a-25a comprising voice and music thus creating the effect of a chorus composed by the toys 10, 61 and 62. The level of time synchronization between the toys required in this last embodiment is high.

[0112] According to one embodiment, a user of smartphone 72 may activate an application to control one or more of the toys 10, 61 and 62 in communication range 81. This application may show in the screen 73 one or more of th e connected toys and may provide means (e.g. icons on the screen 74) to activate and/or perform a variety of functions with the toys 10, 61 and 62.

[0113] According to another embodiment, smartphone 72 may get from the Internet 5 files which may be used in relation to the connected toys 10, 61 and 62. Those files may include different applications, time dependent data streams, etc.

[0114] According to one embodiment, the devices 69-70 in toys 61-62 may also communicate directly. Further according to this embodiment, a data stream in a device 69 may comprise transmitting a wireless message 82 to another device 70, This received wireless message 82 may be used by device 70 as a trigger to start rendering a specific segment in its stored data stream 21. This direct

communication mechanism allow to create a large variety of applications in which the toys interact thus creating a real life effect.

[0115] According to another embodiment the direct communication 82 between the toys 69-70 may be used to transfer messages between the toys but also to measure the proximity between the toys 69-70, According to one embodiment, the RSSI of the wireless message 82 received by device 70 may be used to estimate the proximity between toys 69 and 70. This proximity may be used to trigger an event in device 70,

[0116] According to one embodiment, a toy 69 may also have direct communication 83 with an object comprising wireless communication means (e.g. object adapted to loT), Further according to this embodiment, a toy 69 may interact with furniture and other devices 75 inside smart rooms such as lights to create a new playing experience.

[0117] In another embodiment, toys 69-70 may be able to perform two-way ranging (TWR) by exchanging wire! ess signals 82 and measuring the Time of Arrival (TOA) and measuring or setting the Time of Transmission (TOT).

[0118] For example, a typical scenario may be the following: Device 69 in toy 61 may start rendering a data stream which comprises transmitting messages 82 once the toy 61 was moved (e.g. sensed by a motion sensor). When toy 69 is close enough to toy 62, the RSSI in device 70 may exceed a threshold which may trigger the rendering of a segment in device 70 comprising the reproduction (using a speaker 67) of a voice 68 clip saying "Hello".

[0119] Additional embodiments based on the principles described in Fig, 4 may comprise more complex scenarios in which toys communicate directly and/or with a smartphone 72 to create real life scenes.

[0120] In other embodiments, smartphone 72 may be replaced by a laptop, notebook, tablet, smart TV's or similar devices comprising suitable wireless communication means.

[0121] FIG. 5 is another diagram illustrating an exemplary system 90 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG, 5 may vary consistent with the disclosed embodiments. [0122] According to one embodiment, the toys 10, 61-62 described in Fig. 4 are also part of system 90. Further according to this embodiment, a remote control unit, 85 is used to transmit wireless messages 87 to the toys 10, 61-62 and provide the user (e.g. a kid) certain level of control 86 on the toys 10, 61-62. Using a remote control 85 may be preferable when the users are small kids. In those cases, the parents may want to allow the execution of a limited set of functions which are suitable for the kids (e.g. activating/deactivating the toys).

[0123] In another embodiment, the remote control may be further used to transmit wireless beacons 87 that may be used to time synchronize the devices 13, 69 and 70 embedded in the toys 10, 61-62 as previously described in Fig. 4.

[0124] According to one embodiment, the remote control unit 85 may be a small portable unit a ble to transmit wireless signals 87 at periodic intervals and/or when a specific key 86 is pressed. The wireless signals may be; a. In accordance to IEEE802.Ha/b/g/n/ac/ad, IEEE802.15.4, IEEE802.15.4a, Bluetooth 4.0/5.0, Wireless HART, NB-IoT, LoRaWAN™, Low Power Wide Area Network (LPWAN), Weightiess-N, Weightless-P, Weightless~W, Z-Wave, or with any other wireless transmission protocol.

b. Using ultrasound, near field communication (NFC), RFID, diffused infrared, or any other wireless technology.

[0125] FIG. 6 is a diagram illustrating an exemplary system 100 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG. 6 may vary consistent with the disclosed embodiments. [0126] In one embodiment, system 100 comprises at least one toy 62

(similar to other toys described in previous figures) and a book 95 which may comprise a device 98 with wireless communication 96 means.

[0127] According to one embodiment, the device 98 in the book 95 may communicate with the embedded device 70 in the toy 62.

[0128] Further according to this embodiment, the device 98 in the book 95 may transmit messages 96 which may comprise information related to an open page in the book (e.g. page ID).

[0129] This information may be provided to the device 98 in the book by sensors embedded in the book 95 or from a touch button 97 available in certain or ail pages of the book 95.

[0130] According to certain embodiments of system 100, the device 70 in toy 62 may receive the messages 96 and trigger the rendering of a specific segment 22a-25a in a data stream 21. The contents of the specific segment may correlate to the contents of the open page in the book 95.

[0131] For example, a small kid may open page #1 in a book 95 which may trigger the transmission of a message 96 comprising the page number. The device 70 may receive this message 96 and starts rendering an audio clip that relates 68 (using a speaker 67) to the kid a story in accordance to page #1 in the book 95.

[0132] FIG. 7 is another diagram illustrating an exemplary system 110 (similar to system 100 depicted in Fig. 6) that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG, 7 may vary consistent with the disclosed embodiments,

[0133] In one embodiment, system 110 comprises at least one toy 61-62 (similar to other toys described in previous figures) and an electronic tablet 111 with a book reader application 113. The tablet 111 may comprise wireless communication means which may communicate 112 with the embedded devices 69-70 in toys 61-62 respectively.

[0134] Further according to this embodiment, the tablet 111 may transmit messages 112 which may comprise information related to an open page in the book reader application (e.g. page ID).

[0135] According to certain embodiments of system 110, the device 70 in toy 62 may receive the messages 112 and trigger the rendering of a specific segment 22a-25a in a data stream 21. The contents of the specific segment may correlate to a specific line of text of the open page in the book reader application 113. Similarly, device 69 in toy 61 may receive at a different time messages 112 from the tablet 113 and trigger the rendering of a specific segment 22a-25a in a data stream 21. The contents of that specific segment rendered in toy 61 may correlate to another line of text of the open page in the book reader application 113. That way, the effect of a story being told by several characters (i.e. toys) may be achieved. According to another embodiment, in some cases the devices 69-70 in toys 61-62 may render segments 22a-25a of a data stream 21 simultaneously or with an overlap. [0136] In another embodiment, the tablet 111 provides to the user additional means to control toys 61-62. For example, wireless messages 112 may be used to change the volume of the voices 65, 68 in toys 61-62. In certain embodiments, the tablet may synchronize (using wireless messages 112) the rendering of segments in d evices 69-70 to the hi ghlighting of the corresponding text in the displayed page by the book reader 113.

[0137] According to one embodiment, toys 61-62 may also comprise an RFID tag (e.g. NFC tag) 115-116, embedded in the toy itself (e.g. hidden in the toy's clothes). Once activated, the toy devices 69-70 can read the information in the RFID tag 115-116 and based on that information program the activities of the toy device 69-70. In one embodiment, the information in the RFID tag 115-116 can program one or more of the following:

a. Toy's character (e.g. Mickey mouse, Donald duck, etc.)

b. Toy's HW and interface (e.g. this may be an option if the toy has HW interfaced to the device 69-70)

c. Enabled HW (e.g. voice, LED's, sensors, etc.)

d. Device activity characteristics (e.g. communication with other toys, sound volume, etc.)

e. Status of play progress. The toy 61-62 may retrieve from the RFID tags 115-116 previously stored information comprising the status or progress of a play (e.g. points achieved, game level completed, goals completed, weapon or accessories acquired, etc.). [0138] According to one embodiment, the toys 61-62 and the devices 69-70 are purchased separately. The devices 69-70 are generic devices which can behave differently in accordance to information read from an RFID tag 115-116. Further according to such an embodiment, the user assembles one of the generic devices 115-116 inside each of the toys 61-62, which at the moment the devices 69-70 are activated they read the information in the RFID tags 115-116. As may be apparent to the skilled in the art, this feature allows a significant degree of flexibility in the smart toy's market since those toys can just be sold with a small compartment for an embedded device 69-70 and with an RFID tag 115-116 specifying the

characteristics of the toy.

[0139] According to one embodiment, the toys 61-62 may store back in the RFID tags 115-116, information comprising the status or progress of a play (e.g. points achieved, game level completed, goals completed, weapon or accessories acquired, etc.). That way, a player playing with a specific toy may continue playing with a specific toy 61-62 from the point he interrupted the game or continue the game with another toy. In another embodiment, the status or progress information may be stored in a memory (e.g. flash memory] inside the toys 61-62.

[0140] According to another embodiment, it may be possible to change the RFID tags 115-116 in the toy to let the toy behave differently in accordance with the information stored in the RFID tags 115-116 (i.e. by reading the information in the RFID tag). [0141] According to another embodiment, the RFID reader inside the toys 61-62 may communicate with other RFID tags attached to accessories (not shown) in the proximity of the toy. For example, after assembling or attaching to a toy different accessories (e.g. tools, weapon, dresses, etc.), the toys 61-62 will recognize those accessories and behave accordingly.

[0142] The toy's devices 69-70 can be sold separately and by many manufacturers without the need to link (during the manufacturing process) between the toy manufacturer and the device manufacturer (i.e. if they are not the same).

[0143] According to one embodiment, the device may communicate 114 with a smart multimedia unit 117 (e.g. smart TV, game consoles, tablets, smartphones, etc) or with virtual video platforms (e.g. web based ones like YouTube, etc.), in different ways.

[0144] According to one embodiment, messages or sounds transmitted 114 by the toy 62 may be received by the smart multimedia unit 117 (e.g. smart TV) and trigger an action in the multimedia unit (e.g. start/pause/stop displaying a movie or clip, change the sequence of a movie, change setup in the multimedia unit, etc.). According to another embodiment, the toy 62 may receive messages or sounds transmitted 114 by a smart multimedia unit 117. Further according to this embodiment, the transmitted messages or sounds 114 may be generated by the multimedia unit 117 during the display of a movie/video clip or game and trigger in the toy 62 different actions. [0145] FIG. 8 is a diagram illustrating an exemplary system 120 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG. 8 may vary consistent with the disclosed embodiments.

[0146] In one embodiment, the system 120 comprises one or more toys 61- 62, each of them with embedded devices 69-70 able to render stored data streams 21.

[0147] In addition, to the toys 61-62, an augmented reality device 121 (e.g. glasses) may comprise wireless communication means 122 able to communicate 123 with the embedded devices 69-70.

[0148] In one embodiment, the image of one of the toys 62 located in the field of view 124 may be recognized by the augmented reality d evice 121 and as a result the augmented reality device 121 may transmit a wireless message 123 addressed to that specific toy 62.

[0149] The reception of the wireless message 123 by the device 70 may trigger the rendering of a specific segment in a data stream 21 stored in the device 70.

[0150] Now, moving the field of view 124 of the augmented reality device 121 from toy 62 to toy 61 may stop the rendering of segments in toy 62 and start the rendering of segments in toy 61. [0151] Very creative and amusing scenes may be created using the principles d escribed. For example . , system 120 may comprise a plurality of toys 61-62 and a plurality of augmented reality devices, each worn by a different kid.

[0152] FIG. 9 is a block diagram illustrating exemplary functions comprised in a data stream rendering device 200 that may be used to implement certain aspects of the disclosed embodiments. The type, number, and arrangement of devices, components, and elements illustrated in FIG, 9 may vary consistent with the disclosed embodiments.

[0153] In one embodiment, the data stream rendering device 200 may comprise a processor 201 which may be used to control and monitor the device 200 functions consistent with disclosed embodiments. The device processor 201 may be a low-power high performance processor (e.g. ARM® Cortex® -M4 32b) able to perform the device tasks. The processor 201 may also comprise a timer (e.g. RTC) needed to synchronize the rendering of certain segments in a data stream.

[0154] The device 200 may comprise memory 202 to store instructions and data (e.g. program parameters, device configuration, device identification, data streams with video and audio files, etc.). The memory 202 may comprise random access memory (RAM), flash memory, ROM, SDRAM, DRAM, and the like.

[0155] According to certain embodiments, the device 200 may also comprise: a. Power from an internal battery, capacitor or be powered from an

external power source 206 b. Means (including antenna) 204 for long/local range wireless communication 220

c. Means (including antenna) 205 for short range wireless

communication 221 (e.g. NFC, BLE, etc.)

d. Means 203 for wired communication 225 (e.g. USB, Ethernet, etc.) e. USB /wireless charging

f. Sensors (accelerometer, inclinometer, compass, light, sound, etc.) 207 g. Audio/video reproduction and/ or recording means, audio/video recording means including codecs 208-209. Those functions may interface 222-223 with speakers, microphones, video cameras, screens, etc.

h. Drivers 210 to interface 224 with lights, external sensors, switches, actuators, relays, motors and similar components or modules.

[0156] The processor 201 may control its peripheral functions using control lines 217-218. In addition, certain peripheral functions may directly access 215- 216 the device memory 202 to provide better device performance.

[0157] In certain aspects of the disclosed embodiments long/local range communication means 204 is in accordance to one or more wireless communication technologies including IEEE802.Ha/b/g/n/ac/ad, 1EEE802.15.4, IEEE802.15.4a, Bluetooth, Wireless HART, NB-IoT, LTE CatO for M2M, LTE-M Rel. Cat-0, Cat-1 and Cat-4, Weightless-N, Weightless-P, Weightless-W and 3G and 4G cellular.

[0158] The short range communication means 205 is in accordance to one or more wireless communication technologies including infrared, ultrasonic communication, low frequency (e.g. 125 KHz or 13.56 MHz) RFID, NFC or any other suitable short range wireless communication technology. According to one embodiment, the short range communication may be bidirectional and allow transfer of information in both directions.

[0159] In accordance to certain embodiments, the device processor 201 may interface with a GPS receiver (not shown], compass and navigation sensors.

[0160] Processor 201 may also comprise one or more memory devices 202 that store software and/or program instructions that, when executed by the processor 201, perform one or more processes consistent with aspects of the disclosure,

[0161] Processor 201 may comprise other known computing components for performing known computing functions, such as executing software, storing and accessing information from memory, processing information, and generating and sending information and receiving over a communication link (e.g., communication links 220, 221 and 225).

[0162] The foregoing descriptions have been presented for purposes of illustration and description. They are not exhaustive and do not limit the disclosed embodiments to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practicing the disclosed embodiments. For example, the described implementation comprises software, but the disclosed embodiments may be implemented as a combination of hardware and software. Additionally, although disclosed aspects are described as being stored in a memory on a computer, one skilled in the art will appreciate that these aspects can also he stored on other one or more other types of tangible computer-readable media, such as secondary storage devices, like hard disks, CD- ROMs, or other forms of RAM, ROM, SDRAM, DRAM, Flash memory and the like.

[0163] Computer programs based on the written description and disclosed methods are within the capabilities of one of ordinary skill in the art The various programs or program modules may be created using any of the techniques known to one skilled in the art, or may be designed in connection with existing software. For example, program sections or program modules can be designed in or by means of DirectX, .Net Framework, .Net Compact Framework, Visual Basic, C, XML, Java, C++, JavaScript, HTML, HTML/AJAX, or any other now known or later created

programming language. One or more of such software sections or modules may be integrated into a computer system.

[0164] Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. As the recitations in the claims are to be interpreted broadly based on the language employed in the claims and are not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed non-exclusive.

[0165] Further, the steps of the disclosed methods may be modified in any manner, including by reordering steps and/or inserting or deleting steps.

[0166] Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the disclosure to the exact construction and operation illustrated and described, and accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the disclosure.