Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SOUND SYSTEM FOR HUMANOID ROBOT
Document Type and Number:
WIPO Patent Application WO/2014/162162
Kind Code:
A1
Abstract:
A robot is configured to play an audio data file. The robot includes a torso section, a first leg, and a second leg. The first leg is coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis. The second leg is coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis. The robot further includes a first speaker disposed on a front area of the first leg. The robot includes a second speaker disposed on a front area of the second leg.

Inventors:
HO VINH HOANG (VN)
Application Number:
PCT/IB2013/000994
Publication Date:
October 09, 2014
Filing Date:
April 01, 2013
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TOSY ROBOTICS JOINT STOCK COMPANY (VN)
HO VINH HOANG (VN)
International Classes:
A63H11/00; A63H3/04; A63H3/33; A63H3/36; A63H3/46; B25J11/00; B25J13/00
Foreign References:
JP2006289508A2006-10-26
JP2011140096A2011-07-21
JP2003159681A2003-06-03
JP2006247780A2006-09-21
JP2011000681A2011-01-06
JP2004236758A2004-08-26
JPH0316709Y21991-04-10
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A robot configured to play an audio data file, the robot comprising:

a torso section;

a first leg coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis;

a second leg coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis, and wherein the second leg is coupled to an opposite side of the torso section than the first leg;

a first speaker disposed on a first front area of the first leg; and a second speaker disposed on a second front area of the second leg.

2. The robot of claim 1, further comprising a third speaker coupled to the torso.

3. The robot of claim 2, wherein the third speaker is a subwoofer and the first and second speakers are tweeters.

4. The robot of claim 2, wherein the first speaker, the second speaker, and the third speaker are all configured to project sound a same general direction.

5. The robot of claim 1, further comprising a first servo motor configured to rotate the first leg about the first hip axis with respect to the torso, wherein when the first leg rotates with respect to the torso, the first speaker also rotates about the first hip axis.

6. The robot of claim 5, further comprising a second servo motor configured to rotate the second leg about the second hip axis with respect to the torso, wherein when the second leg rotates with respect to the torso, the second speaker also rotates about the first hip axis.

7. The robot of claim 6, further comprising a third speaker coupled to the torso, wherein when either of the first leg or the second leg rotates, the position of the first speaker and the second speaker with respect to the position of the third speaker generally remains the same.

8. The robot of claim 1, further comprising one or more additional speakers disposed on one or more of the first front face or the second front face.

9. The robot of claim 1, further comprising a first rim coupled to the first speaker and a channel formed in the first leg, wherein the channel is designed to receive the rim.

10. The robot of claim 9, further comprising a second rim coupled to the second speaker and a channel formed in the second leg, wherein the channel is designed to receive the second rim.

11. A robot configured to play an audio data file, the robot comprising:

a torso section;

a first leg coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis;

a second leg coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis, and wherein the second leg is coupled to an opposite side of the torso section than the first leg;

a first speaker having a generally circular front face, wherein the first speaker is coupled to the first leg;

a second speaker having a generally circular front face, wherein the second speaker is coupled to the second leg;

a third speaker coupled to the torso section;

a first arm coupled to the torso section through a first shoulder joint;

a second arm coupled to the torso section through a second shoulder joint; and

a head coupled to the torso through a neck, wherein the head is configured to rotate about a first head axis and a second head axis, wherein the first head axis and the second head axis are generally perpendicular;

wherein the first hip axis intersects a center point of the generally circular front face of the first speaker;

wherein the second hip axis intersects a center point of the generally circular front face of the second speaker; and

wherein the first speaker, the second speaker, and the third speaker are configured to project sound in a same general direction.

12. The robot of claim 11 , wherein the third speaker is a subwoofer and the first and second speakers are tweeters.

13. The robot of claim 11, further comprising a first servo motor configured to rotate the first leg about the first hip axis with respect to the torso, wherein when the first leg rotates with respect to the torso, the first speaker also rotates about the first hip axis.

14. The robot of claim 13, further comprising a second servo motor configured to rotate the second leg about the second hip axis with respect to the torso, wherein when the second leg rotates with respect to the torso, the second speaker also rotates about the first hip axis.

15. The robot of claim 14, further comprising a servo controller configured to control the operation of the first servo motor and the second servo motor based on a received input.

16. The robot of claim 11, further comprising a first rim coupled to the first speaker and a channel formed in the first leg, wherein the channel is designed to receive the rim.

17. The robot of claim 16, further comprising a second rim coupled to the second speaker and a channel formed in the second leg, wherein the channel is designed to receive the second rim.

18. The robot of claim 11 , wherein the robot is a humanoid robot.

19. A method in a robot configured to play an audio data file, the method comprising: receiving an audio data file through an interface of the robot; storing the audio data file in a memory of the robot;

receiving a routine data file through the interface;

storing the routine data file in the memory;

associating the audio data file with the routine data file through a controller of the robot;

playing the audio data file through a speaker system of the robot, wherein the speaker system includes a first speaker located in a first leg of the robot, a second speaker located in the second leg of the robot, and a third speaker located in a torso of the robot; and performing a routine during playing of the audio data file, wherein the routine includes moving a first arm of the robot, a second arm of the robot, the first leg, and the third leg with a plurality of motors in accordance with a routine sequence is defined by instructions included in the routine data file.

20. The method of claim 19, wherein the robot is configured to transform to and from a folded position and an unfolded position;

wherein the first arm, the second arm, the first leg, and the second leg are folded against the torso in the folded position; and

wherein the first arm, the second arm, the first leg, and the second leg are extended away from the torso in the unfolded position.

21. The method of claim 20, further comprising transforming from the folded position to the unfolded position prior to playing the audio data file.

22. The method of claim 21, further comprising transforming from the unfolded position to the folded position after the playing of the audio data file is complete.

Description:
SOUND SYSTEM FOR HUMANOID ROBOT

BACKGROUND

[0001] Robots are electromechanical devices capable of performing at least partially automated tasks. Humanoid robots are types of robots that are designed to aesthetically resemble a human being. The actions of humanoid robots may also be programmed such that the humanoid robot performs actions similar to those of humans. Humanoid robots may be used for performing tasks, for research purposes, or for entertainment.

SUMMARY

[0002] One exemplary embodiment relates to a robot configured to play an audio data file. The robot includes a torso section. The robot further includes a first leg coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis. The robot includes a second leg coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis, and wherein the second leg is coupled to an opposite side of the torso section than the first leg. The robot further includes a first speaker having a generally circular front face, wherein the first speaker is coupled to the first leg. The robot further includes a second speaker having a generally circular front face, wherein the second speaker is coupled to the second leg. The first hip axis intersects a center point of the generally circular front face of the first speaker. The second hip axis intersects a center point of the generally circular front face of the second speaker.

[0003] Another exemplary embodiment relates to a robot configured to play an audio data file. The robot includes a torso section. The robot further includes a first leg coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis. The robot includes a second leg coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis, and wherein the second leg is coupled to an opposite side of the torso section than the first leg. The robot further includes a first speaker having a generally circular front face, wherein the first speaker is coupled to the first leg. The robot includes a second speaker having a generally circular front face, wherein the second speaker is coupled to the second leg. The robot further includes a third speaker coupled to the torso section. The robot includes a first arm coupled to the torso section through a first shoulder joint. The robot further includes a second arm coupled to the torso section through a second shoulder joint. The robot includes a head coupled to the torso through a neck, wherein the head is configured to rotate about a first head axis and a second head axis, wherein the first head axis and the second head axis are generally perpendicular. The first hip axis intersects a center point of the generally circular front face of the first speaker. The second hip axis intersects a center point of the generally circular front face of the second speaker. The first speaker, the second speaker, and the third speaker are configured to project sound in a same general direction.

[0004] Yet another exemplary embodiment relates to a method in a robot configured to play an audio data file. The method includes receiving an audio data file through an interface of the robot. The method further includes storing the audio data file in a memory of the robot. The method includes receiving a routine data file through the interface. The method further includes storing the routine data file in the memory. The method includes associating the audio data file with the routine data file through a controller of the robot. The method further includes playing the audio data file through a speaker system of the robot, wherein the speaker system includes a first speaker located in a first leg of the robot, a second speaker located in the second leg of the robot, and a third speaker located in a torso of the robot. The method further includes performing a routine during playing of the audio data file, wherein the routine includes moving a first arm of the robot, a second arm of the robot, the first leg, and the third leg with a plurality of motors in accordance with a routine sequence is defined by instructions included in the routine data file.

[0005] Still another exemplary embodiment relates to a robot configured to play an audio data file. The robot includes a torso section. The robot further includes a first leg coupled to the torso section through a first hip joint, wherein the first leg is configured to rotate about a first hip axis. The robot includes a second leg coupled to the torso section through a second hip joint, wherein the second leg is configured to rotate about a second hip axis, and wherein the second leg is coupled to an opposite side of the torso section than the first leg. The robot further includes a first speaker disposed on a first front area of the first leg and second speaker disposed on a second front area of the second leg. [0006] The invention is capable of other embodiments and of being carried out in various ways. Alternative exemplary embodiments relate to other features and combinations of features as may be generally recited in the claims.

[0007] The foregoing is a summary, and thus, by necessity, contains simplifications, generalizations, and omissions of detail. Consequently, those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices and/or processes described herein, as defined solely by the claims, will become apparent in the detailed description set forth herein and taken in conjunction with the accompanying drawings.

BRIEF DESCRIPTION OF THE FIGURES

[0008] FIG. 1A is a perspective view schematic drawing of a humanoid robot according to an exemplary embodiment.

[0009] FIG. IB is a front view schematic drawing of the humanoid robot of FIG. 1 A according to the exemplary embodiment.

[0010] FIG. 2 is a perspective view schematic drawing of a leg of a humanoid robot having a portion of its body removed according to an exemplary embodiment.

[0011] FIG. 3 is a front view schematic drawing of a humanoid robot according to an exemplary embodiment.

[0012] FIG. 4 is a general block diagram of a system for controlling a humanoid robot according to an exemplary embodiment.

[0013] FIG. 5 is a front view schematic drawing of a humanoid robot according to an exemplary embodiment.

[0014] FIG. 6 is a front view schematic drawing of a humanoid robot according to an exemplary embodiment.

[0015] FIG. 7 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to an exemplary embodiment. [0016] FIG. 8 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.

[0017] FIG. 9 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.

[0018] FIG. 10 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.

[0019] FIG. 11 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.

[0020] FIG. 12 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process according to the exemplary embodiment of FIG. 7.

[0021] FIG. 13 is a perspective view schematic drawing of a humanoid robot during a stage of a transformation process having a portion of its body removed according to the exemplary embodiment of FIG. 7.

[0022] FIG. 14 is a flow diagram of a method of uploading an audio data file and a routine data file to a server according to an exemplary embodiment.

[0023] FIG. 15 is a flow diagram of a method of downloading an audio data file and a routine data file to a robot controller according to an exemplary embodiment.

[0024] FIG. 16 is a flow diagram of a method of a robot playing an audio data file and performing instructions contained within a routine data file according to an exemplary embodiment.

DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS

[0025] Before turning to the figures, which illustrate the exemplary embodiments in detail, it should be understood that the application is not limited to the details or

methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology is for the purpose of description only and should not be regarded as limiting. Unless otherwise specified, "a" or "an" means "one or more." Related PCT application entitled, "Shoulder and Arm Arrangement For a Humanoid Robot," Docket No. 103675-0107, filed on an even date herewith by inventor Ho Vinh Hoang is incorporated herein by reference in its entirety.

[0026] Referring to FIG. 1A and FIG. IB, robot 100 is shown according to an exemplary embodiment. Robot 100 is a humanoid robot. Robot 100 is shaped to approximately resemble a human body when in an unfolded position. Robot 100 includes head 101, torso 102, arms 103 and 104, and legs 105 and 106. Head 101 is coupled to torso 102 through neck 107. Arms 103 and 104 are respectively coupled to torso 102 via shoulder joints 108 and 109. Legs 105 and 106 are respectively coupled to torso 102 through hip joints 110 and 111. Arms 103 and 104 include elbow joints 112 and 113. Arms 103 and 104 further include wrist joints 114 and 115. Arms 103 and 104 may be rotated and/or bent about any of shoulder joints 108 and 109, elbow joints 112 and 113, and wrist joints 114 and 115. Legs 105 and 106 may be rotated about hip joints 110 and 111.

[0027] Arm 104 includes hand 116, forearm 118, and upper arm 120. Hand 116 is connected to forearm 118 through wrist joint 114. Wrist joint 114 allows hand 116 to rotate with respect to forearm 1 18. Forearm 118 is connected to upper arm 120 through elbow joint 112. Elbow joint 112 allows forearm 118 to rotate with respect to upper arm 120. Upper arm 120 is connected to torso 102 through shoulder joint 108. Shoulder joint 108 allows upper arm 120 to rotate with respect to torso 102. Similarly, arm 105 includes hand 117, forearm 119, and upper arm 121. Hand 117 is connected to forearm 119 through wrist joint 115. Wrist joint 115 allows hand 117 to rotate with respect to forearm 119. Forearm 119 is connected to upper arm 121 through elbow joint 113. Elbow joint 113 allows forearm 119 to rotate with respect to upper arm 121. Upper arm 121 is connected to torso 102 through shoulder joint 109. Shoulder joint 109 allows upper arm 121 to rotate with respect to torso 102.

[0028] Leg 106 includes foot 122. Similarly, leg 105 includes foot 123. Foot 122 and foot 123 provide support to robot 100 when robot 100 is in a standing position (as shown in FIG. 1A and FIG. IB). As discussed in more detail later, foot 122 and foot 123 may be comprised of multiple parts such that foot 122 can fold and retract into leg 106 and such that foot 123 can fold and retract into leg 105. The details of feet 122 and 123 are discussed below with respect to FIG. 2 and FIG. 4. [0029] Referring to FIG. IB, robot 100 is shown according to the exemplary embodiment. Robot 100 includes speakers 131, 132, and 133. Speaker 131 is contained within and coupled to torso 102. Speaker 132 is coupled to torso 102 and leg 106 at or near hip joint 110. Speaker 133 is coupled to torso 102 and leg 105 at or near hip joint 111. Together, speakers 131, 132, and 133 form a 2.1 channel stereo speaker system for audio playback. Speaker 131 may be a subwoofer. Speakers 132 and 133 can be tweeters. Speakers 131, 132, and 133 are arranged on robot to face generally in the same direction such that sound coming from each of speakers 131, 132, and 133 is projected in generally the same direction (e.g., towards a listener of audio being emitted from speakers 131, 132, and 133) in one embodiment. In one embodiment, speakers 132 and 133 are disposed on a front face of legs 106 and 105 , respectively, and speaker 131 is disposed on a front face of torso 102. In one alternative embodiment, additional speakers can be provided on the front, side or rear faces of legs 106 and 105. In some embodiments, the speakers may be covered by a membrane and/or a speaker cover.

[0030] Referring to FIG. 2, a close up perspective view leg 105 is shown according to an exemplary embodiment. As shown in FIG. 2, leg 105 is removed from torso 102, and a portion of the exterior shell of leg 105 is removed for describing the internal components of leg 105. Leg 105 ordinarily is coupled to torso 102 through hip joint 111. Hip joint 111 allows leg 105 to rotate about axis LH ("left hip"). Leg 105 may be manually rotated about axis LH (e.g., by a listener of music being played through speakers 131, 132, and 133) or may be rotated by motor 301. Motor 301 rotates leg 105 about axis LH through gearbox 302. Motor 301 may be a DC servo motor. Gearbox 302 operates to rotate leg 105 about axis LH with respect to torso at a lower speed than the output speed of motor 301. Gearbox 302 includes exposed gear 303. Exposed gear 303 meshes with static mating teeth coupled to torso 102 such that when exposed gear 303 rotates, motion of leg 105 with respect to torso 102 is effectuated. In an alternative arrangement, motor 301 rotates leg 105 about torso 102 without a gearbox (e.g., is directly attached to hip joint 111).

[0031] Speaker 133 is mounted within the outer shell of leg 105. Speaker 133 may be mounted to the outer shell of leg 105 such that rim 305 of speaker 133 is received in corresponding channel 306 of outer shell of leg 105. Channel 306 secures speaker 133 in outer shell of leg 105. The front of speaker 133 is generally circular in shape. Axis LH intersects with the generally circular center of the front of speaker 133. Accordingly, during rotation of leg 105 about axis LH, speaker 133 also rotates about axis LH. Speaker 133 is positioned having the center of its front on axis LH to minimize motion of speaker relative to torso and speaker 131 during motion of leg 105. By maintaining speaker 133 position with respect to speaker 131 during operation, sound distortion may be minimized. Outer shell of leg 105 includes a generally circular cutout such that speaker 133 can project sound away from robot 100. Speaker 133 projects sound in generally the same direction as speakers 131 and 132.

[0032] Outer shell of leg 105 forms internal compartment 310. Internal compartment 310 is configured to receive foot 123. As briefly discussed above and as later described in further detail, foot 123 may be configured to fold and retract inside of leg 105 when robot 100 is in the folded position (e.g., as discussed below with respect to FIG. 4, FIG. 9, and FIG. 10). Accordingly, when foot 123 is folded, foot 123 may retract inside leg 105 and is received in internal compartment 310 of leg 105.

[0033] It should be appreciated that the internal components and arrangement of the internal components of leg 106 is generally the same as the described above with respect to leg 105 in FIG. 2. However, the internal components and the arrangement of the internal components within leg 105 is mirrored for proper placement on the opposite side of torso 102.

[0034] Referring to FIG. 3, robot 100 is shown according to an exemplary embodiment. Robot 100 may include various lighting features. The lighting features may include torso 102 lighting features 501, arm lighting features 502, leg lighting features 503, and face lighting features 504. Body lighting features 501, arm lighting features 502, and leg lighting features 503 are shown as linear lighting features. In an alternative arrangement, body lighting features 501, arm lighting features 502, leg and lighting features 503 may be curves, dots, or other geometric shapes. Face lighting features 504 are drawn to represent eyes of robot 100. Face lighting features 504 may further include lighted areas representing other facial features of a human (e.g., nose, mouth, ears, hair, etc.). Robot 100 may include a greater number or a lesser number of lighting features as disclosed in FIG. 3. Lighting features include LED lights located behind opaque or transparent portions of robot 100 body shell. The LEDs used in creating the lighting feature may be multicolored LEDs such that the lighting elements can light up in different colors during different points in a choreographed dance routine (as discussed below with respect to FIG. 5 and FIG. 6). Each of body lighting features 501, arm lighting features 502, leg lighting features 503, and face lighting features 504 are independently controllable in terms of activation (being on or off), brightness level, and color emitted by a controller (as discussed below with respect to FIG. 4)·

[0035] Referring to FIG. 4, a block diagram of a system for interacting with robot 100 is shown according to an exemplary embodiment. Robot 100 includes central processing unit

601 and memory 602. Memory 602 stores various programming modules and codes that, when executed by central processing unit 601, control the operation of robot 100. Memory

602 may include random access memory, read-only memory (e.g., non-transitory memory for storing programming modules), and flash memory storage. Memory 602 may further include a removable memory media reader for enabling a user of robot 100 to provide removable memory media (e.g., a MicroSD card, an SD card, etc.) containing additional instructions and or data files. Memory 602 is configured to store data files (e.g., audio data files, routine data files, etc.). Central processing unit 601 communicates data and electrical signals with various periphery devices, including network interface 603, USB controller 604, auxiliary audio input 605, user inputs 606, audio amplifier 607, servo controller 608, and lighting elements 609. Any of the above mentioned periphery components along with central processing unit 601 and memory 602 may be part of a system-on-chip integrated circuit or may be stand-alone circuit components. Power supply 610 provides operational power to robot 100 and all of the components of robot 100.

[0036] Network interface 603 enables robot 100 to communicate with user device 615, either directly (e.g., through an ad-hoc network connection) or through network 616 (e.g., through the Internet or a local area network). User device 615 may be a computer, a laptop, a tablet computing device, a smart phone, a media player, or a PDA. User device 615 can communicate with robot 100 through system software loaded on user device 615 (e.g., a stand-alone program running on user device 615 such as a smart phone application, a system program loaded by accessing a system website, etc.). User device 615 may send robot 100 operating instructions and various data files. Network interface 603 further enables robot 100 to communicate with server 617 through network 616. Server 617 may include audio data files and/or dance routine data files. Network interface 603 may include a wireless transceiver and a wireless radio antenna. Network interface 603 may communicate over a standard wireless networking protocol (e.g., 802.1 la, 802.1 lb, 802.1 lg, 802.11η, Bluetooth, ZigBee, 802.15, CDMA, LTE, GSM, etc.). Network interface 603 may be able to communicate over multiple wireless frequencies and networking protocols at the same time (e.g., over an 802.11 WiFi connection and a Bluetooth connection at the same time). In an alternative arrangement, network interface 603 includes a separate WiFi transceiver and antenna (e.g., an 802.11 wireless transceiver and antenna) and a separate Bluetooth transceiver and antenna. Through network interface 603, central processing unit 601 may receive audio and dance routine data files and store in memory 602 for later playback and performance. Central processing unit 601 may also transmit data to user device 615 and/or server 617 (e.g., status reports, remaining memory 602 capacity, the identity of data files stored on memory 602, any detected errors associated with any periphery devices, etc.). Central processing unit 601 may also receive system updates (e.g., updated robot 100 operating system or programming modules).

[0037] USB controller 604 is a version 1.0, 2.0, or 3.0 USB controller. USB controller includes a female USB port on the exterior of robot 100 body. The USB port may be a standard size USB port (e.g., a mini-USB port, a micro-USB port, etc.) or the USB port may be a proprietary USB port. USB controller 604 is configured to establish a data connection between user device 615 and central processing unit 601 when user device 615 is plugged into USB controller 604 via a USB cable. Once a data connection is established between user device 615 and central processing unit 601, central processing unit 601 may receive audio and dance routine data files and store in memory 602 for later playback and performance. Central processing unit 601 may also transmit data to user device 615 (e.g., status reports, remaining memory 602 capacity, the identity of data files stored on memory 602, any detected errors associated with any periphery devices, etc.). Central processing unit 601 may also receive system updates (e.g., updated robot 100 operating system or programming modules). In some arrangements, USB controller 604 is also configured to provide electrical power to robot 100 to charge power supply 610 when an external power source (e.g., grid power, an external battery pack, etc.) is connected to robot 100 through USB controller 604 and/or to provide operational power to robot 100.

[0038] Auxiliary audio input 605 is configured to allow a user of robot 100 to provide an audio stream to central processing unit 601 for playback over speakers 620 (e.g., speakers 131, 132, and 133 in Figure 2 in one embodiment) . Auxiliary audio input 605 may be a standard 3.5 mm stereo input. Although auxiliary audio input 605 is drawn as providing an audio stream to central processing unit, auxiliary audio input 605 may be directly connected to audio amplifier 607 for direct playback through speakers 620.

[0039] User inputs 606 may include any number of buttons (e.g., spring loaded

mechanical buttons, touch-sensing buttons, etc.), switches, and dials enabling a user of robot 100 to provide input to central processing unit 601 without the use of an exterior computing system (e.g., user device 615). User inputs 606 include at least a power switch (i.e., an on/off button), audio playback controls (e.g., a play button, a stop button, a pause button, a fast forward button, a rewind button, a next button, a previous button, a volume up button, and a volume down button), a fold button, an unfold button. User inputs 606 may also include a remote control. The remote control may communicate with robot 100 through network interface 603 (e.g., over a Bluetooth connection), over an infrared communication link, over another type of wireless radio communication link, or through a wired connection with robot 100. The remote control may include any of the above discussed buttons, switches, and/or dials discussed above.

[0040] Audio amplifier 607 amplifies low-power audio signals received from central processing unit 601 and/or auxiliary audio input 605 to a higher power level suitable for driving speakers 620. Audio amplifier 607 may be a 2.1 channel audio amplifier. Speakers 620 include speakers 131, 132, and 133 in one embodiment. However it should be appreciated that alternate configurations of robot 100 may include a fewer number of speakers (e.g., two speakers or one speaker) or a greater number of speakers (e.g., four speakers, five speakers, etc.).

[0041] Servo controller 608 is in communication with motors 630. Motors 630 include the above discussed motors associated with each joint of robot 100 and associated with wheels of feet 122 and 123. Accordingly, servo controller 608 can individually control an individual motor by sending instructions and/or routing power to the individual motor. Servo controller 608 sends and receives data signals to and from central processing unit 601. The data received by servo controller 608 generally relates to an instruction to turn an individual motor to a certain angular position at a designated angular speed. Accordingly, servo controller 608 parses the instruction and executes the instruction to effectuate the instructed motor motion. Each motor of robot 100 is a DC servo motor and may include a servo motor encoder. The servo motor encoder provides feedback relating to an angular position of the associated servo motor with respect to a reference position. Servo controller 608 may report the various angular positions of individual motors among motors 630 to central processing unit 601.

[0042] Central processing unit 601 also controls the operation of lighting elements 609. Lighting elements include any lighted features of robot 100 (e.g., lighting features 501, 502, 503, and 504). Central processing unit 601 is configured to activate and deactivate individual lighting features of lighting elements 609 in response to an executed program. For example, the eyes of robot 100 (e.g., lighting features 504) may light up independently of lighting elements on torso 102 of robot 100 (e.g., lighting features 501). Central processing unit 601 is configured to independently adjust the brightness and/or color of individual lighting features of lighting elements 609 in response to an executed routine program.

[0043] Power supply 610 provides operational power to robot 100 and all components of robot 100. Power supply 610 may include a battery. The battery may be a rechargeable battery (e.g., a lithium-polymer battery). Power supply 610 may be charged through power routed from USB controller 604 when USB controller 604 is connected to an external power source (e.g., when connected to user device 615). Alternatively, power supply 610 may be charged through a separate power port located on robot 100. When robot 100 is connected to an external power supply (e.g., grid power, an external battery pack, etc.), operational power for robot 100 may be provided by the external power supply while power supply 610 is receiving a charge.

[0044] Referring to FIG. 5 and FIG. 6, robot 100 is shown according to exemplary embodiments. During operation of robot 100, central processing unit 601 controls robot 100 movement, audio output, and lighting output when executing a routine data file stored in memory 602. Central processing unit 601 may instruct servo controller 608 to move robot 100 in a predefined sequence. Additionally, central processing unit 601 may send audio signals to audio amplifier 607 for playback over speakers 620 during the predefined movement sequence. Accordingly, robot 100 may be configured to move according to a predefined choreography in synchronization with audio playing over speakers 620. Such movement during audio playback gives the appearance that robot 100 is dancing to the audio being emitted from speakers 620. Depending on the program executed by central processing unit 601, robot 100 may move arms 103 and 104, legs 105 and 106, and head

101 about the various axes discussed above. Central processing unit 601 may also move robot 100 through controlling motors associated wheels of feet 123 and 124. Central processing unit 601 may also control lighting elements 609 for added visual effect during the audio playback and movement of robot 100.

[0045] Referring to FIG. 7 through FIG. 13, robot 100 is shown progressing through folding steps from a humanoid robot to a speaker box according to an exemplary embodiment. Central processing unit 601 is configured to transform robot 100 from an unfolded position wherein robot 100 resembles a human (as shown in FIG. 1) to a folded position wherein robot 100 resembles a speaker box (as shown in FIG. 12) in one embodiment. Central processing unit 601 executes the folding routine program or unfolding routine program stored in memory 602 when instructed to by the user of robot 100. The instruction to fold or unfold can be provided through user inputs 606 (e.g., a fold button, an unfold button, a transform button, etc.), through a user remote, or from a command received from user device 615. The instruction to fold or unfold may also be embedded in a dance routine. For example, when robot 100 is performing a dance routine, robot 100 may initially transform from the folded position to the unfolded position at the start of a song, and then retransform from the unfolded position to the folded position at the conclusion of a song.

[0046] Referring to FIG. 7, the first stage of the folding process is shown according to an exemplary embodiment. During the first stage of the folding process, central processing unit 601 instructs arms 103 and 104 to fold in adjacent to torso 102 and speaker 131. Once arms 103 and 104 are in position, hands 116 and 117 rotate up and underneath speaker 131. Accordingly, arms 103 and 104 along with hands 116 and 117 form a generally "U" shape around torso 102 and speaker 131. Further, arm 104 folds up against a front portion of torso

102 because the arm's longitudinal center axis (when extended parallel to the floor) is offset and parallel to the shoulder's axis. This folding arrangement leaves space 902 for receiving leg 106. A similarly shaped space exists on the opposite side of torso 102 behind arm 103.

[0047] Referring to FIG. 8 through FIG. 10, the second stage of the folding process is shown according to an exemplary embodiment. During the second stage of the folding process, central processing unit 601 instructs legs 106 and 105 to begin rotation up towards torso 102 and behind arms 103 and 104. In FIG. 10, legs 106 and 105 are folded up against torso 102. Leg 106 fits into space 902 against torso 102 and just behind arm 104. Feet 122 and 123 retract into legs 106 and 107 respectively. Each foot includes two parts that are configured to fold along an axis such that each foot can retract into its respective leg (e.g., into compartment 310 as shown in FIG. 2). When legs 106 and 107 fold up against body and feet 122 and 123 retract into their respective legs, robot 100 begins to take on the appearance of a speaker box instead of a humanoid robot.

[0048] Referring to FIG. 11 through FIG. 13, the final stage of the folding process is shown according to an exemplary embodiment. During the final stage of the folding process, central processing unit 601 instructs head 101 to fold into a cavity within torso 102. In order to enable head 101 to rotate along an axis parallel to the chin of head 101 and into torso 102, chest plate 1301 pivots away from torso 102. When in the fully folded position (as shown in FIG. 12), robot 100 no longer looks like a human, and instead resembles a speaker box. As shown in FIG. 13, head 101 sits in a cavity of torso 102 above speaker 131 in the folded position.

[0049] Robot 100 can transform from the folded position to the unfolded position by performing the above steps in the reverse order. Central processing unit 601 begins the process of transforming robot 100 from the folded position to the unfolded position when it receives a command from user inputs 606, a user remote control, or from user device 615.

[0050] Referring to FIG. 14, a method 1600 of creating a dance and lighting routine to be executed by robot 100 is shown according to an exemplary embodiment. In order to provide a dance and lighting routine ("the routine") to robot 100, the routine is first created and uploaded to server 617. Method 1600 begins when system software is loaded onto user device 615 (step 1601). System software enables user device 615 to communicate with server 617 through network 616 and to communicate with robot 100 through network interface 603 and/or USB controller 604. System software may be a stand alone application (e.g., a program or application running natively on user device 615). Alternatively, system software may be executed remotely and accessed through a website. During the loading of system software onto user device 615, the user may be prompted to register for a system account or log into an already registered system account (i.e., provide login credentials including a username and password) such that later uploaded data files can be associated with the user's account. System software presents interactive graphical user interfaces to the user through user device 615 such that the user may perform any tasks related to robot 100, including creating and uploading the routine.

[0051] After loading the system software on user device 615, the user selects an audio data file that will correspond to the later defined routine (step 1602). The audio data file may correspond to a song. The audio data file may be stored on a memory of user device 615. Alternatively, the audio data file may be stored on server 617 (e.g., the system may provide sample songs for a user to create routines). The audio data files on server 617 may be freely offered or offered for a fee. In yet another alternative arrangement, the system software may access another remote service of the user to locate an audio data file stored on another device. For example, the user may provide the system access to another user owned account (e.g., a cloud music storage service), such that an audio data file not on already on server 617 or in memory of user device 615 may be used with robot 100. In each situation, the user must first select an audio data file to be used with the routine. After selection, the audio data file is provided to server 617 through network 616.

[0052] After selection of the audio data file, the user may then program the routine (step 1603). The user may choose from preprogrammed routines such as a free sample routine provided by the system, a premium routine available for purchase by the user, or a routine shared by another user of the system. If the user selects a preprogrammed routine, the routine may first be uploaded to server 617 through network 616 if the routine is not already on server 617, and then the routine may be associated with the already provided audio data file.

[0053] In an alternative arrangement, a user may provide inputs to choreograph the routine from scratch. In such an alternative arrangement, system software may present an interactive graphical user interface that allows the user to program a sequence of robot 100 movements and lighting effects that correlate to the provided audio data file. For example, the system software may present the user a graphical user interface displaying both an interactive model of robot 100 above a waveform display representing the uploaded audio data file. Accordingly, at various points of the waveform display, the user can manipulate the model of robot 100 into the desired positions at the various points through user inputs of user device 615 (e.g., a mouse or a touch screen). Additionally, the user can arrange a lighting sequence specifying which lights are activated and deactivated at designated times during playback of the audio data file, as well as indicating the emitted light color and brightness.

[0054] After the user programs the routine, the user can preview the routine by selecting a preview feature of the system software. The system software will then play the audio file through an audio output of user device 615 (e.g., a speaker) and move the model in accordance with the user programmed routine. During this process, the user may make adjustments to the routine and repeat the preview until satisfied with the routine.

[0055] After the user is satisfied with the routine, the user names the routine (step 1604) and uploads the routine to server 617 for later retrieval and programming to robot 100 (step 1605). Once uploaded, server 617 stores the routine data file and associates the routine data file with the audio data file. The associated files are then associated with the user's account for later retrieval, downloading, or sharing with another user of the system.

[0056] Referring to FIG. 15, a method 1700 of programming robot 100 is shown according to an exemplary embodiment. As discussed above, robot 100 includes memory 602. Memory 602 can store audio data files and routine data files. Memory may store audio data files in a way that they are associated with routine data files (such that when the audio data file is later played, the routine is automatically performed) or may store audio data files without an associated routine data file (such that when the audio data file is later played, no routine is performed). Method 1700 begins when a user loads system software on user device 615 (1701). System software enables user device 615 to communicate with server 617 and to communicate with robot 100 through network interface 603 and/or USB controller 604. System software may be a stand alone application (e.g., a program or application running natively on user device 615). Alternatively, system software may be executed remotely and accessed through a website. During the loading of system software onto user device 615, the user may be prompted to register for a system account or log into an already registered system account (e.g., provide login credentials including a username and password) such that later uploaded data files can be associated with the user's account. System software presents interactive graphical user interfaces to the user through user device 615 such that the user may perform any tasks related to robot 100, including creating and uploading the routine.

[0057] After loading the system software and logging into the user's account, the user searches for audio data files and routine data files that the user would like to send to robot 100 (step 1702). The user may search for audio data files and routine data files that the user previously created and/or uploaded to server 617 (e.g., through method 1600).

Alternatively, the user may search a system store for audio data files and routine data files (e.g., a store hosted on server 617). Data files in the system store may be available for free or for purchase.

[0058] When the user locates audio data files and/or routine data files that the user wishes to upload to the robot, the user adds the audio and/or routine data files to a playlist (step 1703). The user may continue to add audio data files and/or routine data files to the playlist until the desired number of audio files and associated routine files are selected. The playlist must have at least one audio data file. The playlist may include any number of audio files as long as the playlist can fit within memory 602.

[0059] After the user is finished setting the playlist, the user downloads the files in the playlist (step 1704). Server 617 packages the files and transmits the files to user device 615 through network 616. User device 615 saves the files in a memory of user device 615 for later transmission to robot 100.

[0060] After saving the files in memory, a connection is established between robot 100 and user device 615 (step 1705). The connection established either through network interface 603 or USB controller 604. In some arrangements, the connection may only be established if robot 100 is powered on. In such an arrangement, robot 100 must be powered on before a connection can be established (e.g., the user may turn a power switch into the "on" position).

[0061] If connecting via network interface 603, the user may instruct system software to scan for available robots via WiFi or Bluetooth, depending on the capabilities of user device 615 and any previously set user preferences indicating a user preferred mode of connection between user device 615 and robot 100. The system software will then look for all robots broadcasting a signal through a compatible wireless networking protocol, and present a list of available robots to the user for selection. The list may indicate a robot name and an available method of connection (e.g., WiFi or Bluetooth). The user can then select the desired robot (e.g., robot 100), and system software will then establish a connection with robot 100. The user may be required to provide a password or PIN prior to robot 100 granting access to user device 615. If connecting via USB controller 604, the user must initially connect a USB cable between USB controller 604 and user device 615. After the cable is connected, user device 615 will recognize that robot 100 has been connected. The user may need to install appropriate robot 100 USB drivers on user device 615 prior to being able to establish a USB connection with robot 100. If robot 100 is connected via USB to user device 615, power supply 610 may also receive a trickle charge from the power source of user device 615.

[0062] After the connection is established, the data files may be transmitted from user device 615 and received by robot 100 (step 1706). Upon establishing a connection with robot 100, the user is granted access to various functions of robot 100 through the system software being executed on user device 615. The user may delete data files (e.g., audio data files and routine data files) stored in memory 602. The user may provide updated operating software to robot 100 if the system software determines that the software running on robot 100 is out of date. Additionally, the user can send audio and routine data files to robot 100 for storage in memory 602. To do so, the user may select the downloaded play list, which includes audio data files and associated routine data files, for sending to robot 100. The system software then transmits the data files to robot 100 where the files are received at either network interface 603 or USB controller 604. Central processing unit 601 analyzes the files and stores the files in memory 602 for later playback and performance.

[0063] In an alternative arrangement, the data files may be provided directly to robot 100 through the insertion of a removable memory media into a removable memory media receiving port of robot 100. In such an arrangement, robot 100 is configured to receive and read data stored on removable memory media (e.g., SD card, MicroSD card, etc.). Thus, any audio data files and routine data files stored on the provided removable memory media are accessible by central processing unit 601, and no connection is required to provide the data files to robot 100. Further, the data files may not need to be copied into memory 602 because central processing unit 601 is provided access to the files on the removable memory media. [0064] After the file transfer is complete, the connection between user device 615 and robot 100 is terminated (step 1707). This may be accomplished by powering down robot 100 (e.g., by toggling a power switch), by powering down user device 615, by unplugging a connecting USB cable from either robot 100 or user device 615, and/or by severing a wireless connection between user device 615 and network interface 603. Once the connection between user device 615 and robot 100 is terminated, method 1700 ends.

[0065] Referring to FIG. 16, a method 1800 of performing a programmed dance routine to music stored on robot 100 (e.g., in memory 602 or on a provided removable memory media) is shown according to an exemplary embodiment. Method 1800 begins when robot 100 is activated (step 1801). Robot 100 is activated when robot 100 is turned on (e.g., by a user interacting with a power switch or button).

[0066] After activating robot 100, the user determines if robot 100 needs to be

transformed (step 1802). Some routines require robot 100 to be in the unfolded position at the start of the routine. Accordingly, if robot 100 is in the folded position and is required to be in the unfolded position, the user provides an input indicating that robot 100 is to unfold (step 1803). The input may be provided through interaction with user inputs 606 (e.g., a button on robot 100 body or a button on a remote in wireless communication with robot 100). The input is received by central processing unit 601, and central processing unit 601 executes a program contained in memory 602 to begin the unfolding process (as discussed above). Alternatively, some routines may purposefully start in the folded position and perform a transformation from the folded position to the unfolded position as part of the routine. Still further, it may be desirable to play an audio data file without causing robot 100 to transform (e.g., if no routine data file is associated with the audio data file to be played).

[0067] After activating robot 100 and ensuring that robot 100 is in the proper position, an audio data file is selected (step 1804). The audio data file is selected by the user interacting with user inputs 606. For example, the user may press a next button or a previous button to select the song. Robot 100 may include a display indicating the name or title of the audio data file selected for playback. The name may be the file name, a name stored in metadata of the file, or a number indicating the audio file's order in the uploaded playlist. The user may interact with buttons coupled to the body of robot 100. Alternatively, the user may interact with buttons located on a remote in wireless communication with robot 100.

[0068] After selecting the appropriate audio data file for playback, the user provides an input to robot 100 indicating that the audio data file is to be played (step 1805). Central processing unit 601 processes the audio data file and determines whether there is an associated routine data file. If there is an associated routine data file, the routine data file is processed simultaneously by central processing unit 601. Accordingly, central processing unit 601 sends audio signals to audio amplifier 607 for playback on speakers. At the same time, central processing unit 601 sends movement commands to servo controller 608 to effectuate a dance routine performed by robot 100. Still further, at the same time, central processing unit 601 controls lighting elements if the routine data file includes a lighting routine. The end result of the processing of the audio and routine data files is a dance performance by robot 100 synchronized to an audio file played through speakers 620 of robot 100.

[0069] After the playback of the data files is complete, the user may indicate that the robot is to transform back into the folded position (step 1806). In some arrangements, this step may be performed automatically by central processing unit 601 as part of the routine data file. Still further, this step may be skipped if the user wishes robot 100 to remain in the unfolded position. Method 1800 then ends when the user deactivates robot 100 (step 1807). The user deactivates robot 100 by interacting with a power switch or button. In an alternative configuration, robot 100 may automatically deactivate after a designated period of inactivity (e.g., after five minutes with no playback or user interaction). Step 1807 is optional as the user may wish to continue playing another audio file located within memory 602 of robot 100 after the first file is completed. If the user wishes to play another audio data file, the method returns to step 1804.

[0070] Robot 100 may also be configured to play music without performing a

programmed dance routine. For example, if a user instructs robot 100 to play an audio file that does not have a corresponding dance routine stored in memory 602, central processing unit 601 will merely provide an audio signal to audio amplifier 607 for playback over speakers 620. Still further, audio streams received from user device 615 through either network interface 603 (e.g., via Bluetooth), USB controller 604 (e.g., when robot is arranged as a speaker for an audio stream as opposed to performing a file transfer of audio files and corresponding dance files), or via auxiliary audio input 605, central processing unit 601 sends the audio stream to audio amplifier 607 for playback over speakers 620 without executing a dance routine.

[0071] It is important to note that the construction and arrangement of the elements of the systems and methods as shown in the exemplary embodiments are illustrative only.

Although only a few embodiments of the present disclosure have been described in detail, those skilled in the art who review this disclosure will readily appreciate that many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited. For example, elements shown as integrally formed may be constructed of multiple parts or elements. It should be noted that the elements and/or assemblies of the enclosure may be constructed from any of a wide variety of materials that provide sufficient strength or durability, in any of a wide variety of colors, textures, and combinations. Additionally, in the subject description, the word "exemplary" is used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" is intended to present concepts in a concrete manner. Accordingly, all such modifications are intended to be included within the scope of the present inventions. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Any means-plus-function clause is intended to cover the structures described herein as performing the recited function and not only structural equivalents but also equivalent structures. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the preferred and other exemplary embodiments without departing from scope of the present disclosure or from the spirit of the appended claims.

[0072] The present disclosure contemplates methods, systems, and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine- readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.

[0073] Although the figures may show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision step.