Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ENHANCED GAMING SYSTEMS AND METHODS
Document Type and Number:
WIPO Patent Application WO/2019/143766
Kind Code:
A1
Abstract:
Disclosed are systems and methods for enhanced game play, such as a game of billiards. In many games or other activities, it is important to determine a sequence of contact events because the order of contact events can make a difference, and rule decisions may be made based on the sequence and types of contact events. An example aspect of the disclosed systems and methods is a system for determining a sequence of contact events on a surface (e.g., a billiard table). The example system includes a camera positioned to view objects (e.g., billiard balls) on the surface, accelerometers in the surface that sense contact events on the surface, and a processor configured to determine a sequence of contact events involving at least one of the objects based on information received by the camera and the accelerometers.

Inventors:
LEE IRA (US)
Application Number:
PCT/US2019/013932
Publication Date:
July 25, 2019
Filing Date:
January 17, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ESB LABS INC (US)
International Classes:
A63D15/20
Foreign References:
US9827483B22017-11-28
US5026053A1991-06-25
US9795865B22017-10-24
US4882676A1989-11-21
Other References:
None
Attorney, Agent or Firm:
GLOVSKY, Susan G. L. et al. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system for determining a sequence of contact events on a billiard table, the system comprising:

a camera positioned to view billiard balls on the billiard table; accelerometers in the table configured to sense contact events on the billiard table; and

a processor configured to determine a sequence of contact events involving at least one of the billiard balls on the billiard table based on information received by the camera and the accelerometers.

2. A system as in claim 1 wherein the accelerometers are built into a playing surface of the billiard table and rails of the billiard table.

3. A system as in claim 2 further including at least one vibration transfer bar built into at least one rail of the billiard table and coupled to at least one accelerometer, the vibration transfer bar extending a substantial length of the rail.

4. A system as in claim 2 wherein a contact event includes contact between (i) two of the billiard balls, (ii) one of the billiard balls and a rail, (iii) one of the billiard balls and the playing surface, (iv) one of the billiard balls and a cue stick, (v) a player and the ball, or (vi) a ball and something other than the table surface.

5. A system as in claim 4 wherein the processor is configured to determine whether a contact event between a billiard ball and a cue stick is a legal contact, a foul, a double-contact, a miscue, or indeterminate.

6. A system as in claim 2 wherein a plurality of accelerometers are affixed to the bottom of the playing surface, and wherein at least one accelerometer is affixed to each of the rails.

7. A system as in claim 1 further comprising at least one microphone to receive sounds made by contact events.

8. A system as in claim 7 wherein the processor is configured to determine a sequence of contact events based on information received by the camera, the accelerometers, and the at least one microphone.

9. A system as in claim 8 wherein the processor is configured to account for latencies of the sounds received by the at least one microphone to synchronize the received sound with the information received by the camera and the accelerometers.

10. A system as in claim 7 wherein the at least one microphone is suspended above the billiard table.

11. A system as in claim 7 wherein the at least one microphone includes four

microphones built into the corners of the billiard table.

12. A system as in claim 1 further comprising an infrared sensor to sense heat caused by contact events, and wherein the processor is configured to determine a sequence of contact events based on information received by the camera, the accelerometers, and the infrared sensor.

13. A system for determining a sequence of contact events on a surface, the system

comprising:

a camera positioned to view the objects on the surface;

accelerometers in the surface configured to sense contact events on the surface; and

a processor configured to determine a sequence of contact events of the objects on the surface based on information received by the camera and the accelerometers.

14. A system as in claim 13 further comprising at least one microphone to receive sounds made by contact events, and wherein the processor is configured to determine a sequence of contact events based on information received by the camera, the accelerometers, and the at least one microphone.

15. A system as in claim 13 further comprising an infrared sensor to sense heat caused by contact events, and wherein the processor is configured to determine a sequence of contact events based on information received by the camera, the accelerometers, and the infrared sensor.

16. A system as in claim 14 further comprising an infrared sensor to sense heat caused by contact events, and wherein the processor is configured to determine a sequence of contact events based on information received by the camera, the accelerometers, the infrared sensor, and the at least one microphone.

17. A lighting system comprising:

a plurality of light sources arranged to project light toward a subject area; a camera positioned to view the subject area and to sense lighting characteristics of the subject area; and

a controller communicatively coupled to the light sources and the camera and configured to receive the lighting characteristics sensed by the camera and to adjust at least a portion of the light sources based on the lighting characteristics received from the camera.

18. A lighting system as in claim 17 wherein the plurality of light sources includes a grid of light-emitting diodes.

19. A lighting system as in claim 17 wherein the controller is configured to adjust

brightness of individual light sources of the plurality of light sources.

20. A lighting system as in claim 17 wherein the controller is configured to adjust an amount of light projected on at least a portion of the subject area by adjusting brightness of individual light sources of the plurality of light sources.

21. A lighting system as in claim 17 wherein the controller is configured to adjust color of light projected on at least a portion of the subject area by adjusting brightness of individual light sources of the plurality of light sources.

22. A lighting system as in claim 17 wherein the subject area is a billiard table.

23. A lighting system as in claim 22 wherein the controller is configured to adjust

brightness of individual light sources of the plurality of light sources to create an even distribution of light around the perimeter of billiard balls on the billiard table.

24. A lighting system as in claim 17 wherein the subject area is a sports playing field.

25. A lighting system as in claim 17 wherein the subject area is a scene to be

photographed or video recorded.

26. A method of determining expected motion of billiard balls on a billiard table, the method comprising:

receiving billiard game play data from a plurality of billiard games;

receiving image data of a given billiard game, the image data depicting a billiard table and billiard balls on the billiard table;

processing the received image data to determine a location of each billiard bah on the billiard table; and

determining expected motion of one or more balls on the billiard table by processing the determined location of the one or more balls and the received billiard game play data.

27. The method of claim 26 wherein the game play data from a plurality of billiard games comprises at least one of: motion capture data of game play;

accelerometer data of game play;

sound data of game play;

video image data of game play; and

still image data of game play.

28. The method of claim 27 wherein at least one of the video image data of game play and the still image data of game play depict one or more persons playing billiards.

29. The method of claim 28 further comprising building a play style profile for each of the one or more persons.

30. The method of claim 26 wherein the processing to determine expected motion of one or more balls on the billiard table comprises processing the determined location of the one or more balls and the received billiard game play data via image processing and probabilistic or statistics-based models.

31. The method of claim 30 wherein the image processing includes processing the

determined location of the one or more balls and the received billiard game play data through a neural network.

32. The method of claim 26 further comprising processing the received game play data to determine a location of each billiard ball in the received game play data.

33. The method of claim 32 wherein the game play data is image data of game play from a billiard table configured with a color variation on a perimeter of a playing surface of the billiard table and the method further comprising:

processing the image data of game play to produce color changed image data where the color changed image data has modified the color variation on the perimeter to improve contrast between the perimeter and the playing surface, the improved contrast used in the processing to determine a location of each billiard ball in the received game play data.

34. The method of claim 26 wherein the billiard table is configured with a color variation on a perimeter of a playing surface on the billiard table, and where processing the received image data to determine a location of each billiard ball on the billiard table comprises:

processing the received image data to modify the color variation on the perimeter of the play surface depicted in the received image data, the color variation modified to improve contrast between the perimeter and the playing surface, the improved contrast used in the processing to determine a location of each billiard ball on the billiard table.

35. The method of claim 26 further comprising:

receiving person location information for the given billiard game; and wherein determining expected motion of one or more balls on the billiard table further includes processing the person location information.

36. The method of claim 26 further comprising receiving image data of players of the given billiard game.

37. The method of claim 36 further comprising:

identifying an active shooting player in the given billiard game through processing of the received image data of the players; and

wherein determining expected motion of one or more balls on the billiard table further includes processing a player profile associated with the identified active shooting player.

38. The method of claim 36 further comprising:

identifying an active shooting player in the given billiard game through processing of the received image data of the players;

determining an identity of the active shooting player through use of facial recognition processing; accessing a player profile of the active shooting player using the determined identity; and

wherein determining expected motion of one or more balls on the billiard table further includes processing the accessed player profile.

39. The method of claim 26 further comprising:

processing the received image data to identify chalk deposits on the billiard table; and

wherein determining expected motion of one or more balls further comprises processing the identified chalk deposits.

40. The method of claim 26 wherein processing the received billiard game play data comprises:

identifying spin and trajectory of billiard balls depicted in the billiard game play data.

41. The method of claim 26 wherein the location of each billiard ball on the billiard table is determined to a sub-pixel level of granularity.

42. The method of claim 26 wherein processing the image data received to determine a location of each billiard ball utilizes image processing.

43. The method of claim 42 wherein the image processing utilizes a Hough algorithm.

44. The method of claim 26 wherein processing the received image data to determine a location of each billiard ball utilizes one or more aliasing patterns identified in pixels of the image data received.

45. The method of claim 26 wherein determining the expected motion of one or more balls includes utilizing any of the systems of claims 1-16.

46. A system of cameras for capturing billiard game play, the system comprising: an overhead camera positionable above a billiard table and configurable to have billiard balls on the billiard table in a field of view of the camera;

one or more robotically controllable cameras positioned around the billiard table; and

a control unit communicatively coupled to the overhead camera and the one or more robotically controllable cameras and configured to control at least one of the overhead and the one or more robotically controllable cameras to capture image data of billiard game play based on an expected billiard shot to be played.

47. The system of claim 46 further comprising:

a simulation unit communicatively coupled to the control unit, the simulation unit configured to:

determine the expected billiard shot to be played; and

forward the determined expected shot to the control unit.

48. The system of claim 47 wherein the simulation unit is configured to operate in

accordance with any of the methods of Claims 26-45.

49. The system of claim 47 further comprising:

a stationary camera positionable to have a playing surface of the billiard table in a field of view of the stationary camera, the stationary camera configured to forward image data to the simulation unit.

50. The system of claim 46 wherein the expected billiard shot to be played is used by the control unit to control the cameras to capture an initial impact of the shot to be played and a final perspective of motion of the billiard balls.

51. The system of claim 46 wherein the one or more robotically controllable cameras are configured to be moved to any position around the billiard table.

52. The system of claim 46 wherein the one or more robotically controllable cameras is controlled by the control unit to follow a given player, the given player identified through use of facial recognition processing.

53. The system of claim 46 wherein the one or more robotically controllable cameras is controlled by the control unit to follow a person based on a role of the person.

54. The system of claim 46 wherein the control unit controls pan and zoom of the

overhead camera.

55. The system of claim 46 wherein the control unit controls pan, tilt, zoom, height, and position with regard to the billiard table for the one or more robotically controllable cameras.

56. The system of claim 46 wherein the control unit modifies a speed at which the one or more robotically controllable cameras are moved based upon a speed of a player at the billiard table.

57. A lighting system for a billiard table, the lighting system comprising:

surface lights in a playing surface of the billiard table;

marker lights in upper surfaces of rails of the billiard table;

cushion lights along rail cushions of the billiard table; and

a controller coupled to the surface lights, marker lights, and cushion lights to control lighting characteristics of the surface lights, marker lights, and cushion lights.

58. A lighting system as in claim 57 wherein the surface lights, marker lights, and cushion lights include light emitting diodes.

59. A lighting system as in claim 57 wherein the surface lights, marker lights, and cushion lights include end glow fiber optic strands.

60. A lighting system as in claim 59 wherein the end glow fiber optic strands are coupled to light emitting diodes.

61. A lighting system as in claim 57-60 wherein the cushion lights are located in a gap between a rail cushion of the table and the playing surface of the table.

62. A lighting system as in any of claims 57-61 wherein at least one of the marker lights includes a plano-convex lens mounted between a light source and a spot diffuser to produce an evenly lit spot.

63. A lighting system as in any of claims 57-62 wherein the lighting characteristics of the surface lights, marker lights, and cushion lights include at least one of color, brightness, and flashing pattern, and the controller sets the lighting characteristics of any of the surface lights, marker lights, and cushion lights to indicate states of billiard play.

64. A lighting system as in any of claims 57-63 wherein the surface lights include a head spot, a foot spot, a center spot, and an opening break spot.

65. A lighting system as in any of claims 57-64 wherein the controller causes the marker lights to (a) illuminate white or yellow to designate a turn of a corresponding billiard player, (b) flash to warn of an impending time-out foul, (c) illuminate red to indicate an expired shot clock, (d) indicate a start or stop of billiard play, or (e) illuminate to provide a count of points, or (f) a combination of results (a) through (e).

66. A lighting system as in any of claims 57-65 wherein the controller causes the cushion lights to illuminate selectively to indicate aiming points for billiard play or to indicate contact points of billiard play that has occurred.

67. A lighting system as in any of claims 57-66 further including floor lights coupled underneath the billiard table to project light below the billiard table.

68. A lighting system as in claim 67 wherein the controller causes the floor lights to (a) illuminate red to indicate that a foul has been committed by a player or (b) illuminate white or yellow to designate a turn of a corresponding billiard player.

69. A system for capturing, analyzing, and displaying billiard play comprising:

a billiard table;

a lighting system to project light on the billiard table; and

a camera system including:

an image capture camera to capture a plurality of images of the billiard table;

a position capture camera to capture a position of each object on the billiard table;

accelerometers to capture a sequence of contact events of billiard play; and

a processor configured to analyze the captured images, position, and sequence of contact events to create an output.

70. A system as in claim 69 wherein the output is sent by the processor to a device that communicates the billiard play to an audience.

71. A system as in claim 70 wherein the device that communicates the billiard play

broadcasts the billiard play.

72. A system as in claim 69 further comprising a simulator that receives the output from the processor and simulates the billiard play.

73. A system as in claim 72 wherein the simulator provides instructions to the processor for controlling at least one of the image capture camera and the position capture camera, the instructions based upon the simulated billiard play.

74. A system as in claim 72 wherein the simulator is configured to operate in accordance with any of the methods of claims 26-45.

75. A system as in claim 69 wherein the output is at least one of:

a video stream of the billiard play;

an instruction to a billiard player;

a video stream of the billiard play clipped and tagged for a plurality of events in the billiard play;

scoring of the billiard play;

remote play instructions;

image or video projection of proposed billiard play onto the billiard table; instant-replay video stream of the billiard play;

3D rendered views from a given perspective of the billiard play; augmented reality views to enhance video stream of the billiard play;

virtual reality views of the billiard play;

mixed reality holographic views of the billiard play;

summary views of the billiard play generated using images of object positions recorded when the objects stop moving;

a trigger for starting or stopping a timer for the billiard play;

a trigger for an anticipated high-speed video capture event for the billiard play; metrics for the billiard play to be layered on a video steam; and synthesized sounds to provide real-time audio feedback of the billiard play.

76. A system as in claim 75 wherein the anticipated high-speed video capture event

includes any of a cue stick hitting a cue ball, a cue ball hitting another ball near a rail, and a final motion of a cue ball into a line toward a point scored.

77. A system as in claim 75 wherein the metrics for the billiard play include any of object speed, cut, and spin speed.

78. A system as in claim 75 wherein the synthesized sounds include any of an amplified collision sound effect, a whirring sound effect corresponding to spinning objects on the billiard table, real-time sound feedback indicators for fouls and miscues, sound effects indicating pending billiard play phases, and voice synthesis announcing scores or pre/post inning billiard play announcements.

79. A system as in claim 75 wherein the remote play instructions include directions for controlling at least one robot at a second billiard table, the second billiard table being separate from the billiard table.

80. A system as in claim 79 wherein the directions for controlling the at least one robot cause the at least one robot to move billiard balls on the separate billiard table to correspond to the billiard play on the billiard table.

81. A system as in claim 79 wherein the at least one robot is a cable driven parallel robot (CDPR).

82. A system as in claim 79 wherein the at least one robot is a plurality of wheeled micro robots.

83. A system as in claim 79 wherein the separate billiard table is remotely located relative to the billiard table.

84. A system as in claim 69 further including a projector arranged to project images on the billiard table.

85. A system as in claim 84 wherein the processor and at least one of the image capture camera and the position capture camera are configured to track a tip of a billiard cue and cause the projector to draw reference lines and points on the billiard table based on tracked motion of the tip of the billiard cue.

86. A system as in claim 84 wherein the processor causes the projector to draw visual trails behind the objects as the objects move on the billiard table.

87. A system as in claim 69 further including lights integrated into the billiard table and controlled by the processor.

88. A system as in claim 87 wherein the lights are LED lights.

89. A system as in claim 87 wherein the lights are integrated (a) in a playing surface of the billiard table to project light through the playing surface or (b) in rails of the billiard table to project light indicating states of billiard play.

90. A system as in claim 87 wherein at least a portion of the lights are configured to be controlled by a player to turn on, turn off, or change color to keep track of billiard play score.

91. A system as in claim 69 further including a microphone to receive voice prompts from a player and forward the voice prompts to the processor, and wherein the processor is configured to interpret the voice prompts received by the microphone to control operation of the system.

92. A system as in claim 69 wherein the camera system includes an infrared camera to sense heat caused by an object bouncing on the billiard table, and wherein the processor is configured to determine a height that the object bounced based on the heat sensed by the infrared camera.

93. A system as in claim 69 wherein the lighting system is configured to operate in

accordance with any of the systems of claims 17-25.

94. A system as in claim 69 wherein the camera system is configured to operate in

accordance with any of the systems of claims 46-56.

95. A system as in claim 69 wherein the system is configured to operate in accordance with any of claims 1-16.

96. A system as in claim 69 wherein the system is configured to operate in accordance with any of claims 57-68.

97. A billiard ball comprising:

a plurality of markers on a surface of the ball, the markers arranged such that at least two of the markers are visible for any orientation of the ball.

98. A billiard ball as in claim 97 wherein the markers are arranged on the surface of the ball at points at which an inscribed cube would intersect the surface of the ball.

99. A billiard ball as in claim 97 wherein the markers are oriented with respect to one another such that each pair of markers visible at one time is a unique combination of marker orientations.

100. A billiard ball as in claim 97 wherein the markers are substantially equilateral

triangles.

101. A system of cameras for capturing game play, the system comprising:

an overhead camera positionable above an arena for game play and configurable to have objects in the arena in a field of view of the camera;

one or more robotically controllable cameras positioned around the arena for game play; and

a control unit communicatively coupled to the overhead camera and the one or more robotically controllable cameras and configured to control at least one of the overhead and the one or more robotically controllable cameras to capture image data of game play based on expected motion of the objects in the arena.

Description:
ENHANCED GAMING SYSTEMS AND METHODS

RELATED APPLICATIONS

[0001] This application claims the benefit of ET.S. Provisional Application No.

62/671,578, filed on May 15, 2018, ET.S. Provisional Application No. 62/619,353, filed on January 19, 2018, and ET.S. Provisional Application No. 62/619,404, filed on January 19, 2018. The entire teachings of the above applications are incorporated herein by reference.

BACKGROUND

[0002] Competitive games include ball games ( e.g ., billiards, baseball, soccer, football, tennis, table tennis, and golf), and non-ball games (e.g., curling), among others. The position of one or more objects during play can be an important aspect of such games.

[0003] Billiards is an example of such a game involving balls. Billiards refers to a variety of games played with a cue stick that is used to strike billiard balls, projecting the balls along a cloth-covered table bounded by rails (also referred to as“cushions” or“rail cushions”). Example categories of billiards include carom billiards, pool, and snooker.

[0004] Carom billiards is played on tables that are typically ten feet long and that do not include pockets (e.g, the table 105 illustrated in FIG. 1 A having four rail cushions 1 lOa-d). The object of carom billiards is to score points by“caroming” one’s own cue ball off both the opponent’s cue ball and an object ball in a single shot. Variations of carom billiards include straight rail, balkline, one-cushion billiards, three-cushion billiards, and artistic billiards. In straight rail, one point is scored each time a player’s cue ball contacts both object balls (i.e., the opponent’s cue ball and a third ball) in a single shot. In balkline, the table is divided into marked regions, called balk spaces, in which a player may only score up to a certain number of points while the object balls are within those regions. This limitation on scoring reduces “rail nursing” techniques that players might use in the straight rail variety. In one-cushion billiards, a carom off of both object balls with at least one rail being struck before the second object ball is hit is necessary to score a point. In three-cushion billiards, a point is scored when a player caroms the cue ball off both object balls, with at least three rail cushions and one object ball being contacted before the player’s cue ball contacts the second object ball. In artistic billiards, players perform a number of predetermined shots of varying difficulty, each shot having a maximum point value.

[0005] Pool is played on tables ( e.g ., the table 115 illustrated in FIG. 1B) with six rail cushions l20a-f and six pockets l25a-f. Pool tables are typically seven to nine feet long with cushions extending between the pockets. Common variations of pool include eight-ball, nine-ball, one pocket, and bank pool. Eight-ball is played with fifteen balls and a cue ball.

To win, a player claims a suit of balls (e.g., stripes or solids), deposits all balls of the suit in the pockets, and then deposits the“8” ball into a pocket, according to the rules of the particular game. Nine-ball uses nine balls. To win, a player must deposit the“9” ball into a pocket, but first must make contact with the lowest numbered ball on the table. In one- pocket, each player is assigned a comer pocket of the table. A player wins by depositing a majority of the balls in the player’s assigned pocket. In bank pool, a player wins by depositing a majority of the balls into pockets by“banking” the balls off of a cushion; that is, the player’s cue ball must contact an object ball, then the object ball must contact one or more cushions on the way to a pocket.

[0006] Snooker is played on a table with six pockets, similar to a pool table, that is about twelve by six feet in dimension. Snooker is played using a white cue ball, fifteen red balls worth one point each, a yellow ball worth two points, a green ball worth three points, a brown ball worth four points, a blue ball worth five points, a pink ball worth six points, and a black ball worth seven points. Points are scored by depositing the colored balls into the pockets in a certain order.

SUMMARY

[0007] In the various types of billiards described above, it can be appreciated that it can be important to determine a sequence of contact events, such as, for example, whether a certain ball first comes into contact with another ball or a cushion. In a particular example, in three-cushion billiards a player’s cue ball must contact three cushions and one object ball before contacting the second object ball. In other billiards games, after the cue ball contacts an object ball, a ball should contact a cushion to constitute a legal stroke. In a case in which the second object ball is positioned very close to a cushion, it may not be possible to detect with typical human senses which of the object ball or cushion that the cue ball contacted first. [0008] In many such games or other activities, it is important to determine a sequence of contact events because the order of contact events can make a difference, and rule decisions may be made based on the sequence and types of contact events. In some cases, not only is it important to determine the sequence of contact events, but also to determine the nature of contacts events ( e.g ., in billiards, determining whether a contact event is a foul, double-hit, or contact with a foreign object, such as a ball hopping off of a billiards table and hitting the wood of the table; in baseball, whether the ball hits a wall in a location that results in a home run or a double; or in football, whether a receiver’s foot lands in-bounds before falling out-of- bounds.

[0009] One aspect of the disclosed systems and methods is a system for determining a sequence of contact events on a billiard table. The system includes a camera positioned to view billiard balls on the billiard table, accelerometers in the table that sense contact events on the billiard table, and a processor that determines a sequence of contact events involving at least one of the billiard balls on the billiard table based on information received by the camera and the accelerometers. Contact events can include, for example, legal or illegal contact between (i) two of the billiard balls, (ii) one of the billiard balls and a rail cushion,

(iii) one of the billiard balls and the playing surface, (iv) one of the billiard balls and a cue stick, (v) a player and the ball and/or (vi) a ball and something other than the table surface, ball, or cushion. The processor can be configured to categorize a contact event between a billiard ball and a cue stick as being a legal contact, a foul, a double-contact, a miscue, or indeterminate, for example. In many embodiments, the accelerometers can be built into a playing surface and rail cushions of the billiard table. A vibration transfer bar may be coupled to one or more accelerometers in a given rail. In such embodiments, a plurality of accelerometers may be affixed to the bottom of the playing surface, and at least one accelerometer may be affixed to a surface behind each of the rail cushions. Some

embodiments may also include at least one microphone to obtain sounds made by contact events. The microphone(s) may be suspended above, or embedded within, the billiard table, for example. In embodiments including at least one microphone, the processor can determine a sequence of contact events based on information obtained by the camera, the

accelerometers, and the microphone(s). The processor can be configured to account for latencies of the sounds received by the microphone(s) and latencies of vibrations received by the accelerometers to synchronize the received sound, the information received by the camera, and the vibrations received by the accelerometers. Some embodiments may also include an infrared sensor to sense heat caused by contact events. In such embodiments, the processor can be configured to determine a sequence of contact events based on information received by any combination of the system’s sensors, e.g ., the camera, accelerometers, infrared sensor, and microphone(s).

[0010] Another aspect of the disclosed systems and methods is a system for determining a sequence of contact events on a surface. The system includes at least one camera positioned to view the objects on the surface, accelerometers in the surface that sense contact events on the surface, and a processor that determines a sequence of contact events of the objects on the surface based on information received by the camera(s) and the

accelerometers.

[0011] Another aspect of the disclosed systems and methods is a lighting system that includes a plurality of light sources arranged to project light toward a subject area, a camera positioned to view the subject area and to sense lighting characteristics of the subject area, and a controller communicatively coupled to the light sources and the camera. The controller adjusts at least a portion of the light sources based on lighting characteristics received from the camera. The subject area can be, for example, a billiard table, sports playing field, or scene to be photographed or filmed. In many embodiments, the light sources can include a grid of light-emitting diodes. The controller can be configured to adjust brightness of individual light sources of the plurality of light sources. In such embodiments, the controller can adjust an amount of light or color of light projected on a portion of the subject area by adjusting brightness of individual light sources of the plurality of light sources. The controller can also be configured to adjust brightness of individual light sources of the plurality of light sources to create an even distribution of light around the perimeter of billiard balls on a billiard table.

[0012] Another aspect of the disclosed systems and methods is a method of determining expected motion of billiard balls on a billiard table. The example method includes receiving billiard game play data from a plurality of billiard games and receiving image data of a given billiard game. The image data depicts a billiard table and billiard balls on the billiard table. The method further includes processing the received image data to determine a location of one or more billiard balls on the billiard table, and determining expected motion of the ball(s) on the billiard table by processing the determined location of the balls(s) and the received billiard game play data.

[0013] In some embodiments, the game play data includes at least one of: motion capture data of game play, accelerometer data of game play, sound data of game play, video image data of game play, and still image data of game play. The video image data or the still image data may depict one or more persons playing billiards. In such embodiments, the method can include building a play style profile for each of the one or more persons. In some

embodiments, the example method includes processing the determined location of one or more balls and the received billiard game play data via image processing and probabilistic or statistic based models. In such embodiments, the method can process the determined location of the ball(s) and the received billiard game play data through a neural network. In some embodiments, processing the image data to determine a location of each billiard ball utilizes image processing, such as, for example, a Hough algorithm, or utilizes one or more aliasing patterns identified in pixels of the image data received.

[0014] In some embodiments, the example method includes processing the received game play data to determine a location of one or more billiard balls based on the received game play data. In such embodiments, the game play data can be image data of game play from a billiard table that is configured with a color variation on a perimeter of a playing surface of the billiard table. In such embodiments, the method can include processing the image data to produce color changed image data, where the color changed image data includes a modified color variation on the perimeter to improve contrast between the playing surface perimeter and a cushion of the table. The improved contrast can be used to determine a location of one or more billiard balls in the received game play data, such as whether a ball is in contact with a cushion.

[0015] In some embodiments, the example method includes receiving person location information for the given billiard game. In such embodiments, determining expected motion of balls on the billiard table can include processing the person location information. In some embodiments, the example method includes receiving image data of players of the given billiard game. In such embodiments, the method includes identifying an active shooting player in the given billiard game through processing of the received image data of the players, and determining expected motion of one or more balls on the billiard table can further include processing a player profile associated with the identified active shooting player. The method can include determining an identity of the active shooting player through use of facial recognition processing and accessing a player profile of the active shooting player using the determined identity.

[0016] In some embodiments, processing the received billiard game play data includes identifying spin and trajectory of billiard balls depicted in the billiard game play data. In some embodiments, the example method includes determining the location of one or more billiard balls on the billiard table to a sub-pixel level of granularity.

[0017] In some embodiments, the example method can include processing the received image data to predict or identify the location and amount of chalk deposits on the billiard table. In such embodiments, determining expected motion of the ball(s) further comprises processing the identified chalk deposits. This can involve a simulation of chalk dust particles ( e.g ., through up cloud of dust and simulate how the dust particles disperse in the region).

[0018] Another aspect of the disclosed systems and methods is a system of cameras for capturing billiard game play. The example system includes an overhead camera, one or more robotically controllable cameras, and a control unit. The overhead camera is positionable above a billiard table and configurable to have billiard balls on the billiard table in a field of view of the camera. The robotically controllable camera(s) are positioned surrounding the billiard table. The control unit is communicatively coupled to the overhead camera and the robotically controllable camera(s) and is configured to control at least one of the overhead and the robotically controllable camera(s) to capture image data of billiard game play based on an expected billiard shot to be played.

[0019] The robotically controllable camera(s) can be configured to be moved to any position around the billiard table, can be controlled by the control unit to follow a given player, where the given player is identifiable through use of facial recognition processing, and the robotically controllable camera(s) can be controlled by the control unit to follow a person based on a role of the person. The control unit may control pan, tilt, zoom, position, focus, frame rate, and timing of the camera(s). The control unit may modify a speed at which the overhead camera and the robotically controllable camera(s) are controlled based upon a speed of a player at the billiard table. It should be appreciated that the robotically

controllable camera(s) can be used in an environment other than a billiard table, such as a stage or arena, for example. [0020] In some embodiments, the example system includes a simulation unit communicatively coupled to the control unit. The simulation unit can be configured to determine the expected billiard shot to be played and forward the determined expected shot to the control unit. Such embodiments can further include a stationary camera positionable to have a playing surface of the billiard table in a field of view of the stationary camera. The stationary camera can be configured to forward image data to the simulation unit. In some embodiments, the expected billiard shot to be played can be used by the control unit to control the cameras to capture an initial impact of the shot to be played and a final perspective of motion of the billiard balls.

[0021] Another aspect of the disclosed systems and methods is a lighting system for a billiard table. The lighting system can include surface lights in a playing surface of the billiard table, marker lights in upper surfaces of rails of the billiard table, cushion lights along rail cushions of the billiard table, and a controller coupled to the surface lights, marker lights, and cushion lights to control lighting characteristics of the surface lights, marker lights, and cushion lights. The surface lights, marker lights, and cushion lights can include light emitting diodes or end glow fiber optic strands. End glow fiber optic strands can be coupled to light emitting diodes. The lighting characteristics of the surface lights, marker lights, and cushion lights can include at least one of color, brightness, and flashing pattern. The controller can set the lighting characteristics of any of the surface lights, marker lights, and cushion lights to indicate states of billiard play. In some embodiments, at least one of the marker lights can include a plano-convex lens mounted between a light source and a spot diffuser to produce an evenly lit spot. The surface lights can include a head spot, a foot spot, a center spot, and an opening break spot. In some embodiments, the controller can cause the marker lights to: (a) illuminate white or yellow to designate a turn of a corresponding billiard player, (b) flash to warn of an impending time-out foul, (c) illuminate red to indicate an expired shot clock, (d) indicate a start or stop of billiard play, (e) illuminate to provide a count of points, or (f) a combination of two or more of results (a) through (e). It should be understood that other colors can be used. The colors that correspond to a billiard player can be the color of the player’s cue ball in billiard games that include different color cue balls for each player. The controller can cause the cushion lights to illuminate selectively to indicate aiming points for billiard play or to indicate contact points of billiard play that has occurred. In some embodiments, the lighting system includes floor lights coupled underneath the billiard table to project light below the billiard table. In such embodiments, the controller can cause the floor lights to: (a) illuminate red to indicate that a foul has been committed by a player or (b) illuminate white or yellow to designate a turn of a corresponding billiard player.

[0022] Another aspect of the disclosed systems and methods is a system for capturing, analyzing, and displaying billiard play. The system includes a billiard table, a lighting system to project light on the billiard table, a camera system, accelerometers to capture a sequence of contact events of billiard play, and a processor configured to analyze the captured images, position data, and sequence of contact events to create an output. The camera system includes an image capture camera to capture a plurality of images of the billiard table, and a position capture camera to capture a position of each object on the billiard table.

[0023] In some embodiments, the output can be sent by the processor to a device that communicates the billiard play to an audience. Such a device may broadcast the billiard play. In some embodiments, the system further includes a simulator that receives the output from the processor and simulates the billiard play. The simulator can provide instructions to the processor for controlling at least one of the image capture camera and the position capture camera, where the instructions are based on the simulated billiard play.

[0024] The output by the processor can be at least one of: (i) a video stream of the billiard play, (ii) an instruction to a billiard player, (iii) a video stream of the billiard play clipped and tagged for a plurality of events in the billiard play ( e.g ., based on timing, type, or magnitude of an audience feedback), (iv) scoring of the billiard play, (v) remote play instructions, (vi) image or video projection of proposed billiard play onto the billiard table, (vii) instant-replay video stream of the billiard play, (viii) 3D rendered views from a given perspective of the billiard play, (ix) augmented reality views to enhance video stream of the billiard play, (x) virtual reality views of the billiard play, (xi) mixed reality holographic views of the billiard play, (xii) summary views of the billiard play generated using images of object positions recorded when the objects stop moving, (xiii) a trigger for starting or stopping a timer for the billiard play, (xiv) a trigger for an anticipated high-speed video capture event for the billiard play, (xv) metrics for the billiard play to be layered on a video steam, and (xvi) synthesized sounds to provide real-time audio feedback of the billiard play. The anticipated high-speed video capture event can include any of a cue stick hitting a cue ball, a cue ball hitting a rail or another bah near a rail, and a final motion of a cue ball into a line toward a point scored. Metrics for the billiard play can include any of object speed, cut, and spin speed. Synthesized sounds can include any of an amplified collision sound effect, a whirring sound effect corresponding to spinning objects on the billiard table, real-time sound feedback indicators for fouls and miscues, sound effects indicating pending billiard play phases, and voice synthesis announcing scores or pre/post inning billiard play announcements. Remote play instructions can include directions for controlling at least one robot at a second billiard table, the second billiard table being separate from the billiard table. In such embodiments, the directions for controlling the robot(s) cause the robot(s) to move billiard balls on the separate billiard table to correspond to the billiard play on the billiard table. The robot(s) can be a cable driven parallel robot (CDPR), a plurality of wheeled micro robots, or spherical robots. The separate billiard table may be remotely located relative to the billiard table.

[0025] In some embodiments, the system includes a projector ( e.g ., LED or laser) arranged to project images on a billiard table. In such embodiments, the processor and at least one of the image capture camera and the position capture camera can be configured to track a tip of a billiard cue and cause the projector to draw reference lines and points on the billiard table based on tracked motion of the tip of the billiard cue. In such or other embodiments, the processor may cause the projector to draw visual trails behind the objects as the objects move on the billiard table.

[0026] In some embodiments, the system includes lights (e.g., LED lights) integrated into the billiard table and controlled by the processor. In such embodiments, the lights can be integrated (a) in a playing surface of the billiard table to project light through the playing surface, or (b) in rails of the billiard table to project light indicating states of billiard play. In such or other embodiments, at least a portion of the lights can be configured to be controlled by a player to turn on, turn off, or change color to keep track of billiard play score.

[0027] In some embodiments, the system can include a microphone to receive voice prompts from a player and forward the voice prompts to the processor. In such an

embodiment, the processor can be configured to interpret the voice prompts received from the microphone and use the interpretation to control operation of the system.

[0028] In some embodiments, the camera system can include an infrared camera to sense heat caused by an object bouncing on the billiard table. In such an embodiment, the processor can be configured to determine (1) locations of the bounces based on the heat sensed by the infrared camera and (2) the timing of the bounces based on vibration data. This location and timing information can be used to determine a height that the object bounced.

[0029] Another aspect of the disclosed systems and methods is a billiard ball including a plurality of markers on a surface of the ball. The markers are arranged such that at least two of the markers are visible for any orientation of the ball. The markers can be arranged on the surface of the ball at points at which an inscribed cube would intersect the surface of the ball, and the markers can be oriented with respect to one another such that each pair of markers visible at one time is a unique combination of marker orientations.

[0030] While many aspects have been described in the context of a billiard table, they may equally be used in other environments, such as, for example, sporting arenas, outdoor events and competitions, and performance stages.

BRIEF DESCRIPTION OF THE DRAWINGS

[0031] The foregoing will be apparent from the following more particular description of example embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments.

[0032] FIG. 1 A is a top view of a carom billiard table.

[0033] FIG. 1B is a top view of a pool table.

[0034] FIG. 2A is an elevation view schematic diagram illustrating a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0035] FIG. 2B is a top view schematic diagram illustrating example accelerometer placement in a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0036] FIG. 3 is a cross-sectional view of a billiard table showing an accelerometer positioned near a cushion of the billiard table, according to an example embodiment.

[0037] FIG. 4A is a cross-sectional view of a billiard table showing an accelerometer coupled to a vibration transfer bar mounted within a cushion of the billiard table, according to an example embodiment.

[0038] FIG. 4B illustrates an example vibration transfer bar with an accelerometer and associated electronic communication line. [0039] FIG. 4C illustrates an example vibration transfer bar mounted within a cushion of a billiard table and coupled to an accelerometer.

[0040] FIG. 5A is an elevation view schematic diagram illustrating a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0041] FIG. 5B is an elevation view schematic diagram illustrating a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0042] FIG. 5C is an elevation view schematic diagram illustrating a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0043] FIG. 6A is a schematic diagram illustrating a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0044] FIG. 6B is a screen shot of an interface to a system for determining a sequence of contact events on a billiard table, according to an example embodiment.

[0045] FIG. 7A is a schematic diagram illustrating an example cartridge with sensor components that can be inserted into a receptacle in a comer of a billiard table, according to an example embodiment.

[0046] FIG. 7B is a set of schematic diagrams illustrating an example cartridge that can be inserted into a receptacle in a corner of a billiard table, according to an example embodiment.

[0047] FIGS. 7C and 7D illustrate an example cartridge that can be inserted into a receptacle in a corner of a billiard table, according to an example embodiment.

[0048] FIG. 8A is a flow chart illustrating determining the occurrence of a contact event, according to an example embodiment.

[0049] FIG. 8B is a graph illustrating an example frequency signature of a cue-stick-on- ball contact event as sensed through slate vibration sensors, according to an example embodiment.

[0050] FIG. 9A is a flow chart illustrating an example workflow for the system illustrated in FIG. 6, according to an example embodiment.

[0051] FIG. 9B is a flow chart illustrating determining whether a contact event has occurred in the context of FIG. 9 A, according to an example embodiment. [0052] FIG. 10 illustrates example calculations of heights of a ball bouncing on a surface.

[0053] FIG. 11 is a schematic diagram illustrating a lighting system, according to an example embodiment.

[0054] FIG. 12 is a schematic diagram illustrating a lighting system, according to an example embodiment.

[0055] FIG. 13 is a schematic diagram illustrating a lighting system, according to an example embodiment.

[0056] FIG. 14 is a schematic diagram illustrating a lighting system, according to an example embodiment.

[0057] FIG. 15 is a schematic diagram illustrating a grid of white light-emitting diodes (LEDs), according to an example embodiment.

[0058] FIG. 16 is a schematic diagram illustrating a grid of multi-colored light-emitting diodes (LEDs), according to an example embodiment.

[0059] FIG. 17 is a flow diagram illustrating calibrating the lighting system, according to an example embodiment.

[0060] FIGS. 18A and 18B are flow diagrams illustrating adjusting light sources of the lighting system, according to example embodiments.

[0061] FIG. 19 is a schematic diagram illustrating a lighting system, according to an example embodiment.

[0062] FIGS. 20A-D are schematic diagrams illustrating calibration of a lighting system, according to an example embodiment.

[0063] FIG. 21 is a schematic diagram illustrating adjusting light sources of a lighting system, according to an example embodiment.

[0064] FIGS. 22A-F are schematic diagrams illustrating adjusting light sources of a lighting system, according to an example embodiment.

[0065] FIG. 23 A is a schematic diagram illustrating a portion of a billiard table with a color variation on a perimeter of a playing surface of the billiard table, according to an example embodiment.

[0066] FIGS. 23B and 23C illustrate a portion of a billiard table with a color variation on a perimeter of a playing surface of the billiard table, according to an example embodiment.

[0067] FIG. 24 is a top view schematic diagram illustrating a chalk deposit on a billiard table caused by a cue stick having chalk on its tip striking a cue ball. [0068] FIG. 25A is a top view schematic diagram illustrating a system of cameras for capturing billiard game play, according to an example embodiment.

[0069] FIG. 25B is an elevation view schematic diagram illustrating a system of cameras for capturing billiard game play, according to an example embodiment.

[0070] FIG. 26A is an elevation view schematic diagram illustrating an example angle at which a contact event between a ball and a rail cushion can be viewed by a camera.

[0071] FIG. 26B is an perspective view schematic diagram illustrating an example angle at which a contact event between a ball and a rail cushion can be viewed by a camera.

[0072] FIG. 27 is a flow chart illustrating capturing images of a contact event, according to an example embodiment.

[0073] FIG. 28 is a top view schematic diagram illustrating an integrated lighting system for a billiard table, according to an example embodiment.

[0074] FIG. 29 is a schematic diagram illustrating lights arranged between the cushion structure of a billiard table and the playing surface of the table.

[0075] FIG. 30 is an elevation view schematic diagram illustrating floor lights coupled underneath a billiard table, according to an example embodiment.

[0076] FIGS. 31A-D are schematic diagrams illustrating configurations for producing spots of light on a billiard table, according to example embodiments.

[0077] FIG. 32 is a schematic diagram illustrating a billiard table and a cable driven parallel robot to move billiard balls on the table.

[0078] FIG. 33 A is a schematic diagram illustrating a billiard table and a plurality of wheeled micro robots to move billiard balls on the table.

[0079] FIG. 33B is a schematic diagram illustrating a close-up of a corner of the billiard table of FIG. 33B.

[0080] FIG. 34A is an elevation view schematic diagram illustrating a projector arranged to project images on a billiard table, according to an example embodiment.

[0081] FIG. 34B is a top view schematic diagram illustrating visual trails projected onto a billiard table by a projector, according to an example embodiment.

[0082] FIG. 35 is a flow diagram illustrating aspects of remote billiard play, according to an example embodiment.

[0083] FIG. 36 is a flow diagram illustrating aspects of remote billiard play, according to an example embodiment. [0084] FIG. 37 illustrates an example billiard ball spot pattern design that enables detection of movement of a billiard ball, according to an example embodiment.

[0085] FIGS. 38A and 38B illustrate an example billiard ball marker pattern design that enables a determination of an orientation of a billiard ball.

[0086] FIG. 39 illustrates a pattern that enables a computer vision system to determine the orientation of a ball, according to an example embodiment.

[0087] FIG. 40 is a set of schematic diagrams illustrating a pattern that enables a computer vision system to determine the orientation of a ball, according to an example embodiment.

DETAILED DESCRIPTION

[0088] A description of example embodiments follows.

[0089] FIG. 2A is an elevation view schematic diagram illustrating a system 200 for determining a sequence of contact events on a billiard table 205, according to an example embodiment. The table 205 of FIG. 2 A could be, for example, either of the tables shown in FIGS. 1 A or 1B. The system 200 includes a camera 210 positioned to view billiard balls 2l5a-c on the billiard table 205, accelerometers 220a-l (FIGS. 2A and 2B) in or on the table 205 that sense contact events on the billiard table 205, and a processor 225 that determines, based on information received by the camera 210 and the accelerometers 220a-l, a sequence of contact events involving at least one of the billiard balls 2l5a-c on the billiard table 205. The table 205 includes a playing surface 235 (FIG. 2B) of a substrate 240 ( e.g ., slate) on which the balls 2l5a-c sit and over which the balls 2l5a-c can roll. The table 205 also includes rail cushions 230a-d (FIG. 2B) along its perimeter. Example contact events can include contact between (i) two of the billiard balls 2l5a-c, (ii) one of the billiard balls 215a- c and one of rail cushions 230a-d, (iii) one of the billiard balls 2l5a-c and the playing surface 235, and/or (iv) one of the billiard balls 2l5a-c and a cue stick (not shown). The processor 225 can determine whether a contact event between a billiard ball and a cue stick, for example, is a legal contact, a foul, a double-contact, a miscue, or indeterminate. As shown in FIGS. 2A and 2B, accelerometers 220a-h are associated with the playing surface 235 of the billiard table 205. The accelerometers 220a-h may be affixed to the bottom of the substrate 240 that includes the playing surface 235, or may be mounted within the substrate 240 that includes the playing surface 235. As shown in FIGS. 2 A and 2B, accelerometers 220Ϊ-1 can be built into the rail cushions 230a-d of the billiard table 205. While four accelerometers are shown as being built into the cushions, it should be appreciated that any number of accelerometers can be used.

[0090] FIG. 3 is a cross-sectional view of the billiard table 205 showing accelerometer 220i mounted within a structure (e.g, made of wood) of the billiard table 205 and positioned near cushion 230b of the billiard table. The accelerometer 220i may be a PCB Piezotronics ceramic shear ICP accelerometer (1000 mV/g) mounted behind wood facing in the rail upon which a rubber cushion is mounted.

[0091] FIG. 4A is a cross-section view of a billiard table 205 showing an accelerometer 320i coupled to a vibration transfer bar 405 mounted within a cushion 230b of the billiard table 205, according to an example embodiment. FIG. 4B illustrates an example vibration transfer bar 405 with an accelerometer 220i and associated electronic communication line 410. FIG 4C illustrates an example vibration transfer bar 405 mounted within a cushion 230b of the billiard table and coupled to an accelerometer 220i. In the embodiment of FIGS. 4A- 4C, the range of the accelerometer 220i is extended with the use of a vibration transfer bar (“contact rod”) 405 to transfer vibrations from the cushion 230b to the accelerometer 220i. The transfer bar 405 can be mounted so that it does not touch the facing of the table, and a layer of rubber (e.g, 60 duro) or other dampening material can be mounted between the bar 405 and the facing of the table to allow measurable displacement as well as dampen or protect the accelerometer 220i from being damaged during harsh compressions. The transfer bar 405 can be machined with a flat surface to mount (e.g, via stud mount) the accelerometer device (e.g., PCB Piezotronics 333B52 or 393B12). The configuration shown in FIGS. 4A- 4C can transfer displacements measured from a collision anywhere along the length of the vibration transfer bar 405 to an accelerometer, e.g, 220i. The transfer bar 405 can be made from a rigid material (e.g, aluminum or beryllium) capable of transmitting vibrations with minimal absorption. Aluminum transfers vibrations at about 6,000 m/s, and beryllium transfers vibrations at about 13,000 m/s. The transfer bar 405 can alternatively be made of carbon fiber, for example, and may be at least a portion of a carbon fiber tube with an aluminum (or other material) component coupled to the carbon fiber to mount an

accelerometer. The configuration shown in FIGS. 4A-4C extends the range of motion sensing, and can enable a system to detect even the slightest touches of the rail over 355mm away from an accelerometer with a high signal-to-noise ratio. [0092] FIG. 5A is an elevation view schematic diagram illustrating a system 500 for determining a sequence of contact events on a billiard table 205, according to an example embodiment. The system 500 includes a camera 210 positioned to view billiard balls 2l5a-c on the billiard table 205, accelerometers 220a-l in the table 205 that sense contact events on the billiard table 205, microphones 505a and 505b to receive sounds made by contact events, and a processor 225 that determines a sequence of contact events involving at least one of the billiard balls 2l5a-c on the billiard table 205 based on information received by the camera 210, the accelerometers 220a-l, and the microphones 505a and 505b. Additional information may also be considered in the determination. As shown, the microphones 505a and 505b can be suspended above the billiard table 205. Alternatively, microphones (not shown in FIG.

5 A) can be built into the comers of the table 205. The processor 225 can be configured to account for latencies of the sounds received by the microphones 505a and 505b to synchronize the received sounds with the information received by the camera 210 and the accelerometers 220a-l.

[0093] FIG. 5B is an elevation view schematic diagram illustrating a system 510 for determining a sequence of contact events on a billiard table, according to an example embodiment. The system 510 includes a camera 210 positioned to view billiard balls 2l5a-c on the billiard table 205, accelerometers 220a-l in the table 205 that sense contact events on the billiard table 205, an infrared sensor 515 to sense heat caused by contact events, and a processor 225 that determines a sequence of contact events involving at least one of the billiard balls 2l5a-c on the billiard table 205 based on information received by the camera 210, the accelerometers 220a-l, and the infrared sensor 515.

[0094] FIG. 5C is an elevation view schematic diagram illustrating a system 520 for determining a sequence of contact events on a billiard table, according to an example embodiment. The system 520 includes a camera 210 positioned to view billiard balls 2l5a-c on the billiard table 205, accelerometers 220a-l in the table 205 that sense contact events on the billiard table 205, microphones 505a and 505b to receive sounds made by contact events, an infrared sensor 515 to sense heat caused by contact events, and a processor 225 that determines a sequence of contact events involving at least one of the billiard balls 2l5a-c on the billiard table 205 based on information received by the camera 210, the accelerometers 220a-l, the microphones 505a and 505b, and the infrared sensor 515. [0095] FIG. 6A is a schematic diagram illustrating a system 600 for determining a sequence of contact events on a billiard table 605, according to an example embodiment. The billiard table 605 may be the billiard table 205 described above. Awareness of the nature and order of a series of contact events ( e.g ., ball collisions) is desired, for example, in order to score or account for gameplay automatically. There are many different types of contact events, including collisions of stick-to-ball (when a player’s cue stick contacts a cue ball), ball-to-ball (when two balls collide), ball-to-cushion (when a ball collides with a cushion), and ball-to-slate (a series of predictably-timed collisions as a ball bounces on the slate, see FIG. 10, and non-collision ball-rolling (not a collision event, but a contact event - a sustained roll of the ball or the ball spinning in place can generate a predictable frequency).

[0096] According to the example system 600, a distributed set of sensing subsystems can be coordinated to acquire data, which can be used to detect and categorize the location and timing of contact events on the billiard table 605. Each subsystem may be implemented using a separate computer or server. The internal clocks of all subsystems can be

synchronized (e.g., to the nanosecond), and each subsystem can output collected

measurements using data feeds or streams. The streams can be fixed to a theoretical model of detected objects as well as their specific relation to the various sensors (e.g, accelerometers). This coordination of subsystems improves quality of measurements by using all known information to correct for distortions of measurement, providing improved determination of the location and timing of motions and collisions that no one system can precisely achieve alone. The sensing subsystems can include camera(s) 610, infrared camera(s) 615, accelerometers 620, microphones 625a and 625b, temperature, barometric pressure sensors 665, a geometric model of the space (including objects and sensors), and a physics model of the environment. Processing subsystems can include a vision system 630, a motion server 635, and a vibration acquisition and analysis system (vibration server) 640. Such subsystems 630, 635, and 640 can be connected to a nanosecond time server (not shown) to synchronize all clocks. Data collected by the subsystems can be broadcast to the network 655 so that any other subsystems can archive and utilize the information acquired. This architecture also allows for distributed performance of various tasks and a reduction of processing bottlenecks throughout the system.

[0097] The vision system 630 can be configured to interpret camera images against an expected model of the billiard playing environment. Image distortions must be taken into account if the position of balls and objects are to be determined precisely. Images directly read from a camera sensor 610 suffer from distortion ( e.g ., lens and perspective distortion). Distortions must normally be accounted for before generating a feed of precise positions for balls located in each video frame. An infrared camera 615 is able to measure changes in heat, which enables the measurement of timing and location of collisions (e.g., ball-to-slate bounces).

[0098] Previous methods of locating and tracking the motion of objects on a visual field using computer vision generally involve massive amounts of image processing across the entire field of view. Dealing with high-resolution pictures involves un-distorting the entire image, and using image processing techniques to filter colors/shapes prior to performing edge detection to locate objects. Such a traditional process requires moving large amounts of memory (graphics buffers) around (from a capture board, to user and GPU space) and processing millions of pixels exhaustively before a result can be computed. If accurate locations must be achieved using a high-frame-rate camera, the performance demands increase tremendously.

[0099] Using a precise theoretical model of the field of search (e.g, a billiard table) and the objects being searched (e.g, billiard balls), the problem can be“flipped around” completely, which achieves significant precision and speed performance gains. Rather than wasting valuable processing power on un-distorting large areas in the camera view, pixel by pixel, only to find that the objects being sought are not found in many areas, the system can, instead, operate on a mathematically distorted model, searching for synthesized theoretical objects as predicted to appear in the real world. This enables the system 600 to use computationally-expensive edge detection algorithms narrowly on hundreds (rather than millions) of pixels and essentially reduces a search to efficiently scan smaller buffers of localized raw (distorted) sensor data. Using a stream of position data for the balls on the table, the system can employ physics-based calculations to model expected motions (e.g, velocity, trajectory, acceleration, and spin) and then use predictive motion models to validate these positions with data collected by other subsystems.

[00100] The vibration acquisition and analysis system 640 can include an array of motion sensors 620 (e.g, eight slate accelerometers, twelve cushion accelerometers) and two microphones 625a and 625b to collect vibration data and archive significant portions of the data (e.g, amplitudes) as time-series data. On-demand Fast Fourier Transform (FFT) analysis and analysis logic, for example, can be used to determine the nature of collision peaks in vibrations. Due to the differences of the speed of sound through various materials e.g ., wood, rubber, slate, and air) vibration events are picked up by the sensors 620 at various times for any particular physical collision event. The subsystem 640 may take into consideration the position of each sensor, what the materials are, and known speeds of sounds though various materials. The timing for the vibration data can take latency information into account before an on-demand analysis occurs. Such latency information can be provided by the camera/vision subsystem 630.

[00101] The vibration acquisition and analysis system 640 can constantly receive vibration (including sound) data from a vibration analog-to-digital converter 660 and store the data in a vibration data store. The frequency of vibration data collection can be high (e.g., 25.6 kHz to 96 kHz) compared to frequency of image collection (e.g, 30 or 60 frames per second).

Vibration data can be obtained from a plurality of sensors embedded in the slate or rails of the table as described herein. An example unit of such vibration measurement can be 10,000 mV per gram for slate sensors. Microphones (e.g, four free-field microphones) can be mounted in the corners of the table and pointed toward the middle of the table. Sound dampening material (e.g, a visco-elastic polymer, which changes physical vibrations to heat) can be used to mount the microphones to make sure that only air-borne sounds/vibrations are measured by the microphones. As used herein, the term“vibrations” can include sound waves, which are a type of vibration.

[00102] FIG. 6B is a screen shot of an interface 670 to a system for determining a sequence of contact events on a billiard table, according to an example embodiment. The example interface includes a representation 675 of shots recorded by the system. An animation of the shots can be viewed from multiple perspectives (e.g, a bird’s eye view 680 and a perspective view 685). Sounds and vibrations associated with a shot can be viewed in graphical form 690.

[00103] FIG. 7A is a schematic diagram illustrating a cartridge 700 with sensor components. The cartridge 700 can be inserted into a receptacle in a corner of a billiard table. The sensors of cartridge 700 can include, for example, camera(s), microphone(s), heat sensor(s), and air pressure gauge(s). For example, the cartridge illustrated in FIG. 7A includes a camera 705, a microphone 710, and a temperature and barometric pressure sensor 715. The cartridge 700 and receptacle can be positioned in the comer of a table such that a billiard ball cannot make contact with the cartridge or receptacle given the dimensions of the ball and the comer of the table. The cartridge 700 and receptacle can be used to correctly position the sensors of the cartridge 700 in the environment in which they are being used. Correct positioning of the sensors, facilitated by the cartridge 700 and receptacle, improves the performance of the sensors themselves and thus, improves the system’s ability to analyze potential contact events. The cartridge 700 or receptacle can also provide shielding ( e.g ., a visco-elastic polymer) to isolate unwanted vibrations from reaching the sensors 705, 710, 715 in the cartridge 700. The cartridge can connect to a power source located in the receptacle behind the cartridge 700, as well as data communication lines to other components of the system. Using such cartridges having different configurations of sensors makes a billiard table highly configurable. FIG. 7B is a set of schematic diagrams illustrating an example cartridge that can be inserted into a receptacle in a corner of a billiard table. FIGS. 7C and 7D illustrate an example cartridge that can be inserted into a receptacle in a corner of a billiard table, according to an example embodiment. FIG. 7C shows the example cartridge positioned in front of the comer of the billiard table. FIG. 7D shows the example cartridge positioned above the corner of the billiard table.

[00104] FIG. 8A is a flow chart illustrating a method 800 of determining the occurrence of a contact event, according to an example embodiment. The example method 800 includes obtaining 805 images of a billiard table and analyzing 810 the images to determine a potential contact event between two objects. The method 800 further includes determining 815 a location of the potential contact event based on locations of the objects, and determining 820 a type and time of the potential contact event based on vibration/sound data. For example, video frames can be obtained from an overheard camera and used to detect motion and find locations of potential contact events. If two billiard balls, for example, are seen as touching in a given frame, then there exists visual evidence that a contact event has occurred, and the location (X-Y-Z position) of that contact event can be calculated from the locations (X-Y-Z positions) of the billiard balls. It may be the case that the time when two objects are touching is very short and visual evidence may not exist as images recorded by a camera may only show the balls before and after the contact occurred. In such a case, previous frames and model -based information can be used to determine an implied contact (e.g., frames of balls before contact and after contact) and a collision location and time can be calculated based on those frames. Another form of contact evidence can be obtained from heat signatures recorded by an infrared camera because a ball bouncing on the surface of the table leaves relatively bright heat marks that can be sensed by an infrared camera.

[00105] A motion server 635 can be used to analyze the location information for each object on the table to determine locations of potential contact events. The motion server 635 can use a virtual model of the table (representing locations of sensors and sensor types or other identifying information) to calculate distances between the sensors ( e.g ., accelerometers 620 and microphones 625a and 625b) and the possible contact event. These distances can be important because of latencies between a contact event and the time that evidence of that contact (e.g., vibrations) reaches a sensor. Thus, the sensed vibrations can be normalized based on time delays due to the time it takes for vibrations to travel through various media of the table (e.g, slate, wood, air). For example, the center of a table is often the furthest point from all microphones, and it can take a sound of a contact event thousands of micro seconds to travel to the microphones. These sensed sounds can be normalized by accounting for air temperature and pressure to determine the speed of sound through air at a given time.

[00106] Determining the type and accurate time of the potential contact event based on vibration/sound data can be performed by another component (e.g, a vibration server 640). The motion server 635 can send a request to the vibration server 640 to validate the suspected contact event. The request can include the approximate time (based on visual evidence) of the potential contact event, a time window, a list of sensors, and latencies of those sensors. For example, in the case of a potential contact event at a corner of the table, the motion server 635 can determine the latencies of the closest sensors (e.g, nearest microphone, nearest slate sensor, and nearest cushion sensor), and provide a list of those sensors and their latencies to the vibration server 640.

[00107] Based on collected vibrations (including sounds), the vibration server 640 can determine the type of contact event in a number of ways. One example way is for the vibration server 640 to compare the frequency of the contact event with a number of frequency signatures corresponding to different types of contact events sensed by the sensors. FIG. 8B is a graph illustrating an example frequency signature 825 of a cue-stick-on-ball contact event as sensed through slate vibration sensors. Based on sample frequency data corresponding to known cue-stick-on-ball contact events, the frequency signature can be generated by determining minimum 830 and maximum 835 frequencies of the sample frequency data. Frequency signatures can be determined for each contact event type (ball- ball, stick-ball, etc.) and for each sensor type (slate sensor, microphone, etc.). The contact event type can be classified by comparing the vibration data received by the sensors for the particular time/window with the signatures for each corresponding sensor type. In many cases, the system may perform at least one comparison of each type of sensor ( e.g ., microphone, slate sensor, and cushion sensor). Slate vibration sensor values can be represented in G-forces (e.g., .0001 Gs, where 10,000 mV from the sensor equals one G), and microphone values can be represented in decibels. The time of the contact event can be determined based on the first evidence of collision, that is, the beginning of a curve of vibration amplitude.

[00108] Another way for the vibration server 640 to determine the type of contact event is to use machine learning. For example, a convolutional neural network can be trained using training data of sensor output and contact types. Once the neural network is trained using a sufficient amount of data, sensor outputs can be provided as input to the neural network, which can return a classification, e.g, type, of the contact event.

[00109] The vibration server 640 can return one or more contact event classifications and corresponding times, and can even return a more precise location of the contact event based on the difference between sensor locations on the table. The optional location data can be used to replace the location data previously determined by the motion server 635, or for validation purposes.

[00110] FIG. 9A is a flow chart illustrating an example workflow 900 for the system 600 illustrated in FIG. 6A. The system 600 can track 905 locations of billiard balls on the billiard table and collect 910 data regarding vibrations and sounds made by billiard balls or other objects using sensors embedded in or associated with the billiard table. For example, the video camera 610, connected to a frame-grabber of the vision system 630, can track locations of objects and generate a multicast User Datagram Protocol (UDP) network data stream of positions (X-Y-Z) of the balls for each frame of video (according to the above-described model-centric approach). The vibration acquisition and analysis system 640 can collect vibration/sound data from, for example, twenty sensors and two microphones and archive data into a vibration data store. The system 640 can sample at a rate of, for example, 96 kHz, which provides for a data rate of about 10 microseconds, for example, which is a greater level of granularity than a camera alone can achieve. [00111] Locations of possible contact events involving the billiard balls can be determined 915 based on the tracked locations of the billiard balls. For a possible contact event, the system can determine 920, based on the vibration or sound data, whether a contact event has occurred, including a time and type of the contact event. For example, the vibration acquisition and analysis system 640 can wait for remote procedure call (RPC) requests and latency information from the visual system 630 to qualify the type and timing of collision events. A dedicated motion server 635 can use the position data streams to compute smooth trajectories ( e.g ., directional velocities and accelerations) to predict locations of possible collisions. The motion server 635 can make RPC requests to the vibration acquisition and analysis system 640 for expected collision events (e.g., any time an object has changed direction or may have changed direction) to determine a time and type of collision event, if any. Analysis of vibration data is computationally expensive, so reference to vibration data can be limited to determining collisions (e.g, ball-to-ball, ball-to-slate, ball-to-cushion, and cue-to-ball collisions). The motion server 635 can predict collision events, compute theoretical latencies for collision vibration data, and then send a query (with computed sensor latencies) to the vibration acquisition and analysis system 640, which can apply the latencies to the data before computing a result: a time and type of collision event. The result can be sent back to the motion system 635, which can store 925 the contact event as a discrete event with its timing into a database for review using a user interface system 650. The user interface system 650 can be a web-based user interface used to analyze the billiards data by synchronizing the data into, for example, videos, 3D models, vibration data, or discrete events.

[00112] FIG. 9B is a flow chart illustrating determining 920 whether a contact event has occurred. The system 600 can determine 930 distances between a location of a possible contact event and locations of a plurality of vibration (e.g, including sound sensors) sensors. For each of the plurality of vibration sensors, the system can determine 935 a latency for the sensor based on a speed of vibration through the material between the location of the possible contact event and the sensor. The system can determine 940 an approximate time of the possible contact event, a list of sensors to use, and latency information for the sensors. Based on vibration/sound data from the list of sensors to use and the latency information for the sensors, the system can determine 945 whether a collision event has occurred at about the approximate time. [00113] It should be appreciated that while many of the above aspects have been described in the context of a billiard table, they may equally be used in other environments, such as, for example, in sporting arenas and performance stages.

[00114] FIG. 10 illustrates example calculations of heights of a ball bouncing on a surface based on reaction phenomenon. Reaction phenomenon can refer to a sequence of contact events that can provide additional information. For example the timing between ball-slate collision events, when a billiard ball bounces on the table, can be used to calculate a height (Z-value) of each bounce of the ball, without regard to the ball’s X-Y position. Once bouncing is detected, additional bouncing will continue until the ball comes to rest. The system can use such reaction phenomenon information to determine height values to construct true X-Y-Z positions of the billiard balls, which can then be rendered graphically.

[00115] Another aspect of the disclosed systems and methods is a lighting system. FIG.

11 is a schematic diagram illustrating a lighting system 1100, according to an example embodiment. The lighting system 1100 includes a plurality of light sources 1105, which can be arranged to project light toward a subject area, such as a billiard table playing surface.

The plurality of light sources 1105 can be an arbitrary number of light sources, which can include LED light sources. While the light sources 1105 shown in FIG. 11 are arranged in a grid, the light sources 1105 may be differently arranged. For example, rows may be offset from each other, or the light sources may be arranged in a circular pattern. The light sources may be referred to herein as“luxels.” A luxel may be considered to be a pixel of light, but, unlike a pixel, a luxel does not have to be part of a display screen and may be of any arbitrary size. The system 1100 also includes a camera 1110, positioned to view the subject area and to sense lighting characteristics ( e.g ., brightness and color) of the subject area, and a controller 1115, communicatively coupled to the light sources 1105 and the camera 1110.

The controller 1115 receives lighting characteristics sensed by the camera 1110 and adjusts at least a portion of the light sources 1105 based on the lighting characteristics received from the camera 1110 to achieve desired lighting conditions for an area illuminated by the system 1100. The system 1100 may be configured to provide any variety of desired lighting conditions, such as uniform lighting of an area.

[00116] FIG. 12 is a schematic diagram illustrating a lighting system 1200, according to an example embodiment. Similar to the system 1100 of FIG. 11, the lighting system 1200 includes a plurality of light sources 1205, camera 1210, and controller 1215. In FIG. 12, the light sources 1205 are arranged in a grid with an opening in the middle of the grid through which the camera 1210 can be positioned to view a subject area.

[00117] FIG. 13 is a schematic diagram illustrating a lighting system 1300, according to an example embodiment. The lighting system 1300 includes a plurality of light sources l305a-f arranged to project light toward a subject area 1320, shown as an athletic field of a stadium. The system 1300 also includes a camera 1310, positioned to view the athletic field 1320 and to sense lighting characteristics of the field 1320, and a controller 1315, communicatively coupled to the light sources l305a-f and the camera 1310. The controller 1315 receives the lighting characteristics sensed by the camera 1310 and adjusts at least a portion of the light sources l305a-f based on the lighting characteristics received from the camera 1310.

[00118] FIG. 14 is a schematic diagram illustrating a lighting system 1400, according to an example embodiment. The lighting system 1400 includes a plurality of light sources l405a,b arranged to project light toward a subject area 1420, shown as a stage. The system 1400 also includes a camera 1410, positioned to view the stage 1420 and to sense lighting

characteristics of the stage 1420, and a controller 1415, communicatively coupled to the light sources l405a,b and the camera 1410. The controller 1415 receives the lighting

characteristics sensed by the camera 1410 and adjusts at least a portion of the light sources l405a,b based on the lighting characteristics received from the camera 1410.

[00119] FIG. 15 is a schematic diagram illustrating a grid of white LEDs, according to an example embodiment. The letter“W” designates white LEDs. FIG. 16 is a schematic diagram illustrating a grid of multi-colored LEDs, according to an example embodiment.

The grid of LEDs includes alternating rows of white and multicolored LEDs ( e.g ., row 1605 is a row of white LEDs and row 1610 is a row of multi-colored LEDs). Where they appear, the letter“W” designates a white LED, the letter“R” designates a red LED, the letter“G” designates a green LED, and the letter“B” designates a blue LED (e.g., LED 1615 is white, LED 1620 is red, LED 1625 is green, and LED 1630 is blue). In addition to adjusting the brightness of a subject area, a lighting system using multi-colored lighting sources (such as the lighting sources of FIG. 16), can adjust the color of light being projected on the subject area.

[00120] FIG. 17 is a flow chart 1700 illustrating calibrating a lighting system, e.g, 1100, according to an example embodiment. The calibration can be carried out by a controller (e.g., controller 1115, 1215, 1315, or 1415 of systems 1100, 1200, 1300, or 1400). Calibration of the lighting system includes capturing and storing a map of each luxel’ s influence within a region of interest. The region of interest can be defined as a set of arbitrary ( e.g ., four or more) points in a subject area. The calibration process captures (1705) a Base Reference Image of the region of interest. The Base Reference Image can be represented by an arbitrary brightness value for each of the plurality of points in the region of interest (e.g., a brightness value of“0” can represent a minimum brightness, and a brightness value of“255” can represent a maximum brightness). The process continues by determining the influence on the region of interest of each luxel of the lighting system. A specific luxel may be referred to as“Luxel K,” where K is a number of the specific luxel. For each Luxel K (where K increments from 1 to N, N being the number of luxels) (1710), set (1715) Luxel K to its maximum brightness, capture and store (1720) an Image K (e.g, in a format similar to the Base Reference Image) of the region of interest, compute (1725) the influence of Luxel K based on the difference between Image K and the Base Reference Image, and store (1730) the influence of Luxel K as Influence K (e.g, in a format similar to the Base Reference Image). Influence K can be considered an influence map of Luxel K. The result of the calibration is an influence map for each luxel of the lighting system.

[00121] FIG. 18A is a flow chart illustrating a method 1800 of adjusting light sources (e.g, luxels) of the lighting system, according to an example embodiment. The adjustment of light sources may be carried out by a controller (e.g, controller 1115, 1215, 1315, or 1415 of systems 1100, 1200, 1300, or 1400). The goal of luxel adjustment is to achieve, within a certain error threshold or close to a local error minimum, a Target Image 1805, which may be represented by a brightness value for each point in the region of interest (e.g, in a format similar to the Base Reference Image format). The Target Image and error threshold may be defined manually or dynamically (automatically). According to the example process, all luxels are set 1810 to an initial brightness value. For example, on a scale of 0 to 4 (where“0” represents a minimum amount of brightness that can be produced by a luxel and“4” represents a maximum amount of brightness that can produced by a luxel), each luxel can be set to an initial brightness value of 2, which can represent about half of the amount of brightness that can be produced by a luxel. The process then iterates through the following actions until the Target Image is achieved within the error threshold or close to the error minimum. Capture 1815 a Current Image of the region of interest, and determine 1820 an Image Difference based on the difference between the Target Image and the Current Image (e.g., using a sum-of-squares difference measurement). If the Image Difference is greater than the error threshold 1825, then for each Luxel K (where K increments from 1 to N, N being the number of luxels) and for each Brightness Value P (where P increments from 0 to Q, Q being the number of brightness values) 1830, combine 1835 the Current Image with the Influence K for Luxel K at the Brightness Value P to determine an Anticipated Image Difference K,P. The Anticipated Image Difference K,P may be represented, for example, by a numerical value computed using a sum-of-squares measurement. Considering all brightness values for all luxels results in N x Q anticipated image differences. The Luxel K and Brightness Value P corresponding to the Difference K,P with the lowest value is selected 1840, and Luxel K is set 1845 to Brightness Value P. When the Image Difference is less than or equal to the error threshold or the error minimum is met 1825, the process ends 1850.

[00122] FIG. 18B is a flow chart illustrating a method 1855 of adjusting light sources (e.g, luxels) of the lighting system, according to an example embodiment. Another goal of luxel adjustment is to achieve, either within a certain error threshold or a local error minimum, a Policy, which may be represented by a certain rules or measurements. Example policies can include, for example, creating a uniform brightness on a surface, or minimizing shadows created by objects on the surface. Another example of a policy can be to optimize the contrast of ball edges, which is useful for computer vision edge detection purposes. A high contrast along the edges of ball tends to be more computer-vision friendly. In addition, the system can have multiple policies, in which case the system can find a balance between satisfying the policies (e.g, average threshold value or greatest number of rules satisfied for the polices). According to the example process, a Policy is determined (e.g., set) 1860 and all luxels are set 1865 to an initial brightness value. For example, on a scale of 0 to 4 (where“0” represents a minimum amount of brightness that can be produced by a luxel and“4” represents a maximum amount of brightness that can produced by a luxel), each luxel can be set to an initial brightness value of 2, which can represent about half of the amount of brightness that can be produced by a luxel. The process then iterates through the following actions until the Policy is achieved within an error measurement threshold or achieved by reaching a local error minimum. Capture 1870 a Current Image of the region of interest, and determine 1875 an Error Measure based on the Policy and the Current Image (e.g, in the case of a uniform brightness policy, a standard deviation among pixels of the Current Image can be used). If the Error Measure is greater than the error threshold or minimum 1880, then for each Luxel K (where K increments from 1 to N, N being the number of luxels) compute the influence of Luxel K (Influence K) 1885, and combine 1887 with the Current Image the Influence K for Luxel K to determine an Anticipated Error Measure K. The Luxel K corresponding to the Anticipated Error Measure K with the lowest value is selected 1890, and Luxel K is adjusted 1892 according to the Policy. When the Error Measure is less than or equal to the error threshold or has reached a local error minimum 1880, the process ends 1895.

[00123] FIG. 19 is a schematic diagram illustrating a lighting system 1900, according to an example embodiment. The lighting system 1900 includes a luminaire 1930 having a plurality of lighting sources (“luxels”) 1905 and an opening in the middle to position the camera 1910 of the system 1900. The luminaire 1930 is arranged to project light toward a subject area (region of interest) 1925 of a surface 1920, and the camera 1910 is positioned to view the region of interest 1925. As shown in FIG. 19, each luxel of the luminaire 1930 projects light substantially on a discrete number of points in the region of interest. For example, luxel l905a projects light substantially on location 1935a on the surface 1920, and luxel l905b projects light substantially on location 1935b on the surface 1920.

[00124] FIGS. 20A-D are schematic diagrams illustrating calibration of a lighting system, according to an example embodiment. The number of luxels shown in the figures is relatively small, and the figures show only a subset of the calibration steps described above for illustration purposes. FIG. 20A illustrates an example influence of one luxel 2005a on a region of interest 2010. Luxel 2005a projects varying amounts of light on different points in one location 2015a of the region of interest 2010. The brightness value of luxel 2005a is represented by a numerical value ( e.g ., the number“4” representing the brightest value of the luxel in this example), and the amounts of light projected on the points in location 2015a are represented by numerical values. The numerical values of each point in the region of interest 2010 can be used to represent the influence of luxel 2005a. Similarly, luxel 2005b projects varying amounts of light on different points in another location 2015b of the region of interest 2010, luxel 2005c projects varying amounts of light on different points in another location 2015c of the region of interest 2010, and luxel 2005d projects varying amounts of light on different points in another location 20l5d of the region of interest 2010. The locations (e.g., 20l5a-d) can overlap. The calibration results in an influence map for each of the luxels, as described above in connection with FIG. 17. [00125] FIG. 21 is a schematic diagram illustrating adjusting light sources of a lighting system, according to an example embodiment. FIG. 21 illustrates how setting the brightness values of the luxels of a luminaire 2105 to the same level can result in different amounts of light being projected on various points in a region of interest 2110. The numerical values shown in the region of interest 2110 can be used to represent the Current Image 2115, as described above in connection with FIG. 18A. Image 2120 is shown as an example representation of a Target Image as described above in connection with FIG. 18 A. Image 2125 is shown as an example representation of an Image Difference as described above in connection with FIG. 18 A, calculated using a sum-of-squares calculation.

[00126] FIGS. 22A-F are schematic diagrams illustrating adjusting light sources of a lighting system, according to an example embodiment. FIGS. 22A-F show a subset of steps of the process 1800 described above in connection with FIG. 18A. FIGS. 22A-C show the computation of three example Anticipated Difference Images 2225a-c, and corresponding difference values, when adjusting luxel 2205a (also denoted“L(0,0)”) to three example brightness values ( e.g ., 3, 4, and 5). FIGS. 22D-F show the computation of three other example Anticipated Difference Images 2225d-f, and corresponding difference values, when adjusting luxel 2205b (also denoted“L(2,0)”) to three example brightness values (e.g., 1, 3, and 4). For the sake of illustration, considering just these six difference values associated with the six example Anticipated Difference Images 2225d-f, the scenario of FIG. 22B would be selected as it is associated with the smallest difference value (a value of 40). As such, luxel 2205a would be set to brightness value“4”, and the process 1800 described in connection with FIG. 18A would continue until a target image (within a certain threshold) is reached.

[00127] Another aspect of the disclosed systems and methods is a color variation along a perimeter of the billiard table’s surface. FIG. 23 A is a schematic diagram illustrating a portion of a billiard table 2300 with a color variation 2305 on a perimeter of a playing surface 2305 of the billiard table, according to an example embodiment. The color variation 2305 can be located under an overhang of the cushion 2315 of the table. The cushion 2315 typically casts a shadow along the perimeter of the playing surface 2310. As such, the color variation 2305 can be selected to blend with the shadow. For example, if the color of the playing surface 2310 is green, the color of the color variation 2305 can be blue, which can help reduce its noticeability by a user of the table. A system, such as the system 200 illustrated in FIG. 2 A, can process an image of the table ( e.g ., obtained by camera 210 or cameras 2505 or 2510 (FIGS. 25A and 25B)) to increase contrast between the color variation 2305 on the perimeter and the color of the playing surface 2310. For example, a color variation 2305 that is blue can be digitally converted to white in an image of the table, which increases the contrast (e.g., white against green). The increased contrast in the digitally- altered image allows for a better visual determination of a location of a billiard ball with respect to a cushion. Such a determination is important in many billiard games in which it must be determined whether a ball has made contact with a cushion. FIG. 23B and 23 C illustrate a portion of a billiard table with a color variation 2305 on a perimeter of a playing surface 2310 of the billiard table. The color variation 2305 is shown in FIG. 23B as digitally converted to a high-contrast color (e.g, blue to white) compared to the playing surface 2310. Thus, it can more easily be determined whether a ball 2320 is in contact with a cushion 2315. In FIGS. 23B and 23C, the ball 2320 in not in contact with cushion 2315. FIG. 23B shows by the color variation 2305 being visible between the ball 2320 and the cushion 2315. The improved contrast can help make automated determinations more accurate as it improves processing of an image of the ball and cushion. The color variation can be achieved, for example, by using a different color cloth in place of the existing cloth, changing the color of the existing cloth, or affixing an additional layer (cloth tape or other material) on top of the existing cloth.

[00128] Another aspect of the disclosed systems and methods is simulation of expected trajectories of objects (e.g, billiard balls). Such simulation can be used for predicting where billiard balls will be at a future time, for training, for solving problems, for gaming, and for anticipating future shots and camera positions. Various levels of simulation can be performed, including (1) simulating where a ball will be from time t to time t+l, (2) simulating where all balls will be from time t 0 to t n , for a given billiards shot, (3) determining an“optimal” shot to take from all possible shots, taking into account, for example, chances of scoring, chances of error (which can be based on player’s margin of error), and what is likely a best next shot for the shooting player and a worst shot for the opponent player. An optimal shot can be determined based on how the balls’ positions and actions meet a set of criteria (e.g, ending ball positions, contacts with cushions). Determining an optimal shot may require thousands of simulations, and is generally one that is error tolerant (e.g, has a shorter path, fewer changes of direction, and appropriate speed). The optimal shot can be can customized for individual players based on player style profile. For example, some players are more offense-minded while others are more defense-minded. It should be appreciated that such simulations are useful outside the context of billiards, and may be applied to other activities, such as golf ( e.g ., golf ball rolling on a putting green), and can be used to build other systems, such as a robotic golfer.

[00129] As described above, the system can track the location of each object on the billiard table. Based on this information, the system can determine when shots start and stop, which can be used for scoring and for clipping of video, for example. The system can also use this information to catalogue and queue-up video from multiple cameras. This location information can also be used to anticipate where balls will move in the future based on a physics-based model of the billiard table. Movement that can be predicted by the system includes ball location, spin, speed, and direction. While a physics-based model of the table can be used to make these predictions, additional information may be needed. For example, the table may not be exactly level, and chalk from each shot accumulates on the surface of the table, which can result in sticky residue that changes frictions of the table surface. To account for such anomalies, the system can observe what happens when a ball is in motion and can use such information to inform predictions of how the balls will move.

[00130] The system can catalogue reactions of balls with other objects (with respect to speed and direction of the balls, as well as rotational speed and direction) and represent those reactions in a table that the system can reference. When interacting with cushions, the system can catalogue, for example, spin, speed, angle of reflection of the ball, and amount of chalk near the cushion and on the ball. Different cushions (or parts thereof) may act differently, so the system can keep a map of the whole table to account for the different reactions. A neural network can be used to predict movement of a ball when interacting with a cushion. Inputs to the neural network can include location, speed, spin, and incidence angle of a ball, and the output can include angle of reflection, speed, and spin of the ball. Training data for the neural network can include all previous actions on the table, and can be constantly updated during gameplay (on-line training).

[00131] Another neural network can also be used to predict the movement of rolling of balls. Inputs to the neural network can include spin, speed, direction, and location of a ball, and the output can include spin, speed, and direction of the ball. During roll of the ball, the system can continually access the neural network to determine how the ball will continue to roll. Using neural networks, or similar techniques, to learn how balls react on a table is advantageous because each table can be unique. The learned reactions enable more accurate predictions to be made.

[00132] If sufficient training data does not exist (based on a threshold, which may be set manually or automatically calculated), the system can use a physics-based model until enough training data is collected to make the neural network(s) useful. Alternatively, training data from another table can be used as a starting point. It should be appreciated that if conditions of the area for which object trajectories are being analyzed change, e.g ., a billiard table is reconditioned (cleaned, felt replaced, cushions replaced, etc.), the neural network may need to be retrained with new training data.

[00133] Both neural networks (for rolling balls and interactions with cushions) can use a map (e.g, contour map) of chalk deposits in reaching their determinations. FIG. 24 is a top view schematic diagram illustrating a chalk deposit 2410 on the surface a billiard table 2405. Such a chalk deposit 2410 can be left on the surface 2405 when a cue stick 2420 (having chalk on its tip) strikes a ball 2415. Tracking the location of such chalk deposits can be useful for determining how a billiard ball will roll over the surface of the table, as chalk deposits can affect a rolling ball (e.g, changing rolling friction of the ball and, thus, the speed of the ball). A system, such as the system 200 illustrated in FIG. 2A for example, can process images of the table (e.g, obtained by camera 210) to identify the location of the chalk deposits. Alternatively, chalk deposit locations and quantities can be predicted based, for example, on the location, sound, or other data of a cue ball strike. The chalk deposit location information can then be used to determine expected motion of the ball(s) on the table. The chalk map may be represented as a table with X-Y locations and accumulations of chalk as an index of chalk density, which can correspond to a coefficient of friction, at those locations.

[00134] Another aspect of the disclosed systems and methods is a system of cameras for capturing game play. FIG. 25A is a top view schematic diagram illustrating a system 2500 of cameras 2505, 2510, 2515 for capturing billiard game play, according to an example embodiment. FIG. 25B is an elevation view of the system 2500 of FIG. 25 A. The example system includes an overhead camera 2515, two robotically controllable cameras 2505, 2510, and a control unit 2520. The overhead camera 2515 can be positioned above a billiard table 2525. Billiard balls on the billiard table 2525 can be in a field of view of the overhead camera 2515. The robotically controllable cameras 2505, 2510 can be positioned in locations surrounding the billiard table 2525, and can be mounted, for example, to a curved track 2530 suspended above the table 2525, or to robotic arms. The control unit 2520 is

communicatively coupled to the overhead camera 2515 and the robotically controllable cameras 2505, 2510. The control unit 2520 can control the overhead camera 2515 or robotically controllable cameras 2505, 2510 to capture image data of billiard game play. The robotically controllable cameras 2505, 2510 can be moved to any position around the billiard table 2525. The control unit 2520 can cause one or more of the robotically controllable cameras 2505, 2510 to follow a player as the player moves with respect to the table 2525.

The player can be identified through use of facial recognition processing, and can be assigned a role ( e.g ., shooter) by the system. The control unit 2520 can cause the robotically controllable camera(s) to follow the player based on the assigned role. The control unit 2520 can control pan and zoom of the overhead camera 2515, and can control pan, tilt, zoom, height, and position with regard to the billiard table 2525 for the robotically controllable cameras 2505, 2510. The control unit 2520 can modify a speed at which the robotically controllable cameras 2505, 2510 are moved based upon a speed of a player at the billiard table 2525.

[00135] The system 2500 can also include a simulation unit (not shown in FIGS. 25A and 25B) communicatively coupled to the control unit 2520. The simulation unit may be part of a motion server (e.g., 635 from FIG. 6 A), as described above, or may be a separate component. As described above, the simulation unit can determine an expected billiard shot to be played and forward the determined expected shot to the control unit 2520. The system 2500 can further include a stationary camera positionable to have the playing surface of the billiard table 2525 in a field of view of the stationary camera. The stationary camera can obtain and forward image data of the playing surface of the billiard table 2525 to the simulation unit.

The expected billiard shot can be used by the control unit 2520 to control at least one of the cameras 2505, 2510, 2515 to capture an initial impact of the shot and a final perspective of the resulting motion of the billiard balls.

[00136] It is preferable to have the camera(s) in position to capture a next shot or contact event. Ideally, the camera is moved to a position where it can obtain footage that is along a tangent between the surfaces of a contact event. For example, a ball-on-ball contact event is likely to be best viewed along a vertical plane between the two balls. A ball-on-cushion contact event (ball 2610 contacting cushion 2615) is likely to be best viewed along a plane 2605 as shown in FIGS. 26A and 26B. The system can calculate the anticipated point of contact to determine the plane location, based on X-Y-Z locations. The plane 2605 can be referred to as a“tangent plane.” Viewing a potential contact event along the tangent plane 2605 is useful for any two objects, such as determining whether a player touches a ball ( e.g ., when a billiard player“cues over” a ball). Viewing along the tangent plane can also be useful in non-billiard sports, such as in baseball when determining whether a player touches a base before being tagged by another player, or in football when determining whether the football crosses the goal line.

[00137] For an anticipated contact event (e.g., interesting/critical contact event), the camera(s) can be moved to capture video along the tangent plane. Some cameras may be limited to certain positions, so such a camera can move to the closest position on the tangent plane. FIG. 27 is a flow chart illustrating an example method 2700 of capturing images of a contact event. The example method 2700 includes determining 2705 a probable next contact event (e.g, shot, ball contact) and determining 2710 a plane from which to view the probable contact event. A camera (e.g, high-speed camera) can be moved 2715 to a position to view the probable contact event along the plane and can then capture 2720 video relevant to the probable contact event. As described above, an anticipated contact event can be determined based on a simulation ran earlier as to a probable next shot. Once camera(s) are moved into place, a trigger for capture can be set to occur (e.g, for capturing high-speed video). While video from a regular camera may be constantly stored in system memory, high-speed cameras generate large amounts of data, so the system can use a trigger to download only the frames that are relevant to the anticipated contact event. High-speed cameras use a large circular buffer (e.g, totally 2-10 seconds of high-speed video). A high-speed camera recording at 8000 fps) results in 40k-50k frames in a 10 second buffer. Because the video frames that are most relevant to a contact event may be only about fifty frames (or some similar relatively small number), the contact evidence determined by the motion and vibration servers can be used to specify which frames to obtain from the high-speed camera. Many high-speed cameras require that specific frame numbers be requested. Such frame numbers can be calculated based on the time of a contact event, the frame rate of camera, and the current frame number of the camera. For example, if a camera is recording at 10,000 fps, and the frames from 1 second ago are desired, then the system can request the frames that are 10,000 frames back. [00138] One interesting aspect of the cameras of the disclosed system is that a human can intervene with the camera control at a granular level. For example, the overall movement of a camera can be automatically controlled, and a human can override a subset of camera functions to tweak certain aspects of the camera ( e.g ., focus, framing, creative touch). This allows a camera to move into a general position for an anticipated shot, and a human can fine-tune focus/zoom/depth-of-field for aesthetic purposes.

[00139] In addition to the cameras described above, two cameras, for example, can be used to analyze the body postures and faces of the billiard players. Facial features and limbs/joints of the players can be tracked using commercially-available software. Analyzing the player’s postures (“pose estimation”), movements, and facial expressions can be performed in order to keep those cameras recording the players (e.g., keep player in camera frame and follow the player), to avoid collision with a player (e.g, keep track of player locations), to identify who is currently shooting, and to anticipate how much time there is before a player is going to shoot. For example, a player standing up likely means that there is time before the player will shoot, and cameras can be moved into position during that time. When a player moves down to shoot, the system can determine based on a player profile (past shooting history) and body movement how much time there will likely be until a shot is taken. If there is likely not enough time to move a camera into a desired position, the system can use a sub-optimal position for viewing a shot. If enough time is available, the camera can move into a better position.

[00140] Another aspect of the disclosed systems and methods is an integrated lighting system for a billiard table. FIG. 28 is a top view schematic diagram illustrating an integrated lighting system 2800 for a billiard table, according to an example embodiment. The integrated lighting system 2800 includes surface lights 2805 in a playing surface 2825 of the billiard table, marker lights 2810 in upper surfaces 2830 of rails of the billiard table, cushion lights 2815 along rail cushions 2835 of the billiard table, and a controller 2820 coupled to the surface lights 2805, marker lights 2810, and cushion lights 2815 to control lighting characteristics of the surface lights 2805, marker lights 2810, and cushion lights 2815. The surface lights 2805, marker lights 2810, and cushion lights 2815 can include, for example, light emitting diodes or end-glow fiber optic strands. End-glow fiber optic strands can be coupled to light emitting diodes. The lighting characteristics of the surface lights 2805, marker lights 2810, and cushion lights 2815 can include color, brightness, or lighting patterns, e.g, flashing. The controller 2820 can set the lighting characteristics of any of the surface lights 2805, marker lights 2810, and cushion lights 2815 to indicate states of billiard play. The integrated table light system 2800 can be interactive and dynamic (contextual) via a game-intelligent computer controlled mechanism (e.g, controller 2820). As an example of interactive lighting, the controller 2820 can cause the marker lights 2810 to (a) illuminate white or yellow to designate a turn of a billiard player designated as corresponding to white or yellow, (b) flash to warn of an impending time-out foul, (c) illuminate red to indicate an expired shot clock, (d) indicate a start or stop of billiard play, or (e) illuminate to provide a count of points. It should be understood that the specific color assignments described above are examples, and any color can be used for any role. As a further example, the controller 2820 can cause the cushion lights 2815 to illuminate selectively to indicate aiming points for billiard play or to indicate contact points of billiard play that has occurred. The surface lights 2805 can include one or more of a head spot, a foot spot, a center spot, and an opening break spot for indicating placement of balls on the table. Any of the above lights may also be used to indicate a type or magnitude of audience feedback lights (e.g, by changing the color or brightness of the lights).

[00141] Surface Lights 2805 - For game play, billiard tables require official markings to be made on the table surface 2825 that are relevant during certain phases of gameplay. The location of the markings should be precise. Traditionally, such markings are placed permanently on the surface using stickers or using a pen or marker. Stickers are problematic because they can come off of the surface. Additionally, the stickers have a thickness, which can interfere with a ball rolling on the surface. Pen or marker markings on stretched wool cloth (a typical billiard table surface covering) are difficult to place accurately and may actually shift slightly over time. They also wear away and have to be re-marked from time to time. These types of permanent markings are not contextually relevant or“smart.” Use of overhead laser pointers is one implementation that facilitates contextual placement; however, lasers cannot be as precise since their size depends on the distance between the laser and the table surface. Also, laser surface markings can become occluded by a ball passing over it, or other object passing between the laser and the table surface. The disclosed integrated light system 2800 can include just under the surface 2825 slate-embedded end-glow fiber optic strands, which can offer precise positioning and be integrated with a camera or other vision system that is configured to monitor or control the state of a billiard game being played. The system can make the spots appear or disappear or appear in various colors depending on the context of the game. An example color scheme for the surface lights 2805 is as follows:

Head Spot (white/yellow, depending on context); Foot Spot (red); Center Spot (white/yellow, depending on context); Opening break spots (white/yellow, depending on context).

[00142] The surface lights 2805 can also, or alternatively, include side-glow fiber optic elements, light-pipes, or similar elements to produce lines on the surface 2825 of the table. Lines are often useful in billiard games to delineate regions or thresholds. Such lines may be straight ( e.g ., balklines in carom or lines used to show where to line-up spotted balls in pool), or curved as in the snooker“D” behind the head string. A portion of the side-glow fiber optic elements, light-pipes, or similar components can be made to be flat on one side so that it can be mounted flush with the table surface 2825 (e.g., flush with the slate of the table surface 2825, just under a felt coating).

[00143] The surface lights 2805 can flash if the camera system, for example, detects that a ball is being manually handled (i.e., not rolling naturally on the table). According to the rules and context of the game being played, the appropriate surface light or lights can illuminate to indicate candidate placement locations for the ball being handled. In some cases, the closer the ball approaches a designated spot, the faster the light can blink until it is a solidly lit spot and the ball can be placed on the spot. If, given the state of the game being played, there is no rule that requires the ball to be placed at a given location, the surface light at that location may not illuminate.

[00144] The controller 2820 can be configured to be aware of various phases of the game in play and the rules associated with ball placement. For example, at the start of a three- cushion billiards game, the red ball is placed on the foot spot, the opponent’s cue ball is placed on the head spot, and the cue ball is either placed on the spot to the left or right of the head spot. If the player (or official) holds the ball near a valid spot, the surface light at that spot can blink to indicate that the spot is a good candidate position. Surface lights at invalid spots (such as the center spot, which would not be in play at that time) can turn off.

[00145] As a further example, the surface lights can act as guides for resetting balls during the course of a game, which is an option when a ball becomes frozen (i.e., stops in contact with a players cue ball). The correct rules and options are not always well known to even experienced players. Thus, the surface lights 2805 can be used to guide the resetting of balls to their proper locations. For example, if the red ball is frozen to the player’s cue ball, then it stands to be re-spotted to the foot spot, while the cue ball (yellow or white ball) is re-spotted to the head spot. However, if the opponent’s cue ball is frozen to the player’s cue ball, the opponent’s cue ball should be re-spotted to the center spot. Using the contextually smart surface light system, if the official attempts to move the cue ball near the center spot (where it should not be placed), the controller 2820 can be configured not to illuminate the surface light near the center spot, which would misguide the act. The controller 2820 can be configured to illuminate the surface light at the head spot, which is the correct position for the cue ball in that situation. An attempt to bring the opponent’s cue ball near the head spot would not result in illumination of the surface light at the head spot. A proper attempt to bring the opponent’s cue ball near the center spot would cause the surface light at the center spot to illuminate, indicating the proper place to re-spot the ball.

[00146] In an example surface light configuration, the surface light system can include seven spots (including two alternate spots shown in FIG. 28), where only five spots are used at a time. In such a configuration, the alternate two spots can allow the table to operate in either direction at will. In other words, the players can decide which side of the table is the head and which side is the foot. Normally, when spots are marked permanently (with a marker), the head and foot designations are predetermined and cannot be changed. The example surface light configuration having extra, alternate spots allows the head and foot of the table to be dynamically reversible.

[00147] Marker Lights 2810 - The system 2800 can include a series of, for example, twenty-eight visual markers located around the perimeter of the table along each side of the rails, or as another example, 240 lights spaced out at approximately ten lights per typical diamond distance along the perimeter of the table. In one embodiment, there can be twenty- eight main marker lights, and additional marker lights at one-half or one-quarter points between pairs of main market lights. Such marker lights can be selectively illuminated. The marker lights 2810 can be referred to as“diamonds” although they can be round or another shape. Marker lights 2810 can serve as visual aids, aiming devices, rulers to measure shots, landmarks to calculate angles, and as cues for spatial orientation with respect to the playing surface 2825. The marker lights 2810 can change color, lux, and flashing pattern to indicate various game states to players and spectators. Example applications of the marker lights 2810 include: illuminating the marker lights 2810 white or yellow to designate which player’s turn it is at the table; flashing the marker lights 2810 to warn of an impending time- out foul; illuminating the market lights 2810 red to indicate that a shot-clock has expired; using the marker lights 2810 as a timer to indicate start and stop of fixed practice sessions prior to match start; and using the marker lights 2810 to provide count indications about a series of points scored by a player, or some combination thereof. When not lit, the marker lights 2810 can be colored the same as the color of the table surface in which the lights are mounted, providing a way to simulate removal of the markers from the table for training or other purposes.

[00148] Cushion Lights 2815 - The system 2800 can include cushion lights 2815 that can include fiber optic strands embedded within the cushions 2835 at contact points along the edge of the table. The purpose of the cushion lights 2815 can be to indicate where to aim and provide points of gaze, to display which contact points were previously hit on a given shot, or to provide for instruction by displaying one or more solutions for a subsequent shot, indicating, for example, where to aim and what spin strategy to use ( e.g ., by using different colors of light). The cushion lights 2815 can be controlled by the controller 2820 and color- coordinated, for example, according to which ball made, or is anticipated to make, contact.

[00149] Instead of being embedded within cushions 2835 at contact points along the edge of the table, the cushion lights 2815 can be arranged at the base of the cushions 2835, in a gap 2845 between the cushions 2835 and the playing surface 2825 of the table. FIG. 29 illustrates lights 2815 arranged between the cushion structure 2835 of a billiard table and the playing surface 2825 of the table. A billiard table rail can be constructed with a cavity 2855 to mount light sources 2815 underneath a cushion structure 2835 of a billiard table. If not specifically constructed with a cavity, the rails 2830 of a table may be removed and a cavity 2855 can be created in the rails underneath the cushion structure 2835. Light sources (e.g., an LED array) 2815 can be mounted within the cavity 2855, and a light carrying medium 2850 can be mounted between the light sources 2815 and the gap 2845 where the cushion 2835 meets the playing surface 2825. Such a light carrying medium 2850 can be, for example, a plurality of light pipes corresponding to the light sources. A light pipe is typically made of a hard plastic material capable of carrying light waves with minimal loss. An array of light pipes 2850 can be constructed as a single unit and coupled to the light sources 2815. Light projected by the light sources 2815 can be carried by the light pipes 2850 to the gap 2845 where the base of the cushion 2835 meets the playing surface 2825. The light sources 2815 can then be used to illuminate the gap 2845 under the cushions 2835 to act as indicators that are useful for training or billiard game play effects. One example use of the lights is to aid in repositioning of a ball after it has been removed from the table ( e.g ., for cleaning purposes). Because the system accurately tracks the location of each ball, the system can record the position of a ball before it is removed from the table. Lights can be used to aid in replacing the ball at that position by tracking the position of the ball being replaced and providing visual feedback regarding the desired position. For example, the if the ball needs to be moved toward the head side of the table, lights near the head side of the table can flash to indicate that the ball needs to be moved in that direction. The lights can flash faster and faster until the ball is in the correct spot, at which time the lights can stay illuminated.

[00150] The lighting system 2800 can also include floor lights 2840a-h coupled underneath the billiard table to project light below the billiard table. FIG. 30 is an elevation view schematic diagram illustrating such floor lights 2840a-h arranged underneath a billiard table. The controller 2820 can cause the floor lights 2840a-h to, for example, (a) illuminate red to indicate that a foul has been committed by a player, or (b) illuminate white or yellow to designate a turn of a billiard player. Floor lighting can help spectators and players see which ball is playing next.

[00151] FIGS. 31A-D are schematic diagrams illustrating configurations for producing spots of light on a billiard table, according to example embodiments. The configurations illustrated in FIGS. 31A-D can serve as marker lights 2810 of the disclosed system 2800.

FIG. 31 A illustrates a non-illuminated spot 3115 on a billiard table rail 3105. The spot 3115 is created by an insert 3110 of a color different from the color of the rail 3105. The material under the surface layer can be wood, and the surface layer can be lacquer. FIG. 31B illustrates a light source 3125 (e.g., LED) mounted within a cavity 3120 of the rail 3105. A light-diffusing material 3135 is used to diffuse the light produced by the light source 3125. The configuration of FIG. 31B produces an illuminated spot 3130 having a bright interior, relative to the perimeter, and a dark perimeter, relative to the interior. FIG. 31C illustrates a light source 3125 (e.g, LED) mounted within a cavity 3120 of the rail 3105. A light- diffusing material 3135 is used to diffuse the light produced by the light source 3125. The light source 3125 is positioned adjacent to the light-diffusing material 3135. The

configuration of FIG. 31C produces an illuminated spot 3140 having a small, bright interior and a larger, darker perimeter. FIG. 31D illustrates a light source 3125 (e.g, LED) mounted within a cavity 3120 of the rail 3105. A light-diffusing material 3135 is used to diffuse the light produced by the light source 3125. A plano-convex lens 3145 mounted between the light source 3125 and light-diffusing material 3135 to produce an evenly-lit spot 3150.

Alternatively, light pipes may be used to transfer light from the light source to the surface of the rail. In some embodiments, the light sources can be any number of LEDs, including an array or matrix of LEDs. In some embodiments, the spots of light may be LED displays embedded in the surface of the rails.

[00152] Another aspect of the disclosure is a system of at least one robot to move billiard balls, or other objects, on a billiard table. FIG. 32 is a schematic diagram illustrating a billiard table 3205 and a cable driven parallel robot (CDPR) 3215 to move billiard balls 32l0a-c on the table 3205. The CDPR 3215 may be capable of moving to any location above or on the table 3205 using techniques known in the CDPR art. The CDPR 3215 also includes a mechanism for grasping a billiard ball, such as a claw. The CDPR 3215 is shown in FIG.

32 as grasping billiard ball 32l0b. The CDPR 3215 can pick up billiard balls 32l0a-c and move them anywhere on the table 3205 to put the balls 32l0a-c into a desired arrangement on the table 3205.

[00153] FIG. 33 A is a schematic diagram illustrating a billiard table 3305 and a plurality of wheeled micro-robots 3320a-d to move billiard balls 33 lOa-c on the table. While four micro-robots 3320a-d are shown, any number can be used. The micro-robots 3320a-d can interface with a control system. The control system can control each individual micro-robot 3320a-d via a wireless interface (e.g. , WiFi or Bluetooth). The micro-robots 3320a-d do not need to be aware of their locations on the table 3305, as the control system can track the locations of the micro-robots 3320a-d using, for example, an overhead camera system as in FIG. 2A. The micro-robots 3320a-d can be used to move the billiard balls 33 lOa-c by, for example, pushing the balls. To exert more control over the direction of ball movement, two or more micro-robots can be used to move one ball. When not in use, the micro-robots 3320a-d can be kept in the comers of the billiard table 3305, where they will not interfere with the billiard balls. FIG. 33B is a schematic diagram illustrating a close-up of a corner of the billiard table 3305. As shown, a micro-robot 3320a in the corner of the table 3305 will not contact a billiard ball 33 lOa in the corner of the table because the micro-robot 3320a can fit in the space that is created between the curve of the billiard ball 33 lOa and the corner of the table. If a player takes a shot before the micro-robots 3320a-d are able to move to the corners of the table 3305, the controller can cause the micro-robots 3320a-d to move to avoid the moving balls.

[00154] Another aspect of the disclosure is the ability to enable remote billiard play between two parties using their own billiard tables and associated systems, such as the tables and systems disclosed herein. Two similarly equipped playing fields at different locations can each include:

[00155] • a billiard table with a set of balls;

[00156] • a video camera;

[00157] • a motion tracking system (MTS) employing computer vision for tracking balls e.g ., a motion tracking system disclosed herein, such as, for example, the system 600 illustrated in FIGS. 6) using a dedicated camera or a camera used for other aspects of the system;

[00158] · a display (e.g., monitor display, overhead projector, laser projector, or augmented reality display, such as, for example, the embodiments illustrated in FIGS. 34A and 34B showing a projector 3410 arranged to project images (e.g, ball paths 3415) on a playing surface of a billiard table 3405);

[00159] • indicator lights (e.g, an LED table lighting system as disclosed herein, such as, for example, the embodiment illustrated in FIG. 28) around or above the table (e.g, white or yellow lights to indicate ball in play by shooter-player, red lights to indicate cease shooting, or blue lights to indicate positioning of balls);

[00160] · optional cable-driven parallel robot (CDPR), wheeled micro robots, or spherical robots for ball placement (e.g, a CDPR as illustrated in FIG. 32 or micro robots as illustrated in FIGS. 33A and 33B);

[00161] • a controller for controlling the system and play between the players; and

[00162] • network connectivity for communication between players and their respective systems.

[00163] FIG. 35 is a flow diagram illustrating an aspect of remote billiard play, according to an example embodiment 3500. Before regular billiard play commences, players often conduct an“opening lag” to determine which player starts first. An opening lag typically involves each player shooting a ball (e.g, cue ball) toward the foot cushion of a billiard table and back to the head end of the table. The player whose ball is the closest to the head cushion wins the lag. According to the example embodiment 3500 of FIG. 35, each player’s controller prompts 3505 the player to conduct the opening lag. The lag shots may be conducted simultaneously or in succession. Each player’s motion tracking system (MTS) tracks 3510 the motion of the player’s lag shot, and the controllers of the respective systems cooperate to determine a winner of the lag. A winner is determined after both balls roll to a halt, at which time the MTSs measure the distance of each cue ball to its respective head rail (closest ball to the head rail wins). If a winner cannot be determined 3515, then the controllers can prompt 3505 the players to conduct the opening lag again. If there is a winner 3515, then the controllers assign 3520 the roles of“shooter” and“observer” to the players.

[00164] FIG. 36 is a flow diagram illustrating an aspect of remote billiard play, according to an example embodiment 3600. The following description is an example game play scenario of carom billiards using the remote play system. It should be understood that different color balls or lights can be used depending on player preference or the game being played. At the beginning of game play, a“Current Position” of the balls is initialized 3605 to be the starting position of the balls for the particular billiard game ( e.g ., carom billiards). The Current Position is a representation of positions for each ball involved in the game, for example, an X-Y position for each ball with respect to the table. The controller associated with the shooter player sets 3610 the shooter’s table indicator lights to blue, representing a ball positioning phase of game play. If a ball positioning robot (e.g., CDPR) is used 3615, then the robot at the shooter’s location positions 3620 the balls to spots on the shooter’s table that correspond to the Current Position. If a robot is not used 3615, then the shooter’s controller directs 3625 the shooter player to position the balls to the Current Position by, for example, projecting images on the shooter’s table. The images may be made, for example, by a video projector, laser projector, or augmented reality display (e.g, virtual reality glasses), and can include images of each ball in play at its respective location on the table, or a representation of the ball, such as a colored spot. Alternatively, the shooter’s controller can direct the shooter player to position the balls via a video monitor showing the balls in their respective locations on the table. As the shooter player places the balls on the table, the video display may help guide the placement of the balls by indicating to the shooter player a direction to move a ball if not placed correctly. The image may be zoomed for better detail. Once the shooter’s MTS determines that the balls are placed correctly within a certain error threshold, the shooter’s MTS notifies 3630 the shooter’s controller that positioning is complete. [00165] When positioning is complete, game play is ready to continue and the shooter’s controller can provide 3635 video of the shooter’s location to the observer player, though in some embodiments video may be provided continually to the observer player. In some embodiments, the video can be projected on the table, or graphics representing what occurred on the shooter’s table can be projected on the observer player’s table. When it is time for the shooter player to perform a billiard shot, the shooter’s controller sets 3640 the indicator lights to become yellow or white, depending on the color of the shooter player’s cue ball or other designation of color corresponding to the shooter player. If any ball is moved before the shooter player performs a billiard shot, the shooter’s MTS can recognize such movement and notify the shooter’s controller of a foul. Video of the shooter player’s shot can be recorded and stored with pre/post pad times (extra video). After the shooter player performs a shot, the shooter’s MTS determines 3645 whether the shooter player has scored a point ( e.g ., tracking whether a ball contacts a number of cushions or other balls) and updates the Current Position. If a point is scored 3650, then the shooter’s controller registers the point with both controllers (plus any additional information, such as the time the point was scored) and determines whether a point goal has been reached. If a point goal has been reached 3655, then the shooter controller determines 3660 that the shooter player is the winner. If the goal has not been reached 3655, then the shooter player performs another billiard shot and the shooter’s MTS again determines 3645 whether the shooter player has scored a point and updates the Current Position. If the shooter player did not score a point 3650, then the shooter’s controller sets 3665 the indicator lights to become red, indicating that the shooter player’s turn is over, and the controllers switch 3670 the roles of the shooter and the observer. The controller associated with the previous observer player, now the shooter player, sets 3610 the now shooter’s table indicator lights to blue, representing the ball positioning phase of game play, and game play continues until a winner is determined. When game play is complete, video clips from both locations can automatically be time-sorted, combined, and rendered with relevant score information, to form one coherent video file of the game, which can be shared with both parties.

[00166] FIG. 37 illustrates an example billiard ball spot pattern design that enables detection of movement of a billiard ball 3705. Current state of the art billiard balls include six spots for visual detection of a spin of a ball. Typically, these spots are on the

circumference of the billiard ball, at the point where three axes (X, Y, and Z axes) intersect the surface of the ball. A ball with such a spot arrangement can rotate around one of the axes such that only spot will be visible by a viewer (the spot on the axis directly facing the viewer), and the ball appears as though it is not rotating. The example ball spot pattern shown in FIG. 37 includes eight spots 37l0a-h arranged on a billiard ball 3705 at points where an inscribed cube 3715 would intersect the surface of the billiard ball (at the vertices of the inscribed cube). Spots 37l0a-d are shown in solid lines representing that they are visible on the front side of the ball 3705. Spots 37l0e-h are shown in dashed lines representing that they are on the back side of the ball 3705. An imaginary inscribed cube 3715 is shown in FIG. 37 in dotted lines. The spots can be a color that is a darker shade than the color of the billiard ball, providing for better, more consistent optical tracking and better aesthetics.

Alternatively, the spots can be invisible to the human eye, but readable by a machine ( e.g ., using infrared or ultraviolet light). The spot pattern design shown in FIG. 37 is such that at least two spots are visible and at least one spot will always be observed as moving when the ball is spinning.

[00167] FIGS. 38A and 38B illustrate an example billiard ball marker pattern design that enables a determination of an orientation of a billiard ball 3805. FIG. 38A shows one side of the ball 3805, and FIG. 38B shows the opposite side of the ball 3805. The pattern may be referred to as an“orientation detection pattern.” Similar to the pattern illustrated in FIG. 37, eight markers 38l0a-h can be arranged on a billiard ball 3805 at points where an inscribed cube would intersect the surface of the billiard ball (at the vertices of the inscribed cube).

The markers 38l0a-h may be a variety of shapes (e.g., triangles as illustrated in FIGS. 38A and 38B). Triangles, for example, are useful in computer vision systems for purposes of improving edge detection. Each of the markers 38l0a-h can be oriented in a particular way such that a combination of markers 38l0a-h that are visible at any one time (e.g, markers 38l0a-d as shown in FIG. 38 A) uniquely identifies an orientation of the ball 3805 based on the particular markers that are visible. In the configuration of FIG. 37, with spots at each location, if the ball 3705 were to rotate about one its primary axes exactly 90 degrees, the pattern of visible spots may look identical, as though the ball 3705 did not rotate. A ball as illustrated in FIGS. 38A and 38B can be used to determine whether the ball 3805 has rotated, for example, about one of its primary axes exactly 90 degrees, due to the shapes of the markers 38l0a-h and the different orientation of each of the markers 38l0a-h on the ball 3805. For example, at least two or more of the markers 38l0a-h will be visible at any one time. Each visible pair of markers is oriented with respect to each other in a unique way compared to the other pairs of markers. As an example, markers 38l0a and 38l0b would have one combination of orientations, and markers 38l0a and 38l0c would have another combination of orientations. A combination of orientations can be thought of as an imaginary line between the two triangles, where the orientation of each triangle with respect to that line would be a unique combination of orientations. For eight markers, there can be twelve non-symmetrical combinations of orientations. Thus, every presentation of at least two markers uniquely identifies a particular orientation of the ball 3805. An observer ( e.g ., a computer) of two of the markers 38l0a-h (e.g., triangles) is able to determine which two markers on the ball are being observed. It should be appreciated that the described marker arrangements can be used with other objects, such as, for example, soccer or golf balls.

Further, other shapes and arrangements can be determined and used to obtain the orientation determination.

[00168] FIG. 39 illustrates a pattern 3900 that enables a computer vision system, for example, to determine, using any one image frame, the orientation of a ball. As with the pattern of FIGS. 38A and 38B, the pattern may be referred to as an“orientation detection pattern.” The example pattern, including eight triangle markings 39l0a-h, whose positions coincide with the vertices of an inscribed cube 3915 within a spherical ball (not shown), allow any arbitrary camera view of the ball to always be able to at least detect a pair of (two) triangles, or, put another way, one of the twelve edges of the cube 3915. Detection of the ball orientation is made possible due to a particular rotational arrangement of the triangles 39l0a- h, which uniquely qualifies the position and orientation of any edge of the configuration. Triangle-shaped markers 39l0a-h have an added benefit of providing error correction when dealing with low resolution or out-of-focus images. Using the example pattern on a ball, a computer vision system can detect the exact orientation of the ball.

[00169] FIG. 40 is a set of schematic diagrams illustrating a pattern that enables a computer vision system to determine the orientation of a ball. FIG. 40 shows a particular rotational arrangement of eight triangles, which uniquely qualifies the position and orientation of any edge of the configuration.

[00170] The example configuration has applications in motion tracking applications, sports video analysis, spin-tracking, etc. The pattern can be used in any application in which tracking the orientation, pose, or rotation of an object (e.g, a ball) or a ball attached to an object is desired. For example, the pattern can be used in a computer vision system that tracks a location and orientation of a round object. A sporting ball is one example application where an aesthetically simple look is desired, and precise computer-vision tracking is also desired for precise spin modeling. Such sporting balls can include billiard balls, soccer balls, baseballs, table tennis (ping-pong) balls, golf balls, etc. The pattern can be used in a point motion tracking system so that balls used for tracking motion ( e.g ., balls mounted to a person’s joints to track motion of the person’s joints) can not only help determine the balls’ locations but also their orientations. Another example use is on a cap or helmet where it may be useful to know the way the wearer’s head is facing. In such a use, the pattern can be printed on the surface of the cap or helmet. Another example use is to track the orientation of a flying drone, where a ball shape, with the pattern, can be incorporated into the drone design. The pattern may also be used with a ball balancing robot (a robot sitting upon a ball and using the ball for balancing and movement). The robot sitting upon the ball can reference the pattern printed on the ball to verify the orientation and movement of the ball. The pattern may also be used in a trackball application for controlling a computer cursor or pointer, or a three-dimensional object in modeling software. Traditional trackballs use embedded rollers and optical sensors. The disclosed orientation detection pattern can be used on a ball. A camera, such as a computer’s integrated camera (e.g., web cam), can be used to track the orientation of the ball. Such a ball alone, sitting on a flat surface or held in a hand, for example, can be used to control a computer’s cursor, pointer, etc., using no additional moving parts or electronics. The pattern can also be used in conjunction with ball positioning robots, such as the micro-robots 3220a-d shown in FIGS. 33A and 33B. Precision movements of the positioning robots can be enhanced using the orientation detection pattern.

[00171] While example embodiments have been particularly shown and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the embodiments encompassed by the appended claims. While many aspects have been described in the context of a billiard table, they may equally be used in other environments, such as sporting arenas and performance stages. For example, aspects disclosed herein can be used in the context of a baseball or football game to detect a sequence of collisions between objects.