Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
POSITION DETERMINATION OF MOVING OBJECT TRAVERSING A TARGET ZONE
Document Type and Number:
WIPO Patent Application WO/2001/054781
Kind Code:
A2
Abstract:
An object position measurement system iteratively records images of the object as it moves along a path and updates a sequence of object path measurements in response to the recorded images, and produces an accurate decision from precision estimation concerning the object position, based on the recorded images. In the case of a baseball umpire decision task, the object is a baseball and the object path is the flight of the baseball from the pitcher to the batter at home plate. The system captures the flight of the baseball as it is thrown by the pitcher and travels toward the batter. An image capture subsystem records the flight of the baseball. A control subsystem processes the images collected by the image capture subsystem, and produces an accurate decision about whether the pitch is strike or a ball, based on the processed images.

Inventors:
LIKES DENNIS W (US)
DETIENNE DAVID H (US)
PEIRCE WILLIAM S (US)
JOLLEY D KENT (US)
WHITE W DAVE (US)
Application Number:
PCT/US2001/002756
Publication Date:
August 02, 2001
Filing Date:
January 26, 2001
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SCS TECHNOLOGIES LLC (US)
LIKES DENNIS W (US)
DETIENNE DAVID H (US)
PEIRCE WILLIAM S (US)
JOLLEY D KENT (US)
WHITE W DAVE (US)
International Classes:
A63B71/06; A63B63/00; A63B69/00; G01S5/16; (IPC1-7): A63F13/00
Foreign References:
US4545576A1985-10-08
Attorney, Agent or Firm:
Hall, David A. (CA, US)
Download PDF:
Claims:
CLAIMS
1. A method of determining object position, the method comprising: iteratively recording images of an object in a sequence of recording intervals as the object moves along a path; generating an initial measurement of the object position along the path, based on one or more of the recorded images; repeatedly updating a sequence of next measurements of the object path movement based on recorded images taken subsequent to the initial measurement; and producing an accurate decision concerning an object characteristic in response to reference markers in the recorded images of the object.
2. A method as defined in claim 1, further including: receiving a test subject decision about the object characteristic; and comparing the test subject decision and the accurate decision concerning the object characteristic to thereby assess the test subject decisions.
3. A method as defined in claim 1, further including determining parameters for producing the accurate decision concerning the object characteristic prior to the initial image recording based on one or more of the recorded images.
4. A method as defined in claim 3, wherein the object comprises a thrown baseball, the parameters relate to a strike zone defined by a baseball home plate and a batter, and the accurate decision relates to whether the baseball thrown was a strike or a ball.
5. A method as defined in claim 4, wherein the strike zone parameters defined by the batter are retrieved from a data base of stored information relating to particular players.
6. A method as defined in claim 1, further including determining parameters for producing the accurate decision concerning the object characteristic with respect to reference markers in the recorded images.
7. A method as defined in claim 6, wherein iteratively recording images comprises video recording of the object movement.
8. A method as defined in claim 7, wherein the recorded images comprise video images and the reference markers are in the object environment and are visible in the recorded images.
9. A method as defined in claim 7, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
10. A method as defined in claim 6, wherein the object comprises a thrown baseball, the image reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
11. A method as defined in claim 10, wherein iteratively recording images comprises video recording of the baseball and batter when the baseball is thrown.
12. A method as defined in claim 11, wherein the reference markers are visible in the recorded images.
13. A method as defined in claim 1, further including performing a static calibration process prior to generating the initial approximation of object position and further including performing a dynamic calibration process during the sequential processing of the images to generating the final trajectory of the object.
14. A method as defined in claim 13, wherein the static and dynamic calibration processes are performed with respect to external calibration points that relate to the object environment and that are visible in the recorded images.
15. A method as defined in claim 1, further including providing feedback to the test subject on the outcome of the decision assessment prior to a next sequence of iteratively recorded images.
16. A method as defined in claim 1, wherein iteratively recording images comprises video recording of the object from multiple viewing positions as the object moves along the path.
17. A method as defined in claim 16, wherein reference markers are in the object environment and are visible in the recorded images.
18. A method as defined in claim 17, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the decision relates to whether the baseball thrown was a strike or a ball.
19. A method as defined in claim 18, wherein the strike zone is determined after a static calibration process performed prior to generating the initial approximation of object position and a dynamic calibration process can be performed during the sequential processing of the images.
20. A method as defined in claim 19, wherein both the static and dynamic calibration processes are performed with respect to calibration points that relate to points in the playing field.
21. A method as defined in claim 1, wherein repeatedly updating a next measurement comprises updating the object position with a Kalman filter approximation.
22. A method as defined in claim 1, further including storing image data, a test subject decision, and accurate decision data in a database.
23. A method as defined in claim 1, further including measurement of distance to the reference markers and accounting for the direction of the gravity field.
24. A method as defined in claim 1, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter's body, and the object decision relates to whether the baseball thrown traversed the strike zone, wherein upper and lower strike zone boundaries are defined by the batter's body, and wherein the upper and lower boundaries are determined by image processing with stereo vision.
25. A method as defined in claim 1, wherein the images are recorded from a plurality of image recording devices, and the method further comprises accounting for time shifts between images from different image recording devices.
26. A method as defined in claim 1, further including a calibration process wherein the object position measurements are generated using only data from the recorded images and the locations of the reference markers, such that ellipses are assumed to be circles distorted by a view angle, the path field is assumed flat, and detected lines are assumed to be either parallel or orthogonal.
27. A method as defined in claim 1, further including measurement and calculation of physical trajectory parameters of the object movement, including object speed, spin axis, and effective spin rate to thereby determine object characteristics indicating how the object was propelled.
28. A method as defined in claim 27, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object characteristics relate to the type of pitch thrown.
29. A method of determining object position as the object moves along a path, and determining an object path characteristic, the method comprising: recording multiple data images of the object with respect to a set of reference markers as the object moves along the path; determining an initial measurement of a next object location based on the multiple data images; predicting a next data image of the object; updating the measurement of the next object location; repeating the steps of recording a next data image and updating the measurement until the object reaches the end of the path; and producing an accurate decision concerning the object path characteristic in response to the reference markers in the data images.
30. A method is defined in claim 29, further including: receiving a test subject decision about the object path characteristic; and comparing the test subject decision and the produced accurate decision to assess the performance of the test subject.
31. A method as defined in claim 29, further including determining parameters concerning the object path characteristic approximately at a time when the object is beginning movement along the path based on one or more of the data images.
32. A method as defined in claim 31, wherein the object comprises a thrown baseball, the parameters relate to a strike zone defined by a baseball home plate and a batter, and the object path characteristic relates to whether the baseball thrown was a strike or a ball.
33. A method as defined in claim 32, wherein the strike zone is determined with respect to reference markers in the data images.
34. A method as defined in claim 29, wherein parameters relating to the correct decision are determined after a calibration process that is performed prior to determining an initial approximation of a next object location.
35. A method as defined in claim 34, wherein the calibration process is performed with respect to calibration points external to the object path.
36. A method as defined in claim 29, further including determining parameters concerning the object characteristic, wherein the object comprises a baseball and the parameters relate to a strike zone defined by a baseball home plate and a batter.
37. A method as defined in claim 29, further including storing image data, a test subject decision, and accurate decision data in a database.
38. A method of operating a computer system to determine object position as the object moves along a path, the method comprising: recording multiple data images of the object into computer storage with respect to a set of reference markers as the object moves along the path; predicting the measurement of object location on the path, based on the multiple data images; recording a next data prediction of the object location into the computer storage; updating the measurement of the object location in accordance with the next data image; repeating the steps of recording a next data prediction and updating until the object reaches the end of the path; and producing an accurate estimate concerning movement of the object along the path.
39. A method as defined in claim 38, further including: receiving a test subject judgment about the object movement; and comparing the test subject judgment and the accurate decision relating to the object movement, and assessing the performance of the test subject.
40. A method as defined in claim 38, further including determining parameters concerning movement of the object along the path approximately at a time when the object is beginning movement along the path, based on one or more of the data images.
41. A method as defined in claim 40, wherein the object comprises a baseball, and the parameters relate to a strike zone defined by a baseball home plate and a batter.
42. A method as defined in claim 41, wherein the strike zone is determined with respect to the reference markers in the data images.
43. A method as defined in claim 38, wherein the computer system determines parameters relating to the accurate decision after a static calibration process that is performed prior to determining an initial approximation of a next object location and a dynamic calibration process that can be performed during the sequential processing of the images..
44. A method as defined in claim 43, wherein both the static and dynamic calibration processes are performed by the computer system with respect to calibration points external to the object path.
45. A method as defined in claim 38, further including the computer system receiving an operator input and responding by determining parameters concerning the object characteristic, wherein the object comprises a baseball and the parameters relate to a strike zone defined by a baseball home plate and a batter.
46. A method as defined in claim 38, further including storing image data, a test subject decision, and correct decision data in a database.
47. A method as defined in claim 38, further including measurement of distance to the reference markers and accounting for the direction of the gravity field.
48. A method as defined in claim 38, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter's body, and the object decision relates to whether the baseball thrown traversed the strike zone, wherein upper and lower strike zone boundaries are defined by the batter's body, and wherein the upper and lower boundaries are determined by image processing with stereo vision.
49. A method as defined in claim 38, wherein the images are recorded from a plurality of image recording devices, and the method further comprises accounting for time shifts between images from different image recording devices.
50. A method as defined in claim 38, further including a calibration process wherein the object position measurements are generated using only data from the recorded images and the locations of the reference markers, such that ellipses are assumed to be circles distorted by a view angle, the path field is assumed flat, and detected lines are assumed to be either parallel or orthogonal.
51. A method as defined in claim 38, further including measurement and calculation of physical trajectory parameters of the object movement, including object speed, spin axis, and effective spin rate to thereby determine object characteristics indicating how the object was propelled.
52. A method as defined in claim 51, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object characteristics relate to the type of pitch thrown.
53. A system that determines object position as the object moves along a path, the system comprising: an image capture subsystem that iteratively records images of the object in a sequence of recording intervals as the object moves along the path; and a control subsystem that generates an initial measurement of the position of the object on the path, based on one or more of the recorded images, repeatedly updates measurements of the object path movement based on the recorded images taken subsequent to the initial measurement, and produces an accurate decision concerning a object characteristic in response to reference markers in the recorded images of the object.
54. A system as defined in claim 53, further including an operator interface that receives a test subject decision about the object characteristic wherein the control subsystem compares the test subject decision and the accurate decision concerning the object characteristic to thereby assess the test subject decision.
55. A system as defined in claim 53, wherein the control subsystem determines parameters for producing the correct decision concerning the object path characteristic prior to the initial image recording based on one or more of the recorded images.
56. A system as defined in claim 55, wherein the object comprises a thrown baseball, the parameters relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
57. A system as defined in claim 56, wherein the strike zone parameters defined by the batter are retrieved by the control subsystem from a data base of stored information relating to particular players.
58. A system as defined in claim 53, wherein the control subsystem determines parameters for producing the accurate decision concerning the object characteristic with respect to reference markers in the recorded images.
59. A system as defined in claim 58, wherein the image capture subsystem iteratively records images by video recording of the object movement.
60. A system as defined in claim 59, wherein the recorded images comprise video images and the reference markers are in the object environment and are visible in the recorded images.
61. A system as defined in claim 59, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
62. A system as defined in claim 58, wherein the object comprises a thrown baseball, the image reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
63. A system as defined in claim 62, wherein the image capture subsystem iteratively records images by video recording of the baseball and batter when the baseball is thrown.
64. A system as defined in claim 63, wherein the reference markers are visible in the recorded images.
65. A system as defined in claim 53, wherein the computer system determines parameters relating to the accurate decision after a static calibration process that is performed prior to determining an initial approximation of a next object location and a dynamic calibration process that can be performed during the sequential processing of the images.
66. A system as defined in claim 65, wherein both the static and dynamic calibration processes are performed by the computer system with respect to calibration points external to the object path.
67. A system as defined in claim 53, wherein the control subsystem provides feedback to a test subject on the outcome of the decision assessment.
68. A system as defined in claim 53, wherein the image capture subsystem iteratively records images by video recording of the object from multiple viewing positions as the object moves along the path.
69. A system as defined in claim 68, wherein reference markers are in the object environment and are visible in the recorded images.
70. A system as defined in claim 69, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the decision relates to whether the baseball thrown was a strike or a ball.
71. A system as defined in claim 70, wherein the computer system determines parameters relating to the accurate decision after a static calibration process that is performed prior to determining an initial approximation of a next object location and a dynamic calibration process that can be performed during the sequential processing of the images.
72. A system as defined in claim 71, wherein both the static and dynamic calibration processes are performed by the computer system with respect to calibration points external to the object path.
73. A system as defined in claim 53, wherein repeatedly updating a next approximation comprises updating the object position with a Kalman filter approximation.
74. A system as defined in claim 53, further including storing image data, a test subject decision, and correct decision in a database.
75. A system as defined in claim 53, wherein the system further measures distance to the reference markers and accounts for the direction of the gravity field.
76. A system as defined in claim 53, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter's body, and the object decision relates to whether the baseball thrown traversed the strike zone, wherein upper and lower strike zone boundaries are defined by the batter's body, and wherein the system determines upper and lower boundaries by image processing with stereo vision.
77. A system as defined in claim 53, wherein the images are recorded from a plurality of image recording devices, and the method further comprises accounting for time shifts between images from different image recording devices.
78. A system as defined in claim 53, wherein the system performs a calibration process in which it generates object position measurements using only data from the recorded images and the locations of the reference markers, such that ellipses are assumed to be circles distorted by a view angle, the path field is assumed flat, and detected lines are assumed to be either parallel or orthogonal.
79. A system as defined in claim 53, wherein the system further measures and calculates physical trajectory parameters of the object movement, including object speed, spin axis, and effective spin rate, to thereby determine object characteristics indicating how the object was propelled.
80. A system as defined in claim 79, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object characteristics relate to the type of pitch thrown.
81. A program product for use in a computer system that executes program steps recorded in a computerreadable media to perform a method for assessing a decision by a test subject, the program product comprising: a recordable media; and a program of computerreadable instructions executable by the computer system to perform method steps comprising: generating an initial measurement of the position of the object along the path, based on one or more of the recorded images; repeatedly updating a sequence of next measurements of the object path movement based on recorded images taken subsequent to the initial measurement; and producing an accurate decision concerning an object characteristic in response to reference markers in the recorded images of the object.
82. A program product as defined in claim 81, wherein the performed method further comprises: receiving a test subject decision about the object characteristic; and comparing the test subject decision and the accurate decision concerning the object characteristic to thereby assess the test subject decision.
83. A program product as defined in claim 81, further including determining parameters for producing the accurate decision concerning the object path characteristic prior to the initial image recording based on one or more of the recorded images.
84. A program product as defined in claim 83, wherein the object comprises a thrown baseball, the parameters relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
85. A program product as defined in claim 84, wherein the strike zone parameters defined by the batter are retrieved from a data base of stored information relating to particular players.
86. A program product as defined in claim 81, further including determining parameters for producing the accurate decision concerning the object characteristic with respect to reference markers in the recorded images.
87. A program product as defined in claim 86, wherein iteratively recording images comprises video recording of the object movement.
88. A program product as defined in claim 87, wherein the recorded images comprise video images and the reference markers are in the object environment and are visible in the recorded images.
89. A program product as defined in claim 87, wherein the object comprises a thrown baseball and the decision markers relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
90. A program product as defined in claim 86, wherein the object comprises a thrown baseball, the image decision markers relate to a strike zone defined by a baseball home plate and a batter, and the object decision relates to whether the baseball thrown was a strike or a ball.
91. A program product as defined in claim 90, wherein iteratively recording images comprises video recording of the baseball and batter when the baseball is thrown.
92. A program product as defined in claim 91, wherein the reference markers are visible in the recorded images.
93. A program product as defined in claim 92, further including performing a static calibration process that is performed prior to determining an initial approximation of a next object location and a dynamic calibration process that can be performed during the sequential processing of the images.
94. A program product as defined in claim 93, wherein both the static and dynamic calibration processes are performed by the computer system with respect to calibration points external to the object path.
95. A program product as defined in claim 92, further including providing feedback to a test subject on the outcome of the decision assessment.
96. A program product as defined in claim 92, wherein iteratively recording images comprises video recording of the object from multiple viewing positions as the object moves along the path.
97. A program product as defined in claim 96, wherein reference markers are in the object environment and are visible in each recorded image.
98. A program product as defined in claim 97, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the decision relates to whether the baseball thrown was a strike or a ball.
99. A program product as defined in claim 98, wherein the computer system determines parameters relating to the accurate decision after a static calibration process that is performed prior to determining an initial approximation of a next object location and a dynamic calibration process that can be performed during the sequential processing of the images.
100. A program product as defined in claim 99, wherein both the static and dynamic calibration processes are performed by the computer system with respect to calibration points external to the object path.
101. A program product as defined in claim 92, wherein repeatedly updating a next measurement comprises updating the object position with a Kalman filter approximation.
102. A program product as defined in claim 92, further including storing image data, a test subject decision, and accurate decision data in a database.
103. A program product as defined in claim 81, wherein the performed method further includes measuring distance to the reference markers and accounting for the direction of the gravity field.
104. A program product as defined in claim 81, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter's body, and the object decision relates to whether the baseball thrown traversed the strike zone, wherein upper and lower strike zone boundaries are defined by the batter's body, and wherein the upper and lower boundaries are determined by image processing with stereo vision.
105. A program product as defined in claim 81, wherein the images are recorded from a plurality of image recording devices, and the method further comprises accounting for time shifts between images from different image recording devices.
106. A program product as defined in claim 81, wherein the performed method further includes a calibration process wherein the object position measurements are generated using only data from the recorded images and the locations of the reference markers, such that ellipses are assumed to be circles distorted by a view angle, the path field is assumed flat, and detected lines are assumed to be either parallel or orthogonal.
107. A program product as defined in claim 81, wherein the performed method further includes measurement and calculation of physical trajectory parameters of the object movement, including object speed, spin axis, and effective spin rate to thereby determine object characteristics indicating how the object was propelled.
108. A program product as defined in claim 107, wherein the object comprises a thrown baseball and the reference markers relate to a strike zone defined by a baseball home plate and a batter, and the object characteristics relate to the type of pitch thrown.
Description:
PRECISION MEASUREMENT OF A MOVING OBJECT TRAVERSING A TARGET ZONE TECHNICAL FIELD This invention relates generally to trajectory measurement systems and, more particularly, to accurate measurement of object movement in proximity to a target zone.

BACKGROUND ART There are many well known techniques for determining the location of an object.

The problem of tracking a moving object during the play of a game, however, places special constraints on the permissible range of these technologies. The ideal technique should in no way interfere with the play of the game, modify objects used to play the game, or encumber the players. The preferred methods for this type of passive location measurement are usually stereo vision techniques. For object location measurements to be useful, the measurements should relate to some external reference, or target zone.

For example, in pitching a baseball, the ball location is measured relative to home plate and the strike zone, while in tennis, the measurement of interest is where the ball strikes the ground relative to the court lines.

Any measurement system can be no more accurate than its base reference measurements. Most systems either assume reference points were placed accurately (such as bases on a baseball field or lines on a tennis court), or locate them with measuring tape or standard surveying equipment. Therefore, accuracy of object measurement will be improved with more precise placement and location of target zone reference points.

Sometimes the target zone for measurement in a sport is relative to a human player's body. This zone is typically determined in a subjective manner by a human.

There are no known commercially available techniques to passively measure a target zone that is defined by a human body. Current active measurement techniques attach reflective balls to the body and perform stereo vision measurements to detect the ball location. Another technique is to scan a human body with a laser.

Calibration of measurement systems places additional constraints on practical techniques. A calibration solely based on images would be a desirable feature, as a separate measurement of reference point positions, or camera positions, would not need to be done. Although there is some research in this area, there are no known commercially available systems or methods for exclusively image-based calibration.

Typically, there are constraints on where the measurement sensors can be placed.

Existing systems use ad hoc sensor placement, or heuristics to place the sensors.

For determining the location of an object comprising a baseball, there are many systems that can be used to subjectively record features of a pitch, such as ball or strike, in or out, high or low, inside or outside, curve or slider, top spin or back spin, fast or slow, and so forth. Unfortunately, it is a human-intensive undertaking to record this data. The data is typically not scientifically measured, but rather subjectively estimated.

It would be advantageous if such measurement and compilation of physical features could be made automatic. This would also allow for the quick and simple generation of statistics and for accurate and efficient searches of the resulting database.

Thus, it is often desirable to accurately determine the position of an object, such as a baseball, as it moves along a path, and this is especially useful where the object movement determines the outcome of a decision task. For example, during a baseball game, umpires must observe the movement of a baseball thrown by a pitcher and must decide whether the pitch was a ball or a strike. This decision task involves an umpire standing directly behind a batter and catcher, observing a baseball thrown by a pitcher toward the batter and traveling up to 100 or more miles per hour. The umpire must quickly judge the baseball location in a visually defined three-dimensional target space near the batter (known as the strike zone) and, based on that judgment, must decide whether the particular pitch was a ball (passing outside the strike zone) or a strike (intersecting the strike zone). If the movement of the thrown ball were precisely known, along with the"edges"of the three-dimensional target zone, it would be possible to accurately determine if the pitch was a ball or strike, and it would be possible to assess the correctness of the umpire call. The determination of the correct call has application, for example, to training (umpire, catcher, coach, etc.) and viewer entertainment. The ball movement, however, must be precisely determined with respect to the target zone.

Strike zone parameters include certain fixed vertical markers (the edges of home plate) and variable horizontal markers (the batter's waist, shoulders, and knees). In accordance with Major League Baseball rules, the strike zone is defined to be the area over home plate, the upper limit of which is a horizontal line across the batter's body at the midpoint between the top of the shoulders and the top of the uniform pants, the lower limit of which is a line at the hallow beneath the batter's knee cap (See, for example, web page at www. majorleaguebaseball. com/mlbcom/headquarters/rule2. htm).

This definition is applied while the batter assumes the usual batting stance. The correct ball/strike decision is not known until the baseball is thrown and travels to the batter, and the decision must be made within a fraction of a second.

It is desirable to provide an accurate, objective assessment of whether a particular pitch is a ball or a strike. Such an assessment is useful, for example, for increasing viewer enjoyment of a televised baseball game, to permit monitoring of a pitcher's game performance, as well as the game performance of the umpire. With respect to umpire performance, such accurate assessments of balls and strikes can be particularly useful in training umpires to make correct calls. For purposes of both entertainment broadcasting and training, the accuracy of the ball/strike assessments should be available for later study and comparison.

Other tasks exist for which it may be desirable to provide an accurate measurement of object movement and a resulting assessment of whether a particular decision task is correctly performed. For example, other sports situations of interest involve referee decisions such as calling a tennis ball in play or out, or judging field goals in U. S. football. In addition to supporting referee decision making, accurate measurement of the ball trajectory could make it possible to establish the curvature of the trajectory. This, in turn, could permit measurement of the spin speed and axis of rotation. For baseball, this would provide a physical, results-based characterization of whether a pitch was a fastball, curve ball, slider, and so on. The entertainment value of many sporting events watched by viewers is increased by the accurate depiction of ball movement and resulting decision tasks. In addition, production processes may accept or reject a manufactured article based on measurement of object movement.

Appropriate training can dramatically improve the performance of persons who must make minimal-reaction-time decision tasks based on object movement, such as the

referee or umpire tasks described above. During training, it is important to provide test subjects (such as umpires) with feedback on their decision-making performance. To provide reasonable feedback, it is important to observe test subjects making decisions in their respective tasks, accurately determine the movement of the ball and the correct decision responses, identify decision errors, and provide the subjects with information on their performance.

In the case of Major League Baseball umpires, for example, current training regimes typically involve a potential Major League umpire performing first as an umpire in minor league play and then, only after successful completion with satisfactory performance, assignment to Major League play. The assignment to Major League umpiring usually occurs only after several years of minor league play, and only after extensive performance reviews by Major League Baseball umpires. Such training is valuable, but is necessarily slow to provide feedback and is time-consuming to complete.

In addition, it requires the involvement of Major League umpires or other personnel dedicated to the training task. The collective effort to provide an assessment of each trainee umpire's performance, for each pitch the umpire calls, is very great.

There is no precision measurement system for accurately determining the movement of a ball relative to the strike zone, and no system for collecting such data for entertainment or training purposes. Although a training process could benefit from some degree of automation, there currently are no training schemes that systematize the collection of accurate performance data for umpires undergoing training and that provide useful feedback. In particular, there is no training system that precisely determines whether each observed pitch is a ball or a strike and provides umpires with feedback on their performance in calling balls and strikes, with storage of such performance data for careful review, study, and assessment.

There are visualization systems that can help depict the flight of a thrown ball in the game of baseball and can graphically illustrate the ball path. These systems typically store video images, then replay such images with additional imagery to help visualize the path of the ball. Thus, these systems provide a two-dimensional (video) representation of the three-dimensional (actual) event. One such visualization system is called"SuperVision"and is a product of QuesTec Imaging, Inc. of Deer Park, New York, USA. Other systems have application in other sports, such as basketball, golf,

tennis, and racing events. Most visualization systems, however, are intended primarily for viewer entertainment, as an enhancement to observing a baseball game or other sporting event, and are not capable of precision measurement of object movement.

Such conventional visualization systems do not typically perform measurements of the ball in flight. Nor do they measure the upper and lower bounds of the target zone, i. e., the strike zone. In addition, other systems cannot characterize the type of ball movement, e. g., curve ball, slider, and so forth. Rather, such systems typically do no more than graphically illustrate the flight of a ball in a sequence of visual re-creations of the ball path, using replay and overlay techniques. The accuracy of such systems is limited to what can be approximated from review of the replayed video images, and therefore such systems typically cannot be used to place a baseball relative to a strike zone with an accuracy no better than several inches. It should be apparent that such visualization systems are not capable of accurately determining whether a thrown baseball is a ball or a strike. In addition, such systems do not typically produce and accumulate data on the performance of a contestant, referee, or umpire.

From the discussion above, it should be apparent that there is a need for a system that can track movement of an object with sufficient accuracy to reliably determine the location of the object, such as a ball, to provide an accurate determination about the object's movement and its proximity to a target zone, and to accumulate data relating to such object's characteristics during a real-time event, such as a game. The present invention fulfills this need.

DISCLOSURE OF INVENTION The present invention tracks and measures movement of an object by iteratively recording images of the object as it moves along a path toward a target and updates a sequence of object path measurements in response to the recorded images. The invention also determines the extent of a target (its edges) in three-dimensional space.

The invention can thereby produce an accurate decision about the object's movement from a precision measurement estimation concerning the object position, based on the recorded images. In this way, the invention provides a system for accurately measuring the path of an object from image data and for producing an accurate decision event about the object's path and its intersection with a target zone in three-dimensional space.

A system constructed in accordance with the invention includes an image capture subsystem that captures image data of the decision event, and includes a control subsystem that receives the decision event image data and determines the correct decision. The image capture subsystem includes image capture devices, such as video cameras, that capture multiple views of the movement with respect to reference markers and permit accurate measurement of the actual decision event. The control subsystem processes the image data, computes the changes in object position, calculates the extent of the target zone, and produces a decision about the object movement with respect to the target.

In the case of a baseball umpire calling pitches, the object is a baseball and the object path is the flight of the baseball from the pitcher to the batter at home plate, the target zone is the three-dimensional volume of the strike zone, and the object path characteristic decision event is whether the thrown baseball was a strike or a ball. The system measures the player-dependent upper and lower strike zone bounds via image processing techniques. The system captures the flight of the baseball in a series of images after the baseball is thrown by the pitcher and travels toward the batter. The control subsystem processes the images collected by the image capture subsystem and processes the data from the images to calculate the trajectory of the baseball to within an accuracy of approximately one-half inch (1 cm). The control subsystem then calculates the extent of the strike zone, and produces a correct decision about whether the umpire call should be a strike or a ball, and compares the umpire decision with the produced correct decision. The control subsystem is also able to evaluate the moving physics of the ball to determine spin axis, pitch type (fast ball, slider), and pitch speed.

Other features and advantages of the present invention should be apparent from the following description of the preferred embodiment, which illustrates, by way of example, the principles of the invention.

BRIEF DESCRIPTION OF DRAWING The objects, advantages and features of this invention will be more readily appreciated from the following detailed description, when read in conjunction with the accompanying drawing, in which: Figure 1 is a block diagram representation of a performance evaluation system constructed in accordance with the invention.

Figure 2 is a block diagram representation of the Image Capture Subsystem shown in Figure 1.

Figure 3A is a side elevation view of a baseball park with the Figure 1 system, showing the system components in relation to a hypothetical pitcher and batter.

Figure 3B is a detail side elevational view of a baseball park with the Figure 1 system, showing a method for accurate measurement of the locations for reference markers and cameras.

Figure 4 is a plan view of the Image Capture Subsystem components deployed in the baseball park shown in Figure 3A.

Figure 5 is a block diagram representation of the Control Subsystem shown in Figure 1.

Figure 6 is a block diagram of a computer constructed in accordance with the present invention.

Figure 7 is a representation of a display of the computer processing system shown in Figure 6, illustrating the operator initialization screen home display window.

Figure 8 is a flow diagram representation of the processing steps executed by the computer system shown in Figure 6.

Figure 9 is a representation of the Calibrate display window of the computer shown in Figure 6.

Figure 10 is a representation of the Setup display window of the computer shown in Figure 6.

Figure 11 is a representation of the Pitch Processing display window of the computer shown in Figure 6.

Figure 12 is a representation of the Feedback display window of the computer shown in Figure 6.

Figure 13 is a representation of the Store display window of the computer shown in Figure 6.

BEST MODE FOR CARRYING OUT THE INVENTION Figure 1 is a representation of a precision object measurement and performance assessment system 100 constructed in accordance with the present invention. The system iteratively records images of an object as it moves along a path and calculates measurements of the object position. From the accurate position measurements, the system can accurately produce a determination about object characteristics, along with determining the extent of a target zone toward which the object is moving. If desired, the system determination can be compared to a decision by a test subject who observes the object movement. For example, in the case of a baseball pitch-calling umpire decision task, the ball is a baseball, the baseball umpire is the test subject, and the determination about an object characteristic concerns whether the thrown baseball should be called either a strike or a ball. The object path being recorded is the flight of the thrown baseball from the pitcher to the catcher standing behind a batter at home plate.

Thus, to produce an accurate determination of strike or ball, the system 100 must accurately determine the position of the baseball relative to the strike zone. The strike zone is defined to be the area over home plate that is no higher than a horizontal line across the batter's body at the midpoint between the top of the batter's shoulders and the top of the uniform pants, and is no lower than the hallow beneath the batter's knee cap.

It should be appreciated that the boundaries of the strike zone must be adjusted for each batter, complicating the measurement and assessment task.

To accurately determine the baseball position with respect to the strike zone, the system 100 repeatedly updates a sequence of position data of the object path movement based on the images. That is, the system precisely calculates the location of the baseball at regular time intervals based on processing of the image data. In this way, the object position is accurately measured. In particular, the baseball locations can be measured to within 0.5 inch (approximately 1 cm) of home plate and any other fixed reference.

The system ultimately generates a strike or ball object characteristic decision relative to the batter's strike zone based on the measurements, thereby producing a presumed correct decision in response to the images. The system can receive the umpire's decision

and can compare the umpire decision with the correct decision concerning the object characteristic. In this way, the system can assess the umpire's decision.

Thus, the system provides an automated objective second opinion concerning decisions based on the proximity of an object (a baseball) to a target (strike zone).

Umpires undergoing training or evaluation, for example, thereby have an objective tool to grade their performance. The system accurately measures or specifies the target zone, and then measures and records physics characteristics of the object trajectory. The system then records and compares a human decision about the object position to traversing the target zone with the measured object position at the target zone. The system allows for the retrieval of the information by many characteristics, such as speed, location, velocity, axis and speed of spin, participants, location, types of decisions, and decisions deviating from a given threshold. Statistics on the decisions and physics parameters of the trajectory are compiled. Finally, a display allows the human to compare their decisions to the measured data.

In the preferred embodiment, the system 100 can locate the ball relative to reference points via image processing techniques. In this way, even if the cameras are inadvertently moved during play, no loss of accuracy will occur. For the preferred embodiment, the reference points and sensor (camera) locations are measured using a laser range finder. This provides, with the current state of the art in laser range finders, about 1 mm reference point location accuracy. The technique of this invention accurately aligns the coordinate system with the earth's gravitational field. Alignment with the gravitational field is important for physics modeling of the trajectory. No assumptions are made that the playing field is level or flat, or that any given pair of lines are either parallel or orthogonal.

In addition, the preferred embodiment uses a neural network and passive imaging techniques. Images of players, scored by static measurement, are used for training. An example of this would be defining the upper and lower limits of a baseball player's strike zone. The rules define these limits relative to the player's knees, belt, and shoulders. The system then defines these limits in terms of camera pixels, for conversion to the"real-world"physical coordinates of the particular playing venue. The system also does a full error analysis, consistent with the physical facilities and view,

to find the optimal placement for the sensors, thereby returning the most accurate possible result.

The system measurements return a series of ball positions. Finding the trajectory of the ball involves some form of curve fitting to these positions. The Kalman filter is a well known technique for doing this. The system modifies the curve fitting technique by demanding a smoothness criterion to match the time shifts between the sensors, and by using accurate physics to fit the trajectory. Fitting and thereby measuring the physical parameters allows the speed, direction, spin axis, and spin rate of the ball to be determined. It is noteworthy this technique does not require the very high resolution, very high frame rate cameras that would be needed to image the ball in enough detail to optically track the actual spinning.

Dovetailing with this database is a sports training capability. Currently there are no systems that provide scientifically measured feedback for many sports training issues.

The system allows a player, umpire, or referee to see the actual measured trajectory, and compare it to the call made, or technique used to propel the ball. Also, general trends could be reported. For example, an umpire could be told most of his missed calls were on outside pitches. A player's pitches that progressively slow down throughout the season may indicate a steadily worsening injury.

Returning to the preferred embodiment shown in Figure 1, the system 100 includes an Image Capture subsystem 102 that captures the object movement and produces the image data, and a Control subsystem 104 that receives the image data and produces the correct object characteristic decision. The Image Capture subsystem 102 includes multiple image capture devices, such as video cameras. Each image capture device records the movement of the object along the path from a different viewing perspective, but with respect to the same set of reference markers. The reference markers can comprise, for example, visual markers such as a foul ball pole, or may be special high contrast edges in the images or reflective markers positioned in a spectator area. Preferably, the entire set of markers should be visible in every recorded image, to permit accurate isolation of the baseball position irrespective of stadium reverberation, wind velocity, or other very small variations in camera position from video frame to video frame. This ensures that each image capture device 202 will record the entire

observed event, such as the flight of the baseball from the pitcher all the way to the batter, and permits accurate comparison of images from event to event.

The Control subsystem 104 uses the image data from the Image Capture subsystem 102 to perform real-time computation of the event outcome. Thus, performance assessment occurs between each sequence of image capture. The computation may involve, for example, sophisticated processing of data from the successive video images of a pitch, showing the baseball in flight from the pitcher toward home plate. The Control subsystem 104 receives the test subject (umpire) decision and compares the test subject decision about the object characteristic (strike or ball) with the actual event outcome. The Control subsystem also assesses the test subject performance. In this way, the test subject can promptly receive performance feedback.

Such information can be shared with designated members of a viewing audience. If desired, the viewing audience can include observers of a game broadcast, to permit monitoring of game performance by both the pitcher and the umpire. Such information can enhance the entertainment experience of the viewing audience.

Image Capture Subsystem Figure 2 is a block diagram representation of the Image Capture subsystem 102 shown in Figure 1. The Image Capture subsystem is the part of the overall system 100 that is responsible for recording the flight of the baseball with sufficient accuracy to permit the measurement processing of the Control subsystem 104 (Figure 1). The Image Capture subsystem shown in Figure 2 includes image capture devices 202, such as video cameras that record successive frames of images viewable on a screen, and reference markers 204, such as reflectors or other distinct features visible in the video frames.

The Image Capture subsystem 102 also includes an Image Data Processing subsystem 206 that includes a computational device, such as a Personal Computer (PC), that performs data processing functions on the image data.

In the preferred embodiment, the PC of the Image Data Processing subsystem 206 receives digital image data from the video cameras 202. The digital image data may be communicated over a variety of data transmission techniques between the components, including wiring and radio frequency (RF) links. The RF link enables data transfer without the amounts of wiring that would be necessary if cabling were used for

data transmission. The Image Capture subsystem also includes a Storage and Security Processing component 208, which stores the data and also encrypts the data for transmission from the cameras to the PC. Encryption is especially useful if an RF link is used for communication, and prevents theft of the image and performance data, which is confidential and proprietary.

Figure 3A is a side elevation view of a baseball park equipped with the Figure 1 system, showing implementation of the system in relation to a hypothetical pitcher, batter, catcher, and umpire. Figure 3A shows a pitcher 302 located on a pitcher's mound 304 (with a baseball 305) at a specified distance from a batter 306. A catcher 307 crouches behind the batter, behind home plate 308, and an umpire 309 stands behind the catcher, judging each baseball thrown by the pitcher. The batter assumes a batting stance relative to home plate 308, in anticipation of receiving the baseball pitch. The pitcher, batter, catcher, and umpire are located in a ballpark or stadium, with a ballpark wall 310 defining the edge of the playing field 312 on which the players and umpire are standing.

Figure 3A also shows a camera 314 that has a view of the entire baseball trajectory, from the pitcher 302 to the batter 306. The camera captures images at regular intervals as the baseball travels from the pitcher to the batter. A second camera 315 also is positioned along the wall 310. The cameras 314,315 may or may not be elevated above the playing field, as required to be free of interfering with play and to have a good view of the ball path. Because any single camera in the system may have its view of the baseball trajectory temporarily blocked, it should be understood that multiple cameras will be positioned around the stadium, with each camera having a mostly unobstructed view of the baseball trajectory from pitcher to batter. If any single camera has an obstructed view, the entire path of the baseball will still be captured with the other video cameras to produce sufficient data for location processing.

In general, it has been found necessary to obtain sufficient data from video images to triangulate the object position at every point along the path. A minimum of two cameras are required to find the three dimensional position of the ball. More than two cameras economically provides a sufficiently high system pixel count for the desired accuracy. It has also been found that more cameras may be needed for optimal system performance, to ensure sufficient data to obtain measurement accuracy to within one-half

inch. Such multiple camera deployment overcomes a situation, for example, where the images from one of the cameras may not be usable due to crowd movement or persons in the playing field standing between the camera and the baseball trajectory. In that case, data from the other cameras will permit computation that locates the baseball. The additional cameras also provide redundancy that improves system reliability; if one camera goes down, the others can continue with slightly reduced accuracy. A calibration process described further below is performed prior to image capture and further ensures the desired accuracy.

Camera placement affects system performance and accuracy. The cameras need a clear view of the ball trajectory. Some additional constraints on camera placement occur due to the physical limitations of the ballpark, and the need to not interfere with the play or with observer (fan) viewing. Even so, there will be some latitude in where cameras are placed. For example, a camera may be attached under either the second or third tier of seats. Once the tier has been decided, there is still a degree of freedom in that the camera can be moved back and forth along a line parallel to the path from home plate to first base or third base. The system 100 determines the contributions to error and the resulting error in the measured ball position. In view of the description herein, those skilled in the art will understand how to determine where to optimally place the cameras so as to minimize the final error in the ball position.

Figure 3A also shows multiple reference markers 316,318,320,322 located around the playing field. The reference markers may comprise small reflective devices or other markers that will be easily detected in recorded image data, sufficient that conventional vision detection systems can detect their presence in the captured images and align the baseball images from video frame to video frame and obtain consistent data. Such vision detection and recognition systems will be familiar to those skilled in the art. It should be understood that a variable number of reference markers may be used, including some that may already be present in the video images, such as home plate 308 and other bases, including third base 323 and first base 324. The number of reference markers deployed will depend on the environment, but in any case should be sufficient to permit triangulation of the baseball location relative to the markers from the data in the captured images. In addition to having the baseball trajectory in each camera

field of view, a sufficient number of the reference markers should also be in each camera's field of view.

Suitable video cameras for capturing the images may comprise commercially available cameras that provide at least 640 x 480 pixel resolution and operate at approximately twenty-five or thirty frames per second (fps) with at least 1/1000 second shutter speed. In the preferred embodiment, cameras such as the Model TRV 900 camera from the Sony Corporation of Japan can provide a stream of digital data. Such equipment provides an adequate signal-to-noise ratio for the necessary image computations. The cameras may be located in spectator viewing areas, adjacent the playing field 312, as illustrated in Figure 3A. As noted above, even if the view of the baseball in the images from one camera may be obscured, the use of multiple cameras and their scattered placement ensures that the baseball will be viewable in the captured images from a different camera. Thus, at least two cameras must be used, such that the reference markers are viewable in every captured image. In this way, the best images from among the cameras will be used for computation of the baseball position.

Optimal camera positioning for measurement accuracy is achieved when the camera view angles are orthogonal to the ball path and to each other. Thus, it is advisable to position the cameras such that all their view directions are not in the same plane, or level. For example, if four cameras are used, two may be placed along the third base line and two along the first base line, but one or both of the cameras along each base path should be elevated with respect to the other camera along the same base path.

Figure 4 is a plan view 400 of the Image Capture subsystem components deployed in the baseball park shown in Figure 3. Thus, Figure 4 shows the pitcher's mound 304, and the location of the pitcher is marked by the"x". In Figure 4, the pitcher, batter, and umpire are not shown, for simplicity of illustration. Also visible in Figure 4 is the field wall 310 that divides the ballpark stands from the playing field 312.

The cameras 314,315 are shown adjacent to the wall, in the spectator area. The reference markers 316,318,320,322,323,324 are shown in Figure 4, but it should be understood that alternate locations for the markers may be used, so long as the reference markers are visible in the cameras'fields of view and can be reliably positioned in the ballpark so they will not be moved during play, while image capture is occurring.

Similarly, the cameras are shown along the field wall 310, but also may be positioned elsewhere, so long as they have an unobstructed view of the baseball trajectory and are not moved during play. It also should be understood that the cameras must be positioned so as not to interfere with play.

Control Subsystem Figure 5 is a block diagram representation of the Control subsystem 104 shown in Figure 1. The Control subsystem 104 is the part of the overall performance measurement system 100 (Figure 1) that receives the processed image data from the Image Capture Subsystem 102 and determines the accurate decision regarding a pitch being a ball or a strike. Figure 5 shows that the Control subsystem 104 includes a Decision Processing component 502, a Performance Feedback Processing component 504, an Operator Interface component 506, and a Security Processing component 508.

The Decision Processing component 502 comprises a software application program executing on a personal computer (PC) that may have the same construction as the PC of the Image Data Processing component 206. The Decision Processing component processes the image data to determine the trajectory of the ball and then determine if the observed trajectory intersects the batter's strike zone and hence whether the correct call should be a strike or a ball. The output of the Decision Processing component consists of a sequence of estimated, time-tagged ball positions (the position is estimated sequentially in time, at each instant in time a new camera image or images is available); a smoothed continuous trajectory of the ball derived from the sequentially estimated measurements; and a calculation of the target zone"miss distance" (for a ball) or, conversely, the"penetration distance" (for a strike) of the trajectory (enlarged to the size of a baseball) with respect to the three-dimensional"strike zone". The estimated accuracy on the"miss distance"or"penetration distance"is one-half inch. The functioning of the Image Data Processing component 206 and Decision Processing component 502 may be implemented in the same PC. Likewise, the Performance Feedback Processing component 504 is a software application program that may be implemented in the same PC as the Decision Processing component, or in the same PC as both the Decision Processing and Image Data Processing components. The Operator

Interface 506 provides a graphical user interface through which a system operator controls operation of the system.

As with the Image Capture subsystem (Figure 2), the Control subsystem 104 also includes a Security Processing component 508, which can decrypt the image data upon receipt from the cameras and/or PC of the Image Capture subsystem. This prevents theft of the image and performance data, which is kept confidential and proprietary. In addition, the Security Processing component 506 may include encryption facilities for any data transmissions that may take place from the component to other devices. In the preferred embodiment, the processing components of the Image Capture subsystem and of the Control subsystem are implemented as applications executing on the same computer.

Computer Configuration Figure 6 is a block diagram of a computer constructed in accordance with the present invention. As noted above, the processing performed by the Image Capture subsystem 102 and the Control subsystem 104 may be performed on computers having a similar construction or may be performed by a single computer. In either arrangement, the computer performing the processing may have a construction as illustrated in Figure 6.

Figure 6 is a block diagram of an exemplary computer 600 such as might comprise any of the PC computers of the Image Data Processing component 206 or of the Decision Processing component 502. Each computer 600 operates under control of a central processor unit (CPU) 602, such as a"Pentium"microprocessor and associated integrated circuit chips, available from Intel Corporation of Santa Clara, California, USA. A computer user can input commands and data from a keyboard and mouse 604 and can view inputs and computer output at a display 606. The display is typically a video monitor or flat panel display device. The computer 600 also includes a direct access storage device (DASD) 607, such as a fixed hard disk drive. The memory 608 typically comprises volatile semiconductor random access memory (RAM). Each computer preferably includes a program product reader 610 that accepts a program product storage device 612, from which the program product reader can read data (and to which it can optionally write data). The program product reader can comprise, for

example, a disk drive, and the program product storage device can comprise removable storage media such as a floppy disk, an optical CD-ROM disc, a CD-R disc, a CD-RW disc, DVD disk, or the like. Each computer 600 can communicate with the other connected computers over the network 613 through a network interface 614 that enables communication over a connection 616 between the network and the computer.

The CPU 602 operates under control of programming steps that are temporarily stored in the memory 608 of the computer 600. When the programming steps are executed, the pertinent system component performs its functions. Thus, the programming steps implement the functionality of the system modules 102,104 illustrated in Figure 1. The programming steps can be received from the DASD 607, through the program product 612, or through the network connection 616. The storage drive 610 can receive a program product, read programming steps recorded thereon, and transfer the programming steps into the memory 608 for execution by the CPU 602. As noted above, the program product storage device can comprise any one of multiple removable media having recorded computer-readable instructions, including magnetic floppy disks, CD-ROM, and DVD storage discs. Other suitable program product storage devices can include magnetic tape and semiconductor memory chips. In this way, the processing steps necessary for operation in accordance with the invention can be embodied on a program product.

Alternatively, the program steps can be received into the operating memory 608 over the network 613. In the network method, the computer receives data including program steps into the memory 608 through the network interface 614 after network communication has been established over the network connection 616 by well-known methods that will be understood by those skilled in the art without further explanation.

The program steps are then executed by the CPU 602 to implement the processing of the Presentation Session system.

It should be understood that all of the computers of the system 100 illustrated in Figure 1 preferably have a construction similar to that shown in Figure 6, so that details described with respect to the Figure 6 computer 600 will be understood to apply to all computers of the system 300. Any of the computers can have an alternative construction, so long as they can communicate with the other computers and support the functionality described herein.

Operator Interface Home Display Figure 7 is a representation of a display of the computer processing system shown in Figure 6, illustrating the operator initialization"Home"presentation that is produced by the graphical user interface provided by the Operator Interface component and shown in a window of the display 606 of the computer 600. In the Figure 7 representation, the display window 700 includes a title bar 702 along the top edge indicating the window title,"Strike System-Home Page". Four menu selection display buttons are shown, comprising"Calibrate"704,"Setup"706,"Pitch"708,"Feedback" 709, and"Store"710 processes that may be performed by designation of the system operator upon using the display device mouse 604. An"Exit"button 712 permits the operator to halt execution of the performance measuring system.

System Operation Figure 8 is a flow diagram representation of the processing steps executed by the computer system shown in Figure 6. In the first operating step, indicated by the flow diagram box numbered 802, the system receives an operator input command. The input may come from an operator through, for example, the keyboard or mouse of the computer. After the input command is received, the system responds by checking the menu selection to which it corresponds. If the command is for a static calibration operation, an affirmative outcome at the decision box numbered 804, then the system performs a location calibration operation as indicated at the flow diagram box numbered 806. The static calibration operation step 806 is performed upon camera placement using techniques such as laser range finding and gravity field alignment before starting the system for any performance measuring session, such as any one game or training session. This is described in greater detail below. After the static location calibration, the system waits for the next operator command, represented by the return of processing to the flow diagram box numbered 802.

If the operator command is not for a calibration process, a negative outcome at the decision box 804, then the system next checks for a command to perform a pre-game setup operation, as represented by the flow diagram box numbered 808. The setup processing is performed if applicable, as indicated by the affirmative outcome of the decision box 808 and the flow diagram box numbered 810. After setup, processing then

returns to receive the next operator command at the flow diagram box numbered 802.

If the pre-game setup was not selected, a negative outcome at the decision box 808, then the system checks for selection of pitch processing, indicated by the decision box numbered 812.

If pitch processing was selected, an affirmative outcome at the decision box numbered 812, then the system receives pitch identification information, performs any necessary dynamic pitch calibration operations, and processes pitch data for determination of balls and strikes as described above in the Decision Processing box of the Control Subsystem. This processing is represented by the flow diagram box numbered 814. The dynamic pitch calibration operations are described in greater detail in conjunction with Figure 9. The pitch processing and associated options are described in greater detail in conjunction with Figure 11. The pitch identification processing 814 may occur in a post processing step. After data from a pitch has been processed, the system checks to determine if a continuation of the pitch processing is called for, with no change to pre-game setup parameters. This checking is represented by the decision box numbered 816. If such continuation is requested, an affirmative outcome at the decision box 816, then processing returns to the flow diagram box 814, where the system receives information for the next pitch and continues with the pitch processing.

If no more pitches are to be processed, a negative outcome at the decision box 816, then system processing returns to box 802.

Every time a pitch is measured, images from different cameras need to be processed and correctly time-tagged for later retrieval. One problem in processing is that the frames from different cameras generally will not be taken at exactly the same time and therefore will not be synchronized. This lack of synchronization can throw off measurement calculations and result in processing errors. The lack of synchronization is because the cameras will have slightly different start times and because the frame-to- frame time intervals will be slightly different for each camera. Most cameras do not have a clock with sufficient time resolution to measure the time shifts between cameras, even if all the clocks of the cameras could be perfectly synchronized.

In the preferred embodiment, frame synchronization is achieved by shifting the pitch recording sequence for each camera in time, based on the recorded images, until maximum smoothness for the pitch is achieved. This will solve the time shift between

all the cameras. Another problem is that a given camera or controlling computer may have a slightly faster clock than the others. Even if the camera clocks were accurately synchronized to start, they might drift apart during the course of play or from pitch to pitch. The system 100 solves this problem by time tagging the images according to a central clock, with the time shifted by the amount determined by the required maximum smoothness criterion.

More particularly, the system 100 designates one camera as a"master"camera or time standard to which all the cameras will be synchronized. In the synchronization technique of the preferred embodiment, multiple images of a thrown baseball are collected from a camera and superimposed on each other to generate a single image in the form of a ballistic trajectory. The trajectory from the master camera is compared with the trajectory from each of the other cameras. With a suitable computer processing routine, such as video processing techniques or linear transformations, the frame-to- frame spacing of the images from each camera can be adjusted so as to synchronize with the spacing (timing) of the master camera. Thus, all of the cameras are synchronized to the master camera.

Because the differences in camera frame rates will cause differences in image interval spacing over time as the cameras are operated, the preferred embodiment performs the synchronization processing with each pitch. The processing may be performed on the fly, as images are received from the cameras, so as to make the image corrections and determine the correct pitch call within a reasonable time interval after a pitch. Alternatively, the processing may be performed in the interval between pitches during play, or the processing may be performed after all pitches from a game have been recorded, in a batch processing mode. In any one of these scenarios, the processing ensures that the image data from all the cameras will be properly and automatically synchronized for every pitch.

Returning to the flow diagram of Figure 8, if the initial operator command was not a request for pitch processing, a negative outcome at the decision box numbered 812, then the system next checks for a command to save the session data. This processing is represented by the decision box numbered 818. A command to store the session data, an affirmative outcome at the decision box 818, causes the system to execute the save processing. The save processing is represented by the flow diagram box numbered 820

and involves storing the umpire performance, as well as a record of the corresponding pitches. The save processing is described in greater detail below.

If neither the pitch processing nor any other available operator command for further processing was selected, a negative outcome at the decision box numbered 818, then the"Exit"command was selected by the operator, and at the flow diagram box numbered 822 the exit processing is performed. Other operation of the computer is then continued.

Calibration Display As noted above, one of the operator commands available from the home page display is a Calibrate command. Figure 9 is a representation of the Calibrate display window 900 of the computer shown in Figure 6. The Calibrate display is used to initiate the processing represented by the flow diagram box 806 of Figure 8 upon selection of the Start button 902. The calibration operation of Figure 9 is performed once upon placement of the cameras, before starting the image capture operation.

Thereafter, it should not be necessary to perform a similar calibration before starting the performance measuring system for any measurement session, so long as camera position is not changed. Thus, calibrating the system precisely determines the camera perspective in relation to the baseball playing field and to the fixed references.

Those skilled in the art will appreciate that an initial camera calibration operation will be performed on an individual camera before the camera is placed at the playing field. This laboratory calibration operation will be known to those familiar with the digital video image cameras described above, which preferably utilize CCD arrays for imaging. The laboratory calibration involves measuring and correcting for what are known as camera intrinsic parameters, such as effective pinhole focal length, vector center of the camera CCD array, x and y pixel spacing, and r'and r'radial distortion.

These laboratory calibration operations ensure that points in the camera field of view will be mapped to corresponding points in the camera CCD array, so that objects in the center of the camera field of view are mapped to the center of the CCD array, and objects in the upper left corner field of view are mapped to the upper left corner of the final image, and so forth. Such mapping may be accomplished with the aid of test patterns or other special images viewed through the camera lens. The laboratory

calibration process applies any needed correction factors to the camera video information to provide the correct mapping. After the laboratory calibration, the camera is ready for use at the playing field.

The Figure 9 calibration process at the playing field is necessary because the relative positions of the cameras will be different when the system is deployed at different playing fields. Therefore, to achieve the required baseball location accuracy to within one-half inch, the system must be calibrated once at each new playing field location and camera deployment. This location calibration process comprises precisely determining the location of each camera with respect to a fixed coordinate system that can define a fixed point in space. The coordinate system must describe known, precisely discernible positions. In the context of the baseball implementation of the preferred embodiment, the origin of the fixed coordinate system is defined by the back corner of home plate, which occurs at the intersection of the playing field foul lines.

At the playing field, the location calibration process involves carefully measuring the locations of the reference points and cameras. For example, these measurements may be obtained with precision laser range finders or global positioning satellite (GPS) system receivers. Those skilled in the art will appreciate that a variety of laboratory calibration and location calibration techniques may be used. One such technique is described in"An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision"by Roger Y. Tsai, published in the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (1986) at pages 364-374. As described further below, a further pitch calibration process is performed at least between every pitch to confirm the view direction of the cameras and, if necessary, perform correction calculations.

The location calibration may be performed using geometrical models of the object environment, or taking into account intrinsic camera parameters including lens distortion and other video transmission artifacts. Such techniques may incorporate 3-D machine vision processing techniques described in the Tsai article mentioned above. To compute both the translational and rotational matrices that will be necessary for complete calibration of the system with the one-half inch accuracy described above, seven or more non-coplanar field calibration points may be required from each camera to establish camera position relative to the fixed coordinate system for a in-field calibration. By

prior lab calibration of the cameras'effective pinhole focal length, CCD center, x and y pixel spacing, and r2 and r4 radial distortion, and if the camera locations are known, the number of required reference points required drops to two. Using more reference points provides redundancy and better accuracy. The calibration points may comprise, for example, identifiable locations in the spectator stands and locations in the playing field. More than one camera can use the same set of seven points for calibration. For example, if two cameras are located adjacent a wall along the third-base line, then they both should be able to generate images in which the same seven non-coplanar calibration points are visible. Another set of cameras might be located along the first-base line, and therefore may not be able to use the same set of calibration points as the cameras of the third-base line. Thus, it may be necessary to establish multiple sets of calibration points.

Accurate knowledge of the location of the reference markers in"real world" coordinates is necessary for accurate ball location. That is, the physical distance between the reference markers must be accurately known. In addition, the direction of the gravity field needs to be known to accurately measure the pitch trajectory physics and to account for gravitational forces (ballistics) and object characteristics (such as ball spin). That is, with the direction of the gravitational field known, and the distance from the reference markers accurately known, it is possible to determine whether ball movement is due to ballistic forces or is due to spin on the ball, thereby categorizing whether a pitch was a"fast ball"pitch, a curve ball, or some other type of throw. This processing is most conveniently done by first aligning a real world coordinate system axis with the gravity field. Figure 3B depicts a method to accomplish this goal.

Figure 3B shows a plumb bob 350 that provides the direction of the gravity field and a known length. The lower end of the plumb bob points to a corner of home plate 308, to be used as the first reference point in this illustration. The other end of the plumb bob is attached to a reference point of a laser range finder 352. A laser beam 354 shines onto a reflector 356, which marks the distance from the laser range finder to the second reference point, which is illustrated as a corner of first base 324. The reflected beam is received back at the laser range finder and the distance from the range finder 352 to first base 324 is accurately determined. The height of the range finder above home plate is known. The reflector 354 is then moved to the next reference point, such as third base. The laser range finder is used to accurately determine the

distance from home plate (the first reference point) to third base. This procedure is repeated to determine the distance from home plate to every other reference point and camera being used. The laser range finder 352 and plumb bob 350 are then positioned at the next reference point, such as third base. The reflector 354 is then moved to each of the other reference points and cameras (home plate, first base, etc.) to determine the distance from third base to every other reference point and camera being used. With the distance of the laser range finder above the playing field known, and the distance between the reference points and cameras known, an accurate mapping of distances will be obtained and will ensure the extremely high positioning accuracy available with the system.

Thus, the reflector 356 is moved to each reference marker, and each camera until the distance to every reference point, and to each camera, is measured. This process is repeated so a minimum of three reference points is visible in the viewing frame of each one of the cameras. Those skilled in the art will recognize that this will provide a very accurate way to locate each reference point in real world coordinates, and to accurately calculate the camera positions. It is noteworthy that this method does not require the playing field to be flat or level, or that the geometry of the base paths (or any other feature of the playing field) be perfectly parallel or orthogonal. This procedure provides about 1 mm accuracy in locating reference points and cameras. An alternative to this method would be to use differential GPS, and integrate over a sufficient time span to get the desired accuracy. Those skilled in the art will readily understand how to perform this operation.

A second method that can be used to calibrate the positions of the reference points and the cameras is an all optical method that uses only images of the field for calibration, with no active measurements of the positions as described above with the laser range finder. This method would have the advantages of ease of setup and added portability for the system, with the tradeoff of being less accurate than actively measuring the reference point and camera positions. This method would require building a virtual model of the playing field, that would minimally specify home plate, first base, and third base. Additional assumptions would include the field being flat and level. Lines detected by the image processing would be assumed to be parallel or orthogonal. Ellipses would be assumed to be circles distorted into ellipses by the

viewing angle. Those skilled in the art will recognize that the virtual position, pointing, and zoom of the camera can be mathematically varied to produce the best match with the images taken, thereby calibrating the system.

For performing the location calibration, an image is received from each camera.

The image may comprise, for example, a digital data video frame produced with the camera locked in position and viewing a test pattern. With each image generated from a given camera, the Image Processing component finds the calibration markers, measures the pixel location of the markers in the image frame, and provides the x, y pixel coordinates to a calibration routine. The calibration routine determines the location and pointing vector of the camera with respect to the fixed coordinate system using known techniques, such as described in the Tsai article mentioned above. Thus, for the baseball implementation, the calibration routine will establish the precise location and pointing of the camera relative to the origin of the coordinate system, which is located at the back point of the home plate.

In between pitches, the system checks the location of the reference points relative to the position noted at the pre-game, location calibration operation to determine if there has been a change in the camera pointing vector. If there is any discrepancy, the system determines corrective operations, such as shifting and/or rotating the image. This pitch calibration operation is preferably performed automatically between pitches. If desired, this pitch calibration may be performed after each video frame is processed.

The display window 900 of Figure 9 shows that, if desired, a window dialogue box 904 can be provided for designating the number of calibration points available.

This may be useful, for example, in troubleshooting the system, so that if the calibration routine detects a number of calibration points that does not match the specified number, then the system will indicate an error. The window 900 also permits entry of"real world"data in the form of x, y, z point data through a dialogue box 912 that receives distance information for reference markers. This type of distance calibration information also can be received through x-y pixel data in a pixel dialogue box 914 whose data also can be automatically entered by means of designating points in a display area 916 of a camera view. Camera views in the box 916 can be selected with the camera dialogue box 918.

After the cameras are calibrated, the Image Processing component receives the image data and processes it to determine location of the object relative to the fixed coordinate system. Thus, in the baseball implementation, the Image Processing component receives the image data frame by frame, showing the path of a baseball from the pitcher to the catcher. For each video frame, the Image Processing component locates the baseball in the frame and determines the pixel location of the ball in the frame, after considering any adjustment data produced from the calibration process.

When the location calibration process is completed, a message will be shown in the Calibration display page of Figure 9, such as with a pop-up display (not shown).

The system operator can then select the"Previous"button 906 to return to the Home (Figure 7) display. The system will then return to the home page to await the next operator command, so that the computer display will then show the home page of Figure 7. At any time during calibration, the system operator may select the"Cancel" button 908 to halt the process, and may select"Help"910 for instructions.

Pre-Game Setup Display After the location calibration has been completed for a particular playing field installation, the next processing step to be performed is typically that of pre-game setup, which is selected from the"Home"display of Figure 7 and then accessed through the Setup display screen 1000 of Figure 10. The Setup screen permits an operator to select the teams involved, the pitchers who will be performing, the batter sequence, and the umpire. Each of the display boxes in Figure 10 designates a drop-down menu or dialogue box that permits selection of predetermined choices and also permits input of operator input choices. This could conceivably be automated by connection to the Internet"world wide web"or another database.

For example, the Figure 10 Setup screen 1000 shows a display box 1002 for a "Visiting"team and a display box 1004 for a"Home"team. The respective team boxes 1002,1004 include a drop-down menu 1006,1008 that permits an operator to view a list of all Major League teams for selection. In the preferred embodiment, the operator also may move a display pointer into the"Team"menu box 1006,1008 and input a particular team name by typing. A"Pitcher"box 1010,1012 will, upon selection or input of a team name in the corresponding Team box, include a listing of pitchers on

the roster of the entered Team name. Such information may be available from the Control subsystem or the system database. Those skilled in the art will understand that up and down arrows of the respective boxes may be used to scroll up and down the roster list of pitcher names. The system operator also may input a pitcher's name directly into the Pitcher 1010,1012 box by typing the name.

The Figure 10 Setup display 1000 also is used to specify a batting lineup. For example, upon selection of a visiting team name, the"Batters"display box 1014,1016 will include a list of batters on the roster of the selected team. Using selection arrows 1018,1020, the operator may move selected batters into and out of the"Lineup"box 1022,1024. As a batter's name is selected, the batter is moved into the Lineup box to be the next sequential batter. Up and down arrows in the Lineup box can be used to scroll up and down the lineup list, and the selection arrows can be used to move batter names back and forth between the Batters display box 1014,1016 and the Lineup display box 1022,1024 until the desired batting lineup is achieved. An"Umpire" display box 1030 is provided for designation of the umpire who will be performing or undergoing training. A"Date"box 1032 permits entry of date information, to identify the data recording session. An"Update"button 1034 permits automatic updating of information, such as where the system communicates with a database that can automatically retrieve information such as teams, pitchers, lineup, and the like.

The display buttons marked"Previous"1040,"Cancel"1042, and"Help"1044 have the same actions as described above for the corresponding display buttons of Figure 9.

Pitch Processing Display Figure 11 shows the Pitch Processing display screen 1100, which the system operator uses to initiate calculation of the baseball trajectory and determination of whether the baseball passes through the strike zone. A thrown baseball is considered a strike once any part of the baseball touches the boundary of the strike zone. The preferred embodiment determines the position of the baseball as it passes home plate by using a trajectory reconstruction technique, such as a Kalman filter. Those skilled in the art will be familiar with Kalman filter techniques and will understand that such techniques iteratively estimate the position of an object along a path, producing an

updated position estimate with new measurement data integrated with the most recent position estimate. As an added bonus, parameters measured via the Kalman filter or other fitting process can give the pitch's curvature, and hence determine the spin axis and effective spin rate of the ball. This will tell if the pitch is a curve vs. fastball vs. slider, etc. It is noteworthy this method does not require high resolution, high frame rate cameras that would be needed to actually image the ball well enough to see and measure the spinning.

More particularly, upon selection of the"Start"display button 1102, the Control subsystem 104 (Figure 1) receives the position data from each video frame and produces an estimate of the baseball position that will be observed in the next video frame. Thus, multiple prior position data points are used to generate a position estimate, so that each new position data point is used to produce an updated estimate. In this way, the latest video data of the actual position is used to generate an improved position estimate.

Those skilled in the art will recognize such computational processes as an implementation of a Kalman filter. In this way, the position of the baseball is predicted forward with each new frame of data and the position updating process is repeated again and again until the baseball passes home plate.

The pitch processing of the Control subsystem Decision Processing Component 502 uses conventional image processing techniques to isolate object movement and identify the data of interest. For example, in the baseball implementation, the change in location of the baseball from frame to frame should be easily identified because the location of all other reference markers in any two time sequential frames from the same camera should be unchanged. Thus, the system can adjust any two frames of image data so that the reference markers effectively occupy the same pixels in each frame. The only significant difference between the two frames will then be the change in the location of the baseball. The image processing may then subtract the video information in the two frames from one another to highlight the baseball location. In this way, the movement is accurately identified and the change in magnitude and direction of the baseball movement is quantified. In particular, with the pixel coordinates of the baseball identified, the Image Processing component can determine the location of the baseball relative to the fixed coordinate system; that is, home plate. Accuracy can also be increased by real-time adjustments to the calibration process, which those skilled in the

art will understand can be provided using convolution filter techniques. Other image processing techniques may be employed to isolate object movement and identify the data of interest.

Those skilled in the art will also recognize that Fourier filtering and color filtering techniques may be used to improve the signal-to-noise ratio of the baseball image data with respect to the background image data in a video frame. Such Fourier techniques can eliminate some of the undesired image data prior to the image subtraction operation described above, thereby improving the accuracy of the object location determination. It has been found, however, that video frame subtraction as described above is sufficient to provide the desired accuracy.

As noted above, the pitch processing routine of the Decision Processing Component makes a determination of whether the thrown baseball is a ball or a strike, depending on whether the projected ball path is determined to intersect the strike zone.

Although the vertical boundaries of the strike zone are determined by imaginary planes extending upward from the edges of home plate, the upper and lower horizontal boundaries of the strike zone are hitter-dependent. Therefore, the system of the invention determines each batter's strike zone immediately after the batter moves from a standing position to a batting stance, by viewing real-time video frame data and using image processing techniques. In the preferred embodiment, this is achieved with neural network processing, refined by input from a player database. Alternatively, the system can determine a batter's strike zone from a developed player database of strike zone data, if the particular batter is included in the data base. This processing scheme results in a highly accurate strike zone measurement and pitch call decision reporting, which can be used to provide feedback to umpires and improve their mental imagery of the strike zone, thereby improving the accuracy of their calls.

The pitch processing routine is initiated when the system operator selects the Figure 11 display"Start"button 1102. The system operator will select"Start"when the batter moves to a batting stance, and will thereby start the image capture (recording) and pitch data processing. All the Figure 11 information concerning inning or training session, batter, pitch number, and umpire pitch call, are filled in by the system operator either before or after the actual pitch is thrown. Alternatively, this information may be filled in automatically by a web connection or connection to another database. The

system automatically records all the data in the PC data base, along with the actual pitch outcome as determined by the pitch processing routine described above at or before the next"Start"activation.

In the Pitch Call box 1110, the system operator indicates whether the pitch was a strike, ball, or other event. So long as the system operator continues selecting the "Start"display button, the operator will continue to see the Figure 11 display screen and the system will continue operation in the Pitch Processing mode. At the conclusion of the game or training session, the system operator may select the"Previous"display button 1120 to return to the Figure 7 home display, where"Exit"can be selected to halt the strike analysis system processing. The"Cancel"button 1122 and"Help"button 1124 have the same actions as described above for Figure 9. The"Update"button provides the same functionality as with Figure 10, whereby lineup information can be automatically inserted from a database with which the system is in communication.

In the pitch processing, after the system operator has selected the"Start"button 1102, each of the cameras begins recording the pitch at a rate of twenty-five or thirty or more frames per second. For a typical pitch sequence, the flight of a baseball will typically require between fifteen and forty frames to record. The pitch processing executed by the Control subsystem computer will subtract each new frame from the previously received frame to highlight and identify the position of the baseball. This processing removes the constant background features, such as the playing field wall and spectators, that are visible in the video images, and accentuates the change in baseball position from frame to frame. The processing results are received by a decision element that calculates the precise position of the baseball in relation to the strike zone and batter, and determines whether the thrown baseball was a ball or a strike.

To make the ball/strike decision, a smoothed trajectory is created from the measured baseball positions and a determination is made as to whether any part of the smoothed trajectory, including allowance for the baseball itself, intersects the strike zone.

If there is an intersection, then the thrown baseball is determined to be a strike. If there is no intersection, the throw is determined to be a ball. The calibration information is used to precisely determine the relationship between the baseball and the strike zone with respect to the baseball field reference markers and ensure the accuracy of the analysis process.

More particularly, the baseball is tracked to home plate and the ball/strike decision is made using a sequential estimation process, such as a Kalman filter. The baseball location in each image is quantified using the pixel location and is then used by the Kalman filter processing routine to update a next approximation of the baseball location. Although triangulation of the image data from multiple camera angles permits great accuracy, the inherent random noise, and finite pixel count, in the measurement process always limits the accuracy of the result.

As noted above, the system operator will select the Start display button 1102 at the beginning of the pitching sequence, at the pitcher's wind-up, to start the recording of images. The flight of a baseball from pitcher's wind-up to crossing home plate, recording images at a rate of twenty-five or thirty or more frames per second, will typically provide a total of twelve to forty video frame data intervals to the Kalman filter processing routine. Thus, the Kalman filter routine will be able to use twelve to forty data points to make a final predicted baseball location. It should be understood that the associated error of the prediction is a function of the number of images received by the Kalman filter routine, and therefore the error will be reduced by providing as many baseball location measurement data points as practical, using stereo viewing geometries. In the preferred embodiment, four cameras are positioned to collect data.

For example, one pair of cameras may be located along the third base line, and the second pair may be located along the first base line.

Feedback Display Figure 12 is a representation of the Feedback display window 1200 of the computer shown in Figure 6. The Figure 12 display provides a visualization of the pitches thrown to a named batter, the ball/strike determination calculated by the system, and the call made by the umpire. The display provides information on the performance measures, the test subject decision accuracy and timing, and the extent and magnitude of decision errors.

In the preferred display embodiment shown in Figure 12, the batter is identified by name in a"Batter"window dialogue box 1202 placed above a performance display area 1204 containing representations of the pitch calls relative to the strike zone. The performance display area preferably shows separate strike zones that represent the umpire calls that were correct 1220, incorrect 1222, and that were close calls 1224. The displayed strike zones are intended to represent an area having the width of the strike zone (home plate) and the height of the strike zone (dependent on the batter), from the perspective of the umpire, looking forward toward the pitcher. Circles represent correct calls and squares represent incorrect calls, with numbers optionally included to indicate the pitch sequence. A"View"box 1206 permits the system operator to select a different perspective, such as pitcher, first base line, third base line, or plan view (overhead).

In the Feedback window 1200, the location of each thrown baseball relative to the strike zone 1220,1222, and 1224 is indicated by placement of a display icon, either a circle or a square. The circles represent pitches correctly called by the umpire and the squares represent pitches incorrectly called by the umpire. A number within each circle and square indicates the number of the throw in the pitching sequence to the batter who is identified in the name box 1202.

Thus, in Figure 12, the first pitch is indicated as a ball, thrown high and to the batter's left shoulder, with the square icon indicating that the umpire incorrectly called it as a strike. The second thrown ball is indicated as passing within the strike zone 1204, with the umpire making a correct call, indicated by the circular icon. It should be apparent that the third and fifth baseballs thrown were determined by the system to be balls, passing outside the strike zone 1204, with the umpire incorrectly calling the third ball and correctly calling the fifth ball. The third ball is indicated as a"close call", meaning that the distance of the ball's traversal from the target zone (strike zone) was within a predetermined"close call"tolerance band. It also should be apparent that the sequence of balls shown in Figure 12 does not correspond to an actual game, as any actual game sequence of thrown baseballs will terminate after the umpire calls three strikes, whether or not the system determines they were correctly called. A score of the test subject performance is shown in a window 1208 to indicate percentage of correct calls, or some other performance measurement. In addition, the circles and squares can be color coded to further aid the visualization. As an arbitrary example, green could mean a correct call, red could indicate a incorrect call, and blue a close call. In the example of Figure 12, the placement shows an umpire that tends to call pitches to the left correctly, but has more trouble with ones on the right side.

Thus, the Feedback display 1200 shows decision accuracy over a series of batters, pitches, pitch types, and locations relative to the strike zone. The displayed information can be selected and organized by pitcher, type and sequence number of pitch, series of pitches, batter, inning, or game. In addition, multiple viewing perspectives can be selected, to show the strike zone and baseball placement from the perspective of the home plate umpire, batter, or pitcher. Each viewing perspective will show the entire trajectory of the thrown baseball from leaving the pitcher's hand to arrival at the home plate area. The Feedback visualization will also provide a statistical summary by pitch, batter, inning, or game. The"Previous"button 1210,"Cancel"button 1212, and"Help" button 1214 have the same actions as described above for Figure 9.

Store Display Figure 13 is a representation of the Store display window 1300 of the computer shown in Figure 6. The Figure 13 display provides a menu of store parameters that will identify a data collecting session for later retrieval. Thus, the Store window receives batter identification information 1302, the date of the recording session (or game) 1304, umpire identification 1306, and an identification number 1308 that may be automatically assigned or provided by the system operator. The"Previous"button 1310,"Cancel" button 1312, and"Help"button 1314 have the same actions as described above for Figure 9.

As described above, the invention provides a system that iteratively records images of an object as it moves along a path and is observed by a test subject, who makes a decision concerning an object characteristic, and the system repeatedly updates a sequence of approximations of the object path movement based on the recorded images. The system then produces an accurate decision concerning an object characteristic in response to the recorded images. The system also can receive the test subject decision about the object characteristic and can compare the test subject decision and the accurate decision concerning the object characteristic to thereby assess the test subject decision.

Alternatives The preferred embodiment has been described with respect to a baseball umpire decision task. The invention, however, may be applied to a wide variety of situations where a test subject must quickly make a decision after observing and tracking movement of one or more objects, and where it is necessary to judge the test subject performance in real time with a high degree of accuracy. For example, the system could be used to assess judging athletic events such as calling tennis balls in or out of play, or judging whether basketball players where within a three-point zone before shooting.

Other implementations of the invention may involve commercial, laboratory, or other settings where precise, real-time assessment is needed and decisions must be made regarding movement of an object. For example, products may be accepted or rejected depending on their movement characteristics, such as how balls bounce or objects rebound.

The present invention has been described above in terms of a presently preferred embodiment so that an understanding of the present invention can be conveyed. There are, however, many configurations for performance measuring systems not specifically described herein but with which the present invention is applicable. The present invention should therefore not be seen as limited to the particular embodiments described herein, but rather, it should be understood that the present invention has wide applicability with respect to performance measuring analysis generally. All modifications, variations, or equivalent arrangements and implementations that are within the scope of the attached claims should therefore be considered within the scope of the invention.