Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
LENGTH MEASUREMENT AND HEIGHT MEASUREMENT APPARATUS AND METHOD
Document Type and Number:
WIPO Patent Application WO/2023/118775
Kind Code:
A1
Abstract:
A length measurement apparatus for measuring the length of an object visible in an image, the apparatus comprising: a line generation section adapted to generate a line extending along the length of the object, the line including the front end and/or the rear end of the object; a portion selection section adapted to select a portion of the generated line including the front end and/or the rear end of the object; a colour extraction section adapted to extract from each pixel along the selected portion an intensity value representing the intensity of a colour in the pixel; a relative intensity section adapted to calculate for each of the pixels the extracted intensity value relative to that of a reference pixel of the pixels; an end determination section adapted to determine as a possible endpoint of the object a pixel for which the calculated relative intensity value is within a predetermined range; and a length calculation section adapted to calculate the length of the object based on the determined possible endpoint.

Inventors:
HUMPHRIES GRANT (GB)
Application Number:
PCT/GB2022/050481
Publication Date:
June 29, 2023
Filing Date:
February 22, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HIDEF AERIAL SURVEYING LTD (GB)
International Classes:
G06T7/60; G01C11/06; G06V20/17
Domestic Patent References:
WO2013088175A12013-06-20
Other References:
HUMPHRIES GRANT RICHARD WOODROW ET AL: "Aerial photogrammetry of seabirds from digital aerial video images using relative change in size to estimate flight height", RESEARCH SQUARE, 3 November 2021 (2021-11-03), pages 1 - 23, XP055961949, Retrieved from the Internet [retrieved on 20220917], DOI: 10.21203/rs.3.rs-1032839/v1
Attorney, Agent or Firm:
J A KEMP LLP (GB)
Download PDF:
Claims:
CLAIMS

1. A length measurement apparatus for measuring the length of an object visible in an image, the apparatus comprising: a line generation section adapted to generate a line extending along the length of the object, the line including the front end and/or the rear end of the object; a portion selection section adapted to select a portion of the generated line including the front end and/or the rear end of the object; a colour extraction section adapted to extract from each pixel along the selected portion an intensity value representing the intensity of a colour in the pixel; a relative intensity section adapted to calculate for each of the pixels the extracted intensity value relative to that of a reference pixel of the pixels; an end determination section adapted to determine as a possible endpoint of the object a pixel for which the calculated relative intensity value is within a predetermined range; and a length calculation section adapted to calculate the length of the object based on the determined possible endpoint.

2. The length measurement apparatus of claim 1, wherein: the generated line includes the front end and the rear end of the object; the portion selection section is adapted to select a front portion including the front end and a rear portion including the rear end; the end determination section is adapted to determine a possible front endpoint from the front portion and a possible rear endpoint from the rear portion; and the length calculation section is adapted to calculate the length of the object between the determined possible front endpoint and the determined possible rear endpoint.

3. The length measurement apparatus of claim 1 or 2, wherein: the image is an aerial image captured from a moving platform; and the length calculation section is adapted to calculate the length of the object by measuring the Euclidean distance from the number of pixels in the vertical and horizontal directions between the endpoints of the object based on the ground sample distance of those pixels.

4. The length measurement apparatus of any preceding claim, wherein: the length calculation section is adapted to calculate the length of the object based on each of a plurality of determined possible endpoints.

5. The length measurement apparatus of any preceding claim, wherein: the colour extraction section is adapted to extract the intensity value for each of a plurality of colours in the pixel; and the length calculation section is adapted to calculate the length of the object based on each of a plurality of determined possible endpoints corresponding to each of the plurality of colours.

6. The length measurement apparatus of any preceding claim, wherein: the image is a frame of a series of images; and the length calculation section is adapted to calculate the length of the object based on each of a plurality of determined possible endpoints corresponding to each of a plurality of frames in which the object is visible.

7. The length measurement apparatus of any preceding claim, wherein the image is a frame of a series of images and the apparatus comprises: a frame counting section adapted to count the number of consecutive frames in which the object is visible; and an object filter section adapted to exclude from length measurement any object for which the counted number is below a predetermined threshold. 8. The length measurement apparatus of any preceding claim, comprising: a line input section adapted to allow a user to define a user-defined line along the length of the object from approximately the front end to approximately the rear end; wherein the line generation section is adapted to generate the line by extending both ends of the user-defined line.

9. The length measurement apparatus of any preceding claim, wherein the reference pixel is at an end of the selected portion.

10. A height measurement apparatus for measuring the height of the object, the apparatus comprising: the length measurement apparatus of any preceding claim; and a height calculation section adapted to calculate the height of the object using the calculated length of the object, a reference length for the object and the height of a moving platform from which the image is captured.

11. The height measurement apparatus of claim 10, comprising: a classification apparatus for classifying an object visible in an aerial image captured from a moving platform, the classification apparatus comprising; an imaging section adapted to provide an aerial image captured from a moving platform in which an object above a water surface is visible; a reflection determination section adapted to allow a user to input whether the image includes a reflection of the object on the water surface; and a height classification section adapted to classify the object as being a reference object at less than a predetermined height above the water surface if it is input that the image includes such a reflection, and to classify the object as being at least the predetermined height above the water surface if it is input that the image does not include such a reflection; wherein the reference length for the object is a measured length of the object 21 classified as being a reference object.

12. The apparatus of any preceding claim, wherein the object is an airborne bird.

13. A length measurement method for measuring the length of an object visible in an image, the method comprising: generating a line extending along the length of the object, the line including the front end and/or the rear end of the object; selecting a portion of the generated line including the front end and/or the rear end of the object; extracting from each pixel along the selected portion an intensity value representing the intensity of a colour in the pixel; calculating for each of the pixels the extracted intensity value relative to that of a reference pixel of the pixels; determining as a possible endpoint of the object a pixel for which the calculated relative intensity value is within a predetermined range; and calculating the length of the object based on the determined possible endpoint.

14. The length measurement method of claim 13, wherein: the generated line includes the front end and the rear end of the object; a front portion including the front end and a rear portion including the rear end are selected; a possible front endpoint is determined from the front portion and a possible rear endpoint is determined from the rear portion; and the length of the object is calculated between the determined possible front endpoint and the determined possible rear endpoint.

15. The length measurement method of claim 13 or 14 wherein: the image is an aerial image captured from a moving platform; and the length of the object is calculated by measuring the Euclidean distance from 22 the number of pixels in the vertical and horizontal directions between the endpoints of the object based on the ground sample distance of those pixels.

16. The length measurement method of any of claims 13-15, wherein: the length of the object is calculated based on each of a plurality of determined possible endpoints.

17. The length measurement method of any of claims 13-16, wherein: the intensity value is extracted for each of a plurality of colours in the pixel; and the length of the object is calculated based on each of a plurality of determined possible endpoints corresponding to each of the plurality of colours.

18. The length measurement method of any of claims 13-17, wherein: the image is a frame of a series of images; and the length of the object is calculated based on each of a plurality of determined possible endpoints corresponding to each of a plurality of frames in which the object is visible.

19. The length measurement method of any of claims 13-18, wherein the image is a frame of a series of images and the method comprises: counting the number of consecutive frames in which the object is visible; and excluding from length measurement any object for which the counted number is below a predetermined threshold.

20. The length measurement method of any of claims 13-19, comprising: allowing a user to define a user-defined line along the length of the object from approximately the front end to approximately the rear end; the line is generated by extending both ends of the user-defined line. 23

21. The length measurement method of any of claims 13-20, wherein the reference pixel is at an end of the selected portion.

22. A height measurement method for measuring the height of the object, the method comprising: the length measurement method of any of claims 13-21; and calculating the height of the object using the calculated length of the object, a reference length for the object and the height of a moving platform from which the image is captured.

23. The height measurement method of claim 22, comprising: a classification method for classifying an object visible in an aerial image captured from a moving platform, the classification method comprising: providing an aerial image captured from a moving platform in which an object above a water surface is visible; determining whether the image includes a reflection of the object on the water surface, based on an input from a user; and classifying the object as being a reference object at less than a predetermined height above the water surface if it is determined that the image includes such a reflection, and classifying the object as being at more than the predetermined height above the water surface if it is determined that the image does not include such a reflection; wherein the reference length for the object is a measured length of the object classified as being a reference object.

24. The method of any of claims 13-23, wherein the object is an airborne bird.

25. A computer program comprising computer-executable code that when executed on a computer system causes the computer system to perform the method of 24 any of claims 13-24.

26. A computer-readable medium storing a computer program of claim 25.

Description:
LENGTH MEASUREMENT AND HEIGHT MEASUREMENT APPARATUS

AND METHOD

The present invention relates to a classification apparatus and method, a length measurement apparatus and method and a height measurement apparatus and method, particularly for the purpose of measuring the height of objects such as airborne birds.

There are many instances where it is desirable to be able to measure the height of an object from imagery. For example, in the field of environmental surveys it is desirable to be able to estimate the altitude of a flying bird from aerial imagery.

It is well known that this can be achieved using two or more images of the same object taken from different view -points, a process often referred to as photogrammetry.

Photogrammetry works by looking at the change in image position of an object as the camera is moved. Objects of different heights will move differently, and if enough information about camera positions is available the height of each object can be calculated.

Photogrammetry is hard to apply to moving objects, because it is necessary to ensure that both images being compared are from exactly the same instant in time. If this is not the case, the height calculations can be subject to large errors.

Johnston A, Cook AS (2016) How high do birds fly? Development of methods and analysis of digital aerial data of seabird flight heights - BTO Research Report Number 676. 53 discloses attempts to estimate flight height using a size-based technique by comparing the size of birds in images to the size of birds from published data or measured bird skins. However, this method relies on measurements of wing chord or body length being comparable between aerial images of birds and those derived from bird skins (which would change in size during the preparation process, typically due to shrinkage) or published measurements.

One aspect of the present invention provides length measurement apparatus for measuring the length of an object visible in an image, the apparatus comprising: a line generation section adapted to generate a line extending along the length of the object, the line including the front end and/or the rear end of the object; a portion selection section adapted to select a portion of the generated line including the front end and/or the rear end of the object; a colour extraction section adapted to extract from each pixel along the selected portion an intensity value representing the intensity of a colour in the pixel; a relative intensity section adapted to calculate for each of the pixels the extracted intensity value relative to that of a reference pixel of the pixels; an end determination section adapted to determine as a possible endpoint of the object a pixel for which the calculated relative intensity value is within a predetermined range; and a length calculation section adapted to calculate the length of the object based on the determined possible endpoint.

Another aspect of the invention provides a length measurement method for measuring the length of an object visible in an image, the method comprising: generating a line extending along the length of the object, the line including the front end and/or the rear end of the object; selecting a portion of the generated line including the front end and/or the rear end of the object; extracting from each pixel along the selected portion an intensity value representing the intensity of a colour in the pixel; calculating for each of the pixels the extracted intensity value relative to that of a reference pixel of the pixels; determining as a possible endpoint of the object a pixel for which the calculated relative intensity value is within a predetermined range; and calculating the length of the object based on the determined possible endpoint.

The present invention enables reference objects to be classified more reliably to which the size of objects in images can be compared in the size-based technique. The present invention enables the length of objects visible in images to be measured more accurately. The present invention enables the altitude of moving objects to be measured more reliably.

Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:

Fig. 1 illustrates a image capture of a moving airborne object from a moving aerial platform;

Fig. 2 is a side view illustrating schematically the fields of view of an imaging array when mounted on a flying aircraft;

Fig. 3 is a schematic illustration of a height measurement apparatus embodying the present invention;

Fig. 4 shows sample RGB traces for portions of a line along the length of an object, and the change of the normalised colour values; and

Fig. 5 shows an example of reflection of a bird used to identify if it was flying close to sea surface.

The present invention relates to the analysis of images. Optionally, the images may be aerial images. The images may be captured from a moving platform 1. Fig. 1 and Fig. 2 schematically illustrate apparatus that may be used to capture the images.

Referring to Fig. 1, the apparatus comprises a moving platform 1 e.g. a vehicle, such as an aeroplane, or satellite with an imaging device 2 that captures a sequence of still images. Preferably, the movement of the platform, indicated by the arrow 4 is constant and is known either by design or by measurement and is a pure translation, with no rotation component. Preferably, the motion of the platform is parallel to an approximately flat background 5, such as the ground or sea, at a fixed height h.

The imaging device 2 has a constant field of view 3, with respect to the moving platform 1. Sequential images are captured with the imaging device so that there is significant overlap between successive images. As a minimum, the overlap should be such that every point on the background is visible in at least two images. Preferably, each point in the background should appear in 5 - 25 images. This can be achieved, for example, using a video camera as the imaging device 2.

Referring to Fig. 2, in the illustrated embodiment of the invention, the imaging devices 2 are positioned in a mount fixed to the moving platform 1 such that they do not look directly downwards, but at a forward or backward looking angle a. In this way, the images represent a partially side-on view of the animals rather than a top down view; this makes many creatures (animals/birds) easier to recognise. This can also help to reduce the effect of sun glare. This can also allow the reflection of objects to be seen on the water surface.

According to this embodiment, the cameras are tilted forwards by an angle a between 10 and 45 degrees, preferably 30 degrees. Each camera is orientated such that the projections of their centerlines on the ground are parallel. The cameras have fields of view in which the distance on the ground between the projections of the centerlines is greater than the image width.

Optionally, four cameras are provided on the moving platform 1. The approximate ground sample distance (GSD; the distance between the centroids of two adjacent pixels measured at ground) may be 2 cm from an altitude of 550 m above sea level.

Fig. 3 schematically depicts a classification apparatus 30. The classification apparatus 30 is for classifying an object 6. The object 6 is visible in an aerial image captured from a moving platform 1.

As shown in Fig. 3, optionally the classification apparatus 30 comprises an imaging section 31. The imaging section 31 is adapted to provide an aerial image captured from a moving platform 1. An object 6 is visible in the image. In the image, the object 6 is above a water surface. The water surface may be, for example, the surface of a sea, ocean or lake. Optionally, the imaging section 31 is in communication with the imaging device 2. The imaging device 2 is configured to output the image data to the imaging section 31. Optionally, the imaging section is configured to output the image onto a display for displaying the image to a user. This allows a user to analyse the image and input information about the image into the classification apparatus 30.

For example, as shown in Fig. 3 in an embodiment the classification apparatus 30 comprises a reflection determination section 32. The reflection determination section 32 is adapted to allow a user to input whether the image includes a reflection of the object 6 on the water surface. For example, the reflection determination section 32 may be configured to provide the user with a user interface through which the user can input information indicating that a reflection of the object 6 is visible on the water surface in the image or alternatively that no such reflection of the object 6 is visible on the water surface in the image.

As shown in Fig. 3, optionally the classification apparatus 30 comprises a height classification section 33. The height classification section 33 is adapted to classify the object 6 as being a reference object if it is input that the image includes a reflection of the objection 6 on the water surface. The reference object is an object 6 that is determined as being at less than a predetermined height above the water surface. If it is input that the image does not include such a reflection, then the height classification section 33 is adapted to classify the object 6 as being at least the predetermined height above the water surface. When an object 6 is classified as being reference object, parameters of the object 6 may be used in downstream calculations, as explained in more detail below.

Optionally, the height of an object 6 is estimated using a size-based technique by comparing the size of objects (of unknown height) in images to the size of reference objects (of known height or approximately known height). Optionally, to use the size-based method, both lengths of reference objects at or near the surface 5 are determined with which to compare objects of unknown height. Optionally, to determine if an object (e.g. a bird) is at or near the surface 5, users identify if a reflection of the object 6 is visible on the surface of the water. When close enough to the sea surface in calm, usually overcast conditions, an object 6 would be expected to reflect light off it. The present inventors have determined that objects (e.g. birds) with a reflection visible in the image usually have a maximum mean height of approximately one metre above the water surface. It is therefore reasonable to assume that objects that have a visible reflection are effectively at the water surface. These objects can be used as reference objects in comparing against higher objects of unknown height.

According to the present invention, reflections of flying birds may be used to denote that they are flying close to the sea surface. This ensures that a base line of length measurements can be established for birds at a known height close to the water surface. As is described in more detail below, optionally the length measurements of birds at unknown height is compared with birds at known height in order to help determine the height of the birds at unknown height.

Optionally, the mount attached to the moving platform 1 is provided with a rotation facility, such as a ‘lazy-susan’ bearing that enables the imaging devices 2 to be pointed consistently in the same direction (with respect to the ground) throughout a survey. For example, the imaging devices 2 can point forward when the aircraft is flying in a first direction; the aircraft then turns through 180 degrees to make a pass flying in the opposite direction, but the mount is also rotated through 180 degrees so that the imaging devices 2 point backwards relative to the aircraft, but still in the same direction relative to the ground as when flying in the first direction.

Optionally, a feature in an image is determined as being a reflection only if the feature is below the object in the image. Because the camera equipment can be rotated to face away from the sun the difference between reflection and shadow can easily be differentiated. A shadow can be identified in an image if it occurs above the bird (i.e., towards the top of the image in reference to location of the bird) because the sun has to be behind the camera from the bird. A reflection always occurs below a bird in an image because light travelling to the camera from the bird always reflects off the sea surface in a position between the bird and the camera. Fig. 5 shows an example of reflection of a bird used to identify if it was flying close to sea surface.

The classification apparatus 30 allows objects of known height to be classified more efficiently. In turn, the increased efficiency of classifying objects near the water surface 5 leads to more efficient measuring the length of the reference objects, and an improved efficiency in the height measurement for an object.

As shown in Fig. 3, optionally a length measurement apparatus 20 comprises the classification apparatus 30. The classification apparatus 30 may be adapted to classify the object 6 as being a reference object at less than a predetermined height above the water surface. The length measurement apparatus 20 may be for measuring the length of the reference object. The length of the reference object may then be used to help determine the height of an object of unknown height. Further features of the length measurement apparatus 20 are described in more detail below.

As shown in Fig. 3, optionally a height measurement apparatus 10 comprises the length measurement apparatus 20. The height measurement apparatus 10 is for measuring the height of a second object (i.e. different from the reference object) visible in an aerial object captured from a moving platform. The image in which the second object is visible may be the same or different from the image in which the reference object is visible. The image in which the second object is visible may have been captured in the same survey or in a different survey from the image in which the reference object is visible. The image in which the second object is visible may have been captured from the same moving platform 1 or from a different moving platform 1 from that used to capture the image in which the reference object is visible.

Optionally, the length measurement apparatus 20 of the height measurement apparatus 10 is for measuring the length of the reference object as a reference length for the second object. Optionally, the reference object and the second object are of the same type of object. For example, the reference object and the second object may be the same type (e.g. species) of bird. The type of object may have been identified by the user analysing the images. Of course, it is possible that there may be some difference in length of two objects of the same type (e.g. two birds of the same species). Nevertheless, the present inventors have found that use of the length of the reference object as a comparator provides good results in determining the heights of other flying objects of the same type. Optionally, the length measurement apparatus 20 is further for measuring the length of the second object. The length measurement apparatus 20 may be configured to measure the length of both the reference objects and the second object. Alternatively, the length of the reference object may be already known. In this case, the length measurement apparatus 20 may not be required to measure the length of the reference object.

As shown in Fig. 3, optionally the height measurement apparatus 10 comprises a height calculation section 11. The height calculation section 11 is adapted to calculate the height of the second object using the measured length of the second object, the reference length for the second object and the height of the moving platform 1 from which the aerial image in which the second object is visible is captured. The apparent body length of any one individual object increases linearly with increasing proximity to the imaging device 2. For example, a bird at half the distance between the seawater surface 5 and the imaging device 2 appears twice as large as a bird of the same size at the water surface 5. It is possible to use the proportional size of an object at height to an object at the water surface multiplied against the altitude of the moving platform 1 so as to determine the distance of the object at height from the water surface 5 along the camera angle. Optionally, the height of the second object is calculated using the following equation, where h 0 is the height of the second object, h p is the height of the moving platform 1, l r is the reference length for the second object (e.g. the length of the reference object) and lo is the length of the second object.

Optionally, the lengths of a plurality of reference objects may be measured and lr may be the mean length of the reference objects. Alternatively, l r may be upper 95% confidence level of the mean length of reference objects, or the lower 95% confidence level of the mean length of reference objects. Optionally, a plurality of calculations may be made of the length of the second object and l 0 may be the maximum of the calculated lengths or the mean of the calculated lengths.

Optionally a plurality of calculations for the height of the second object are made. For example, a maximum height of the second object may be calculated by taking l r as the upper 95% confidence level of the mean length of reference objects and lo as the maximum of the calculated lengths of the second object. A mean height of the second object may be calculated by taking l r as the mean length of reference objects and l 0 as the maximum of the calculated lengths of the second object. A minimum height of the second object may be calculated by taking l r as the lower 95% confidence level of the mean length of reference objects and l 0 as the maximum of the calculated lengths of the second object.

As shown in Fig. 3, optionally the height measurement apparatus 10 comprises an object direction section 12. The object direction section 12 is adapted to allow a user to input the direction of an object 6 in an image. Optionally, the object direction section comprises a user interface through which the user can input information indicative of the direction of movement of the object. For example, the user may input a direction from a plurality of options such as left, right, up, down, left-up, leftdown, right-up, and right-down. The number of possible options that the user can input may be less than 8 or more than 8.

As shown in Fig. 3, optionally the height measurement apparatus 10 comprises a direction classification section 13. The direction classification section 13 is adapted to classify the object 6 as one of a plurality of direction classes. In one example, there are two direction classes, namely parallel and nonparallel. The direction classification section 13 is adapted to classify the object 6 as one (a) moving parallel to the moving platform 1 and (b) moving nonparallel to the moving platform 1 , based on the direction input by the user. The class (a) moving parallel to the moving platform 1 may correspond to one or more of the possible direction inputs of the user. The class (b) moving nonparallel to the moving platform 1 may correspond to one or more of the possible direction inputs of the user. For example, if the direction input by the user is up or down, then the direction classification section may classify the object 6 as moving parallel to the moving platform 1. If the direction input by the user is left, right, left-up, left-down, right-up or right-down, then the direction classification section 13 may classify the object 6 as moving nonparallel to the moving platform 1.

In another example, there are three direction classes, namely parallel, oblique and perpendicular. The direction classification section 13 may be adapted to classify the object 6 as one of (a) moving parallel to the moving platform 1, (b) moving obliquely to the moving platform 1 and (c) moving perpendicularly to the moving platform 1 , based on the direction input by the user. The class (a) moving parallel to the moving platform 1 may correspond to one or more of the possible direction inputs of the user. The class (b) moving obliquely to the moving platform 1 may correspond to one or more of the possible direction inputs of the user. The class (c) moving perpendicularly to the moving platform 1 may correspond to one or more of the possible direction inputs of the user. For example, if the direction input by the user is up or down, then the direction classification section may classify the object 6 as moving parallel to the moving platform 1. If the direction input by the user is left-up, left-down, right-up or right-down, then the direction classification section 13 may classify the object 6 as moving obliquely to the moving platform 1. If the direction input by the user is left or right, then the direction classification section 13 may classify the object 6 as moving perpendicularly to the moving platform 1. In a further alternative, the number of direction classes may be more than three.

Optionally, the object direction section 12 and the direction classification section 13 may be replaced by a single direction section which is adapted to allow the user to classify the object 6 as one of the plurality of direction classes, for example as one of (a) moving parallel to the moving platform 1 and (b) moving nonparallel to the moving platform 1. This reduces the number of processing steps.

As shown in Fig. 3, optionally the height measurement apparatus 10 comprises a direction matching section 14. The direction matching section 14 is adapted to determine whether the reference object and the second object are both classified the same one of the plurality of direction classes. For example, when there are two direction classes, the direction matching section 14 is adapted to determine whether the reference object and the second object are both classified the same one of (a) moving parallel to the moving platform 1 and (b) moving nonparallel to the moving platform 1. The height calculation section 11 is adapted to calculate the height of the second object only if the direction matching section determines that the reference object and the second object are both classified the same one of the plurality of direction classes, for example the same one of (a) moving parallel to the moving platform 1 and (b) moving nonparallel to the moving platform 1.

As mentioned above and shown in Fig. 2, for example, the image device 2 is angled. As a result, the direction of movement of the object 6 can affect the apparent size of the object 6 in the image. For example, as birds fly towards or away from the imaging device 2, they could appear shorter, which could undesirably bias flight height measurements. Optionally, the reference objects classed as moving parallel or nonparallel to the moving platform 1 are compared to objects at height moving parallel or nonparallel to the moving platform 1 , respectively. This helps to account for the issues of for shortening in the length measurements.

As shown in Fig. 3, optionally the height measurement apparatus 10 comprises an image coordinate section 15. The image coordinate section 15 is adapted to determine the position of an object within an image. For example, the positions within the image may be defined by x-y coordinates. The image coordinate section may determine the x-y location of the object 6 within the image.

Optionally, the height calculation section 11 is adapted to calculate the height of the second object further using the determined position of the reference object and the determined position of the second object. This can help to account for the influence of position of an object within the image on its measured length. It is possible that the position in the image frame may be related to the apparent length of the object. This location is accounted for by the image coordinate section 15 and the use of the determined positions by the height calculation section 11.

Further details of the length measurement apparatus 20 are set out below. As shown in Fig. 3, optionally the length measurement apparatus 20 comprises a line generation section 11. The line generation section 11 is adapted to generate a line extending along the length of the object 6 within the image. The line includes the front end and/or the rear end of the object 6. Preferably the line includes both the front end and the rear end of the object 6. The line runs through a plurality of pixels of the image in which the object 6 is visible. The line may extend beyond the front end and the rear end of the object 6. The length of the line may be longer than the length of the object 6 in the image.

As shown in Fig. 3, optionally the length measurement apparatus 20 comprises a portion selection section 22. The portion selection section 22 is adapted to select a portion of the generated line. The portion selection section 22 is adapted to select the portion such that the selected portion includes the front end and/or the rear end of the object 6. For example, the portion selection section may be adapted to select two portions, one portion being a front portion that includes the front end of the object 6. The front portion includes the true front end of the object 6 somewhere between the extreme points of the portion. Similarly, the rear portion includes the true rear end of the object 6 somewhere between the extremes of the rear portion.

As shown in Fig. 3, optionally the length measurement apparatus 20 comprises a colour extraction section 23. The colour extraction section 23 is adapted to extract from each pixel along the selected portion an intensity value representing the intensity of a colour in the pixel. Optionally, the colour extraction section is adapted to form the extraction for each of a plurality of colours in the pixel. For example, the red, green and blue pixel values (e.g. values from 0 to 255 which represent the intensity of each colour in a pixel of the image) along the front portion and the rear portion of the generated line may be extracted from the underlying image. Optionally, the portion selection section is adapted to select a portion having a predetermined length relative to the total length of the generated line. For example, the front portion may be selected to be the first third of pixels along the generated line. The rear portion may be the last third of pixels along the generated line. By selecting a portion of the generated line in which the true front end and/or true rear end of the object 6 will be defined it lie, the possibility of inadvertently determining position in the middle of the generated line (e.g. corresponding to the back of a bird) as being the end point of the object is reduced. For example, some birds have colour changes across their back. By selecting a portion of the generated line, the confusion with colour changes across the back of the birds is reduced.

The top two graphs of Fig. 4 show the extracted intensity values for each of red, blue and green along the ten pixels that form that front portion and the rear portion selected from the generated line. The x-axis shows the pixel index, which are numbers used to distinguish the pixels from each other. The pixel index numbers are sequential for pixels positioned sequentially along the selected portion. For example, pixel 1 is at one extreme end of the portion and pixel 2 is adjacent to pixel 1.

As shown in Fig. 3, optionally the length measurement apparatus 20 comprises a relative intensity section 24. The relative intensity section 24 is adapted to calculate for each of the pixels the extracted intensity value relative to that of a reference pixel of the pixels. For example, optionally pixel 1 or pixel 10 is used as a reference pixel. The relative intensity of each other pixel is calculated relative to the reference pixel. The results for the three colours, red, green and blue are shown in the bottom two graphs of Fig. 4. Optionally, the reference pixels are the pixels that correspond to the extreme ends of the generated line. As shown in Fig. 4, the reference pixel may generally be expected to have lower colour intensity than the other pixels.

The bottom two graphs in Fig. 4 show normalised channel values. The normalised channel value is the difference in intensity between a given pixel and the reference pixel, normalised relative to the maximum difference in pixel intensity. For example, in the graphs on the left hand side of Fig. 4, the reference pixel is pixel 1. The maximum intensity difference for red is shown in pixels 7-10. As a result, in the bottom-left graph, the normalised channel value for pixel 1 is 0, by definition because pixel 1 is the reference pixel. The normalised channel value for pixels 7-10 is 1, by definition because these pixels share the maximum red intensity. Meanwhile, pixels 2-6 have intermediary normalised channel values. The normalised channel value is calculated as the difference between the channel value for that pixel and the channel value for the reference pixel, divided by the difference between the channel value for the pixel with the maximum channel value and the channel value for the reference pixel.

As shown in Fig. 3, optionally the length measurement apparatus 20 comprises an end determination section 25. The end determination section 25 is adapted to determine as a possible end point of the object 6 a pixel for which the calculated relative intensity value is within a predetermined range. The possible end point is intended to be the true end (e.g. front end or rear end) of the object 6. For example, the present inventors have found that when the normalised channel values are on a 0 - 1.0 scale, then pixels with normalised channel values within the range of 0.2 - 0.4 represent the most logical true end points of the object 6. Any pixel with a normalised RGB value between 0.2 and 0.4 is determined as a possible end point (e.g. possible front end point or possible rear end point) of the object 6. Pixels that have a pixel colour value that is greater than that of the reference pixel (which may be the pixel having the minimum pixel value) by at least 20% and at most 40% of the difference between the pixel colour value of the pixel with the highest pixel value and the pixel colour value of the reference pixel may be considered as either a start or end point of the object.

The length measurement apparatus 20 allows the length of objects such as airborne birds to be measured more accurately. In the images that are analysed by experts, the start and end points of a bird may not be immediately obvious. This may be due to pixel softness or angle, for example. The use of the pixel colour intensity values allows the endpoints to be determined more accurately, thereby increasing the accuracy of measuring the length of the object. In turn, the increased accuracy of measuring the length of the object leads to an improved accuracy in the height measurement for an object.

It may be that more than one pixel falls within the predetermined range. This may lead to a plurality of combinations of possible front end points and possible rear end points. These combinations may be used to generate a series of possible lengths for the object 6. As shown in Fig. 3, optionally the length measurement apparatus 20 comprises a length calculation section 26. The length calculation section 26 is adapted to calculate the length of the object 6 based on the determined possible endpoint(s). The length calculation section 26 may calculate the length of the object 6 between the determined possible front endpoint and the determined possible rear endpoint.

Optionally, the length calculation section 26 is adapted to calculate the length of the object by multiplying the Euclidean distance from the number of pixels in the vertical and horizontal directions between the endpoints of the object based on the ground sample distance of those pixels. The length calculation section 26 may be adapted to measure the Euclidean distance between the possible endpoints. The ground sample distance may be known or may be calculated based on the height of the moving platform 1.

The length calculation section 26 may be adapted to calculate the length of the object 6 based on each of a plurality of determined possible endpoints. As mentioned above, there may be a plurality of pixel for which the relative intensity is within the predetermined range. This may give rise to a plurality of pairs of possible endpoints. Different sets of possible endpoints may be determined by using different colours and/or different frames of a series of images (e.g. a video).

Optionally, the colour extraction section 23 is adapted to extract the intensity value for each of a plurality of colours in the pixel. For example each of the red, green and blue channels may be used to determine three sets of possible endpoints. The length calculation section 26 may be adapted to calculate the length of the object 6 based on each of a plurality of determined possible endpoints corresponding to each of the plurality of colours.

Optionally, the image is a frame of a series of images (e.g. a video or a plurality of still images). The length calculation section 26 is optionally adapted to calculate the length of the object 6 based on each of a plurality of determined possible endpoints corresponding to each of a plurality of frames in which the object is visible.

Optionally, the length measurement apparatus 20 comprises a frame counting section 27. The frame counting section 27 is adapted to count the number of consecutive frames in which the object 6 is visible. Optionally, the length measurement apparatus 20 comprises an object filter section 28. The object filter section 28 is adapted to exclude from length measurement any object 6 for which the counted number is below a predetermined threshold. In one example, only birds that are visible in a minimum of three frames are measured. This can help to limit the number of birds that only appear briefly in the imagery (and are thus less likely to be at their maximum length).

Optionally, the length measurement apparatus 20 comprises a line input section 29. The line input section 29 is adapted to allow a user to define a user- defined line along the length of the object from approximately the front end to approximately the rear end. The line generation section 21 is adapted to generate the line by extending both ends of the user-defined line. In one example, images of birds that have already been positively identified by experts are measured using a bespoke measurement tool. Using the line input section 29, observers draw lines along the body length of birds from approximately the front of the head to the tip of the tail of birds in flight. The line generation section 21 then automatically extends the length of the line to ensure the front and tail of the bird are captured. Optionally, the line generation section 21 is configured to extend the user-drawn line by a predetermined percentage, for example 25%.

Where a flying object is visible in multiple images, multiple image pairs can be defined. This enables multiple estimates of the height to be obtained. The mean and variance of these heights can be calculated. The mean gives a more robust and precise estimate of the height than any of the individual estimates and it is preferable to use this value as the estimate of the object’s height. The variance gives a quantification of the robustness of the height estimates. This can be used to discard poor height estimates, which may occur when the object’s posture may result in a short measurement.

It is possible to implement each of the various items in Fig. 3 as dedicated hard-wired electronic circuits; however the various items do not have to be separate from each other, and could all be integrated onto a single electronic chip. Furthermore, the items can be embodied as a combination of hardware and software, and the software can be executed by any suitable general-purpose microprocessor, such that in one embodiment the apparatus can be a conventional personal computer (PC), such as a standard desktop or laptop computer with an attached monitor. The computer can be connected to an imaging device, such as a digital video camera, or can input a video file captured by a separate imaging device and transferred to the computer. Alternatively, the apparatus can be a dedicated device.

The invention can also be embodied as a computer program stored on any suitable computer-readable storage medium, such as a solid-state computer memory, a hard drive, or a removable disc-shaped medium in which information is stored magnetically, optically or magneto-optically. The computer program comprises computer-executable code that when executed on a computer system causes the computer system to perform a method embodying the invention.