Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM FOR PERFORMING AN ANIMAL RELATED ACTION
Document Type and Number:
WIPO Patent Application WO/2015/041517
Kind Code:
A1
Abstract:
A system for performing an animal related action on an animal, the system comprising: a 3D sensor system to determine 3D spatial information of atleast one animal part of said animal, an animal related device for performing said animal related action on said animal, a system control for controlling the animal related device to perform said action on the basis of said determined 3D spatial information, wherein said 3D sensor system comprises a source for emitting optical electromagnetic radiation, and a sensor housing with a window and a 3D sensor having a sensor device to detect optical electromagnetic radiation that has been reflected off said animal part through said window onto the sensor device, wherein said 3D sensor system has a sensor control to derive said 3D spatial information from said detected optical electromagnetic radiation, wherein said sensor housing further comprises a wall that is at least partially transparent between said window and said sensor device, and such that there are at least a first space that is limited by the window, the sensor housing and the wall, and a second space opposite the first space with respect to the wall that is limited by the wall and the sensor housing, wherein at least the sensor device is arranged in the second space. This set-up improves thermal properties and extends the working range.

Inventors:
MOSTERT GERARD (NL)
EPEMA DAVID (NL)
YADIN GIDEON (NL)
Application Number:
PCT/NL2014/050532
Publication Date:
March 26, 2015
Filing Date:
July 31, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
LELY PATENT NV (NL)
International Classes:
A01J5/017; G01S17/88; G01S7/481
Domestic Patent References:
WO2008033008A22008-03-20
Foreign References:
EP1253440A12002-10-30
DE202005020282U12007-05-03
US20100013984A12010-01-21
EP2059834A12009-05-20
EP0360354A11990-03-28
NL2010213A2013-01-31
Attorney, Agent or Firm:
JENNEN, Peter Leonardus Hendricus (PB Maassluis, NL)
Download PDF:
Claims:
CLAIMS

1. System (1 ) for performing an animal related action on an animal (2), the system comprising:

- 3D sensor system (1 1 ) arranged to determine 3D spatial information of at least one animal part (8, 9, 10) of said animal,

- an animal related device (7) for performing said animal related action on said animal,

- a system control (12) for controlling the animal related device to perform said action on the basis of said determined 3D spatial information,

wherein said 3D sensor system comprises a source (24) for emitting optical electromagnetic radiation (25), and a sensor housing (13) of a first material and provided with a window (16),

wherein there is provided in said sensor housing a 3D sensor having a sensor device (20) that is arranged to detect optical electromagnetic radiation that has been reflected off said animal part through said window onto the sensor device,

wherein the 3D sensor system has a sensor control (30) that is operatively connected to the sensor device and is arranged to derive said 3D spatial information from said detected optical electromagnetic radiation,

wherein said sensor housing further comprises a wall (15) is at least partially transparent between said window and said sensor device, the wall being positioned and extending such within said sensor housing that there are at least a first space (17) that is limited by the window, the sensor housing and the wall, and a second space (18) opposite the first space with respect to the wall that is limited by the wall and the sensor housing,

wherein at least the sensor device is arranged in the second space.

2. System according to claim 1 , wherein the first material has a first thermal conductance per area and the wall is made of a second material having a second thermal conductance per area that is lower than the first thermal conductance per area, more in particular wherein the first material has a first thermal conductivity and the wall is made of a second material having a second thermal conductivity that is lower than the first thermal conductivity.

3. System according to any one preceding claim, wherein said second material is opaque for the optical electromagnetic radiation and wall has a second window (16, 26), or said second material is transparent for the optical electromagnetic radiation.

4. System according to any one preceding claim, wherein said first material is substantially a metal material, in particular aluminium.

5 5. System according to any one preceding claim, wherein said second material comprises or substantially is (stainless) steel, a glass or a plastic material.

6. System according to any one preceding claim, wherein the second space comprises an air guide means (22, 28) arranged to guide air that rises in the second space towards the sensor device, and in particular away from the first space, o more in particular wherein the air guide means is provided on the wall.

7. System according to any one preceding claim, wherein said source is also mounted in said sensor housing.

8. System according to any one preceding claim, wherein said sensor device, and preferably at least one of said source and said sensor control, is/are5 mounted in a position below the centre of the housing when the 3D sensor system is in an orientation intended for use.

9. System according to any one preceding claim, comprising a bracket device (19, 23) mounted in the sensor housing, preferably onto the wall and preferably horizontally, onto which bracket device there is mounted at least the0 sensor device.

10. System according to any one preceding claim, wherein at least one of said system control and said sensor control is arranged to detect an animal part, in particular a teat, in said determined 3D spatial information.

1 1. System according to claim 10, wherein said animal related device5 comprises a robot arm (5, 6) arranged to be moved to a detected animal part, in particular said teat, on the basis of said determined 3D spatial information.

Description:
System for performing an animal related action

The invention relates to a system for performing an animal related action on an animal, comprising a 3D sensor system arranged to determine 3D spatial information of at least one animal part of said animal, an animal related device for performing said animal related action on said animal, and a system control for controlling the animal related device to perform said action on the basis of said determined 3D spatial information. The 3D sensor system comprises a source for emitting optical electromagnetic radiation, and a sensor housing of a first material and provided with a window. In the sensor housing there is provided a 3D sensor having a sensor device that is arranged to detect optical electromagnetic radiation that has been reflected off said animal part through said window onto the sensor device. The 3D sensor system has a sensor control that is operatively connected to the sensor device and is arranged to derive said 3D spatial information from said detected optical electromagnetic radiation.

Such systems are known, for example from EP2059834. Herein, it is to be noted that the 3D sensor system determining 3D spatial information in the present invention relates to those systems that collect depth information, i.e. distance information, for a plurality of directions substantially simultaneously (i.e. in one image), by means of emitted and reflected optical electromagnetic radiation, and are able to provide a 2D image with distance information for each part (pixel) thereof. Examples of such 3D sensor systems per se include 3D cameras of the time-of- flight/phase mixing device type, from the companies such as ifm electronics GmbH, Mesa Imaging AG (Swiss Ranger), Canesta et cetera, or 3D cameras of the structured-light type, from companies such as PrimeSense, Ltd. (sensor for the Kinect™). In particular not included are sensor systems that work with a different means, such as ultrasound, or sensor systems that are not able to provide a 2D image with distance/depth information for , such as a laser scanner according to EP0360354, or their equivalents, that is not able to provide distance information for each part thereof, at least not simultaneously.

The known system has a disadvantage in that the working range of operating conditions is sometimes too limited. In particular, the range of allowed ambient temperatures is e.g. from 0 °C up to only up to about 40 °C for the sensor of PrimeSense, Ltd. Such a range is often too narrow for use in an environment where animal related actions are to be performed. After all, animals may be kept in warm countries, such as Mediterranean countries, e.g. Israel, and in animal environments, the temperature may rise to above said 40 °C, or stay at an elevated level for prolonged times, but also in much colder circumstances, down to below freezing point, compromising reliability of the 3D sensor system, and thus of the system for performing the animal related action as a whole. In particular lower temperatures below freezing point could give rise to sometimes heavy fogging or even window frost. This is undesirable, as not properly performed animal related action may cause discomfort or even danger for the animal, and may give rise to a reduced performance and thus higher costs and the like for a user of the system. Furthermore, since these systems use optical electromagnetic radiation, i.e. having a wavelength between 100 nm and 1 mm, and more in particular visual or near- infrared radiation, with a wavelength between about 400 and 1400 nm, they are heavily attenuated by fogging of any part in the optical path. And since the system according to the invention is to be used in an environment that could not only be very warm, but also rather cold and/or humid, such as sub zero °C temperatures during winter in cold climates, there is a much larger risk of such fogging than for the known camera/sensor systems, that are to be used in private homes, and that are almost always within a much narrower temperature range.

An object of the invention is to provide a system of the kind mentioned in the introduction, that has an extended working range, and thus an improved reliability and performance.

The invention achieves this object by means of a system according to claim 1 , in particular a system for performing an animal related action on an animal, the system comprising 3D sensor system arranged to determine 3D spatial information of at least one animal part of said animal, an animal related device for performing said animal related action on said animal, a system control for controlling the animal related device to perform said action on the basis of said determined 3D spatial information, wherein said 3D sensor system comprises a source for emitting optical electromagnetic radiation, and a sensor housing of a first material and provided with a window, wherein there is provided in said sensor housing a 3D sensor having a sensor device that is arranged to detect optical electromagnetic radiation that has been reflected off said animal part through said window onto the sensor device, wherein the 3D sensor system has a sensor control that is operatively connected to the sensor device and is arranged to derive said 3D spatial information from said detected optical electromagnetic radiation, wherein said sensor housing further comprises a wall that is at least partially transparent between said window and said sensor device, the wall being positioned and extending such within said sensor housing that there are at least a first space that is limited by the window, the sensor housing and the wall, and a second space opposite the first space with respect to the wall that is limited by the wall and the sensor housing, wherein at least the sensor device is arranged in the second space.

It has been found by the inventors that the system according to the present invention is able to work in an extended temperature range, in particular down to -5 °C and up to 50 °C, with a reliability comparable to existing, known systems. This holds in particular for structured light cameras, more in particular those of the company PrimeSense, Ltd, although other 3D camera types can show similar improvements in working temperature range.

Although the inventors do not wish to be held to an explanation, it is believed to be caused by the following. The air inside the housing is warmed by the sensor. Any water vapour content of that air could be given off to a much colder window. Therefore, it is desirable to keep the air close to the window as cool as possible, or better, to keep the warmed air away from the window as much as possible. This ensures that the temperature gradient between the window and the (inside) air is as low as possible. Furthermore, the 3D sensor and/or the source produce heat, that needs to be removed lest those parts overheat. However, simply increasing heat conduction, or improving cooling of the sensor system as a whole, not always leads to the desired result, as this will make it more susceptible to condensing of water on e.g. the window, in particular during colder or wetter weather. To prevent this, cooling of at least the sensor device has been improved when compared to cooling of the window. Thereto, said wall has been provided, that separates, to a certain degree, air surrounding the sensor device, and air that is in contact with the window, of course in each case within the housing. Hereby, it is possible to allow the former air to better cool (at least) the sensor device by moving past the sensor device, while the latter air is more stationary, and so is kept from mixing with the hotter air that could contain more water vapour. The stationary air keeps the window at a (relatively) lower temperature, thus reducing the risk of fogging. Special embodiments of the inventions, such as defined in the dependent claims, are described in the following.

In the invention, the window is to be understood as anything that is designed to pass the optical radiation, such as an opening covered with an optically transparent material. The latter may simply be a foil or plate, but may also comprise a lens or otherwise optically active element. Note that there is at least one covered opening, as the housing should be gastight to prevent the ingress of air. Within the housing proper, there may be additional windows, that may be similar, or be a simple opening.

In particular, the first material has a first thermal conductance per area and the wall is made of a second material having a second thermal conductance per area that is lower than the first thermal conductance per area. This allows to optimize the heat dissipation for the sensor proper, while increasing thermal insulation with respect to the window, such that it can better be kept cool. Having a difference in thermal conductance per area may simply be achieved by providing the same material with a different thickness, a larger thickness leading to a lower conductance. However, more in particular, the first material has a first thermal conductivity and the second material has a second thermal conductivity that is lower than the first thermal conductivity. Now the materials differ in specific thermal conductivity, allowing for example a uniform wall thickness.

In embodiments, the first material is substantially a metal material. This type of material has, on average, a high thermal conductivity, allowing for a good cooling of the sensor device. Furthermore, it has a very good gas tightness, in particular also for water vapour. This is to be compared with the often used plastic materials, such as in the sensor of PrimeSense, that have a lower thermal conductivity and worse gas/water vapour tightness. It is to be noted that the latter sensors are often used in private homes, that are almost never at such low or elevated temperatures or wet conditions, and thus do not require the measures according to the present invention. In particular embodiments, the first material is substantially aluminium. This is a light and easily machined metal with a good oxidisation resistance and a high thermal conductivity.

In attractive embodiments, said second material is opaque for the optical electromagnetic radiation and wall has a second window, or said second material is transparent for the optical electromagnetic radiation. In the former case, many materials are available for use, according to e.g. constructional needs. Even though a second window, i.e. an opening in the material, is required in order not to block the optical path of the radiation, this still allows to some extent a separation of the air in the first space from the air in the second space. After all, the second window need only be as big as the required cross-sectional area of the optical electromagnetic radiation. Furthermore, the second window may be covered with a third material, that is transparent to the optical electromagnetic radiation. Alternatively, the wall is made of a second material that is transparent for the optical electromagnetic radiation. In such a case, the first space and the second space may be closed off from each other completely, although that is in no case necessary. Closing off completely allows a maximum thermal insulation of the air in the first space and the air in the second space.

As attractive examples, the second material comprises or substantially is (stainless) steel, a glass or a plastic material. Herein, (stainless) steel has good constructional properties and a much lower thermal conductivity. Glass has a thermal conductivity that is two orders of magnitude lower than that of aluminium. It is furthermore transparent, so that this allows the wall to close off the first space from the second space without obstructing the optical path. The plastic material, (or plastic) may be, but need not be, transparent. Plastics have advantages in that they can be light, cheap, and easily moldable. In the above, "transparent" means that a useful amount of the radiation is transmitted, such as 50% or more. Of course, the more radiation is transmitted, the better, but scattering or (specular or diffuse) reflection would be more problematic than mere absorption. Note in particular that transmission may often be improved by means of an antireflection coating. It is also possible to block unwanted radiation, such as outside the band of emitted optical electromagnetic radiation, in order to improve the signal-to-noise ratio. Thereto, it may be advantageous to select an optical filter material for the second material, or apply e.g. an absorption- or antireflection coating, and in particular only on the material of the wall or the second window therein, since this coating would be protected against touching, dirt, moisture and so on. The first window could then be made of a scratch-resistant transparent material, such as quartz, sapphire, borosilicate glass or the like, or be covered with a scratch-resistant coating. This allows more design freedom, as the optical properties relating to absorption/reflection need not be taken into account to such a high degree. In advantageous embodiments, the second space comprises an air guide means arranged to guide air that rises in the second space towards the sensor device, and in particular away from the first space. More in particular the air guide means is provided on the wall. Herein, it is noted that air that is warmed by the sensor device will rise within the second space, dragging along cooler air from below. The more the cooling air is concentrated on the part(s) to be cooled, the better the cooling will be. Furthermore, concentrating the path of the cooling air onto the part(s) to be cooled keeps warm air away from the wall, thus keeping the air in the first space at a stabler and relatively lower temperature, as desired. The air guide means may comprise a simple spoiler or other separate part mounted onto the wall, or may be shaped as a protruding part of the wall, such as a thickened wall part or the like.

In all of the above it was stated that "at least the sensor device" was mounted in the second space, was cooled and so on. All of this may also hold for the source of the optical electromagnetic radiation. In embodiments, said source is also mounted in said sensor housing. The source may be provided in the same second space, but is preferably provided in yet another, third space, also thermally insulated from the first and second space. Similarly, the sensor control may be provided in the housing, although it could be possible to provide it in a separate housing and provide a wired or wireless communication between the sensor control and the sensor device and/or source.

Advantageously, said sensor device, and preferably at least one of said source and said sensor control, is/are mounted in a position below the centre of the housing when the 3D sensor system is in an orientation intended for use. The reference frame of the 3D sensor system, that is used to calculate the 3D spatial information, such as position in space, determines what is the upright position and thus the orientation intended for use. This in turn determines where and how any air guides should be mounted, because the air will of course rise with respect to the vertical. Furthermore, mounting of a part beneath the centre means that the centre (of gravity or geometrical) of the part is below a horizontal plane through the centre (of gravity or geometrical) of the housing. This ensures that there is enough space for air to pass along the part, to ensure good cooling and enough space to give off the collected heat to the surroundings. It is also advantageous to increase the volume of the housing with respect to the volume of the known devices, in particular the sensor for the Kinect™, which sensor is produced by PrimeSense, Ltd. This also ensures that there is more space for/volume of air available, for improved cooling. Thereto, in particular the height is increased.

In embodiments, the system according to the invention comprises a bracket device mounted in the sensor housing, onto which bracket device there is mounted at least the sensor device. This ensures not only that the sensor device is suspended optimally with respect to passing air for cooling, but also improves vibrational decoupling with the environment, such that the sensor and thus its function, of determining the position of an animal or part thereof, is less influenced by any vibrations in the environment, such as kicks against the system as a whole. The bracket may be mounted onto the housing, such as to an outer wall thereof. Preferably, the bracket is mounted onto the wall, i.e. onto the wall between the first and second space. More preferably, the bracket is mounted horizontally. In this way, the flow of air is obstructed to a minimum, and the vibrational decoupling from the environment is at a maximum. In some cases, not only the sensor is mounted on the bracket, but also the source of the electromagnetic radiation. In such a case, the position and orientation of the light source with respect to the sensor device is well- defined, which is beneficial for the accuracy.

In particular embodiments, the bracket device with the at least sensor device mounted thereon is arranged such that it has a resonance frequency within the range of 0.25-20 Hz, preferably within the range of 0.5-5 Hz. This means that the stiffness and dimensions of the bracket are selected such that, in combination with the total weight of the sensor device and everything on the bracket, there is a vibrational frequency that is less damped than others, causing the bracket device to vibrate for some time after the system as a whole is e.g. kicked. It turns out that this improves the image quality and thus the accuracy of the sensor device even in cases where there actually is a vibration of the bracket. For details, reference is made to our co-pending but non-prepublished application NL2010213, which is incorporated herein by reference.

Furthermore, the invention also relates to the 3D sensor system per se, in particular a 3D sensor system arranged to determine 3D spatial information, comprising a source for emitting optical electromagnetic radiation and a sensor housing of a first material and provided with a window, wherein there is provided in said sensor housing a 3D sensor having a sensor device that is arranged to detect optical electromagnetic radiation that has been reflected through said window onto the sensor device, wherein the 3D sensor system has a sensor control that is operatively connected to the sensor device and is arranged to derive said 3D spatial information from said detected optical electromagnetic radiation, wherein said sensor housing further comprises a wall is at least partially transparent between said window and said sensor device, the wall being positioned and extending such within said sensor housing that there are at least a first space that is limited by the window, the sensor housing and the wall, and a second space opposite the first space with respect to the wall that is limited by the wall and the sensor housing, wherein at least the sensor device is arranged in the second space. Such a 3D sensor system has of course the same advantages as are provided by the system according to the invention, as described above. Similarly, all special features and embodiments described thus far, and those as claimed in claims 1 -9, also apply to the 3D sensor device per se, and thus provide similar special embodiments of the 3D sensor system per se.

Advantageously, at least one of said system control and said sensor control is arranged to detect an animal part, in particular a teat, in said determined 3D spatial information. This makes the system useful in applications like a milking robot or the like. A correct determination of the position of e.g. a teat starts with detecting whether something is a teat. In such a determination, use can be made of any known technique in image processing, such as edge or curvature detection, size comparison and so on. The skilled person will have no trouble in finding such techniques per se. Other animal parts, such as a leg or even the whole body or the position of the back part thereof, may provide useful 3D spatial information. It is noted that it is not necessary to have the 3D sensor system determine 3D spatial information of an animal part. It could also determine positions of a non-animal part, such as a child, an obstacle or a moving vehicle. The inventive features still apply when the spatial information is determined of such non-animal parts, and the determined information is used to control the animal related device.

Furthermore, besides detecting the animal part, at least one of the said controls is preferably arranged to determine a position of said animal part such as a teat from the 3D spatial information. Based on such information, the control can arrange for an animal related action to be performed, such as connecting a teat cup. Therefore, in embodiments, the animal related device comprises a robot arm arranged to be moved to a detected animal part, in particular said teat, on the basis of said determined 3D spatial information.

The invention will now be elucidated by means of a number of embodiments shown in the drawings, in which:

Fig. 1 shows very diagrammatically in a perspective view a system according to the invention;

Fig. 2 shows in a diagrammatic cross-sectional side view a 3D sensor system according to the invention;

and

Fig. 3 shows in a diagrammatic cross-sectional top view another 3D sensor system according to the invention.

Fig 1. diagrammatically shows a perspective view of a system 1 for performing an animal related action on an animal 2 in a milking box 3, according to the invention.

The system 1 comprises a milking robot 4 with a robot arm 5 with a gripper 6 for connecting a teat cup 7 to a teat 8 of an udder 9 of the animal 2. A leg of the animal is denoted by reference numeral 10.

The robot arm 5 carries a 3D sensor system 1 1 , and the system 1 also comprises a system control 12.

The system 1 as shown here is arranged to perform a milking action on the animal 2. Thereto, the milking robot 4 connects one or more teat cups 7 to the teats 8 of the animal 2 in a milking box 3 by means of the robot arm 5 with the gripper 6. To be able to do so, the system 1 has a teat detection system, in this case a 3D sensor system 1 1 , which is mounted on the robot arm 5. Note that the 3D sensor system 1 1 could also be mounted on any other part of the system 1 , such as the housing of the milking robot 4, the gripper 6 or the milking box 3.

The 3D sensor system 1 1 will form a 3D-image of at least the part of the milking box 3 containing the relevant part(s) of the animal 2, in particular the udder 9 and/or the teats 8. Also important are the legs 10, in order to prevent a collision therewith. The 3D-image as obtained by the 3D sensor system 1 1 is in fact in most cases a 2D collection of distance information. I.e. the image is built up as a 2D array of distance values as determined with the 3D sensor system 1 1. All this is per se known in the art, and reference is made a.o. to 3D imaging handbooks. On the basis of the obtained 3D-image (also known as a 2,5D-image), the system control 12 is able to control the robot arm 5 and the gripper 6 to connect the teat cup 7 to the teats 8.

Of course, other animal related actions could also be performed on the basis of a 3D-image from the 3D sensor system 1 1 , such as teat preparation, cleaning and stimulation, the application of a treatment fluid, inspection of a teat or another part of the animal, and so on.

After the image has been obtained by the 3D sensor system 1 1 , it is processed in order to detect relevant objects in the image, and to determine the position of such objects with respect to the 3d sensor system 1 1. Thereto, the image is analyzed by means of technics that are well known per se, such as averaging and other noise reducing technics, edge detection and comparison of curves and dimensions with predetermined reference values. All such techniques are deemed to be known to the skilled man, and details thereof will not be mentioned here as they are not part of the true invention.

Fig. 2 diagrammatically shows a cross-sectional side view of a 3D sensor system according to the invention.

The system 1 1 is mounted on a robot arm 5, and comprises a sensor housing 13 with a first window 14 and a wall 15, having an upper wall part 15-1 and a lower wall part 15-2 with a second window 16 therebetween, separating a first space 17 from a second space 18. A bracket is indicated with reference numeral 19, onto which a sensor device 20 is mounted, that is connected to a sensor control 30, from which a cable 21 leads to the outside. A first air guide has been indicated by reference numeral 22.

The sensor device 20 has a field of view 12. This is determined by the dimensions and relative position of the first window 14 as well as the second window 16 and by optics, not shown here. Note that possible filters or the like are neither shown here, nor is the source of electromagnetic radiation. This will be elucidated in connection with fig. 3

The sensor device 20 is shown mounted on a bracket 19. This allows air to move freely from below the sensor device 20 along the sensor device and upward in the second space 18. The sensor device 20 is mounted below the center of the housing 13 in order to provide it in a relatively cool part of the second space 18. Furthermore, the lower air is guided towards the sensor device 20 by means of the first air guide 22, which is formed as a kind of spoiler. This also ensures that the rising air will not, or at least to a much lower extent, enter the first space 17. Hereby, it is ensured that the first space 17 will not be flushed by the hotter air that circulates in the second space 18, Thus, the average temperature in the first space 17, and also of the wall part surrounding it, will be relatively lower. Thus, the air that is present in the first space 17 will have a relatively low temperature with respect to the window 14, preventing condensation of water vapor as much as possible. To aid this further, the wall 15, which is shown here as an upper wall part 15-1 and a lower wall part 15-2, which may however be connected around the second window 16 into a unitary wall, is preferably made of a material with a low thermal conductivity, such as stainless steel, or a plastic. Note furthermore that the second window 16 shown here is a simple opening. However, the opening may also be covered by a transparent material, such as glass or a suitable plastic material. The latter possibility would further improve the thermal insulation between the first space 17 and the second space 18.

The housing 13 is preferably made of a metal such as aluminium, which has a high thermal conductivity and is able to remove heat that is transported by the rising air as much as possible. The first window 14 can be made from any transparent material, which is preferably scratch proof, such as quartz or sapphire. However, cheaper solutions, such as suitably coated glass are also possible.

The sensor device 20 is operatively connected to the sensor control 30 in order to supply a 3D-image (better: 2,5D-image), which image contains depth information that can be processed by the sensor control 30 into relevant information for any device for performing an animal related action, such as a robot arm carrying a teat cup or teat brush or the like.

As a sidestep, it is noted that it is possible to design the bracket 19 with the sensor device 20 mounted thereon such that the combination has a resonance frequency of between about 0.25-20 Hz. Surprisingly, this aids in achieving a better image quality, at least with respect to certain types of the sensor device 20, in particular devices that use so-called structured light cameras such as 3D cameras marketed by the company PrimeSense, Ltd., in particular those as used in the Kinect™. Note, however, that other types of 3D sensor devices, such as time-of-flight cameras or phase mixing devices are also possible.

Fig. 3 diagrammatically shows a cross-sectional side view of another 3D sensor system according to the invention. Herein, similar parts are denoted by the same reference numerals.

In this embodiment, the 3D sensor device 1 1 comprises a housing 13 in which there are provided a first window 14 and a third window 27. Furthermore, a wall 15 partitions the internal space of the housing 13 into a first space 17 and a second space 18. The wall 15 comprises a second window 16 and a fourth window 26.

Also shown are a sensor device 20 on a first bracket 19 and a source 24 on a second bracket 23, emitting a beam of electromagnetic radiation 25. Furthermore, a first air guide 22 and a second air guide 28 are also mounted on the wall 15. A sensor control is denoted 30, and a cable is denoted 21.

The 3D sensor system as shown here comprises a source 24 that is able to emit electromagnetic radiation, such as visual optical radiation or infrared optical radiation, in particular near infrared (NIR) radiation. Possible sources are incandescent lamps or in particular LED. Not shown are any optical devices such as mirrors or lenses to form a beam of radiation 25. The radiation is emitted through the third window 26 and the fourth window 27, which are both made of a material that is transparent to the radiation used. Note that, in case the radiation is near infrared radiation, the material need not be visually transparent. Furthermore, it is also possible to provide the third window 26 as an opening in the wall 15.

The emitted radiation 25 will be reflected by objects in the beam. Part of this reflected radiation will be detected by the sensor device 20, after having passed the first window 14 and the second window 16. The sensor device 20 can be a 2D array of radiation detectors, such as photodiodes, CMOS detectors or a CCD- camera. The control and synchronization of the source 24 and the sensor device 20 depends on the type of the distance detection/3D technology used, but could e.g. be of the structured-light type, such as is used in camera as marketed by the company PrimeSense, Ltd. Alternatively, the 3D sensor system could be of the time-of-flight type or phase mixing device. The control of the source 24 and the sensor device 20 is performed by means of the sensor control 30. Sensor control 30 may also be arranged to receive and process the 3D-image as detected by the sensor device 20. Processing is according to rules known per se, always adapted to the intended use of the system 1 1 . For example, object detection could be focused on the detection of specific parts of an animal, such as a teat. The information obtained from analyzing the 3D special information as received by the sensor device 20, e.g. in the form of control instructions, may be sent to another part of a system for performing an animal related action, not shown further here, via cable 21 or a similar device, such as a wireless link.

The source 24 and the sensor device 20 are shown here as mounted on respective brackets 23 and 19, in a side-by-side fashion. Alternatively, the parts may be mounted one above the other, although this is less advantageous in view of heat controland rising air currents. In this case, respective air guides 22 and 28 are provided to direct air to the respective part to be cooled, viz. the sensor device 20 and the source 24. Note that it is not necessary to provide both parts with a respective air guide. Furthermore, it is noted that both the source and the sensor device are provided in the same second space 18. It is also possible to provide an additional wall between the source 24 and the sensor device 20. In all cases, it can be advantageous to assemble the system 1 1 in an environment with a relatively low humidity, more in particular a low water content in the air. After sealing the housing 30 of the system 1 1 , a relatively low amount of water vapor will then be present in the internal space such as the first space 17 and the second space 18. Such an environment with low humidity can be achieved with low temperatures and dry air or even a specific climate room for assembling the system 1 1.

The embodiments shown and described here are not intended to be limiting. Rather, the scope of the present invention is determined by the attached claims.