Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PROXIMITY SENSOR AND CORRESPONDING DISTANCE MEASURING METHOD
Document Type and Number:
WIPO Patent Application WO/2018/065757
Kind Code:
A1
Abstract:
A proximity sensor device for receiving a distance to an object, the device comprising: a light emission unit configured to direct measurement light onto a surface of the object; a light detection unit configured to detect light reflected off the surface and generate a detection signal; an image sensor configured to obtain an image of the surface; a control unit configured to determine a reflection characteristic of the surface from the image and to calculate distance to the object on the basis of the reflection characteristic and the detection signal.

Inventors:
ALTHOEFER KASPAR (GB)
KONSTANTINOVA JELIZAVETA (GB)
Application Number:
PCT/GB2017/052935
Publication Date:
April 12, 2018
Filing Date:
September 29, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV LONDON QUEEN MARY (GB)
KING S COLLEGE LONDON (GB)
International Classes:
G01B11/02; B25J15/00; G01S17/08
Foreign References:
US20120119091A12012-05-17
US20150204652A12015-07-23
Other References:
KAIJEN HSIAO ET AL: "Reactive grasping using optical proximity sensors", 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION : (ICRA) ; KOBE, JAPAN, 12 - 17 MAY 2009, IEEE, PISCATAWAY, NJ, USA, 12 May 2009 (2009-05-12), pages 2098 - 2105, XP031510083, ISBN: 978-1-4244-2788-8
BIMBO; JOAO; RODRIGUEZ-JIMENEZ; SILVIA; LIU; HONGBIN; SONG; XIAOJING; BURRUS; NICOLAS: "Object pose estimation and tracking by fusing visual and tactile information", IEEE INTERNATIONAL CONFERENCE ON MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS, 2012, pages 65 - 70, XP032470254, DOI: doi:10.1109/MFI.2012.6343019
GOEGER; DIRK; BLANKERTZ; MATTHIAS; WOERN; HEINZ: "A tactile proximity sensor", PROCEEDINGS OF IEEE SENSORS, 2010, pages 589 - 594, XP031978014, DOI: doi:10.1109/ICSENS.2010.5690450
HASEGAWA; HIROAKI; MIZOGUCHI; YOSHITOMO; TADAKUMA; KENJIRO; ISHIKAWA; MASATOSHI; SHIMOJO; MAKOTO: "Development of intelligent robot hand using proximity, contact and slip sensing", IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, 2010, pages 777 - 784
HSIAO; KAIJEN; NANGERONI; PAUL; HUBER; MANFRED; SAXENA; ASHUTOSH; NG; ANDREW Y.: "Reactive grasping using optical proximity sensors", IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, 2009, pages 2098 - 2105, XP031510083
J. KONSTANTINOVA; A. STILLI; K. ALTHOEFER: "Force and Proximity Fingertip Sensor to Enhance Grasping Perception", IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2015, pages 1
LEE; HYUNG-KEW; CHANG; SUN-IL; YOON; EUISIK: "A Capacitive Proximity Sensor in Dual Implementation with Tactile Imaging Capability on a Single Flexible Platform For Robot Assistant Applications", 19TH IEEE INTERNATIONAL CONFERENCE ON MICRO ELECTRO MECHANICAL SYSTEMS, January 2006 (2006-01-01), pages 606 - 609, XP010914321, DOI: doi:10.1109/MEMSYS.2006.1627872
LIN; TZU-YANG LIN TZU-YANG; CHAO; P.C.-P. CHEN WEI-PAR; TSAI; CHE-HUNG: "A novel 3D optical proximity sensor panel and its readout circuit", IEEE SENSORS, 2010, pages 108 - 113, XP031978212, DOI: doi:10.1109/ICSENS.2010.5690835
MALDONADO; ALEXIS; ALVAREZ; HUMBERTO; BEETZ; MICHAEL: "Improving robot manipulation through fingertip perception", IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2012, pages 2947 - 2954, XP032287406, DOI: doi:10.1109/IROS.2012.6385560
TEGIN; JOHAN; WIKANDER; JAN: "Tactile sensing in intelligent robotic manipulation - a review", INDUSTRIAL ROBOT: AN INTERNATIONAL JOURNAL, vol. 32, 2005, pages 64 - 70
WON; JAE-YEON; RYU; HYUNSURK; DELBRUCK; TOBI; LEE; JUN HAENG; HU: "Jiang Proximity Sensing Based on Dynamic Vision Sensor for Mobile Devices", IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, vol. 62, 2014, pages 536 - 544, XP011568371, DOI: doi:10.1109/TIE.2014.2334667
Attorney, Agent or Firm:
J A KEMP & CO (GB)
Download PDF:
Claims:
CLAIMS

1. A proximity sensor device for receiving a distance to an object, the device comprising:

a light emission unit configured to direct measurement light onto a surface of the object;

a light detection unit configured to detect light reflected off the surface and generate a detection signal that varies as a function of the distance to the surface;

an image sensor configured to obtain an image of the surface;

a control unit configured to determine a reflection characteristic of the surface from the image and to calculate distance to the object on the basis of the reflection characteristic and the detection signal 2. A device according to claim 1 wherein the reflection characteristic is color.

3. A device according to claim 1 or 2 wherein the reflection characteristic is reflectiveness. 4. A device according to claim 1, 2 or 3 wherein the control unit comprises a memory configured to store a plurality of calibration curves relating value of the detection signal to distance and a selection unit configured to select one of the calibration curves on the basis of the reflection characteristic. 5. A device according to claim 4 wherein at least one of the calibration curves is a second order power model.

6. A device according to any one of the preceding claims wherein the light emission unit comprises a light source and an emission optical fiber configured to guide light emitted by the light source to an emission point.

7. A device according to any one of the preceding claims wherein the light detection unit comprises a light detector and a detection optical fiber configured to guide light from a reception point to the light detector.

8. A robotic handling system comprising a grasper configured to grasp an object and a proximity sensor device according to any one of the preceding claims and configured to measure a distance to the object.

9. A system according to claim 8 wherein the proximity sensor device is according to claim 6 or 7 and the emission point and/or the reception point are located on the grasper.

10. A system according to claim 9 wherein the grasper is a robotic hand having a plurality of fingers and the emission point and/or the reception point are located on one of the fingers.

11. A system according to claim 10 wherein there are a plurality of proximity sensor devices each having an emission point and/or a reception point on a respective one of the fingers.

12. A system according to claim 11 wherein a single camera is the camera of the plurality of proximity sensor devices.

13. A system according to claim 11 or 12 wherein a single control unit is the control unit of the plurality of proximity sensors.

14. A robotic handling system comprising:

a grasper configured to grasp an object, the grasper having at least one finger;

a proximity sensor having a light source, a sending optical fiber to couple light from the light source to an emission point on the finger, a light sensor to generate a sensor signal and a receiving optical fiber to couple light from a receive point on the finger to the light sensor;

a camera configured to image the object; and

a controller comprising a memory storing a plurality of calibration curves and a processor configured to determine a color of the object from the image, to select a calibration curve on the basis of the color and to determine a distance to the object on the basis of the sensor signal and the selected calibration curve.

15. A method of measuring the distance to an object, the method comprising:

emitting light towards the object;

receiving light reflected by the object;

measuring the intensity of the received light;

obtaining an image of the object;

determining a reflective characteristic of the object from the image; and calculating a distance to the object on the basis of the intensity and the reflective characteristic.

Description:
PROXIMITY SENSOR AND CORRESPONDING DISTANCE MEASURING

METHOD

FIELD OF THE INVENTION

[ 0001 ] The present invention relates to proximity sensors, in particular to proximity sensors that can be integrated in robotic handling devices and to methods of calibration of proximity sensors.

BACKGROUND

[ 0002 ] Proximity and distance estimation sensors can be in robotic hands to improve grasping during grasp planning, grasp correction and in-hand manipulation.

[ 0003 ] Multi-fingered robotic hands capable of sensing proprioceptive information from an object are being developed [Tegin2005]. Proprioceptive information and haptic perception enables expanded understanding of an object to be grasped. Consequently, object grasping and handling can be performed more efficiently by adjusting the finger posture according to the physical characteristics, position or orientation of an object. Therefore, there is a pressing need to develop sensing solutions for robotic hands.

[ 0004 ] Tactile perception during grasping leads to improved quality of manipulation for the cases when the sensing device is in contact with an object. Therefore, grasp configuration can be adjusted based on haptic information and dynamic characteristics, such as contact force, slip and friction.

[ 0005 ] Grasp planning is often performed relying on the information obtained from the external vision system of the robot. However, such approach can lead to problems for scenarios where the fingers of the robot partially or fully occlude a target object. Therefore, it might be challenging to execute a stable grasp and further to perform in-hand manipulation because of incomplete spatial information.

[ 0006] Therefore, to solve the problem of grasp planning, tactile data is often combined with visual information. For instance, in [Bimbo2012] the tactile information from 6-dimensional force and torque sensors is fused with visual information from the external camera. In this approach the estimated shape of an object is reconstructed. Proximity sensing can be used to improve the efficiency of the manipulation of a multi-fingered robotic hand. It can be used during initial pre-shaping of the hand; for posture readjustment, when the grasp is almost complete; or during in-hand manipulation of an object. Proximity information can help to enhance the information about the position of an object and to combine position information with the data from the external vision system. [ 0007 ] In general, proximity sensors can be based on various measurement principles, such as magnetic, capacitive, inductive or optical. The use of inductive sensors is limited to metal objects only. Similarly, magnetic proximity sensors can be applied to magnetic materials only. Capacitive proximity sensors are suitable for application with various types of materials, and are broadly used for industrial applications. Capacitive sensors can be used to combine the proximity measurement with tactile sensing, such as in [Lee2006]. Proximity and tactile sensors, based on capacitive sensing principle, were developed in [Goeger2010]. An electrode made from conductive silicone rubber is used for proximity sensing. However, the sensor does not measure the exact distance to the object, and detects the proximity only, in addition the size of the prototype is not suitable for integration in robotic grippers.

[ 0008 ] Alternatively, optical proximity sensors are used in various robotic applications. A combined optical proximity and hardness sensor employing the combination of light-emitting diodes (LED) and photodiodes was proposed in [Lin2010]. The use of LEDs improves the sensor's robustness to the external light sources; however, such design might be difficult to miniaturize and can be used to detect the proximity of an object, but not the exact distance.

[ 0009] One of the common problems of optical proximity and distance sensors is the dependence of measurements on environmental factors, such as ambient light brightness. To solve this problem, a proximity sensor for smartphones was developed in [Won2014]. It uses a dynamic vision system and has the capability to compensate for the interference from external light sources. However, due to the intended application to detect skin presence, the sensor does not take into account the reflective properties or color of an object.

[ 0010 ] Proximity sensors have been integrated to robotic hands. The fingertips of a Barett Hand were enhanced with optical proximity sensors in [Hsiao2009] to implement reactive grasping using probabilistic modeling. The object's orientation and distance were estimated; however, the average error of such estimation was around 4 mm. In addition, the sensors that were based on an emitter and photo-receiver that are quite big in size and thus their number per fingertip is limited. In [Maldonado2012], manipulator TUM-Rosie was equipped with proximity sensors integrated to its fingertips. In addition, an integrated camera for close distance surface texture recognition, and optical-flow detection with an external 3D camera was used to recognize slip. Work described in [Hasegawa2010] presents another good example of a fingertip that detects several types of information with the help of photo- reflectors and pressure sensitive conductive rubber - tactile data, slip and proximity. SUMMARY

[0011] An aim of the invention is to provide an optical proximity sensor having improved performance, in particular by taking into account surface properties of an object, and decreasing the error during estimation of the distance.

[0012] According to the present invention, there is provided a proximity sensor device for receiving a distance to an object, the device comprising:

a light emission unit configured to direct measurement light onto a surface of the object;

a light detection unit configured to detect light reflected off the surface and generate a detection signal that varies as a function of the distance to the surface;

an image sensor configured to obtain an image of the surface;

a control unit configured to determine a reflection characteristic of the surface from the image and to calculate distance to the object on the basis of the reflection characteristic and the detection signal.

[0013] In an embodiment, reflection characteristic is color and/or reflectiveness.

[0014] In an embodiment, the control unit comprises a memory configured to store a plurality of calibration curves relating value of the detection signal to distance and a selection unit configured to select one of the calibration curves on the basis of the reflection characteristic.

[0015] In an embodiment,at least one of the calibration curves is a second order power model.

[0016] In an embodiment, the light emission unit comprises a light source and an emission optical fiber configured to guide light emitted by the light source to an emission point.

[0017] In an embodiment, the light detection unit comprises a light detector and a detection optical fiber configured to guide light from a reception point to the light detector.

[0018] According to the present invention, there is provided a robotic handling system comprising a grasper configured to grasp an object and a proximity sensor device as described above and configured to measure a distance to the object.

[0019] In an embodiment, the emission point and/or the reception point are located on the grasper.

[0020] In an embodiment, the grasper is a robotic hand having a plurality of fingers and the emission point and/or the reception point are located on one of the fingers.

[0021] In an embodiment, there are a plurality of proximity sensor devices each having an emission point and/or a reception point on a respective one of the fingers. [ 0022 ] In an embodiment, a single camera is the camera of the plurality of proximity sensor devices.

[ 0023 ] In an embodiment, a single control unit is the control unit of the plurality of proximity sensors.

[ 0024 ] According to the present invention, there is provided a robotic handling system comprising:

a grasper configured to grasp an object, the grasper having at least one finger;

a proximity sensor having a light source, a sending optical fiber to couple light from the light source to an emission point on the finger, a light sensor to generate a sensor signal and a receiving optical fiber to couple light from a receive point on the finger to the light sensor;

a camera configured to image the object; and

a controller comprising a memory storing a plurality of calibration curves and a processor configured to determine a color of the object from the image, to select a calibration curve on the basis of the color and to determine a distance to the object on the basis of the sensor signal and the selected calibration curve.

[ 0025 ] According to the present invention, there is provided a method of measuring the distance to an object, the method comprising:

emitting light towards the object;

receiving light reflected by the object;

measuring the intensity of the received light;

obtaining an image of the object;

determining a reflective characteristic of the object from the image; and

calculating a distance to the object on the basis of the intensity and the reflective characteristic.

[ 0026] The sensing system and the algorithm of an embodiment of the invention are based on the following components:

• a fiber optical proximity sensor integrated in the fingertip;

• an external vision system used to obtain color of an object; and

· a calibration algorithm that uses the color information from the camera to select an appropriate calibration.

[ 0027 ] A particularly advantageous embodiment of the invention provides a fiber optical proximity sensor that is integrated with a tactile sensing fingertip of a robotic hand of a mobile robot. The distance estimation of a proximity sensor can be typically influenced by the reflective properties of an object, such as color or surface roughness. With the approach of an embodiment, the accuracy of the proximity sensor is enhanced using the information collected by the vision system of the robot. A camera is employed to obtain RGB values of the object to be grasped. Then, the data obtained from the camera is used to obtain the correct calibration for the proximity sensor. Experimental evidence shows that the invention can be effectively used to reduce the distance estimation error.

BRIEF DESCRIPTION OF THE DRAWINGS

[ 0028 ] The invention is described below with reference to exemplary embodiments and the accompanying drawings, in which:

Figure 1 illustrates an embodiment of the invention based on a mobile robotic platform;

Figure 2 illustrates a fingertip of a robotic hand with integrated proximity and tactile sensors;

Figure 3 illustrates the working principle of a fiber optical proximity sensor;

Figure 4 is a flowchart of the distance estimation algorithm using a fiber optical proximity sensor and an external camera;

Figure 5 is a graph of calibration curves obtained using average sensor response;

Figure 6 is a graph of selected voltage responses used for calibration in the calibration space;

Figure 7 is a graph of obtained color code mapped to voltage responses from the proximity sensor;

Figure 8 is a graph of the results of distance estimation using calibration algorithm relative to a linear reference line (x=y); and

Figure 9 is a graph of the results of distance estimation without the use of calibration algorithm and without an external camera relative to the linear reference line (x=y).

[ 0029] In the various figures, like parts are denoted by like references.

DETAILED DESCRIPTION OF EMBODIMENTS

[ 0030 ] Figure 1 illustrates an exemplary embodiment of the invention, comprising a mobile robotic platform 1 that makes use of a camera 30 system to detect the color of an object 2 to be grasped with its robotic hand 11. The hand 11 is endowed with a fiber optic proximity sensor 20 to estimate the distance between the hand 11 and the object 2. The color information collected by the camera is used to adjust in real time the calibration curve of the proximity sensor 20. The fiber optical proximity sensor 20 is integrated in the fingertip 12 of the robotic hand 11 as shown in Figure 2. [ 0031 ] The proximity sensor 20 (Fig. 2) includes a pair of optical fibers 21, 22. In an embodiment of the fingertip sensor, the terminals 21a, 22a of the fibers are located in the center of the surface of the fingertip 12. The proximity sensor is conveniently located between tactile sensors 12a, but other locations are possible. The measurements are performed based on the principle of light intensity modulation. The surface properties of a target object 2 are used as a reflector. All materials, including dark ones, do not absorb the light completely. Therefore, the part of light beam sent by the emitting fiber 21 is reflected and can be collected by the receiving fiber 22. This is the basic principle that is used to detect the presence of objects in the proximity from the fingertip, as well as to estimate the distance from the sensor to the object's surface. A fiber optical converter is used to generate the light beam and to measure the light intensity of the reflected light beam. The light intensity measurement is converted into a voltage of a range from 0 V to 4 V. Any suitable light source can be used, for example a laser diode or light emitting diode. The light source can be monochrome, multi-colored or broadband.

[ 0032 ] Each optical fiber is 1 mm in diameter and there is a gap between the sending and receiving fiber of 0.4 mm due to the external fiber coating. The sending light is projected to a small point-like region of the target object located at a short distance. It is assumed that the target surface of an object is orthogonal to the finger. Further details of the structure of the proximity sensor and its function are given in [Konstantinova 2015], which document is hereby incorporated by reference.

[ 0033 ] To improve the estimation of the distance to an object using fiber optical proximity sensor, color detection is used. It is assumed that the location of the target object is fixed or known, and is defined using an external vision system of the robot, using a segmentation algorithm.

[ 0034 ] As shown in Figure 4, proximity sensor 20 detects the intensity light reflected by object 2 whilst camera 30 obtains a representation of color of the object. A calibration matrix 41, which has been obtained from a set of calibration objects, is used to determine the estimate 42 of distance to the object using the intensity and the representation of the color of the object.

[ 0035 ] In order to select the correct calibration curve of the proposed proximity sensor the color information is used as a visual descriptor, as it is the easiest way to detect and segment an object from an image. There are different color models, such as RGB, YUV, CMY, CMYK and HSV. The choice of color model is described below. [ 0036] A real-time image processing algorithm takes as input raw images of a commercially available USB camera that has a VGA resolution of 640*480, a frame rate of 30 fps, viewing angle of 56° and a focal length from 2 cm to 10 cm. This USB camera was selected due to its affordability and broad use in robotic applications. It is compact, since it measures only 3.6 cm in length and 0.75 cm in diameter. As the camera is used only to obtain visual information in RGB, the use of this camera can be generalized to other image sensors, including more expensive HD or even 4K solutions, that provide even more accurate results.

[ 0037 ] To perform the calibration of the proximity sensor for the target object, our algorithm uses as an input color coefficients obtained from the external camera (Fig. 3). The RGB color code is defined prior to the grasping action and to the occlusion of an object. In addition, the RGB values are obtained recurrently after the first measurement. This is done to compensate the sudden change of lighting that might influence the reflective properties of an object.

[ 0038 ] This section describes the calibration procedures carried out for fiber optical proximity sensor. As a first step, it is required to record the response of the voltage caused by the light reflection from selected objects with different properties.

[ 0039] As the reflective properties of an object depend also on the surface features of an object, 21 calibration objects with rough, glossy and medium reflectiveness surfaces were selected. In particular, three different materials were used - carton paper, shiny plastic and rough plastic. For each group of materials objects of seven colors were selected. The calibration objects were selected in a way to significantly represent all the wavelength of the visible color spectrum - from 800 nm to 390 nm. The calibration procedure was performed using a motorized linear slide. The calibration object was fixed 50 mm opposite to the proximity sensor, and was moved towards the sensor at speed of 1 mm/s. Voltage from the converter was recorded along with the displacement information. In total, five trials were executed for each object.

[ 0040 ] In order to perform calibration according to the obtained color code, RGB values were obtained for the calibration set of objects. During this test, each object was fixed in front of the camera. The colored surface of the object to be analyzed was located parallel to the lens surface of the camera at a distance of 10 cm. This distance was chosen based on the maximum focal distance of the camera used, hence 10 cm. The algorithm uses a region of interest (Rol) of 60x60 pixels, equal to 1 cm of the real surface. The centroid (the center of the pixel area) was located in the center of the image with boundaries within the area of the displayed object in the image. The number of pixels was chosen to provide significant color information and to have a surface image wide enough to reject local color imperfections of the real surface, like scratches or non-uniform color distribution on the material surface. Hence, 3600 pixels were evaluated. The pixels of the Rol were then used to compute the coefficients of color model representing the color mean descriptor of the object.

[ 0041 ] This section presents the outline of the calibration method. The previous design of the fingertip sensor estimated proximity information based on the average calibration curve that was obtained from controlled distance measurements of only 10 different objects. In this part we show the results obtained using average curve calibration method. Further on, this data is used for the comparison with the algorithm that uses color recognition. Fig. 5 shows the voltage responses obtained from the calibration set of objects and the calculated average curve. The second order power model can be fitted to this curve and further on is used to describe the performance of the proximity sensor:

D = 0.74 * v ~0 1808 - 17.28, (1)

where D is the estimated distance to an object, and V is the voltage response obtained from the sensor. Based on this calibration, the maximum deviation error between the fitted line and the average response corresponds to a distance of 2 mm. The measurement range of the sensor is 20 cm. However, the average calibration method will produce larger error in case of extreme voltage responses. Our proposed method reduces this error, as multiple calibration curves are used, as it is described in the next part.

[ 0042 ] Table 1: Calibration equations used for fiber optical proximity sensor.

[ 0043 ] Table 2: RGB values assigned to each calibration curve.

[ 0044 ] The voltage readings obtained from the calibration set of objects are displayed in Fig. 5 From the voltage curves obtained during approaching and retraction of each object, hysteresis loops were calculated. As there are no mechanical components involved in the design of the proximity sensor, the hysteresis loop is as small as 0.1%.

[ 0045 ] It can be observed that objects of different color and reflectiveness produce relatively high variability of response. In particular, the maximum standard deviation from the calculated mean response is 2.2 V that is above 50 % from the maximum value. Therefore, in order to estimate the correct distance to a broad range of objects with different surfaces it is desirable to use several calibration curves. The voltage responses measured during calibration are unevenly spaced and do not cover all calibration space. Therefore, we create an evenly distributed calibration grid to cover wider of range of surfaces with less estimation error. The steps required to produce the grid are as follows:

1. The original voltage responses (Fig. 5) are sorted and interpolated to cover all calibration space (shown shaded in Fig. 6). The calibration space represents the area where the voltage response from the proximity sensors falls.

2. The adequate spacing between the target curves in the calibration grid is calculated based on the maximum distance estimation error. The value of the error should not be too small in order not to increase the complexity of the algorithm. The empirically evaluated maximum distance error is 1.8 mm that corresponds to four evenly spaced calibration curves. Fig. 6 displays the voltage responses that are used for the calibration. It can be observed, that for different objects the distance estimation range is variable. In particular, proximity sensor can be used to estimate the distance to a target object with the maximum range from 10 mm (Response Vi) to 30 mm (Response V 4 ).

3. Finally, the calibration curves are described with equations (Table I). Second order power equations of the form D = a * V b + C best describe the voltage responses from the proximity sensors, with minimum R-squared value of 0.93.

The next step of the calibration algorithm involves the association of the curves in the calibration grid with the color code coefficients obtained using an external camera. Fig. 7 demonstrates the distribution of color for the training set of objects along with corresponding voltage response curves recorded from the proximity sensor. Black lines show calibration curves, and colored lines correspond to voltage responses from the proximity sensor, the color of the line reflects RGB color value from the camera;

[ 0046] The detected color coefficients from the camera correspond to the original colors of the training objects. The color values are interpolated and assigned to each of the four calibration curves. Each out of four calibration curves correspond to certain color information that can be expressed in any color code.

[ 0047 ] In order to select the best color representation several color models were tested. HSV or HSL are usually used as more robust color detection models in computer graphics.

However, some information, such as reflective properties of an object, is suppressed during data capture. Therefore, it might be more difficult to distinguish between shiny and matte objects, such as polished and rough materials. The principle of distance estimation for fiber optical proximity sensor is based on the reflective properties of an object. Thus, the disturbance of color that is caused by the reflective surface of a shiny object is an important feature for calibration. It was found that for our case, RGB color representation provides better color estimation. The assigned RGB color values are shown in Table 2.

[ 0048 ] RGB values obtained from the center of the camera image and from multiple samples located within a field of view of 30°, were compared. This was done in order to evaluate the influence of the angular distance from the optical axis of the camera on the color detection algorithm, hence, potential errors during the choice of the calibration curves. This field of view was chosen based on the maximum field of view of the camera, that was 56°, and on the size of the object in the image plane, in order to avoid tip effects, like blurring, getting only data from the colored surface of the samples. Data showed that the mean difference between RGB coefficients is (55, 22, 13), this does not affect the calibration results.

[ 0049] Finally, in order to select the calibration for a novel object, the measured RGB value is associated with the most suitable calibration curve and assigned RGB. Measured RGB values are compared with the assigned ones using three-dimensional Euclidean distance. The RGB values from an object are obtained in real-time, and in case environmental conditions, such as light change, the used calibration curve is updated with the other curve from the grid.

[ 0050 ] To evaluate the performance of the proximity sensor and the calibration algorithm, experiments were conducted using objects that were not used during the calibration. Seven objects were used - light colored wooden block, black plastic box, white paper, purple fabrics toy, gray expanded polyurethane, green cellophane, and red rubber tape. The objects were positioned on the linear slide and moved towards the proximity sensor, at the same time the camera was recording color values in real-time.

[ 0051 ] The results of the distance estimation using the proposed calibration algorithm are shown in Fig.8. The calibration curves have different sensing range, as it was mentioned in the section above. Therefore, the length of distance estimation for each object varies. Distance estimation is displayed relatively to the reference curve (x=y), which is represents perfect estimation. The distance estimation errors for the test objects are shown in Table 3.

[ 0052 ] Table 3: Comparison between color calibration and average calibration for novel objects.

[ 0053 ] To evaluate the improvement of the sensor performance the distance estimation is compared with the previous version that is based on the average calibration curve with no use of the external camera, as described above. The responses are shown in Fig. 9 and in Table 3. The results show that in this case there is comparatively large distance estimation error. In addition, the previous does not take into account the range of distance estimation for the calibration that produces large noise for larger distances. For the proposed algorithm this feature is taken into account, and distance estimation is performed only for the calibrated range - 10, 15, 25 and 35 cm for responses VI to V4 respectively. Therefore, it was shown that the proposed color calibration algorithm reduces the distance estimation error for the fiber optical proximity sensor in the majority of cases. For some objects, the distance estimation error does not improve much, as the chosen calibration (V2) is close to the average calibration.

[ 0054 ] Thus an embodiment of the present invention can provide a fingertip proximity sensor that uses a real time visual-based calibration algorithm. The invention provides a distance sensing system that is capable of taking into account the surface properties of a target object, such as color, and of adjusting the calibration accordingly in real time. In case the environmental light and, respectively, the reflective properties of an object are changing, the algorithm updates the calibration accordingly. The proposed algorithm works particularly well with uniformly colored objects. It uses the mean color information from Rol and can partially reduce the distance estimation error.

[ 0055 ] The proposed solution of a fiber optical proximity sensor calibration can be easily integrated for the broad range of different robotic platforms that use an external camera. A low resolution vision system with a short focal length is sufficient. The use of more advanced cameras, with bigger pixel matrix can improve the color estimation, hence, the calibration. The comparison with a method that uses only one calibration curve demonstrates the significant improvement of the distance estimation error for different materials. In particular, the mean error can be reduced to 2 mm.

[ 0056] Current algorithm assumes that camera looks at the object using the same angle as at least one finger. It is possible to estimate the orientation of the fingertip using forward kinematics of the robotic platform, and then to apply the corresponding calibration for the fingertip.

[ 0057 ] The present invention has been described above by reference to exemplary embodiments however it is to be understood that the present invention is not to be limited by the preceding description but rather by the appended claims. Although an embodiment has been described in the context of the grasping applications, the invention can also be used for any other robotic system where proximity and distance detection from unknown objects is required. In described embodiments visible light is used but infrared and/or ultraviolet light can be used instead or in addition and the term "color" should be construed accordingly. The algorithm of the present invention can be implemented on general purpose computing devices or special purpose hardware, such as an ASIC. The algorithm can be embodied in a computer program and recoded on non-transitory computer-readable storage media.

[ 0058 ] BIBLIOGRAPHY

[Bimbo2012] Bimbo, Joao and Rodriguez-Jimenez, Silvia and Liu, Hongbin and Song,

Xiaojing and Burrus, Nicolas and Senerivatne, Lakmal D. and Abderrahim, Mohamed and Althoefer, Kaspar, "Object pose estimation and tracking by fusing visual and tactile information", IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pages 65-70, 2012

[Goeger2010,] Goeger, Dirk and Blankertz, Matthias and Woern, Heinz, "A tactile proximity sensor", Proceedings of IEEE Sensors, pages 589-594, 2010

[Hasegawa2010] Hasegawa, Hiroaki and Mizoguchi, Yoshitomo and Tadakuma, Kenjiro and Ishikawa, Masatoshi and Shimojo, Makoto, "Development of intelligent robot hand using proximity, contact and slip sensing", 2010 IEEE International Conference on Robotics and Automation, pages 777-784

[Hsiao2009] Hsiao, Kaijen and Nangeroni, Paul and Huber, Manfred and Saxena, Ashutosh and Ng, Andrew Y., "Reactive grasping using optical proximity sensors" 2009 IEEE

International Conference on Robotics and Automation, pages 2098-2105

[Konstantinova2015] J. Konstantinova, A. Stilli, and K. Althoefer, "Force and Proximity Fingertip Sensor to Enhance Grasping Perception," in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2015, p. l.

[Lee2006] Lee, Hyung-Kew and Chang, Sun-Il and Yoon, Euisik, "A Capacitive Proximity Sensor in Dual Implementation with Tactile Imaging Capability on a Single Flexible Platform For Robot Assistant Applications", 19th IEEE International Conference on Micro Electro Mechanical Systems, January 2006, pages 606-609

[Lin2010] Lin, Tzu-Yang Lin Tzu-Yang and Chao, P.C.-P. Chen Wei-Par and Tsai, Che- Hung "A novel 3D optical proximity sensor panel and its readout circuit", IEEE Sensors, pages 108-113, 2010

[Maldonado2012] Maldonado, Alexis and Alvarez, Humberto and Beetz, Michael,

"Improving robot manipulation through fingertip perception" 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 2947-2954, 2012

[Tegin2005] Tegin, Johan and Wikander, Jan, "Tactile sensing in intelligent robotic manipulation - a review", Industrial Robot: An International Journal, pages 64-70, volume 32, 2005

[Won2014] Won, Jae-yeon and Ryu, Hyunsurk and Delbruck, Tobi and Lee, Jun Haeng and Hu, "Jiang Proximity Sensing Based on Dynamic Vision Sensor for Mobile Devices", Ieee Transactions on Industrial Electronics, pages 536-544, volume 62, 2014