Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS FOR JOINT CALIBRATION OF RADAR AND CAMERA SYSTEMS FOR AUTONOMOUS VEHICLE APPLICATIONS
Document Type and Number:
WIPO Patent Application WO/2020/214426
Kind Code:
A1
Abstract:
An apparatus and method for joint calibration of vision and radar sensors for an autonomous device is disclosed. The apparatus may include a spherical portion and a cutout portion. The cutout portion may be formed within the spherical portion and have three equal surfaces. Additionally, the apparatus may include a trihedral reflector positioned within the cutout portion.

Inventors:
VU DUC (US)
REDDY DIKPAL (US)
HARTMAN COLE (US)
Application Number:
PCT/US2020/026336
Publication Date:
October 22, 2020
Filing Date:
April 02, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ARGO AI LLC (US)
International Classes:
G01S7/497; G01S17/86; G01S17/931; G02B5/08; G05D1/02
Foreign References:
US20060164295A12006-07-27
US20180372841A12018-12-27
US20160161602A12016-06-09
US20180252798A12018-09-06
US4531128A1985-07-23
DE2308701A11974-09-05
US20060164295A12006-07-27
Other References:
BJORN J. DORING, MARCO SCHWERDT, ROBERT BAUER: "TerraSAR-X Calibration ground equipment", 30 June 2007 (2007-06-30), pages 1 - 5, XP055744036, Retrieved from the Internet
JULIUS ET AL.: "2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS", 1 October 2018, IEEE, article "Automatic Calibration of Multiple Cameras and Depth Sensors with a Spherical Target"
COHEN M ET AL.: "IRE TRANSACTIONS ON ANTENNAS AND PROPAGATION", vol. 10, 1 July 1955, IEEE, article "A dual-standard for radar echo measurements"
See also references of EP 3948336A4
Attorney, Agent or Firm:
SINGER, James, M. et al. (US)
Download PDF:
Claims:
CLAIMS

1. An apparatus for joint calibration of vision and radar sensors for an autonomous device, the apparatus comprising:

a spherical portion;

a cutout portion, wherein the cutout portion is formed within the spherical portion and comprises three equal surfaces; and

a trihedral reflector positioned within the cutout portion.

2. The apparatus of claim 1, wherein the spherical portion comprises one or more non- metallic materials.

3. The apparatus of claim 2, wherein the material is one or more of polystyrene, polypropylene, polyvinyl chloride (PVC), polyamide, polycarbonate (PC), polycarbonate and polybutylene terephthalate blend (PC-PBT), acrylonitrile butadiene styrene (ABS), and acrylonitrile styrene acrylate (ASA).

4. The apparatus of claim 1, wherein the spherical portion is visually opaque.

5. The apparatus of claim 1, wherein the trihedral reflector comprises a metallic material.

6. The apparatus of claim 5, wherein the trihedral reflector comprises a preformed metallic insert having three surfaces of equal size and shape.

7. The apparatus of claim 5, wherein the trihedral reflector comprises a metallic coating applied to the three surfaces of the cutout portion.

8. The apparatus of claim 1, wherein a volume of the cutout portion is 1/8 of a volume of spherical portion.

9. The apparatus of claim 1, wherein a length of each opposite and adjacent side of each triangular surface of the trihedral reflector is equal to the radius of the spherical portion.

10. The apparatus of claim 1, further comprising a base portion coupled to the spherical portion.

11. The apparatus of claim 10, wherein the base portion comprises one or more of a polystyrene, polypropylene, polyvinyl chloride (PVC), polyamide, polycarbonate (PC), polycarbonate and polybutylene terephthalate blend (PC-PBT), acrylonitrile butadiene styrene (ABS), or acrylonitrile styrene acrylate (ASA) material.

12. A system for joint calibration of vision and radar sensors comprising:

an autonomous device having at least one camera sensor and at least one radar sensor; and

a calibration target, wherein the calibration target comprises:

a spherical portion;

a cutout portion, wherein the cutout portion is formed within the spherical portion and comprises three surfaces; and a trihedral reflector positioned within the cutout portion.

13. The system of claim 12, wherein the autonomous device comprises an autonomous vehicle.

14. The system of claim 12, wherein the spherical portion of the calibration target is formed of one or more of a polystyrene, polypropylene, polyvinyl chloride (PVC), polyamide, polycarbonate (PC), polycarbonate and polybutylene terephthalate blend (PC- PBT), acrylonitrile butadiene styrene (ABS), or acrylonitrile styrene acrylate (ASA) material.

15. The system of claim 12, wherein the trihedral reflector of the calibration target comprises a metallic material.

16. The apparatus of claim 15, wherein the trihedral reflector of the calibration target comprises a preformed metallic insert having three surfaces of equal size and shape.

17. The apparatus of claim 15, wherein the trihedral reflector of the calibration target comprises a metallic coating applied to the three surfaces of the cutout portion.

18. A method of forming a calibration target for the joint calibration of vision and radar sensors for an autonomous device, the method comprising:

forming a spherical portion;

forming a cutout portion within the spherical portion, wherein the cutout portion is formed so as to comprise three surfaces of equal size and shape; and providing a metallic trihedral reflector positioned within the cutout portion.

19. The method of claim 18, wherein forming the spherical portion comprises forming the spherical portion from one or more of a polystyrene, polypropylene, polyvinyl chloride (PVC), polyamide, polycarbonate (PC), polycarbonate and polybutylene terephthalate blend (PC-PBT), acrylonitrile butadiene styrene (ABS), or acrylonitrile styrene acrylate (ASA) material.

20. The method of claim 18, wherein forming the cutout portion comprises removing a portion of the spherical portion equal to 1/8 of the volume of spherical portion.

Description:
TITLE: APPARATUS FOR JOINT CALIBRATION OF RADAR AND CAMERA

SYSTEMS FOR AUTONOMOUS VEHICLE APPLICATIONS

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This patent document claims priority to U.S. Patent Application No.

16/383,829, filed April 15, 2019, the disclosure of which is fully incorporated into this document by reference.

BACKGROUND

[0002] Accurate and consistent obstacle detection and navigation are key elements of autonomous driving applications. Typically, an autonomous vehicle utilizes various on-board sensors to detect obstacles, other aspects of the roadway, and/or other aspects of an environment around the vehicle. Examples of such sensors include, for example, one or more of vision sensors (e.g., camera(s)), radio detection and ranging (i.e., radar) sensors, and/or light detection and ranging (i.e., LiDAR) sensors. While it is possible for only one type of sensor to be utilized, it is far more preferable to fuse data from different types of sensors so as to provide the autonomous vehicle’s control systems with more accurate, complete, and dependable information.

[0003] In order for sensor fusion to provide desired outputs, the individual sensors must be calibrated prior to usage of the autonomous vehicle and, over time, recalibrated to assure accurate results. In the past, each sensor modality has been calibrated separately, often using separate calibration targets optimized for each modality. Due to the low-resolution, high- variance nature of radar, radar calibration tolerances are much larger than those of other modalities. When fusion of low-level sensor data is desired, the larger variance nature of radar may lead to a mismatch in detection pairing with other sensor types, such as vision sensors. Thus, in order to utilize low-level sensor data for pairing, radar and vision sensors should ideally be calibrated simultaneously using a common target. However, vision sensor calibration in autonomous vehicles has often relied upon the use of a large, planar checkerboard pattern as the calibration target. These large checkerboard patterns are not suitable for calibration of radar sensors, as they result in a high variance radar signature and may create multipath patterns, leading to inaccurate calibration of the radar sensors.

[0004] Accordingly, there is a need for a calibration target capable of simultaneously providing both radar and vision sensor calibration without interfering with the other’s sensing modality. This document describes systems that are directed to addressing the problems described above, and/or other issues.

SUMMARY

[0005] In accordance with an aspect of the disclosure, an apparatus for joint calibration of vision and radar sensors for an autonomous device is disclosed. The apparatus may include a spherical portion and a cutout portion. The cutout portion may be formed within the spherical portion and have three equal surfaces. Additionally, the apparatus may include a trihedral reflector positioned within the cutout portion.

[0006] According to another aspect of the disclosure, a system for joint calibration of vision and radar sensors is disclosed. The system may include an autonomous device having at least one camera sensor and at least one radar sensor. The system may further include a calibration target. The calibration target may include a spherical portion and a cutout portion. The cutout portion may be formed within the spherical portion and may include three surfaces. Additionally, the calibration target may include a trihedral reflector positioned within the cutout portion. [0007] In accordance with another aspect of the disclosure, a method of forming a calibration target for the joint calibration of vision and radar sensors for an autonomous device is disclosed. The method may include forming a spherical portion, and forming a cutout portion within the spherical portion. The cutout portion may be formed so as to include three surfaces of equal size and shape. Additionally, the method may include providing a metallic trihedral reflector positioned within the cutout portion.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] FIG. 1 illustrates a calibration system for an autonomous device in accordance with an aspect of the disclosure;

[0009] FIG. 2 illustrates a perspective view of a calibration target for use in vision sensor and radar sensor calibration in accordance with an aspect of the disclosure;

[0010] FIG. 3 illustrates a perspective view of a calibration target for use in vision sensor and radar sensor calibration in accordance with another aspect of the disclosure;

[0011] FIG. 4 illustrates a cross-sectional view of the calibration target of FIG. 2; and

[0012] FIG. 5 is a block diagram of elements of a computing device on which the various systems and methods in this document could be implemented.

DETAILED DESCRIPTION

[0013] As used in this document, the singular forms“a,”“an,” and“the” include plural references unless the context clearly dictates otherwise. Unless defined otherwise, all technical and scientific terms used herein have the same meanings as commonly understood by one of ordinary skill in the art. As used in this document, the term“comprising” means “including, but not limited to.” Definitions for additional terms that are relevant to this document are included at the end of this Detailed Description. [0014] Referring to FIG. 1, an autonomous vehicle calibration system 10 in accordance with an aspect of the disclosure is shown. Calibration system 10 includes an autonomous vehicle 20, which may capable of fully-autonomous or semi-autonomous operation. While illustrated as a conventional passenger car, it is to be understood that autonomous vehicle 20 may be configured as any appropriate automated device, e.g., a car, truck, van, train, aircraft, aerial drone and the like.

[0015] Autonomous vehicle 20 includes a plurality of sensor types used in the gathering of information used in vehicle navigation, obstacle avoidance, etc. Specifically, one or more cameras 23 may be provided, as well as one or more radar sensors 24. Additionally, in some embodiments, one or more LiDAR sensors 25 may also be present. While not shown in FIG. 1, it is to be understood that the camera(s) 23, radar sensor(s) 24, and/or LiDAR sensor(s) 25 are electrically coupled to an on-board processor configured to perform calculations and logic operations required to execute programming instructions. In this way, data from the camera(s) 23, radar sensor(s) 24, and/or LiDAR sensor(s) 25 may be combined so as to provide the autonomous vehicle’s control systems with more accurate, complete, and dependable information.

[0016] Calibration system 10 also includes a calibration target 30. As will be described in further detail below, calibration target 30 is configured as a single target capable of providing joint calibration of the camera(s) 23 and radar sensor(s) 24. Calibration target 30 may be mounted upon a post 31 or other structure capable of elevating calibration target 30 above the ground surface 21 and in the field-of-view of both the camera(s) 23 and radar sensor(s) 24. However, it is to be understood that calibration target 30 may be elevated above the ground surface 21 using any appropriate means and/or at any appropriate height. Furthermore, in some embodiments, during a calibration process, the autonomous vehicle 20 may be positioned such that both the camera(s) 23 and radar sensor(s) 24 are positioned at a distance of 4-to-6 meters away from the calibration target 30. However, it is to be understood that the distance between the camera(s) 23 and radar sensor(s) 24 is not limited to 4-to-6 meters, and the calibration target 30 may vary based on, e.g., the size of the calibration target 30, the position of the calibration target 30, the positions of the camera(s) 23 and radar sensor(s) 24 on the autonomous vehicle 20, etc.

[0017] Next, referring to FIG. 2, a detailed view of calibration target 30 in accordance with an aspect of the disclosure is shown. As noted above, calibration target 30 is configured as a single target capable of aiding the calibration of both vision and radar sensors. First, calibration target 30 includes a substantially spherical portion 32, which may be formed of one or more appropriate non-metallic material(s) having a low radar signature (e.g., less than 0 dB/m 2 ). Such appropriate non-metallic material(s) may be, e.g., polystyrene, polypropylene, polyvinyl chloride (PVC), polyamide, polycarbonate (PC), polycarbonate and polybutylene terephthalate blend (PC-PBT), acrylonitrile butadiene styrene (ABS), acrylonitrile styrene acrylate (ASA), etc. Additionally, spherical portion 32 is visually opaque, may be painted or otherwise colored, and may have any suitable diameter, such as, e.g., 12 inches. In this way, spherical portion 32 provides for an ideal target for the calibration of camera(s) 23, as the spherical shape and/or colored surface enables a low-variance vision angle and range estimate for the camera(s) 23. Additionally, the spherical portion 32 does not interfere in any substantial respect with returns from the radar sensor(s) 24, as the material (e.g., polystyrene, PVC, PC, etc.) has a low radar signature, and the spherical shape and relatively small diameter resists reflection of the electromagnetic waves from the radar sensor(s) 24.

[0018] Within substantially spherical portion 32 is a cutout portion 34, with cutout portion 34 accounting for approximately 1/8 of the overall volume of the calibration target 30. A trihedral reflector 35 is provided within the cutout portion 34, wherein the trihedral reflector 35 acts as a comer reflector capable of generating a strong radar echo for use in calibration of the radar sensor(s) 24. More specifically, the trihedral reflector 35 includes three electrically-conductive surfaces 36A, 36B, 36C mounted at a 90° angle relative to one another, allowing incoming electromagnetic waves from the radar sensor(s) 24 shown in FIG. 1 to be accurately backscattered in the direction from which they came.

[0019] Accordingly, even with a relatively small size, the trihedral reflector 35 provides a strong radar echo, particularly when compared with other surfaces (e.g., spheres, planar surfaces, etc.) of similar size. For example, a trihedral reflector 35 in which the length R of each opposite and adjacent side of triangular surfaces 36 A, 36B, 36C is approximately 6 inches may provide for a stable ~20 dB/m 2 radar signal, thereby allowing for low- variance angle detection and a large radar field-of-view, particularly given the relatively small overall size of the trihedral reflector 35. Additionally, the cutout portion 34 and trihedral reflector 35 sized and positioned such that they do not greatly interfere with the ability of the camera(s) 23 to calibrate using the spherical portion 32.

[0020] Referring still to FIG. 2, the trihedral reflector 35 is shown as an insert secured within the cutout portion 34. In some embodiments, trihedral reflector 35 is a solid metallic insert, preformed and then placed into (i.e., completely embedded within) cutout portion 34, leaving portions of the internal surfaces 37A, 37B, 37C of the cutout portion 34 uncovered. Each triangular surface 36A, 36B, 36C may be of equal size and shape.

[0021] However, as shown in FIG. 3, and in accordance with another aspect of the disclosure, it is to be understood that the trihedral reflector may instead be formed of a metallic coating applied to the internal surfaces of the cutout portion 34, thereby leaving no surfaces of the cutout portion 34 uncovered. Specifically, referring to FIG. 3, a calibration target 42 in accordance with one embodiment is shown, with the cutout portion of spherical portion 32 having metallically coated surfaces 46A, 46B, 46C to form a trihedral reflector 45. The metallic coating may be applied via any appropriate method, such as spray coating, adhesive coating, etc. The thickness of each surface 46A, 46B, 46C of the trihedral reflector 45 may be any appropriate thickness (e.g., about 50 microns) capable of the reflection of electromagnetic waves. Furthermore, each surface 46A, 46B, 46C should be non-porous, thereby allowing for accurate backscatter of the electromagnetic waves.

[0022] Referring again to FIG. 2, while the length R of each opposite and adjacent side of triangular surfaces 36A, 36B, 36C is described above as being 6 inches (i.e., equal to the radius of the spherical portion 32), it is to be understood that the length of each surface 36A, 36B, 36C may be longer or shorter. In such an instance, the change x-y-z position of each surface 36A, 36B, 36C relative to the sphere center would need to be accounted for such that the calibration of the radar sensor(s) and the calibration of the camera(s) remains consistent.

[0023] Next, referring to both FIG. 2 and FIG. 4, calibration target 30 further includes a base member 38, which allows calibration target 30 to be mounted in a desired location (e.g., upon post 31, as shown in FIG. 1). As shown in FIG. 3, base member 38 may include an inwardly-directed stem portion 39, along with a circular face portion 40, wherein the stem portion 39 is configured to extend within the spherical portion 32 so as to secure the base member 38 and allow for coupling at a mounting location. In some embodiments, the base member 38 is formed of a material (or materials) which have a low radar signal, such as, e.g., polypropylene, polyamide, PVC, PC, PC-PBT, ABS, ASA, etc. In this way, the base member 38 is essentially transparent to the radar signal and does not alter the calibration of the radar sensor(s). At the very least, the radar signal of the base member 38 should be low enough that it is essentially rendered moot relative to the high radar signal provided by the trihedral reflector 35. [0024] Additionally, in accordance with another aspect of the disclosure, the trihedral reflectors 35, 45 shown and described above with respect to FIGS. 2-4 may be angularly positioned within a cutout of the spherical portion 32 relative to the base member 38 such that each surface of the trihedral reflector 35, 45 has equal exposure from a line-of-sight aligned with the shared comer of the three surfaces and parallel to the ground, similar to that which is shown in FIG. 1. With such a configuration, the calibration target may be more tolerant of misalignment during calibration set-up.

[0025] FIG. 4 depicts an example of internal hardware that may be included in any of the electronic components of the calibration system, such as internal processing systems, external monitoring and reporting systems, or remote servers. An electrical bus 50 serves as an information highway interconnecting the other illustrated components of the hardware. Processor 52 is a central processing device of the system, configured to perform calculations and logic operations required to execute programming instructions. As used in this document and in the claims, the terms“processor” and“processing device” may refer to a single processor or any number of processors in a set of processors that collectively perform a set of operations, such as a central processing unit (CPU), a graphics processing unit (GPU), a remote server, or a combination of these. Read only memory (ROM), random access memory (RAM), flash memory, hard drives and other devices capable of storing electronic data constitute examples of memory devices 56. A memory device may include a single device or a collection of devices across which data and/or instructions are stored. Various embodiments of the invention may include a computer-readable medium containing programming instructions that are configured to cause one or more processors, print devices and/or scanning devices to perform the functions described in the context of the previous figures. [0026] An optional display interface 59 may permit information from the bus 50 to be displayed on a display device 58 in visual, graphic or alphanumeric format. An audio interface and audio output (such as a speaker) also may be provided. Communication with external devices may occur using various communication devices 60 such as a wireless antenna, an RFID tag and/or short-range or near-field communication transceiver, each of which may optionally communicatively connect with other components of the device via one or more communication system. The communication device(s) 60 may be configured to be communicatively connected to a communications network, such as the Internet, a local area network or a cellular telephone data network.

[0027] The hardware may also include a user interface sensor 62 that allows for receipt of data from input devices 64 such as a keyboard, a mouse, a joystick, a touchscreen, a touch pad, a remote control, a pointing device and/or microphone. Digital image frames also may be received from one or more cameras 54 that can capture video and/or still images. The system also may receive data from a motion and/or position sensor 70 such as an accelerometer, gyroscope or inertial measurement unit. The system also may receive data from a radar system 68 and/or a LiDAR system 66 such as that which was described above.

[0028] The above-disclosed features and functions, as well as alternatives, may be combined into many other different systems or applications. Various components may be implemented in hardware or software or embedded software. Various presently unforeseen or unanticipated alternatives, modifications, variations or improvements may be made by those skilled in the art, each of which is also intended to be encompassed by the disclosed embodiments.

[0029] Terminology that is relevant to the disclosure provided above includes;

[0030] The term“vehicle” refers to any moving form of conveyance that is capable of carrying either one or more human occupants and/or cargo and is powered by any form of energy. The term “vehicle” includes, but is not limited to, cars, trucks, vans, trains, autonomous vehicles, aircraft, aerial drones and the like. An“autonomous vehicle” is a vehicle having a processor, programming instructions and drivetrain components that are controllable by the processor without requiring a human operator. An autonomous vehicle may be fully autonomous in that it does not require a human operator for most or all driving conditions and functions, or it may be semi-autonomous in that a human operator may be required in certain conditions or for certain operations, or that a human operator may override the vehicle’s autonomous system and may take control of the vehicle.

[0031] In this document, when terms such“first” and“second” are used to modify a noun, such use is simply intended to distinguish one item from another, and is not intended to require a sequential order unless specifically stated. In addition, terms of relative position such as“vertical” and“horizontal”, or“front” and“rear”, when used, are intended to be relative to each other and need not be absolute, and only refer to one possible position of the device associated with those terms depending on the device’s orientation.