Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATIC NAVIGATION SYSTEM FOR FIRE FIGHTING ROBOT
Document Type and Number:
WIPO Patent Application WO/2021/213737
Kind Code:
A1
Abstract:
The present invention provides a navigation method for a fire fighting robot, the navigation method comprising: acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor; wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part; calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image; controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.

Inventors:
YANG ZHAN BIN (CN)
LIU JIANG BO (CN)
SUN ZHAO JUN (CN)
YU QI (CN)
Application Number:
PCT/EP2021/056767
Publication Date:
October 28, 2021
Filing Date:
March 17, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SIEMENS AG (DE)
International Classes:
G05D1/00; A62C27/00
Domestic Patent References:
WO2020076610A12020-04-16
WO2001084260A22001-11-08
Foreign References:
US20190011921A12019-01-10
CN108815754A2018-11-16
CN110860057A2020-03-06
Attorney, Agent or Firm:
ISARPATENT - PATENT- UND RECHTSANWÄLTE BARTH CHARLES HASSA PECKMANN UND PARTNER MBB (DE)
Download PDF:
Claims:
Patent claims

1. A navigation method (300) for a fire fighting robot, the fire fighting robot comprising a robot main body capable of moving to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation, and a remote controller wirelessly connected to the robot main body, the robot main body having a binocular vision camera and an attitude sensor, and the remote controller having an input part and a display; the navigation method (300) comprising: acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor (310); wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part (320); calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image (330); controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance (340).

2. The navigation method as claimed in claim 1, wherein the step of calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image (330) comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision according to screen coordinates of the destination, the first environment image and the second environment image. 3. The navigation method as claimed in claim 1 or 2, wherein the step of controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance (340) comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle.

4. The navigation method as claimed in claim 1, wherein the navigation method (300) further comprises: detecting whether an obstacle is present in the environment in which the robot main body is currently travelling; and causing the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling.

5. The navigation method as claimed in claim 1 or 4, wherein the navigation method (300) further comprises: detecting the ambient temperature of the environment in which the robot main body is currently travelling, and causing the robot main body to stop moving when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature.

6. A navigation apparatus (400) for a fire fighting robot, the fire fighting robot comprising a robot main body capable of moving to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation, and a remote controller wirelessly connected to the robot main body, the robot main body having a binocular vision camera and an attitude sensor, and the remote controller having an input part and a display; the navigation apparatus (400) comprising: an acquisition unit (410), for acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor; wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; an input determining unit (420), for sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part; a calculating unit (430), for calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image; a control unit (440), for controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.

7. The navigation apparatus as claimed in claim 6, wherein the step of the calculating unit (430) calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision according to screen coordinates of the destination, the first environment image and the second environment image.

8. The navigation apparatus as claimed in claim 6 or 7, wherein the step of the control unit (440) controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle. 9. The navigation apparatus as claimed in claim 6, wherein the navigation apparatus (400) further comprises an obstacle detection unit (450); the obstacle detection unit (450) detects whether an obstacle is present in the environment in which the robot main body is currently travelling, and upon detecting that an obstacle is present in the environment in which the robot main body is currently travelling, causes the robot main body to stop moving.

10. The navigation apparatus as claimed in claim 6 or 9, wherein the navigation apparatus (400) further comprises a temperature detection unit (460); the temperature detection unit (460) detects the ambient temperature of the environment in which the robot main body is currently travelling, and upon detecting that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, causes the robot main body to stop moving.

11. A fire fighting robot, comprising a processor, a memory, and a computer program that is stored on the memory and capable of being run on the processor; when executed by the processor, the computer program implements the navigation method as claimed in any one of claims 1 - 5.

12. A computer readable medium, wherein a computer program is stored on the computer readable storage medium; when executed by a processor, the computer program implements the navigation method as claimed in any one of claims 1 - 5.

Description:
Description

AUTOMATIC NAVIGATION SYSTEM FOR FIRE FIGHTING ROBOT

Technical field

The present invention mainly relates to the field of robots, in particular to a navigation method and navigation apparatus for a fire fighting robot.

Background art

Fire fighting robots can assist fire fighting personnel in the process of extinguishing fires, and are generally arranged close to the site of a fire to perform a fire extinguishing operation. When a fire occurs, a fire fighter must operate a remote controller manually, in order to move the robot to the best fire extinguishing position. The best fire extinguishing position is generally quite far from the operator, and the motility of fire fighter operation will fall significantly in a manual control mode.

Robots having automatic navigation functionality have been developed in the industry; in these robots having automatic navigation functionality, an outdoor mobile platform thereof generally carries a series of sensors, such as a binocular vision camera, radar and a global positioning system (GPS), etc., in order to realize navigation based on SLAM (simultaneous localization and mapping).

However, SLAM-based navigation needs to construct a map in advance for each scene where a fire might start, the positioning accuracy of civil GPS is not high, and added to this is the explosion-proof level limitations of fire fighting robot hardware; consequently, existing navigation methods are unable to meet the automatic navigation needs of fire fighting robots. Summary of the invention

In order to solve the abovementioned technical problem, the present invention provides a navigation method and navigation apparatus for a fire fighting robot, to realize automatic navigation of a fire fighting robot, adapt to rapidly changing fire situations, and improve fire extinguishing efficiency.

To achieve the above object, the present invention proposes a navigation method for a fire fighting robot, the fire fighting robot comprising a robot main body capable of moving to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation, and a remote controller wirelessly connected to the robot main body, the robot main body having a binocular vision camera and an attitude sensor, and the remote controller having an input part and a display; the navigation method comprising: acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor; wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part; calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image; controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.

Thus, automatic navigation of the fire fighting robot is achieved by an operator selecting a destination in an environment image acquired by a binocular vision camera, in conjunction with a yaw angle sensed by an attitude sensor, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency.

In an embodiment of the present invention, the step of calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision according to screen coordinates of the destination, the first environment image and the second environment image.

Thus, a target yaw angle of the destination can be calculated by the principle of binocular vision of the binocular vision camera, thereby realizing control of movement of the robot main body to the destination, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency.

In an embodiment of the present invention, the step of controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle.

Thus, it is possible to realize control of movement of the robot main body to the destination according to the target yaw angle and the real-time yaw angle, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency. In an embodiment of the present invention, the navigation method further comprises: detecting whether an obstacle is present in the environment in which the robot main body is currently travelling; and causing the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling.

Thus, the robot main body is caused to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling; it is thus possible to prevent the robot main body from striking obstacles and being damaged, to improve the stability of the fire fighting robot.

In an embodiment of the present invention, the navigation method further comprises: detecting the ambient temperature of the environment in which the robot main body is currently travelling, and causing the robot main body to stop moving when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature.

Thus, when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, the robot main body is caused to stop moving; it is thus possible to prevent damage to the robot main body due to an excessively high temperature, to improve the stability of the fire fighting robot.

The present invention also proposes a navigation apparatus for a fire fighting robot, the fire fighting robot comprising a robot main body capable of moving to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation, and a remote controller wirelessly connected to the robot main body, the robot main body having a binocular vision camera and an attitude sensor, and the remote controller having an input part and a display; the navigation apparatus comprising: an acquisition unit, for acquiring a first environment image and a second environment image of an environment in which the robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by the attitude sensor; wherein the first environment image and the second environment image are binocular vision images captured by the binocular vision camera; an input determining unit, for sending the first environment image to the display to be displayed, and determining a destination selected in the first environment image by an operator via the input part; a calculating unit, for calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image; and a control unit, for controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.

In an embodiment of the present invention, the step of the calculating unit calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision according to screen coordinates of the destination, the first environment image and the second environment image.

In an embodiment of the present invention, the step of the control unit controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle. In an embodiment of the present invention, the navigation apparatus further comprises an obstacle detection unit; the obstacle detection unit detects whether an obstacle is present in the environment in which the robot main body is currently travelling, and upon detecting that an obstacle is present in the environment in which the robot main body is currently travelling, causes the robot main body to stop moving.

In an embodiment of the present invention, the navigation apparatus further comprises a temperature detection unit; the temperature detection unit detects the ambient temperature of the environment in which the robot main body is currently travelling, and upon detecting that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, causes the robot main body to stop moving.

The present invention also proposes a fire fighting robot, comprising a processor, a memory, and a computer program that is stored on the memory and capable of being run on the processor; when executed by the processor, the computer program implements the fire fighting robot navigation method described above.

The present invention also proposes a computer readable medium, wherein a computer program is stored on the computer readable storage medium; when executed by a processor, the computer program implements the fire fighting robot navigation method described above.

Brief description of the drawings

The accompanying drawings below are merely intended to illustrate and explain the present invention schematically, without limiting the scope thereof, wherein: Fig. 1 is a three-dimensional structural schematic diagram of a robot main body according to an embodiment of the present invention.

Fig. 2 is a functional block diagram of a fire fighting robot according to an embodiment of the present invention.

Fig. 3 is a flow chart of a fire fighting robot navigation method according to an embodiment of the present invention.

Fig. 4 is a block diagram of a fire fighting robot navigation apparatus according to an embodiment of the present invention.

Key to labels used in the drawings:

100 robot main body

101 support platform

102 motion part

103 first support

104 water cannon

105 second support

106 binocular vision camera

107 obstacle sensor

108 infrared temperature sensor

109 processor

110 transceiver

111 attitude sensor

112 electric motor driver

113 electric motor

200 remote controller

201 input part

202 display

300 fire fighting robot navigation method 310 - 340 steps

400 fire fighting robot navigation apparatus

410 acquisition unit

420 input determining unit

430 calculating unit

440 control unit

450 obstacle detection unit

460 temperature detection unit Detailed description of the invention

To enable clearer understanding of the technical features, objectives and effects of the invention, particular embodiments of the present invention are now explained with reference to the accompanying drawings.

Many specific details are expounded in the description below to facilitate full understanding of the present invention, but the present invention may also be implemented in other ways different from those described here; thus the present invention is not limited by the particular embodiments disclosed below.

As shown in the present application and the claims, unless the context clearly indicates an exception, words such as "a",

"one", "a type of" and/or "the" do not specifically mean the singular, but may also include the plural. In general, the terms "comprise" and "include" only indicate inclusion of steps and elements that have been clearly marked, but these steps and elements do not constitute an exclusive list; methods or devices might also include other steps or elements.

Fig. 1 is a three-dimensional structural schematic diagram of a robot main body 100 according to an embodiment of the present invention. The robot main body 100 can move to the vicinity of a region where a fire has broken out to perform a fire extinguishing operation. As shown in Fig. 1, the robot main body 100 comprises a support platform 101, a motion part 102, a first support 103, a water cannon 104, a second support 105, a binocular vision camera 106, an obstacle sensor 107 and an infrared temperature sensor 108.

The support platform 101 is configured to provide support for part of the structure of the robot main body 100, e.g. the water cannon 104 and the binocular vision camera 106, etc. The motion part 102 is configured to cause the robot main body 100 to move; the motion part 102 may be a continuous track driven by a drive electric motor as shown in Fig. 1, and may also be a motion wheel driven by a drive electric motor. The water cannon 104 is disposed on the support platform 101 by means of the first support 103; the first support 103 can move under the control of a water cannon controller (not shown in the figure), thereby adjusting the direction of the water cannon 104, so that the water cannon sprays water towards different fire sites. The binocular vision camera 106 is disposed on the support platform 101 by means of the second support 105. The binocular vision camera 106 can capture a first environment image and a second environment image of an environment in which the robot main body 100 is currently travelling; the first environment image and second environment image are binocular vision images. The obstacle sensor 107 is disposed at a front end of the support platform 101, and is configured to detect whether an obstacle is present in the environment in which the robot main body 100 is currently travelling. The infrared temperature sensor 108 is disposed at the front end of the support platform 101, above the obstacle sensor 107, and is configured to detect the ambient temperature of the environment in which the robot main body 100 is currently travelling.

Fig. 2 is a functional block diagram of a fire fighting robot according to an embodiment of the present invention. The robot comprises the robot main body 100 and a remote controller 200. The remote controller 200 and the robot main body 100 are separated by a certain distance, and can communicate via a wireless connection. The distance can be such that the robot main body 100 is within the range of a visual field of the remote controller 200. The wireless connection may be a Bluetooth connection, infrared connection or near field communication connection, etc. The robot main body 100 may have the three-dimensional structure shown in Fig. 1.

As shown in Fig. 2, the robot main body 100 also has a processor 109, a transceiver 110, an attitude sensor 111, an electric motor driver 112 and an electric motor 113. The processor 109 may be a single-core processor, a multi-core processor, or a processor group formed of multiple processors, with the multiple processors being connected to each other via a bus. The processor 109 may further comprise a graphics processing unit, for processing images and video. The transceiver 110 is configured to send image data to the remote controller 200 and receive instructions from the remote controller 200. The attitude sensor 111 is configured to sense a yaw angle of the robot main body 100. As an example, the attitude sensor 111 may be an inertial measurement unit. The electric motor driver 112 is configured to control the electric motor 113 according to an output signal of the processor 109, so that the robot main body 100 moves in different directions.

The remote controller 200 has an input part 201; an operator can input instructions via the input part 201. The input part 201 may be a physical button and a virtual button for receiving an action of the operator, and may also be a microphone for receiving a speech sound of the operator. The remote controller 200 also has a display 202, configured to display received image data. The display 202 may be a liquid crystal display, a light emitting diode display or an organic light emitting diode display, etc.

Fig. 3 is a flow chart of a fire fighting robot navigation method 300 according to an embodiment of the present invention. The navigation method 300 can control the robot shown in Figs.

1 and 2. As shown in Fig. 3, the fire fighting robot navigation method 300 in this embodiment comprises:

Step 310, acquiring a first environment image and a second environment image of an environment in which a robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by an attitude sensor.

A binocular vision camera 106 is provided on the robot main body 100; the binocular vision camera 106 captures a first environment image and a second environment image of the environment in which the robot main body 100 is currently travelling. This step acquires the first environment image and second environment image. The first environment image and second environment image are binocular vision images captured by the binocular vision camera 106; the first environment image and second environment image have regions which overlap each other. Preferably, the first environment image and second environment image of the environment in which the robot main body 100 is currently travelling are views of a region in front of the robot main body 100.

The binocular vision camera 106 may be a high-definition binocular vision camera, and correspondingly can acquire high- definition images captured by the high-definition binocular vision camera. The step of acquiring the first environment image and second environment image, captured by the binocular vision camera 106, of the environment in which the robot main body 100 is currently travelling may further comprise subjecting the first environment image and second environment image to image processing. As an example, the image processing to which the first environment image and second environment image are subjected may be noise reduction, enhancement, sharpening or stitching, etc.

An attitude sensor 111 is also provided on the robot main body 100; the attitude sensor 111 senses the real-time yaw angle of the robot main body 100. This step acquires the real-time yaw angle. As an example, the attitude sensor 111 may be an inertial measurement unit (IMU).

Step 320, sending the first environment image to a display to be displayed, and determining a destination selected in the first environment image by an operator via an input part.

After acquiring the first environment image and second environment image, captured by the binocular vision camera 106, of the environment in which the robot main body 100 is currently travelling in step 310, the first environment image is sent via a wireless connection to a remote controller 200, and displayed on a display 202 of the remote controller 200. After browsing to the first environment image displayed on the display 202, the operator selects in the first environment image the destination to which the robot main body is about to move.

The input part 201 may be a physical button, a virtual button or a microphone; correspondingly, the operator can select the destination to which the robot main body is about to move by pressing the physical button, touching the virtual button and issuing a speech sound. In an embodiment of the present invention, the operator may be a fire fighter.

Step 330, calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image.

In step 320, the destination of the robot main body 100 was determined; in this step, the target yaw angle of the destination and the distance from the robot main body to the destination are calculated according to the screen coordinates of the destination, the first environment image and the second environment image.

In an optional scenario, the step of calculating the target yaw angle of the destination and the distance from the robot main body to the destination according to the screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle and the distance on the basis of the principle of binocular vision.

In this embodiment, the operator has selected the destination in the first image via the input part 202, the remote controller 200 sends the screen coordinates of the destination to the robot main body 100, and the robot main body 100 calculates a target yaw angle A of the destination and the distance from the robot main body 100 to the destination according to the screen coordinates of the destination, the first environment image and the second environment image, based on the principle of binocular vision. For example, the target yaw angle of the destination may be obtained as 30 degrees east of north, and the distance from the robot main body 100 to the destination may be obtained as 20 metres.

Step 340, controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.

The target yaw angle and distance having been determined in step 330, movement of the robot main body to the destination is controlled in conjunction with the real-time yaw angle acquired in step 310.

In an optional scenario, the step of controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real time yaw angle.

In this embodiment, the yaw angle sensed by the attitude sensor 111 at the initial moment is an initial yaw angle B. For example, the attitude sensor 111 senses the initial yaw angle B as east. After obtaining the target yaw angle A and initial yaw angle B, the difference value between the target yaw angle A and initial yaw angle B is determined; the difference value is sent to an electric motor driver 112, and the electric motor driver 112 drives an electric motor 113 to rotate left or right so as to adjust the difference value. For example, when the target yaw angle is 30 degrees east of north and the initial yaw angle is east, the electric motor driver 112 drives the electric motor 113 to yaw 60 degrees to the left, and travelling is begun.

As the robot main body 100 is travelling, a real-time yaw angle C of the robot main body 100 that is sensed by the attitude sensor 111 is acquired, the difference value C - B between the real-time yaw angle C and the initial yaw angle B is calculated in real time, and if the difference value C - B is equal to the target yaw angle A, the robot main body 100 is kept travelling in a straight line. If the difference value C - B is not equal to the target yaw angle A, the electric motor driver 112 drives the electric motor 113 to perform adjustment until the difference value C - B is equal to the target yaw angle A, at which time the robot main body 100 is kept travelling in a straight line, until the robot main body 100 reaches the vicinity of the destination, i.e. the distance between the robot main body 100 and the destination is less than a preset value.

An optional scenario comprises detecting whether an obstacle is present in the environment in which the robot main body is currently travelling; and causing the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling. It is possible to detect whether an obstacle is present in the environment in which the robot main body is currently travelling by means of an obstacle sensor 107 disposed on the robot main body 100, and cause the robot main body to stop moving when it is detected that an obstacle is present in the environment in which the robot main body is currently travelling, so as to prevent the robot main body 100 from striking obstacles and being damaged.

Optionally, when it is detected that the obstacle in the environment in which the robot main body is currently travelling has been removed, the robot main body is caused to begin moving again. The obstacle in the environment in which the robot main body is currently travelling may be removed automatically or manually. By causing the robot main body to begin moving again when it is detected that the obstacle in the environment in which the robot main body is currently travelling has been removed, the response speed and flexibility of the fire fighting robot can be increased.

In an optional scenario, the ambient temperature of the environment in which the robot main body is currently travelling is detected; and when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, the robot main body is caused to stop moving. The ambient temperature of the environment in which the robot main body is currently travelling may be detected by means of an infrared temperature sensor 108 disposed on the robot main body 100, and when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, the robot main body is caused to stop moving, so as to prevent damage to the robot main body 100 due to an excessively high temperature.

Optionally, when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling has fallen below the alarm temperature, the robot main body is caused to begin moving again. The ambient temperature of the environment in which the robot main body is currently travelling may fall below the alarm temperature of its own accord, or fall below the alarm temperature due to manual interference. By causing the robot main body to begin moving again when it is detected that the ambient temperature of the environment in which the robot main body is currently travelling has fallen below the alarm temperature, the response speed and flexibility of the fire fighting robot can be increased.

This embodiment of the present invention provides a fire fighting robot navigation method, in which automatic navigation of the fire fighting robot is achieved by an operator selecting a destination in an environment image acquired by a binocular vision camera, in conjunction with a yaw angle sensed by an attitude sensor, without the need to construct a map in advance; it is thus possible to adapt to a rapidly changing fire situation, to improve fire extinguishing efficiency.

A flow chart has been used here to explain the operations performed in the method according to an embodiment of the present application. It should be understood that the abovementioned operations are not necessarily performed precisely in order. On the contrary, the various steps may be processed in reverse order or simultaneously. Moreover, other operations may be added to these processes, or one or more operation steps may be removed from these processes.

Fig. 4 is a block diagram of a fire fighting robot navigation apparatus 400 according to an embodiment of the present invention. The navigation apparatus 400 can control the fire fighting robot shown in Figs. 1 and 2.

As shown in Fig. 4, the navigation apparatus 400 in this embodiment comprises: an acquisition unit 410, for acquiring a captured first environment image and a captured second environment image of an environment in which a robot main body is currently travelling, and a real-time yaw angle of the robot main body that is sensed by an attitude sensor; wherein the first environment image and second environment image are binocular vision images captured by a binocular vision camera; an input determining unit 420, for sending the first environment image to a display to be displayed, and determining a destination selected in the first environment image by an operator via an input part; a calculating unit 430, for calculating a target yaw angle of the destination and a distance from the robot main body to the destination according to screen coordinates of the destination, the first environment image and the second environment image; and a control unit 440, for controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance.

In an optional scenario, the step of the calculating unit 430 calculating the target yaw angle of the destination and the distance from the robot main body to the destination according to the screen coordinates of the destination, the first environment image and the second environment image comprises: calculating the target yaw angle of the destination and the distance from the robot main body to the destination on the basis of the principle of binocular vision according to the screen coordinates of the destination, the first environment image and the second environment image.

In an optional scenario, the step of the control unit 440 controlling movement of the robot main body to the destination according to the target yaw angle, the real-time yaw angle and the distance comprises: acquiring an initial yaw angle of the robot main body sensed at an initial moment by the attitude sensor; determining the difference value between the initial yaw angle and the target yaw angle; and controlling movement of the robot main body to the destination according to the difference value and the real-time yaw angle.

In an optional scenario, the navigation apparatus 400 further comprises an obstacle detection unit 450; the obstacle detection unit 450 detects whether an obstacle is present in the environment in which the robot main body is currently travelling, and upon detecting that an obstacle is present in the environment in which the robot main body is currently travelling, causes the robot main body to stop moving.

In an optional scenario, the navigation apparatus 400 further comprises a temperature detection unit 460; the temperature detection unit 460 detects the ambient temperature of the environment in which the robot main body is currently travelling, and upon detecting that the ambient temperature of the environment in which the robot main body is currently travelling exceeds an alarm temperature, causes the robot main body to stop moving.

For the manner of implementation and specific process of the navigation apparatus 400, the navigation method 300 can be referred to; no further description is given here.

The present invention also proposes a fire fighting robot, comprising a processor, a memory, and a computer program that is stored on the memory and capable of being run on the processor; when executed by the processor, the computer program implements the fire fighting robot navigation method described above.

The present invention also proposes a computer readable medium, with a computer program being stored on the computer readable storage medium; when executed by a processor, the computer program implements the fire fighting robot navigation method described above.

Some aspects of the method and apparatus of the present invention may be implemented wholly by hardware, wholly by software (including firmware, terminate-and-stay-resident software, microcode, etc.), or by a combination of hardware and software. All of the above hardware or software may be referred to as a "data block", "module", "engine", "unit", "component" or "system". The processor may be one or more application- specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing device (DAPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor or a combination thereof. In addition, the various aspects of the present invention might be manifested as a computer product located in one or more computer readable medium, the product comprising computer readable program code. For example, computer readable media may comprise, but are not limited to, magnetic storage devices (e.g. hard disks, floppy disks, magnetic tapes...), optical disks (e.g. compact disks (CD), digital versatile disks (DVD)...), smart cards and flash memory devices (e.g. cards, sticks, key drives...).

The computer readable medium might comprise a propagation data signal containing computer program code, e.g. on a baseband or as part of a carrier wave. The propagation signal might be manifested in many forms, including electromagnetic and optical forms, etc., or a suitable combined form. The computer readable medium may be any computer readable medium other than a computer readable storage medium, and may be connected to an instruction execution system, apparatus or device to realize communication, propagation or transmission of a program for use. Program code located on the computer readable medium may be propagated via any suitable medium, including radio, cable, optic fibre cable, RF signal or a similar medium, or a combination of any of the above media.

It should be understood that although the description herein is based on various embodiments, it is by no means the case that each embodiment contains just one independent technical solution. Such a method of presentation is adopted herein purely for the sake of clarity. Those skilled in the art should consider the description in its entirety. The technical solutions in the various embodiments could also be suitably combined to form other embodiments capable of being understood by those skilled in the art.

The embodiments above are merely particular schematic embodiments of the present invention, which are not intended to limit the scope thereof. All equivalent changes, amendments and combinations made by any person skilled in the art without departing from the concept and principles of the present invention shall fall within the scope of protection thereof.