Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
REMOTE CONTROL SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2010/067091
Kind Code:
A1
Abstract:
An object remote control system comprises a remote imaging system comprising a plurality of sensors (2); a central processor (9); a user display (11); an object locator and communication links between the sensors, the object locator and the central processor. The sensors (2) are deployed in a predetermined volume (1) and the central processor (9) derives a three dimensional representation of the volume from data received from the sensors. The object locator determines the location of an object (7) within the volume (1) and a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display. The system further comprises a user input (12) to the central processor; and a communication link (16) between the central processor and the object and commands are input to the user input and communicated to the object, to move the object within the volume. A corresponding method of operation is also provided.

Inventors:
SPICER JOHN JOSEPH (GB)
Application Number:
PCT/GB2009/051578
Publication Date:
June 17, 2010
Filing Date:
November 20, 2009
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ROKE MANOR RESEARCH (GB)
SPICER JOHN JOSEPH (GB)
International Classes:
G08G5/00
Domestic Patent References:
WO2008112148A12008-09-18
Foreign References:
US20080180523A12008-07-31
EP1798691A22007-06-20
Attorney, Agent or Firm:
PAYNE, Janice Julia et al. (Postfach 22 16 34, Munich, DE)
Download PDF:
Claims:
CLAIMS

1. A method of remotely controlling an object, the method comprising deriving a view from an object at a location within a volume by deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view; wherein the view is displayed to a remote controller; and wherein commands are sent from the remote controller to the object to move the object within the volume in response to the displayed view.

2. A method according to claim 1, wherein the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object.

3. A method according to claim 1 or claim 2, wherein the object is a vehicle.

4. A method according to claim 3, wherein the vehicle is an aircraft.

5. A method according to claim 3 or claim 4, wherein the location of the vehicle is determined by the vehicle and communicated to the central processor.

6. A method according to any preceding claim, wherein the volume is an airfield and associated airspace.

7. A method according to any preceding claim, wherein the sensors communicate with the central processor via fixed links.

8. A method according to any of claims 1 to 6, wherein the sensors are mobile sensors adapted to be redeployed within the volume.

9. A method according to claim 8, wherein the sensors send both location and sensor data to the central processor.

10. A method according to any preceding claim, wherein real time data from the sensors is superimposed upon a previously generated representation of the volume.

11. A method according to any preceding claim, wherein the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.

12. An object remote control system, the system comprising a remote imaging system comprising a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display; wherein the system further comprises a user input to the central processor; and a communication link between the central processor and the object; and wherein commands are input to the user input and communicated to the object, to move the object within the volume.

13. A system according to claim 12, wherein the object locator derives one or more of position, velocity and attitude of the object.

14. A system according to claim 12 or claim 13, wherein the object locator is installed on the object.

15. A system according to any of claims 12 to 14, wherein the object is a vehicle.

16. A system according to any of claims 12 to 15, wherein the vehicle is an aircraft.

17. A system according to any of claims 12 to 16, wherein the volume is an airfield and associated airspace.

18. A system according to any of claims 12 to 17, wherein the sensors are mobile sensors.

19. A system according to claim 18, wherein the mobile sensors further comprise location data sources.

20. A system according to any of claims 12 to 19, wherein the sensors are omnidirectional.

21. A system according to any of claims 12 to 20, wherein the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers.

Description:
REMOTE CONTROL SYSTEM AND METHOD

This invention relates to a remote object imaging and control system and a method of deriving a view from an object at a location within a volume and controlling the object. The system and method have particular application for use in remote control of vehicles, such as aircraft, ships, or other vessels, but may also be used to control other objects remotely.

There are occasions when it is convenient to be able to see a view as seen by an object, without actually being in the same position as the object. Although, this has been done in the past, for example with remote surgery using robots, such systems rely on a camera being positioned on the robot and the image transmitted, for example by a broadband link, to a location where the surgeon is able to move controls to direct the corresponding movement in the robot. However, there are cases where it is not practical to have cables attached to a remote object, such as undersea applications and there are other situations where there is not sufficient spectrum available in public areas to use wireless communication for sending back video images, such as in communication with aircraft or ships, or when carrying out underwater maintenance or survey works.

In accordance with a first aspect of the present invention, a method of remotely controlling an object comprises deriving a view from an object at a location within a volume by deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view; wherein the view is displayed to a remote controller; and wherein commands are sent from the remote controller to the object to move the object within the volume in response to the displayed view.

The present invention derives a three dimensional representation of a volume, determines the location of an object and uses these in combination to display a view as seen from the location of the object. Having obtained a view, as seen from the object, this can be used as the basis for controlling that object. The invention enables the displayed image to be used to enable a controller to move the object remotely, for example to avoid any obstacles shown in the view. Preferably, the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object. Preferably, the object is a vehicle.

This invention is most suited to controlling movement of vehicles, although it could be used for controlling other objects. Preferably, the vehicle is an aircraft.

This invention is particularly applicable to aircraft, but can also be used on other types of vehicle operating in constrained areas, such as for docking ships, for carrying out underwater surveys, or for maintaining underwater structures, such as oil rigs.

The location of the vehicle may be determined by conventional ground based tracking systems, such as radar, but preferably, the location of the vehicle is determined in the vehicle and communicated to the central processor.

An example of this is using data from a global navigation satellite system (GNSS) terminal on the vehicle, or a long range aid to navigation (LORAN) terminal. Preferably, the volume is an airfield and associated airspace. In order to reduce issues with spectrum allocation, preferably, the sensors communicate with the central processor via fixed links.

The sensors may be installed at fixed locations, or alternatively, the sensors are mobile sensors adapted to be redeployed within the volume. These may be combined with a communication infrastructure, allowing the mobile sensors to be plugged into a wired network.

In the case of mobile sensors, preferably, the sensors send both location and sensor data to the central processor. Preferably, real time data from the sensors is superimposed upon a previously generated representation of the volume.

Any suitable sensor may be used, but preferably, the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.

In accordance with a second aspect of the present invention, an object remote control system comprises a remote imaging system comprising a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display; wherein the system further comprises a user input to the central processor; and a communication link between the central processor and the object; and wherein commands are input to the user input and communicated to the object, to move the object within the volume.

Preferably, the object locator derives one or more of position, velocity and attitude of the object.

Preferably, the object locator is installed on the object. Preferably, the object is a vehicle. Preferably, the vehicle is an aircraft.

Preferably, the volume is an airfield and associated airspace. Preferably, the sensors are mobile sensors.

Preferably, the mobile sensors further comprise location devices. Preferably, the sensors are omnidirectional.

Preferably, the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers. An example of a method and system according to the present invention will now be described with reference to the accompanying drawings in which:

Figure 1 illustrates a first example of deriving a view as seen from an object in accordance with the present invention;

Figure 2 shows a control centre for use in the systems of Figs.1 and 3; Figure 3 illustrates a first example of a vehicle remote control system and method according to the present invention, applied to aircraft, deriving the view in accordance with the example of Fig. 1;

Figure 4 is a flow diagram illustrating the steps involved using the example of Fig. 3; and, Figure 5 illustrates an example of using the present invention for undersea surveying. Fig. 1 illustrates a method of deriving a view in accordance with the present invention. A volume 1 is defined, for which a remote imaging system is provided. The remote imaging system comprises a plurality of sensors 2, mounted on poles 3, arranged along sides 4, 5 forming a perimeter of the volume 1. In this particular example, the sensors are fixed sensors, mounted on poles, along the perimeter of the volume, but different arrangements of mounting and location are equally possible. The sensors may be mobile sensors and they may be located at any position within the volume, not just along the perimeter, or with suitable construction, the sensors may be suspended from a structure above the volume, hanging down into the volume.

Data received from the sensors 2 is communicated to a control centre 6, shown in more detail in Fig. 2, typically via fixed links to a data input 8. This data is processed by processor 9 and used to generate a three dimensional representation of the volume. This representation may be stored in a store 10. A location of an object 7 within the volume 1 is determined from an object locator and the view, as seen from the object at the determined location is extracted from the 3-D model derived from the sensor data and displayed on a display 11. The object locator may be remote from or mounted on the object. For example, the location of the object may be determined using additional sensors in the system which detect the object and report its position, or the object itself may include some kind of location device, such as GNSS technology, or for above water applications, LORAN and transmit its location at intervals to the control centre 6, e.g. via a wireless link.

Having derived the view as seen from the object, without actually having to put sensors on the object and transmit the sensor data over wireless communication from the object, it is then possible to use the displayed view as part of a remote control system. In the particular example of aircraft, such as cargo aircraft, traffic or shipping surveillance, or crop monitoring aircraft, in order to obtain regulatory approval, ground based pilots are likely to be required for remote controlled aircraft, whether unmanned air vehicles, unmanned autonomous systems, or piloted vehicles requiring outside assistance, when these vehicles are using commercial airfields for take off and landing. Although, current remotely operated vehicles can be controlled by a ground based pilot, this usually involves the use of satellite communication links and equipment on the vehicle, which is very expensive. Even if local terrestrial radio communications were used, a problem arises in that if the ground based pilot were to use video images transmitted from the aircraft, there is a very high spectrum requirement, due to the large radio bandwidth needed to transmit the video back to the ground, which would have a limiting effect on the number of aircraft which can be controlled at any one time. An example of applying the method of the present invention is described with respect to controlling a remotely piloted aircraft, as shown in Fig.3, however the principles described herein may be applied to the remote control of any vehicle within a constrained volume. The aircraft may be an unmanned aircraft, or a manned aircraft where the pilot requires external assistance, such as needing a remote pilot to take control of landing when the aircraft pilot does not have suitable instruments for landing at night or in restricted visibility.

The defined volume 1 includes a runway 13 and airspace 14 surrounding an airfield 15. Along the perimeter of the volume, a number of sensors 2 are provided. In this example the airfield is provided with the sensors mounted on poles 3 to monitor the runway and other parts of the airfield. However, these sensors do not need to be only at the perimeter of the volume 1, nor do they need to be mounted on fixed mountings, provided that their location can be determined and associated with any data generated by those sensors, e.g. using a co-located global positioning system. Mobile sensors may be put in place for a fixed period of time to derive an initial 3-D representation, then moved within the volume to be positioned at locations previously determined to be most likely to have changing circumstances that require regular updates to enable an accurate real time view to be generated. A fixed network may be installed with connections at the desired locations to enable mobile sensors to send data back over fixed links. The control centre 6 may be within, or outside the volume 1 and is connected to the plurality of sensors 2 within the volume. The type of sensor depends upon the desired output for the remote pilot in the control centre and a number of different types of sensor can be used. For example, where a video image is required, the sensors may be cameras, mounted on the poles 3 at intervals along opposite edges 4 of the perimeter of the volume. Multiple cameras facing up, down and in different horizontal directions may be used to cover as large a part of the volume as possible, or omni-directional sensors may be used. Fixed parts of the image, such as the runway, grass, buildings etc., may be pre-stored and images from the video cameras then superimposed over the stored image to show any change in the view from a particular location. Another set of cameras or other types of sensors, e.g. radar may be provided to observe an aircraft 10 and to determine its position, velocity and attitude. Alternatively, the aircraft can transmit the necessary information to the control centre from the aircraft's own navigation systems.

The video feed from the ground based sensors is reconstructed to generate the 3-D representation. Although video is particularly useful, other sources of an image, or view, may be used such as high frequency radio, for example radar imaging, or infrared. This might be essential in conditions of restricted visibility, such as in fog or at night. The reconstruction uses data from the airfield cameras and sensors and may also use stored data. From each sensor, data is related to its position within the volume and a full representation of the volume is built up. Where there are gaps in the basic data, for example due to a lack of sensor coverage, these can be completed by extrapolation, or from a pre-stored representation. Either, the full representation may be pre-stored, or just those features which do not change frequently, such as buildings. When in use, any pre-stored representation is enhanced by real time data, so that the remote pilot can take suitable action if an obstruction appears in an area that needs to be clear for the aircraft to safely pass, such as a vehicle crossing the runway. Having determined the location and direction of travel of the object of interest, the view showing what would have been seen if there had been a camera mounted on the aircraft, given its current position, velocity and attitude is derived.

In the control centre 6, as shown in Fig. 2, the received sensor data is processed in the processor 9 and displayed on a display 11 to the remote pilot. Based on the pilot's skill, he is able to enter instructions, via the user input 12 to the processor, which are transmitted from transceiver 16 to an aircraft 17. The user input 12 may take any convenient form, such as a joy stick, to which the processor responds by generating instructions to the aircraft which will cause corresponding movement of the aircraft control surfaces. Alternatively, the remote pilot may operate by entering discrete instructions to change altitude or speed or to move onto a new heading. The aircraft may have knowledge of its own position which it transmits to the control centre 6 on the ground, such as from an internal navigation system, or communication with the ground may be avoided completely by using ground based air traffic control radar to determine the location of the aircraft. Attitude information enables the view which is generated at a particular location to be the one as seen in the direction of travel of the aircraft. In this example, the ground based pilot communicates with the aircraft via wireless communication to send control instructions, but wireless transmission of control data uses far less bandwidth than video, so this usage is not a significant factor in determining the spectrum requirements for the system. All data heavy communication can be done via fixed links on the ground from each camera, or sensor.

The location of the aircraft 17 may be determined using known ground based methods, such as multilateration as described in GB2250154. The image generation from the sensor data may be carried out using various techniques, such as those applied in virtual studios, as described in GB2413720 A or GB2400513 B.

Fig. 4 is a flow chart of the stages involved in the method of the present invention applied to remote pilotage of a vehicle. As a vehicle comes within range of a volume, defining a control area, where remote pilotage is available, this is detected 20 and a signal to the control centre 6 indicates 21 whether, or not the vehicle requires remote pilotage. This may occur where the vehicle is arriving from elsewhere, e.g. an aircraft waiting to land, or a ship coming into dock, or where the vehicle is at a parking bay, or berth, waiting to move off. If no remote pilotage is required, the control centre continues to monitor 22 for other vehicles which may require the service. If remote pilotage is required, then the control centre sets up communication 23 with the vehicle and allocates a pilot 24. If for some reason, no pilot is available, then the vehicle either remains in its parking bay, or if in flight, can be directed into an automated circling routine 25, outside the controlled volume, using the same sense and avoid systems which have brought it to its current position. When a pilot has been allocated 24, the pilot may send a test transmission 26 to ensure that he is able to communicate with the vehicle and then takes over control as the vehicle reaches the volume edge. The pilot uses the display 11 in the control centre 6 to determine what obstructions may be in the way of the vehicle that he is remotely piloting and provides suitable inputs 27 to move the vehicle to avoid these, whether by changing height, speed, direction or any other suitable option. The display may be of an image, taking data from cameras, or a representation, such as a radar display showing other radar targets to be avoided. Infra red imaging may be superimposed to show otherwise invisible targets such as deer or birds, when operating in poor visibility. At intervals, a check 28 may be made to see if the vehicle has exited the control area. If not, then the remote pilotage continues 29, but once the vehicle has exited the defined control area, then the pilot provides a termination communication 30 to allow the vehicle to take over control of its movement from that point. Although most useful in dealing with aircraft, the invention may also be applied to docking ships in busy harbours, so increasing the number of vessels that a pilot can deal with, by avoiding the pilot having to join and leave each individual one. The steps involved are very similar to those described with respect to Fig.4. The control commands may be sent using wireless communications, without taking up too much of the available spectrum.

The invention may also be used for recreational remote controlled aircraft, for which the basic principles are the same, though possibly over a smaller area. In this case, the control centre 6 may superimpose specific virtual objects which the pilot must avoid, whilst flying within the volume 1 , as well as, or instead of, using real time images on top of the base view created from the sensor data.

Another application is the use of the system in underwater surveys, such as shown in Fig. 5. The plurality of sensors may be provided as a towed array 31 and multiple vessels 32, 33, each with their own towed array 31 positioned in a coordinated manner, so that an image of the underwater volume which the arrays define can be built up. An untethered remote controlled underwater vehicle 34 may be operated within the volume, making use of the remotely generated view as seen from its location to enable the vehicle to be directed to move safely within the volume. Alternatively, to cover a smaller area a single vessel may have a rig with sensors at predetermined locations which can be lowered into an area of interest and the untethered vehicle is then controlled remotely using the remotely generated view as the basis for instructions to move the vehicle and avoid objects, or reach a desired location.




 
Previous Patent: COMMUNICATIONS SYSTEM AND METHOD

Next Patent: VENTING GAS