Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AERIAL DEVICES, ROTOR ASSEMBLIES FOR AERIAL DEVICES, AND DEVICE FRAMEWORKS AND METHODOLOGIES CONFIGURED TO ENABLE CONTROL OF AERIAL DEVICES
Document Type and Number:
WIPO Patent Application WO/2017/173502
Kind Code:
A1
Abstract:
The present disclosure relates to drone technology, and in particular to aerial drone technology. This includes both drone hardware/hardware configuration, and drone software/software configuration. Embodiments are described by reference to a "selfie drone", being an aerial drone device that is configured to take photos (and/or video) of a user (for example by positioning itself in a defined location relative to the user). However, it will be appreciated that various aspects of technology described herein have wider application.

Inventors:
ZAMMIT ANTHONY (AU)
NAMANN SIVAAN (AU)
LIPKIN EDUARD (AU)
YELLACHICH ALEXANDER (AU)
CASSISI SAM ANTHONY (AU)
LIPSKI MATTHEW (AU)
Application Number:
PCT/AU2017/050307
Publication Date:
October 12, 2017
Filing Date:
April 07, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
IOT GROUP TECH PTY LTD (AU)
International Classes:
B64C11/28; B64C27/10; B64C11/48; B64C27/39; B64C27/48; B64C39/02
Domestic Patent References:
WO2014025444A22014-02-13
WO2009005875A22009-01-08
WO2015138217A12015-09-17
WO2016012790A12016-01-28
WO2016077278A12016-05-19
WO2016078056A12016-05-26
Foreign References:
ES2549365A12015-10-27
US20140299708A12014-10-09
US5628620A1997-05-13
Attorney, Agent or Firm:
SHELSTON IP PTY LTD (AU)
Download PDF:
Claims:
CLAIMS:

1 . A rotor assembly for an aerial device, the rotor assembly including: a first blade assembly, wherein the first blade assembly includes a central blade mounting member securely mounted to a first axial shaft, wherein: the first blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection; a second blade assembly, wherein the second blade assembly includes a central blade mounting member securely mounted to a second axial shaft, wherein: the second blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection; wherein the first and second axial shafts are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.

2. A rotor assembly according to claim 1 wherein, in the storage position, the blades of the first and second blade assemblies are enabled to contact with an elongate housing body for the aerial device.

3. A rotor assembly according to claim 1 wherein the limited rotation of the rotor blades relative to the blade supporting members about the vertical axis defined by the primary hinge connections is configured to reduce the effect of an impact of a rotor blade against an external object during flight.

4. A rotor assembly according to claim 1 wherein a control component is configured to enable control over the pair of blade assemblies, the control component including: a printed circuit board comprising a single electronic speed control (ESC) unit; and two MOSFET components, wherein each MOSFET component is configured to provide direct control signalling to a respective set of two motors; wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective blade assemblies.

5. An aerial device having a rotor assembly including: a first blade assembly, wherein the first blade assembly includes a central blade mounting member securely mounted to a first axial shaft, wherein: the first blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection; a second blade assembly, wherein the second blade assembly includes a central blade mounting member securely mounted to a second axial shaft, wherein: the second blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection; wherein the first and second axial shafts are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.

6. An aerial device according to claim 5 wherein, in the storage position, the blades of the first and second blade assemblies are enabled to contact with an elongate housing body for the aerial device.

7. An aerial device according to claim 5 wherein the limited rotation of the rotor blades relative to the blade supporting members about the vertical axis defined by the primary hinge connections is configured to reduce the effect of an impact of a rotor blade against an external object during flight.

8. An aerial device according to claim 5 wherein a control component is configured to enable control over the pair of blade assemblies, the control component including: a printed circuit board comprising a single electronic speed control (ESC) unit; and two MOSFET components, wherein each MOSFET component is configured to provide direct control signalling to a respective set of two motors; wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective blade assemblies.

9. A control component configured to enable control over a pair of rotor assemblies for an aerial device, the control component including: a printed circuit board comprising a single electronic speed control (ESC) unit; and two MOSFET components, wherein each MOSFET component is configured to provide direct control signalling to a respective set of two motors; wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective rotor assemblies.

Description:
AERIAL DEVICES, ROTOR ASSEMBLIES FOR AERIAL DEVICES, AND DEVICE FRAMEWORKS AND METHODOLOGIES CONFIGURED TO ENABLE CONTROL OF AERIAL DEVICES

FIELD OF THE INVENTION

[0001 ] The present invention relates to aerial devices, rotor assemblies for aerial devices, and device frameworks and methodologies configured to enable control of aerial devices. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.

BACKGROUND

[0002] Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.

[0003] Aerial drones have become increasingly popular in recent times. In particular, there has been a dramatic increase in the availability of consumer-level aerial drone devices, which are used primarily for photography purposes (including videography). As part of this, drone control algorithms continue to advance, for example allowing smart autonomous control thereby to enable photography-related flight control with minimal user input. For example, some drones are configured to adopt predefined flight paths, and/or move in a defined manner relative to a defined object (for example a trackable tether device). This, combined with a desire to miniaturize drones for personal use, leads t a range of technical challenges.

SUMMARY OF THE INVENTION

[0004] It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative. [0005] One embodiment provides a rotor assembly for an aerial device, the rotor assembly including:

[0006] a first blade assembly, wherein the first blade assembly includes a central blade mounting member securely mounted to a first axial shaft, wherein:

[0007] the first blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and

[0008] each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection;

[0009] a second blade assembly, wherein the second blade assembly includes a central blade mounting member securely mounted to a second axial shaft, wherein:

[0010] the second blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and [001 1 ] each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection;

[0012] wherein the first and second axial shafts are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.

[0013] One embodiment provides a rotor assembly for an aerial device wherein, in the storage position, the blades of the first and second blade assemblies are enabled to contact with an elongate housing body for the aerial device.

[0014] One embodiment provides a rotor assembly for an aerial device wherein the limited rotation of the rotor blades relative to the blade supporting members about the vertical axis defined by the primary hinge connections is configured to reduce the effect of an impact of a rotor blade against an external object during flight.

[0015] One embodiment provides a rotor assembly for an aerial device wherein a control component is configured to enable control over the pair of blade assemblies, the control component including:

[0016] a printed circuit board comprising a single electronic speed control (ESC) unit; and

[0017] two MOSFET components, wherein each MOSFET component is configured to provide direct control signalling to a respective set of two motors;

[0018] wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective blade assemblies.

[0019] One embodiment provides an aerial device having a rotor assembly including:

[0020] a first blade assembly, wherein the first blade assembly includes a central blade mounting member securely mounted to a first axial shaft, wherein: [0021 ] the first blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and

[0022] each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection;

[0023] a second blade assembly, wherein the second blade assembly includes a central blade mounting member securely mounted to a second axial shaft, wherein:

[0024] the second blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and

[0025] each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection; [0026] wherein the first and second axial shafts are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.

[0027] One embodiment provides an aerial device wherein, in the storage position, the blades of the first and second blade assemblies are enabled to contact with an elongate housing body for the aerial device.

[0028] One embodiment provides an aerial device wherein the limited rotation of the rotor blades relative to the blade supporting members about the vertical axis defined by the primary hinge connections is configured to reduce the effect of an impact of a rotor blade against an external object during flight.

[0029] One embodiment provides an aerial device wherein a control component is configured to enable control over the pair of blade assemblies, the control component including:

[0030] a printed circuit board comprising a single electronic speed control (ESC) unit; and

[0031 ] two MOSFET components, wherein each MOSFET component is configured to provide direct control signalling to a respective set of two motors;

[0032] wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective blade assemblies.

[0033] One embodiment provides a control component configured to enable control over a pair of rotor assemblies for an aerial device, the control component including:

[0034] a printed circuit board comprising a single electronic speed control (ESC) unit; and

[0035] two MOSFET components, wherein each MOSFET component is configured to provide direct control signalling to a respective set of two motors; [0036] wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective rotor assemblies.

[0037] One embodiment provides a computer implemented method for controlling an aerial drone system thereby to autonomously follow a human face, the method including:

[0038] operating the aerial drone system to capture image data via a RGB camera;

[0039] processing the image data thereby to identify a target region predicted to contain a human face, wherein the processing includes:

[0040] (a) comparing captured image data with predefined stored images contained in an image library, wherein the image library includes data representative of predefined image data for faces and/or facial features;

[0041 ] (b) based on the comparing, identifying in the captured image data the target region predicted to contain the human face;

[0042] extracting from the captured image data one or more reference images representative of the human face;

[0043] operating a tracking and control algorithm that is configured to:

[0044] (i) based on comparison of a captured frame of image data to the one or more reference images, determining error data;

[0045] (ii) based on the error data, defining a control instruction based on a predefined control protocol, wherein the control instruction is configured to apply a predefined degree of three dimensional movement to the aerial drone thereby to reduce an error defined by the error data; and

[0046] (iii) periodically repeating (i) and (ii) thereby to continuously apply error reduction.

[0047] One embodiment provides a method according to claim 10 including configuring the tracking and control algorithm to: [0048] (i) cause drone control instructions to be implemented in respect of a first set of facial movement conditions; and

[0049] (ii) not cause drone control instructions to be implemented in respect of a second set of facial movement conditions.

[0050] One embodiment provides a method wherein the first and second sets of facial movement conditions are defined to enable a tracked face to adopt a facial pose without affecting aerial drone position.

[0051 ] One embodiment provides a method wherein the first and second sets of facial movement conditions are temporally variable, such that at least one facial movement condition is transitioned from the first set to the second set during a defined time period.

[0052] One embodiment provides a method wherein the first and second sets of facial movement conditions include one or more of the following movement conditions: horizontal movement conditions; vertical axis facial rotation conditions; face-normal horizontal axis facial rotation conditions; and face parallel horizontal axis facial rotation conditions.

[0053] One embodiment provides a method wherein the tracking and control algorithm is configured with data that correlates each pixel in image data to a distance, thereby to enable tuning of the algorithm to operate with different camera modules.

[0054] One embodiment provides a method wherein the aerial drone is configured with a modular design, whereby a first camera module is removable and replaceable with a second camera module, wherein the second camera module has different image capture properties.

[0055] One embodiment provides a method wherein the tracking and control algorithm is configured to implement a preliminary positional determination process whereby the aerial drone is assumed to be a defined distance from the human face.

[0056] One embodiment provides a method wherein the defined distance is an arm's length approximation distance. [0057] One embodiment provides a method including performing a tuning function based on a combination of: the assumed distance; and statistical averages of facial feature layouts.

[0058] One embodiment provides a method the error data includes pose offset and/or distance offset data based on a predefined start position.

[0059] One embodiment provides a method wherein the tracking and control algorithm additionally includes: (ii) based on the error data, defining a camera control instruction thereby to apply a predefined degree of movement to a camera positioning module, thereby to reduce an error defined by the error data.

[0060] One embodiment provides a computer implemented method for enabling gesture-driven control over a drone device, the method including:

[0061 ] receiving data representative of images captured by a camera device of the aerial drone device;

[0062] operating an image processing algorithm that is configured to detect the presence of one or more defined gesture-driven commands, wherein the gesture-driven commands relate to human body movements;

[0063] operating a facial detection and identification algorithm, wherein the facial detection and identification algorithm is configured to: (i) identify the presence of a human face; and (ii) determine whether the human face corresponds to a prescribed known human face; and

[0064] performing analysis thereby to determine whether a detected gesture driven commands corresponds to a prescribed known human face; and

[0065] in the case that a detected gesture driven commands corresponds to a prescribed known human face, implementing a control instruction in respect of the drone device corresponding to the detected gesture driven command.

[0066] One embodiment provides a method for enabling gesture-driven control over a drone device, the method including: [0067] receiving data representative of images captured by a camera device of the aerial drone device;

[0068] presenting the images on a touchscreen display;

[0069] receiving data representative of a touch command at a location on the touchscreen display;

[0070] identifying a trackable object in the images corresponding to the location of the touchscreen display a time corresponding to the touch command; and

[0071 ] configuring the drone to perform an object tracking method for the identified trackable object, wherein the object tracking method provides input to a drone trajectory control system.

[0072] One embodiment provides a method for enabling user control over a drone device, the method including:

[0073] executing a control application at a handheld device, wherein the handheld device, wherein the handheld device includes an IMU module;

[0074] processing data representative of handheld device motion derived from the IMU module, thereby to define a control instruction;

[0075] based on the control instruction, causing implementing of a variation to either or both of a pair of motors of the drone device, wherein the motors are respectively connected to rotor blade assemblies.

[0076] One embodiment provides a method wherein the motors are each connected to a respective one of a pair of coaxially mounted drive shafts.

[0077] One embodiment provides a method including displaying, via a display screen of the mobile device, a live video feed derived from an image capture device mounted to the drone device. [0078] One embodiment provides a method wherein left and right based movements of the handheld device are mirrored in terms of their resulting control instructions.

[0079] One embodiment provides a method wherein including implementing two modes of operation:

[0080] a first mode of operation wherein left and right based movements of the handheld device are mirrored in terms of their resulting control instructions; and

[0081 ] a second mode of operation wherein left and right based movements of the handheld device are not mirrored in terms of their resulting control instructions.

[0082] One embodiment provides a method wherein, in the second mode of operation, the method includes displaying, via a display screen of the mobile device, a live video feed derived from an image capture device mounted to the drone device.

[0083] One embodiment provides a computer program product for performing a method as described herein.

[0084] One embodiment provides a non-transitory carrier medium for carrying computer executable code that, when executed on a processor, causes the processor to perform a method as described herein.

[0085] One embodiment provides a system configured for performing a method as described herein.

[0086] Reference throughout this specification to "one embodiment", "some embodiments" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment", "in some embodiments" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments. [0087] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.

[0088] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.

[0089] As used herein, the term "exemplary" is used in the sense of providing examples, as opposed to indicating quality. That is, an "exemplary embodiment" is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.

BRIEF DESCRIPTION OF THE DRAWINGS

[0090] Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:

[0091 ] FIG. 1 A to FIG. 1 N illustrate aerial drones and rotor configurations according to embodiments.

[0092] FIG. 2A, FIG. 2B and FIG. 2C schematically illustrate modular hardware configurations according to embodiments.

[0093] FIG. 3 illustrates a communications arrangement according to an embodiment. [0094] FIG. 4 illustrates component connections according to one embodiment. [0095] FIG. 5A illustrates a method according to one embodiment. [0096] FIG. 5B illustrates a method according to one embodiment. [0097] FIG. 6 illustrates an exemplary motor controller board. DETAILED DESCRIPTION

[0098] The present disclosure relates to drone technology, and in particular to aerial drone technology. This includes both drone hardware/hardware configuration, and drone software/software configuration. Embodiments are described by reference to a "selfie drone", being an aerial drone device that is configured to take photos (and/or video) of a user (for example by positioning itself in a defined location relative to the user). However, it will be appreciated that various aspects of technology described herein have wider application.

Overview

[0099] FIG. 1 A to FIG. 1 N illustrate aerial drone devices according embodiments, referred to collectively as "drone 100". Various aspects of drone technology are described by reference to the example of drone 100, and/or variations thereof. It will be appreciated that various technological aspects described herein, including the likes of tracking/control functionalities, physical configurations, and the like, are applicable across a wide range of aerial and other drone devices, and as such are not in any way limited to the specific example of drone 100.

[00100] Drone 100 includes a body 101 , which contains a plurality of modular components. In general terms, the components include:

• A motor and drive control assembly 1 10. Assembly 1 10 includes a pair of motors, configured to drive a respective pair of rotor drive shafts, which are connected to rotor blade assemblies 160 and 170. These rotor blade assemblies are mounted in a vertically-spaced configuration on coaxially mounted drive shafts. The rotor blade assemblies are oppositely configured, thereby to enable control over drone aerial positioning by way of coordinated driving of the two motors (preferably in combination with control over blade rotation axis, for example via one or more servomotors). • A camera module 120. The camera module includes a camera (i.e. a digital image capture device) mounted to a positioning assembly that is optionally variable via one or more powered servomotors. In the illustrated embodiment the servomotors enable control over rotation about a horizontal axis, thereby to enable up-down positioning of camera point of view via such rotation. In further embodiments additional degrees of freedom may be incorporated into the camera module, thereby to enhance PTZ capabilities. In some embodiments this additionally provides stabilization to a video feed. In further embodiments the camera module provides a dedicated stabilization system. In the illustrated embodiment the camera module is mounted to body 100 via dampers 121 , which are configured to reduce the impact of drone vibrations (and other movements) on the camera module, to enhance the quality of captured images and/or video.

• One or more circuit boards 130. These provide processing capabilities to drone 100, in particular enabling algorithms which process input from the camera module (and optionally other input sources) thereby to define output used in the context of defining control instructions to drive the motors (although in some embodiments processing is shared with a tethered device, such as a smartphone). For example, as described in more detail further below, in some embodiments this enables drone control based on facial recognition and facial tracking. The circuit board(s) also provide additional functionalities, including one or more IMUs (thereby to derive input regarding drone acceleration and orientation) and a WiFi module (thereby to enable interaction with one or more control devices, such as smartphones executing control software applications). In some embodiments: the circuit boards additionally include one or more of: GPS for positioning outdoors; a barometer for altitude calculations; and an optical flow sensor with an ultrasonic sensor for positioning indoors.

• A battery module 140. In the illustrated embodiment the battery module is configured to enable external access to a rechargeable battery component, such that the battery component is able to be removed and replaced in a quick and convenient manner.

[00101 ] Body 100 includes provides a housing, which is able to be opened thereby to provide access to internal components. This, combined with the modular configuration (which is discussed in more detail further below), enables convenient removal and replacement of various components. For example, this may include replacement for maintenance issues and/or modular upgrades.

[00102] Body 100 also provides one or more user input devices, in the present embodiment including a primary button 180. Button 180 is in some embodiments configured to enable multiple forms of user input, for example "on", "off" and "fly". In some embodiments the "on" and "fly" commands are synonymous, in that upon activating drone 100 it adopts a default autonomous fly mode.

[00103] The default autonomous mode may be a stable hover mode, in which the drone is configured to maintain a position substantially the same as a position in which it is released (preferably by a manual hand release). In some cases, the default mode includes implementation of an image-processing based tracking mode, for example a "follow-me" mode. In this regard, a "follow-me" mode based on facial recognition is described in more detail further below. In overview, drone 100 is configured to initially identify a human face (preferably a face belonging to a human user to releases drone 100 from an extended arm with the camera pointing in their direction), and implement a tracking and control algorithm which tracks the position and/or pose of the human face (in some cases with defined restraints), and control the drone thereby to maintain a defined orientation and/or position with respect to that human face.

[00104] In some embodiments the default mode is controlled by a user, for example by settings defined in a mobile application. However, it will be appreciated that a mobile device connection is not necessary for various functionalities of drone 100, including facial tracking and image capture. Specifically, drone 100 need not be tethered to any form of electronic device for the purpose of tracking; the tracking is based on image processing as opposed to identification of relative position of an electronic tethered device.

Example Rotor Assembly

[00105] As noted, drone 100 is able to fly due to a rotor assembly including a pair of rotor blade assemblies mounted to co-axial drive shafts. The configuration of this rotor assembly has been primarily developed to provide a compact drone, which is well-suited to consumer-level applications (for instance in terms of safety). [00106] The illustrated rotor assembly includes a first blade assembly, being an upper blade assembly 160, and a second blade assembly, being a lower blade assembly 170. Components defining blade assembly 160 and 170 are generally similar, particularly in terms of a dual-axis hinge arrangement (described below). However, it will be appreciated that rotor blades are mirrored between the assemblies, such that the assemblies are configured to rotate in opposite directions for the purposes of flight. The rotor assemblies are provided on drive shafts that are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.

[00107] Blade assembly 160 includes a central blade mounting member 161 securely mounted to a first axial shaft 162. Shaft 162 is mounted to a drive motor provided by motor and drive control assembly 1 10. A plurality of blade supporting members 163 are, mounted at spaced apart peripheral locations to the central blade mounting member (in the illustrated example there are two diametrically opposed blade supporting members separated by 180 degrees, in other embodiments there may be three or more). Each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection. This enables upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft. In some embodiments the blade supporting members are configured to lock into place in the operational position (for example via an over-centring arrangement, snap lock formations, or the like). It will be appreciated that the upward and outward rotation occurs automatically in response to forces caused by rotation of the drive shaft. However, in some embodiments the drone is configured (or intended) to be used in a manner whereby the upward and outward rotation into a locked operation position is performed by a user prior to motor activation.

[00108] Each blade supporting member is mounted to a respective rotor blade 165 via a respective secondary hinge connection. The secondary hinge connection allows limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection. It will be appreciated that, in response to forces caused by rotation of the drive shaft during flight, the blades remain diametrically opposed at the centre of the hinge rotation. However, upon a collision, the rotation serves to reduce the moment of impact. This is particularly relevant in the context of providing a degree of safety to a consumer-level proximity drone as described herein. Moreover, the use of a dual horizontal and vertical hinge arrangement as described and illustrated here allows for a drone that is both compact and low-hazard.

[00109] As noted, blade assembly 170 is generally similar to blade assembly 160 in terms of a dual horizontal and vertical hinge arrangement. In some embodiments, blade assembly 170 includes additional rotational joints, thereby to enable adjustment of the alignment of the axis blade rotation (for example about one or two axis). This is in some embodiments controlled by a pair of servomotors, mounted within the central blade mounting member. In some examples, the blade mounting member 171 is mounted to the drive shaft by a pair of axles (which mount to a connection block 171 a), upon which the servomotors act (as best shown in FIG. 1 H to FIG. 1 n). This allows limited controllable rotation of the central blade mounting member about two orthogonal horizontal axis, allowing advances directional control and/or stabilization of the aerial drone. It will be observed that some diagrams, including FIG. 1 F, show an alternate arrangement for members 163.

[001 10] It will be appreciated that other approaches to enabling controlled adjustment of rotor blade assembly rotational axis are used in further embodiments. These may act on either or both of the upper and lower assemblies.

Example Modular Configuration

[001 1 1 ] FIG. 2A schematically illustrates a modular configuration according to one embodiment. For example, this modular configuration is implemented in respect of drone 100.

[001 12] In this embodiment, the drone includes three discrete modules (modules 200, 210 230), which are able to be removed and replaced at a consumer level. For example, the drone body provides a communications infrastructure that allows each module to be in essence un-plugged and plugged-in without a need to perform electrical wiring or the like. This preferably enables replacement of modules by end-users, or by technical personnel with minimal effort. Such replacement of modules allows for convenient repair of devices, and in some cases upgrading of devices (for example to improve control/tracking capabilities, camera quality, and so on).

[001 13] Module 200 includes a GPS module 201 , drive modules 202, and a motor controller 203. GPS module 201 is configured to calculate GPS position data, which is transmitted to a flight controller module thereby to enable waypoint navigation and stability. Drive modules 202 include components that are configured to convert electrical energy into mechanical energy, thereby to cause the drone to fly. This includes all of the shafts, gears, motors, etc. A motor controller 203 includes a circuit board which provides current and voltage monitoring and drives the main rotor motors. In a preferred embodiment, this board includes two separate MOSFET devices, which are each configured to control a respective motor, thereby to effect control over a dual rotor assembly drone such as drone 100.

[001 14] Module 201 includes a battery module 21 1 , a main controller 212, a flight controller 213, and a landing gear 214. Battery module 21 1 provides power to fly the drone, and preferably is configured to retain a removable battery unit (that is, a battery unit is replaceable without replacing the entire battery module, which includes battery connection contacts and the like. Main controller 212 acts as a high level control system managing interactions between a user and the flight controller. In this regard, it includes a WiFi module which provides a WiFi hot spot for connectivity to a user (i.e. a user connects to that hotspot thereby to control and otherwise interact with the drone). In further embodiments alternate communications protocols are used. Main controller 212 additionally provides a vision system processing configured for detecting and tracking targets (as discussed in additional detail further below).

[001 15] FIG. 2B illustrates a variation of module 210, wherein battery module 21 1 and main controller 212 are vertically elongate and positioned horizontally adjacent one- another, rather than being stacked vertically as shown in FIG. 2A.

[001 16] Flight controller 213 includes a circuit board containing an IMU. In this regard, it is configured to manage stable flight of the drone using on board sensor data from the IMU, optionally in combination with data derived from GPS module 201 , and/or distance and motions sensors based on vision system processing provided by main controller 212. In the present embodiment, flight controller 213 is additionally configured to navigate waypoints based on GPS data. [001 17] Landing gear 214 provides a mechanical system that allows the drone to land on a flat surface, and yet still pack compactly. It is preferably electrically released. In some embodiments the landing gear is omitted, in favour of a limited hand-launched and hand-landed configuration.

[001 18] In relation to module 230, a camera module 231 includes a digital camera mounted on a tilt-angling servomotor. Various other forms of controllable camera mounts may also be used. Camera module 231 preferably allows replacement of camera components, including the likes of lenses, sensors, and the like.

[001 19] FIG. 2C illustrates inter-module connections according to one embodiment:

• Connector 251 is a low current connector, which provides GPS serial and power cables via 5 pin connectors at each of GPS module 201 and flight controller 213.

• Connector 252 is a high current connector between drive module 202 and motor controller 203 (which in this embodiment includes two servo motors and two drive motors). At the drive module end it provides a 10 pins, which terminate at two- pin connections at each drive motor, and three-pin connections at each servo motor.

• Connector 252 is a high current connector between the motor controller and the battery module, thereby to provide power to the motor controller (via a 2-pin connector arrangement).

• Connector 254 is a low current connector between main controller 212 and flight controller 213. This provides serial connection between main controller and flight controller for data transfer, in this embodiment via a two-pin connection.

• Connector 255 is a low current connector, providing high speed hardware interfacing (e.g. a CSI port) using a ribbon cable. In this embodiment a 16 pin connection is used, for example a 15mm by 1 mm ribbon cable.

• Connector 256 is a low current connector (for example 5V, 2A), from the motor controller to the main controller, configured for providing power to the main controller (via a 2-pin connector arrangement). • Connector 257 is a low current connector between the main controller and the flight controller, enabling motor driver serial communications thereby to facilitate drone control. A 6-pin connector arrangement is used in this embodiment.

• Connector 258 is a low current connector configured to provide WiFi capabilities.

• Connector 259 is a low current connector between the flight controller and the camera module, configured to provide power to camera module and transmit drive signals to the tilt servo drive provided by the camera module. A 3-pin arrangement is used in this embodiment.

[00120] A key design principle of the above modular design is to incorporate tracking and control algorithms in an on-board manner, thereby to simplify the overall hardware make-up and enable reductions in device costs.

Example Motor Controller Board

[00121 ] FIG. 6 schematically illustrates a motor controller board according to one embodiment. This shows a microcontroller coupled to a single MOSFET gate driver. Which is responsible for controlling two motors. In this embodiment, components are selected as follows:

• MOSFET: Low Rds(on) enables low conduction losses negating the need for a heatsink. The Rds(on) is a balance between cost and heat generation under load. The MOSFET should be sized so that the cost is minimised while keeping board temperature under load <85°C.

• Freewheeling diode: Diode with low forward voltage drop for high efficiency and an average current capacity of approximately 5A.

• 5V regulator: Switching regulator with up to 4A of current output.

• Gate Driver: Minimises switching losses with the MOSFET's.

[00122] In relation to current measurement, high side current sensing is applied to minimise EMI issues compared to low side sensing. Shunt resistor with a power rating of 3W to handle the current. A high side current shunt amplifier is provided for compatibility with the ADC of the microcontroller. In relation to voltage measurement, a resistor divider with the maximum output set with an extra margin below the maximum ADC input voltage to allow for voltage spikes. Alternately, a rail-to-rail voltage divider with a TVS diode is applied to handle voltage spikes.

Example Communications Arrangement

[00123] FIG. 3 schematically illustrates an example communications arrangement according to an embodiment. This is in some embodiments implemented using the modular configuration of FIG. 2A/2B/2C (although components are labelled via a different schema here).

[00124] In the example of FIG. 3, a drone device 300 (which may be drone device 100 of FIG. 1 A) includes a WiFi module 301 (for example on a main controller board). WiFi module 301 enables drone 300 to provide a WiFi hotspot, to which a third party WiFi enabled device can connect. For example, in FIG. 3 this takes the form of an example client device 310, which may take the form of (for instance) a smartphone, tablet, smartwatch, and so on. The client device executes a software application that is configured to interact with drone 300, to provide functionalities including (but not necessarily limited to) providing control instructions to drone 300 and viewing images captured by a camera module of drone 300. In some embodiments a browser-based interface is used as an alternative to a proprietary application (such as an android or iOS app). In any case, a drone control user interface is rendered on a display screen of device 310 (for example a touchscreen display). Connector 321 represents wireless WiFi communications between drone 300 and device 310.

[00125] A high-level vision/control system 302 provides a central processing hub to interface user instructions (from device 310), input/output from/to a camera module 304 (such as image data which is used for image-based object identification and tracking, and instructions to a camera control servomotor), and output to a flight controller module 303. System 302 preferably also receives additional inputs, for example GPS and/or IMU data (in some cases an IMU is provided on a circuit board that provides system 302). In the illustrated embodiment, communications are achieved as follows:

• Between system 302 and WiFi module 301 : serial over IC interfacing. • Between system 302 and flight controller 303: serial via UART.

• Between system 302 and camera module 304: a direct hardware interface over CSI.

[00126] This provides a streamlined and efficient mechanism for defining flight controls (which are implemented via motors coupled to the rotor blade assemblies) based on inputs including user inputs from device 310 and sensed inputs derived from the camera module and other modules. For example, hardware-interfaced connections (Wifi and Camera) are used for high speed data transfer to minimise latency of the system. In some embodiments the flight controller and a GPS unit also communicate over I2C.

[00127] System 302 also communicates with an external micro SD storage reader over UART for on-board storage of videos and photos.

Example Component Connection Arrangement

[00128] FIG. 4 illustrates an example component connection arrangement, providing alternate (and more detailed) view as to the configuration of hardware components discussed in preceding sections.

Example Selfie Drone Operation

[00129] As noted above, embodiments are described by reference to a "selfie drone", being an aerial drone device that is configured to take photos of a user (for example by positioning itself in a defined location relative to the user). Aspects of tracking and control are discussed in this section, and it will be appreciated that these in some cases have wider application than purely for facial tracking and/or selfie generation. However, in the context of the compact collapsible drone 100 for FIG. 1 , selfie generation is a core functionality.

[00130] One embodiment provides a computer implemented method for controlling an aerial drone system thereby to autonomously follow a human face. The method includes operating the aerial drone system to capture image data via a RGB camera. The use of a single RGB camera (as opposed to multiple cameras, depth-field cameras, and the like) is relevant in the context of increasing device simplicity and reducing costs. The image data is fed into a main controller module, which is a circuit board with on-board image processing capabilities. The image data preferably includes sequential image frames captured as a frame rate, which may be autonomously or manually defined.

[00131 ] The method then includes processing the image data thereby to identify a target region predicted to contain a human face. This processing includes:

• Comparing captured image data with predefined stored images contained in an image library, wherein the image library includes data representative of predefined image data for faces and/or facial features. Those skilled in the art will appreciate that there are various publically available technologies available, along with associated image libraries, which provide this functionality in an effective manner (for example via open CV).

• Based on the comparing, identifying in the captured image data a target region predicted to contain the human face.

• Extracting from the captured image data one or more reference images representative of the human face.

[00132] In relation to the latter step, the image processing that takes place is in some embodiments based on an open source library (for example derived from OpenCV). However, rather than just utilising general object detection processing steps, and a machine learning approach to facial detection, the present vision system utilises an adaptive approach to face detection. In particular, the adaptive method allows for reference images of the target to be taken and stored during initialisation, thereby to create a more confident vision system tracking algorithm. Thus, rather than just utilising a large library of stored reference images for facial features, this system integrates with references images developed for each target, enabling more reliable tracking of the actual face being considered.

[00133] In a preferred embodiment, the process starts-up in a primary "follow-me" mode, in which reference images of the target are gathered to optimise the vision system. This occurs a defined start pose and distance. For example, the start post may be defined as a frontal view by default, and/or as a user selected pose (selected by facial positioning), such as a partial profile view or the like. The distance is in some cases a known approximate distance, for example an "arm's length" approximation (such as 0.5m) based upon a distance at which the drone is released from a user's hand and begins to fly.

[00134] A first tuning function is configured to correlate the distance that each pixel covers in a captured image. This enables tuning each camera module used with the drone for calculating error data used for the purpose of defining control signals (discussed below).

[00135] A second tuning function is incorporated during the primary "follow me" mode, whereby the drone is held at arm's length, and the detected face is assumed to be 0.5m away. This is then compared against statistical averages of facial feature layouts (i.e. how far apart eyes are, and how far eyes are from the mouth) for improved facial recognition tuning.

[00136] The method then includes continues processing of image data thereby to enable tracking of the human face, and the defining of control instructions to cause movement of the drone and/or camera module (i.e. camera control servomotor) thereby to retain the tracked human face in a defined portion and/or pose relative to captured image data. In this regard, the control system utilises a vision system that is able to detect a target that is assumed to be a human face. Once the target is detected and confirmed, the vision system continues to run several image processing functions to maintain lock onto the target face. As the face moves, the vision system provides error data to the control system that then responds with an output signal to the drive system to follow the target. This in some embodiments includes operating a tracking and control algorithm that is configured to:

(i) based on comparison of a captured frame of image data to the one or more reference images, determine error data (being an error based on pose and/or distance);

(ii) based on the error data, define a control instruction based on a predefined control protocol, wherein the control instruction is configured to apply a predefined degree of three dimensional movement to the aerial drone thereby to reduce an error defined by the error data; and

(iii) periodically repeat (i) and (ii) thereby to continuously apply error reduction. [00137] In some embodiments, once a pose and distance error is known between the target face and drone, these values feed into a PID tuning algorithm before being sent to the flight controller as a control command. The flight controller receives a control command as a numerical value, for example a numerical value within a range that represents a position of the joystick controls in a traditional RC controller. This signal is processed further with addition of stabilisation control commands and further PID tuning before it is output as a voltage to the drive system motors.

[00138] The approaches described above enable an aerial drone device to identify a human face, and track that human face as it moves, thereby to enable hands-free selfie photography. In some embodiments there are multiple levels of control instructions based on the calculated position and/or pose errors. This is preferably implemented to selectively restrict drone control in response to facial tracking for certain facial movements. By way of example, it allows a person to move their face, for example to adopt different "looks", without causing the drone to re-position. As a practical example, a person may wish to take a "selfie" with their head silted about a horizontal axis a little to the left or right; restricting control cased on tracking is used to prevent the drone from continuously moving to capture a front-on view of the face in such a situation.

[00139] Following on from the preceding paragraph, in some embodiments the drone control method includes configuring the tracking and control algorithm to:

(i) cause drone control instructions to be implemented in respect of a first set of facial movement conditions; and

(ii) not cause drone control instructions to be implemented in respect of a second set of facial movement conditions.

[00140] The first and second sets of facial movement conditions are defined to enable a tracked face to adopt a facial pose without affecting aerial drone position. By way of example, the first and second sets of facial movement conditions include one or more of the following movement conditions: horizontal movement conditions; vertical axis facial rotation conditions; face-normal horizontal axis facial rotation conditions; and face parallel horizontal axis facial rotation conditions (and combinations thereof).

[00141 ] In some cases, the first and second sets of facial movement conditions are temporally variable, such that at least one facial movement condition is transitioned from the first set to the second set (or vice versa) during a defined time period. This is optionally implemented such that the drone initially adopts a "follow me" mode whereby a face is tracked in a centred neutral pose, and then into a "pose for photo" mode whereby one or more facial movement conditions (such as looking upwards, downwards, or sideways) no longer result in front control instructions. So, in practice, the user positions the drone to take a photo based on a neutral pose, then instructs the drone to adopt the "pose for photo" mode and is free to move his/her face angle without causing drone movement. In some cases threshold degrees of head movement remain in the first set of facial movement conditions, in effect allowing differentiation between facial tracking and general head position tracking. An example of this is illustrated in the method of FIG. 5A.

[00142] In FIG. 5A, a user holds a drone at about arm's length from their body, initiates flight (for example by pressing a button) and releases the drone to fly at 501 . The drone then executes a facial detection process, thereby to detect the user's face at 502. Facial tracking is then configured, in some cases including the capture of reference images of the identified face thereby to enhance tracking algorithms at 503. Then, at 504, drone control (e.g. flight and camera position) is controlled in response to facial tracking. This commences based on a first set of constraints (optionally a null set of constraints). For the sake of this example, we shall assume that the control is such that the drone identifies errors in pose and distance offset, and is controlled thereby to keep the tracked face centred in the image frame in a front-on neutral pose at a distance of about 0.5m. Then, at 505, the user provides an instruction to implement a second set of tracking constraints. For example, this may cause the drone to stay in position, or to only track certain forms of movement (such as face height, but not facial pose direction). Tracking then continues at 506 based on that second set of tracking constraints, with images captured at 507. In some embodiments the shift between tracking protocols at 505 is initiated by hand gestures, which may be additionally used to control other aspects of drone position, such as distance, height, and lateral position. In some embodiments such commands are provided by way of a tethered IMU-enabled device.

[00143] In the example of FIG. 5B, an alternate approach for initiating facial tracking is disclosed. In this example, a user initially holds a drone at arm's length at 511. The drone then commences facial detection at 512, and a face is detected at 513. For example, in one embodiment the user holds the drone in one hand as if taking a selfie, and holds a smartphone in the other hand. Once a face is detected, an event occurs thereby to inform the user of successful facial detection. For example, this may include: • A change in the status of the drone, such as a changing light colour or the like, and/or rotor assemblies accelerating to flight speed.

• Where the user is holding a smartphone, the smartphone beeps and/or vibrates to inform the user that facial detection has been completed, and the user then looks at the smartphone screen to confirm that the correct face has indeed been detected. The user then provides an instruction, which allows initiation of flight based on tracking of that face (see 513).

[00144] At this stage, the drone accelerates rotors to a flight (hover) speed, and is released by the user (preferably a visual or audible signal informs a user that they can release the drone). From there, this example follows blocks 504-507 from FIG. 5A.

[00145] An additional/alternate functionality to restrict facial tracking based control in certain situations is to pre-program known photo poses. For example, the drone is programmed with popular photo pose positions (for a general user case and/or a specific user), and recognises transition into those poses as being separate from control-effecting facial movements. In some embodiments this is used to cause the drone to track a user based on a pose position other than a front-on neutral pose, for example where a user wishes to be photographed from a predefined offset relative to the front-on position.

[00146] In some embodiments processing in the context of facial recognition and tracking is shared between the drone and a connected device (such as a smartphone). For example, in one embodiment software to enable facial recognition and tracking is installed on a smartphone (along with a library of known faces and facial features, and processing at the drone device is focussed upon procuring and delivering video data (i.e. image frames) to the smartphone, with the more complex facial detection and tracking being performed at the smartphone. The smartphone then defines and provides signals which are configured to enable flight control functionalities based on the facial recognition and tracking. The drone remains self-sufficient in terms of other stabilization control and position control functionalities.

Gesture-Driven Control Using Facial Recognition

[00147] Some embodiments implement gesture-driven control, whereby processing algorithms are configured to identify predefined human movements (such as hand movements, arm movements, and the like), and interpret those as control commands (for example "move up", "move down", "come closer", "move back", "take photos", and so on). Preferably, this is implemented in combination with facial detection, such that only human movements belonging to the same person as a tracked face are identified as being gesture-driven controls.

[00148] In one embodiment a method includes receiving data representative of images captured by a camera device of the aerial drone device. The method then includes operating an image processing algorithm that is configured to detect the presence of one or more defined gesture-driven commands, wherein the gesture-driven commands relate to human body movements. A facial detection and identification algorithm is operated, the facial detection and identification algorithm being configured to: (i) identify the presence of a human face; and (ii) determine whether the human face corresponds to a prescribed known human face. The method then includes performing analysis thereby to determine whether a detected gesture driven commands corresponds to a prescribed known human face. In the case that a detected gesture driven commands corresponds to a prescribed known human face, implementing a control instruction in respect of the drone device corresponding to the detected gesture driven command.

[00149] Such an approach is especially useful in a situation where multiple moving objects (for example different people) are identifiable in image data, and there is a desire to prevent unauthorised/accidental gesture-driven controls from being recognised.

IMU-Driven Control Using Connected Device

[00150] In some embodiments, a WiFi connected device (such as a smartphone) has an on-board IMU, which is used to deliver flying control commands to the drone device. In particular, the connected device's IMU is used to translate the device's relative motion into a control command. For example, by holding the smartphone or tablet in a landscape orientation and gripping it with thumbs free (similar to a steering wheel), a user is enabled to you can adjust the angle of the device to send a control command. These control commands are, in a preferred embodiment:

• Tilt device forward = move forward (positive pitch)

• Tilt device backward = move backward (negative pitch) • Tilt device left = move left (negative roll)

• Tilt device right = move right (positive roll)

• Turn device left = turn left (negative yaw)

• Turn device right = turn right (positive yaw)

[00151 ] In a preferred embodiment, the connected device is configured to provide a live video feed using image data captured by the drone's camera device. For example, that image data is captured and transmitted over WiFi, optionally using compression and/or frame rate reduction for the purpose of experience optimisation.

[00152] In a further preferred embodiment, the control commands are reversed with respect to those shown above for left and right, thereby to provide intuitive control over a drone device that faces the user (for example when operating as a "selfie drone"). That is, left and right commands are mirrored, such that a tilt to the right instructs the drone to move left.

[00153] In some embodiments the drone device is configured to shift between regular and mirrored command schemas. That is optionally controlled by any one or more or:

• User input indicative of a preferred mode of control.

• A mode of operation, for example a "selfie" mode or a "pilot" mode.

• An autonomous determination of relative position of the user and the drone device.

[00154] In some embodiments, an algorithm is configured to translate a degree of IMU- detected motion into a degree of control command responsive to a mode of drone operation. For example, this in some embodiments implemented reduced proportional effect of IMU-initiated commands where a drone is locked onto a tracking target as opposed to when the drone is in being flown freely.

Conclusions and Interpretation [00155] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining", analyzing" or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.

[00156] In a similar manner, the term "processor" may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A "computer" or a "computing machine" or a "computing platform" may include one or more processors.

[00157] The methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein. Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included. Thus, one example is a typical processing system that includes one or more processors. Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit. The processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM. A bus subsystem may be included for communicating between the components. The processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth. Input devices may also include audio/video input devices, and/or devices configured to derive information relating to characteristics/attributes of a human user. The term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit. The processing system in some configurations may include a sound output device, and a network interface device. The memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein. Note that when the method includes several elements, e.g., several steps, no ordering of such elements is implied, unless specifically stated. The software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system. Thus, the memory and the processor also constitute computer- readable carrier medium carrying computer-readable code.

[00158] Furthermore, a computer-readable carrier medium may form, or be included in a computer program product.

[00159] In alternative embodiments, the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment. The one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.

[00160] Note that while diagrams only show a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect. For example, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.

[00161 ] Thus, one embodiment of each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that is for execution on one or more processors, e.g., one or more processors that are part of web server arrangement. Thus, as will be appreciated by those skilled in the art, embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer-readable carrier medium, e.g., a computer program product. The computer-readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method. Accordingly, aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.

[00162] The software may further be transmitted or received over a network via a network interface device. While the carrier medium is shown in an exemplary embodiment to be a single medium, the term "carrier medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "carrier medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention. A carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks. Volatile media includes dynamic memory, such as main memory. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. For example, the term "carrier medium" shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media; a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that, when executed, implement a method; and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.

[00163] It will be understood that the steps of methods discussed are performed in one embodiment by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. The invention is not limited to any particular programming language or operating system.

[00164] It should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, FIG., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.

[00165] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.

[00166] Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.

[00167] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.

[00168] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

[00169] Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as falling within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.