Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A COMPUTER IMPLEMENTED SYSTEM AND METHOD FOR CONTROLLING FLIGHT OF A DRONE HAVING A DRONE CAMERA
Document Type and Number:
WIPO Patent Application WO/2020/161693
Kind Code:
A1
Abstract:
A computer implemented method of controlling flight of a drone having a drone camera, the method including, in response to a user input, defining coordinates of a field of interest to be imaged by the drone camera and of a fly zone adjacent to the field of interest, defining plural portions of the field of interest corresponding to plural activities expected to take place at the plural portions, defining plural drone positions within the fly zone and corresponding plural drone camera orientations corresponding to the plural portions of the field of interest and flying the drone sequentially to plural ones of the plural drone positions within the fly zone and orienting the drone camera to corresponding ones of the plural drone camera orientations in order to image corresponding ones of the plural activities at the plural portions of the fly zone.

Inventors:
ASSA JACKIE (IL)
LERNER ALON (IL)
STIEFEL YEHUDA (IL)
KUTLIROFF GERSHOM (IL)
Application Number:
PCT/IL2019/050528
Publication Date:
August 13, 2020
Filing Date:
May 08, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CLEARVUZE LTD (IL)
International Classes:
G05D1/02; B64C39/02; G05D1/08; G05D1/10; G05D1/12
Foreign References:
US20140316616A12014-10-23
US20150350614A12015-12-03
US20170334559A12017-11-23
Attorney, Agent or Firm:
COLB, Sanford T. et al. (IL)
Download PDF:
Claims:
CLAIMS

1. A computer implemented method of controlling flight of a drone having a drone camera, the method comprising:

in response to a user input, defining coordinates of a field of interest to be imaged by said drone camera and of a fly zone adjacent to said field of interest;

in response to a user input selecting a type of scene to be imaged in said field of interest by said drone camera, defining plural portions of said field of interest corresponding to plural activities expected to take place at said plural portions;

defining plural drone positions within said fly zone and corresponding plural drone camera orientations corresponding to said plural portions of said field of interest; and

flying said drone sequentially to plural ones of said plural drone positions within said fly zone and orienting said drone camera to corresponding ones of said plural drone camera orientations in order to image corresponding ones of said plural activities at said plural portions of said fly zone.

2. A computer implemented method of controlling flight of a drone having a drone camera according to claim 1 and wherein said defining plural drone positions within said fly zone and corresponding plural drone camera orientations takes into account cinematographic considerations.

3. A computer implemented method of controlling flight of a drone having a drone camera according to claim 1 or claim 2 and wherein said coordinates include GPS coordinates.

4. A computer implemented method of controlling flight of a drone having a drone camera according to claim 3 and also comprising converting said GPS coordinates to Cartesian coordinates.

5. A computer implemented method of controlling flight of a drone having a drone camera according to any of the preceding claims and wherein said coordinates of said fly zone are GPS coordinates and height.

6. A computer implemented method of controlling flight of a drone having a drone camera according to any of the preceding claims and also comprising obtaining coordinates of a home point outside of said fly zone in response to a user input.

7. A computer implemented method of controlling flight of a drone having a drone camera according to any of the preceding claims and wherein said flying said drone sequentially to plural ones of said plural drone positions within said fly zone and orienting said drone camera to corresponding ones of said plural drone camera orientations comprises selecting one of said plurality of discrete fly zone positions and a rotational orientation to be assumed by said drone camera.

8. A computer implemented method of controlling flight of a drone having a drone camera according to claim 7 and wherein said selecting one of said plurality of discrete fly zone positions and rotational orientations to be assumed by said drone comprises optimizing said rotational orientation of said drone to enable said drone camera to view a user-selected one or more plural discrete two-dimensional portions of said field of interest under optimal cinematographic conditions.

9. A computer implemented method of controlling flight of a drone having a drone camera according to claim 8 and wherein said optimizing said rotational orientation of said drone camera to enable said drone camera to view a user-selected one or more plural discrete two-dimensional portions of said field of interest under optimal cinematographic conditions comprises maximization of an extent of overlap between an image plane of said drone camera and a projection onto said image plane of said user- selected one or more plural discrete two-dimensional portions of said field of interest.

10. A computer implemented method of controlling flight of a drone having a drone camera according to claim 9 and wherein said maximization of an extent of overlap between an image plane of said drone camera and a projection onto said image plane of said user-selected one or more plural discrete two-dimensional portions of said field of interest comprises:

1. Projecting vertices of said user-selected one or more 2-dimensional portions of said field of interest onto said image plane for each of a plurality of discrete fly zone positions using a camera pinhole model;

2. Drawing a rectangle subsuming said vertices;

3. Bisecting said image plane in mutually perpendicular directions along bisecting lines;

4. Computing distances between a center of said image plane and intersections of said bisecting lines to derive XLEFT, XRIGHT, YTOP, YBOTTOM distances;

5. Computing ratios of said XLEFT /XRIGHT and YTOP/YBOTTOM distances;

6. Representing said extent of overlap between said image plane and said projection as being proportional to a difference between the ratios of the XLEFT /XRIGHT and YTOP/YBOTTOM distances and 1;

7. Repeating above steps 1 - 6 for each of a plurality of rotational orientations each preferably arrived at by making one rotation step in either pitch or yaw until there is no longer any convergence of the difference between the ratios of the XLEFT /XRIGHT and YTOP/YBOTTOM distances and 1;

8. For each of said plural discrete fly zone positions, selecting a preferred rotational orientation based on the above steps 1 - 7, which rotational orientation has the least difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1; and

9. Assigning a score to that one of said plural discrete fly zone positions at its optimal rotational orientation as established by above steps 1 - 8.

11. A computer implemented system for controlling flight of a drone having a drone camera, the system comprising: a fly zone and field of interest coordinate generation module, operative in response to a user input, defining coordinates of a field of interest to be imaged by said drone camera and of a fly zone adjacent to said field of interest;

a plural field of interest portions designator, operative in response to a user input selecting a type of scene to be imaged in said field of interest by said drone camera, defining plural portions of said field of interest corresponding to plural activities expected to take place at said plural portions;

a plural fly zone positions and orientations designator, defining plural drone positions within said fly zone and corresponding plural drone camera orientations to corresponding to said plural portions of said field of interest; and

a fly control output generator operative to fly said drone sequentially to plural ones of said plural drone positions within said fly zone and orienting said drone camera to corresponding ones of said plural drone camera orientations in order to image corresponding ones of said plural activities at said plural portions of said fly zone.

12. A computer implemented system for controlling flight of a drone having a drone camera according to claim 11 and wherein said plural fly zone positions and orientations takes into account cinematographic considerations.

13. A computer implemented system for controlling flight of a drone having a drone camera according to claim 11 or claim 12 and wherein said coordinates include GPS coordinates.

14. A computer implemented system for controlling flight of a drone having a drone camera according to claim 13 and also comprising a coordinate converter converting said GPS coordinates to Cartesian coordinates.

15. A computer implemented system for controlling flight of a drone having a drone camera according to any of the preceding claims 11 - 14 and wherein said coordinates of said fly zone are GPS coordinates and height.

16. A computer implemented system for controlling flight of a drone having a drone camera according to any of the preceding claims 11 - 15 and also comprising a home point designator operative for obtaining coordinates of a home point outside of said fly zone in response to a user input.

17. A computer implemented system forcontrolling flight of a drone having a drone camera according to any of the preceding claims 11 - 16 and wherein said fly control output generator comprises a selector operative for selecting one of said plurality of discrete fly zone positions and a rotational orientation to be assumed by said drone camera.

18. A computer implemented system for controlling flight of a drone having a drone camera according to claim 17 and wherein said selector is operative to select a rotational orientation of said drone to enable said drone camera to view a user-selected one or more plural discrete two-dimensional portions of said field of interest under desired cinematographic conditions.

19. A computer implemented system for controlling flight of a drone having a drone camera according to claim 18 and wherein said selector takes into account maximization of an extent of overlap between an image plane of said drone camera and a projection onto said image plane of said user-selected one or more plural discrete two- dimensional portions of said field of interest.

20. A computer implemented system for controlling flight of a drone having a drone camera according to claim 19 and wherein said maximization of an extent of overlap between an image plane of said drone camera and a projection onto said image plane of said user-selected one or more plural discrete two-dimensional portions of said field of interest comprises:

1. Projecting vertices of said user-selected one or more 2-dimensional portions of said field of interest onto said image plane for each of a plurality of discrete fly zone positions using a camera pinhole model;

2. Drawing a rectangle subsuming said vertices;

3. Bisecting said image plane in mutually perpendicular directions along bisecting lines;

4. Computing distances between a center of said image plane and intersections of said bisecting lines to derive XLEFT, XRIGHT, YTOP, YBOTTOM distances;

5. Computing ratios of said XLEFT /XRIGHT and YTOP/YBOTTOM distances;

6. Representing said extent of overlap between said image plane and said projection as being proportional to a difference between the ratios of the XLEFT /XRIGHT and YTOP/YBOTTOM distances and 1;

7. Repeating above steps 1 - 6 for each of a plurality of rotational orientations each preferably arrived at by making one rotation step in either pitch or yaw until there is no longer any convergence of the difference between the ratios of the XLEFT /XRIGHT and YTOP/YBOTTOM distances and 1;

8. For each of said plural discrete fly zone positions, selecting a preferred rotational orientation based on the above steps 1 - 7, which rotational orientation has the least difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1; and

9. Assigning a score to that one of said plural discrete fly zone positions at its optimal rotational orientation as established by above steps 1 - 8.

21. A method of controlling flight of a drone having a cinematographic payload, the method comprising:

presenting to a user, a representation of a three-dimensional scene to be filmed;

receiving from the user an indication of a three-dimensional portion of interest in said scene; and

controlling flight of said drone in order to position said drone to enable said cinematographic payload to provide a desired moving image of said portion of interest, said controlling including cinematographic analysis of the three-dimensional structure of said portion of interest of said scene.

22. A method of controlling flight of a drone having a cinematographic payload according to claim 21 and wherein said controlling takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

23. A method of controlling flight of a drone having a cinematographic payload according to claim 21 or claim 22 and wherein said controlling takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

24. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 21-23 and also comprising:

prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

automatically restricting flying of said drone to be within said permitted fly zone.

25. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 21-24 and also comprising:

prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

26. A method of controlling flight of a drone having a cinematographic payload according to claim 25 and also comprising:

in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

27. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 21-26 and also comprising:

receiving from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

28. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 21-27 and also comprising:

automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

29. A method of controlling flight of a drone having a cinematographic payload according to claim 28 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed; and

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

30. A method of controlling flight of a drone having a cinematographic payload according to claim 28 and wherein at least one of said generating and said selecting takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

31. A method of controlling operation of a cinematographic payload on a drone, the method comprising:

presenting to a user, a representation of a three-dimensional scene to be filmed;

receiving from the user an indication of a three-dimensional portion of interest of said scene;

automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

32. A method of controlling operation of a cinematographic payload on a drone according to claim 31 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed; and

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

33. A method of controlling flight of a drone having a cinematographic payload according to claim 31 or claim 32 and wherein said selecting takes into account at least one of the identity and characteristics of at least one object in said three- dimensional portion of interest of said scene.

34. A method of controlling flight of a drone having a cinematographic payload according to any of claims 31 - 33 and wherein said generating takes into account at least one of the identity and characteristics of at least one object in said three- dimensional portion of interest of said scene.

35. A method of controlling flight of a drone having a cinematographic payload according to any of claims 31 - 34 and wherein said selecting takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

36. A method of controlling flight of a drone having a cinematographic payload according to any of claims 31 - 35 and wherein said generating takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

37. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 31 - 36 and also comprising:

prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

automatically restricting flying of said drone to be within said permitted fly zone.

38. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 31 - 37 and also comprising:

prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

39. A method of controlling flight of a drone having a cinematographic payload according to claim 38 and also comprising:

in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

40. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 31-39 and also comprising:

receiving from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

41. A method of controlling flight of a drone having a cinematographic payload, the method comprising:

presenting to a user, a representation of a three-dimensional scene to be filmed;

receiving from the user an indication of a three-dimensional portion of interest in said scene; and controlling flight of said drone in order to position said drone to enable said cinematographic payload to provide a desired moving image of said portion of interest, said controlling including cinematographic analysis of the three-dimensional structure of said portion of interest of said scene.

42. A method of controlling flight of a drone having a cinematographic payload according to claim 41 and wherein said controlling takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

43. A method of controlling flight of a drone having a cinematographic payload according to claim 41 or claim 42 and wherein said controlling takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

44. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 41 - 43 and also comprising:

prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

automatically restricting flying of said drone to be within said permitted fly zone.

45. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 41 - 44 and also comprising:

prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

46. A method of controlling flight of a drone having a cinematographic payload according to claim 45 and also comprising:

in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

47. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 41 - 46 and also comprising:

receiving from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

48. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims41 - 47 and also comprising:

automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

49. A method of controlling flight of a drone having a cinematographic payload according to claim 48 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed; and

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

50. A method of controlling flight of a drone having a cinematographic payload according to claim 48 and wherein at least one of said generating and said selecting takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

51. A method of controlling operation of a cinematographic payload on a drone, the method comprising:

presenting to a user, a representation of a three-dimensional scene to be filmed; receiving from the user an indication of a three-dimensional portion of interest of said scene;

automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

52. A method of controlling operation of a cinematographic payload on a drone according to claim 51 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed; and

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

53. A method of controlling flight of a drone having a cinematographic payload according to claim 51 or claim 52 and wherein said selecting takes into account at least one of the identity and characteristics of at least one object in said three- dimensional portion of interest of said scene.

54. A method of controlling flight of a drone having a cinematographic payload according to any of claims 51 - 53 and wherein said generating takes into account at least one of the identity and characteristics of at least one object in said three- dimensional portion of interest of said scene.

55. A method of controlling flight of a drone having a cinematographic payload according to any of claims 51 - 54 and wherein said selecting takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

56. A method of controlling flight of a drone having a cinematographic payload according to any of claims 51 - 55 and wherein said generating takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

57. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 51 - 56 and also comprising:

prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

automatically restricting flying of said drone to be within said permitted fly zone.

58. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 51 - 57 and also comprising:

prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

59. A method of controlling flight of a drone having a cinematographic payload according to claim 58 and also comprising:

in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

60. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 51-59 and also comprising:

receiving from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

61. A method of controlling operation of a cinematographic payload on a drone, the method comprising:

presenting to a user, a representation of a three-dimensional scene to be filmed; receiving from the user an indication of a three-dimensional portion of interest of said scene; and

automatically generating a drone flight path based on a cinematographic analysis of said portion of interest of said scene.

62. A method of controlling flight of a drone having a cinematographic payload according to claim 61 and wherein said controlling takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

63. A method of controlling flight of a drone having a cinematographic payload according to claim 61 or claim 62 and wherein said automatically generating takes into account expected interactions of a plurality of objects in said three- dimensional portion of interest of said scene.

64. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 61 - 63 and also comprising:

prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

automatically restricting flying of said drone to be within said permitted fly zone.

65. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 61 - 64 and also comprising:

prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

66. A method of controlling flight of a drone having a cinematographic payload according to claim 65 and also comprising:

in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

67. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 61 - 66 and also comprising:

receiving from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

68. A method of controlling flight of a drone having a cinematographic payload according to any of the preceding claims 61 - 67 and also comprising:

automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

69. A method of controlling flight of a drone having a cinematographic payload according to claim 68 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed; and

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

70. A method of controlling flight of a drone having a cinematographic payload according to claim 67 and wherein at least one of said generating and said selecting takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

71. A system for controlling flight of a drone having a cinematographic payload, the system comprising: a user interface presenting to a user, a representation of a three- dimensional scene to be filmed and receiving from the user an indication of a three- dimensional portion of interest in said scene; and

a controller, controlling flight of said drone in order to position said drone to enable said cinematographic payload to provide a desired moving image of said portion of interest, said controlling including cinematographic analysis of the three- dimensional structure of said portion of interest of said scene.

72. A system for controlling flight of a drone having a cinematographic payload according to claim 71 and wherein said controlling takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

73. A system for controlling flight of a drone having a cinematographic payload according to claim 71 or claim 72 and wherein said controlling takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

74. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 71 - 73 and also comprising:

a zone definer, prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

a flight restrictor, automatically restricting flying of said drone to be within said permitted fly zone.

75. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 71 - 74 and also comprising:

a home base definer, operative prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

76. A system for controlling flight of a drone having a cinematographic payload according to claim 75 and also comprising:

a fly path definer, operative in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

77. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 71 - 76 and wherein said user interface is also operative to receive from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

78. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 71 - 77 and also comprising:

a generator automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and

a selector automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

79. A system for controlling flight of a drone having a cinematographic payload according to claim 78 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed;

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

80. A system for controlling flight of a drone having a cinematographic payload according to claim 78 and wherein at least one of said generating and said selecting takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

81. A system for controlling operation of a cinematographic payload on a drone, the system comprising:

a user interface presenting to a user, a representation of a three- dimensional scene to be filmed and receiving from the user an indication of a three- dimensional portion of interest of said scene;

a generator automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and

a selector automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

82. A system for controlling operation of a cinematographic payload on a drone according to claim 81 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed; and

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

83. A system for controlling flight of a drone having a cinematographic payload according to claim 81 or claim 82 and wherein said selecting takes into account at least one of the identity and characteristics of at least one object in said three- dimensional portion of interest of said scene.

84. A system for controlling flight of a drone having a cinematographic payload according to any of claims 81 - 83 and wherein said generating takes into account at least one of the identity and characteristics of at least one object in said three- dimensional portion of interest of said scene.

85. A system for controlling flight of a drone having a cinematographic payload according to any of claims 81 - 84 and wherein said selecting takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

86. A system for controlling flight of a drone having a cinematographic payload according to any of claims 81 - 85 and wherein said generating takes into account expected interactions of a plurality of objects in said three-dimensional portion of interest of said scene.

87. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 81 - 86 and also comprising:

a fly zone definer operative prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

a flight restrictor automatically restricting flying of said drone to be within said permitted fly zone.

88. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 81 - 87 and also comprising:

a home base definer, operative prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

89. A system for controlling flight of a drone having a cinematographic payload according to claim 88 and also comprising:

a fly path definer, operative in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

90. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 81 - 88 and wherein said user interface is operative to receive from said user a different indication of a three-dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three- dimensional portion of interest in said scene.

91. A system for controlling operation of a cinematographic payload on a drone, the system comprising:

a user interface presenting to a user, a representation of a three- dimensional scene to be filmed and receiving from the user an indication of a three- dimensional portion of interest of said scene; and

a generator automatically generating a drone flight path based on a cinematographic analysis of said portion of interest of said scene.

92. A system for controlling flight of a drone having a cinematographic payload according to claim 91 and wherein said controlling takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

93. A system for controlling flight of a drone having a cinematographic payload according to claim 91 or claim 92 and wherein said automatically generating takes into account expected interactions of a plurality of objects in said three- dimensional portion of interest of said scene.

94. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 91-93 and also comprising:

a fly zone definer operative, prior to flying of said drone, defining a permitted fly-zone of permitted drone travel; and

a fly zone restrictor, operative for automatically restricting flying of said drone to be within said permitted fly zone.

95. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 91-94 and also comprising: a home base definer, operative prior to flying of said drone, defining a home base for said drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

96. A system for controlling flight of a drone having a cinematographic payload according to claim 95 and also comprising:

a safe fly path definer, operative in cases where said home base for said drone is outside said permitted fly-zone of permitted drone travel, defining a safe fly path between said home base and said permitted fly zone.

97. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 91 - 96 and wherein said user interface is also operative for receiving from said user a different indication of a three- dimensional portion of interest in said scene and wherein said controlling comprises changing a flight path of said drone in mid-flight responsive to said different indication of a three-dimensional portion of interest in said scene.

98. A system for controlling flight of a drone having a cinematographic payload according to any of the preceding claims 91 - 97 and also comprising:

a generator operative for automatically generating a plurality of candidate flight paths for said drone based on a cinematographic analysis of said portion of interest of said scene; and

a selector operative for automatically selecting at least one of said plurality of candidate flight paths for said drone based on predetermined criteria.

99. A system for controlling flight of a drone having a cinematographic payload according to claim 98 and wherein said predetermined criteria include:

distance of said drone from said portion of interest of said scene;

expected presence of at least one predetermined object in said three- dimensional scene to be filmed;

expected compliance of a moving video of said portion of interest of said scene with predetermined cinematographic quality criteria.

100. A system for controlling flight of a drone having a cinematographic payload according to claim 98 or claim 99 and wherein at least one of said generating and said selecting takes into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

101. A system for controlling flight of a drone having a cinematographic payload according to any of claims 98 - 100 and wherein said generating and said selecting take into account at least one of the identity and characteristics of at least one object in said three-dimensional portion of interest of said scene.

Description:
A COMPUTER IMPLEMENTED SYSTEM AND METHOD

FOR CONTROLLING FLIGHT OF A DRONE HAVING A DRONE CAMERA

REFERENCE TO RELATED APPLICATIONS

Reference is hereby made to U.S. Provisional Patent Application Serial No. 62/802,309, filed February 7, 2019 and entitled CINEMATOGRAPHIC DRONE FLIGHT CONTROL, the disclosure of which is hereby incorporated by reference and priority of which is hereby claimed.

FIELD OF THE INVENTION

The present invention relates to systems and methods for controlling the flight of a drone generally and more particularly to systems and methods for controlling cinematographic drone flight.

BACKGROUND OF THE INVENTION

Various types of drone controllers are known in the art.

The following references, the contents of which are hereby incorporated by reference, describe cinematographic issues relevant to cinematographic drone flight control:

Quentin Galvane, Christophe Lino, Marc Christie, Julien Fleureau, Fabien Servant, Francois-Louis Tariolle, and Philippe Guillotel; Directing Cinematographic Drones. ACM Transactions on Graphics, Volume 37 Issue 3, Article No. 34 (August 2018); Rogerio Bonatti, Yanfu Zhang, Sanjiban Choudhury, Wenshan Wang, Sebastian Scherer; Autonomous drone cinematographer: Using artistic principles to create smooth, safe, occlusion-free trajectories for aerial filming; submitted on 28 Aug 2018 to arXiv.org (https://arxiv.org/abs/1808.09563);

Rogerio Bonatti, Cherie Ho, Wenshan Wang, Sanjiban Choudhury,

Sebastian Scherer; Towards a Robust Aerial Cinematography Platform: Localizing and Tracking Moving Targets in Unstructured Environments; submitted on 4 Apr 2019 to arXiv.org (https://arxiv.org/abs/1904.02319);

I. Mademlis, V. Mygdalis, N. Nikolaidis, M. Montag-nuolo, F. Negro, A. Messina, I. Pitas; High-level multiple-UAV cinematography tools for covering outdoor events, IEEE Transactions on Broadcasting 2019; and

Learning to Film from Professional Human Motion Videos; https ://chuanenl. in/papers/C VPR2019.pdf.

SUMMARY OF THE INVENTION

The present invention seeks to provide improved systems and methods for controlling the flight of a drone.

There is thus provided in accordance with a preferred embodiment of the present invention a computer implemented method of controlling flight of a drone having a drone camera, the method including, in response to a user input, defining coordinates of a field of interest to be imaged by the drone camera and of a fly zone adjacent to the field of interest, in response to a user input selecting a type of scene to be imaged in the field of interest by the drone camera, defining plural portions of the field of interest corresponding to plural activities expected to take place at the plural portions, defining plural drone positions within the fly zone and corresponding plural drone camera orientations corresponding to the plural portions of the field of interest and flying the drone sequentially to plural ones of the plural drone positions within the fly zone and orienting the drone camera to corresponding ones of the plural drone camera orientations in order to image corresponding ones of the plural activities at the plural portions of the fly zone.

In accordance with a preferred embodiment of the present invention the defining plural drone positions within the fly zone and corresponding plural drone camera orientations takes into account cinematographic considerations, such as those described in the above-quoted references, the disclosures of which are hereby incorporated by reference.

In accordance with a preferred embodiment of the present invention the coordinates include GPS coordinates. Additionally, the method also includes converting the GPS coordinates to Cartesian coordinates. Preferably, the coordinates of the fly zone are GPS coordinates and height.

In accordance with a preferred embodiment of the present invention the method also includes obtaining coordinates of a home point outside of the fly zone in response to a user input.

Preferably, the flying the drone sequentially to plural ones of the plural drone positions within the fly zone and orienting the drone camera to corresponding ones of the plural drone camera orientations includes selecting one of the plurality of discrete fly zone positions and a rotational orientation to be assumed by the drone camera. Additionally, the selecting one of the plurality of discrete fly zone positions and rotational orientations to be assumed by the drone includes optimizing the rotational orientation of the drone to enable the drone camera to view a user-selected one or more plural discrete two-dimensional portions of the field of interest under optimal cinematographic conditions.

Preferably, the optimizing the rotational orientation of the drone camera to enable the drone camera to view a user-selected one or more plural discrete two- dimensional portions of the field of interest under optimal cinematographic conditions includes maximization of an extent of overlap between an image plane of the drone camera and a projection onto the image plane of the user-selected one or more plural discrete two-dimensional portions of the field of interest. Additionally, the maximization of an extent of overlap between an image plane of the drone camera and a projection onto the image plane of the user-selected one or more plural discrete two- dimensional portions of the field of interest includes 1. Projecting vertices of the user- selected one or more 2-dimensional portions of the field of interest onto the image plane for each of a plurality of discrete fly zone positions using a conventional camera pinhole model, 2. Drawing a rectangle subsuming the vertices, 3. Bisecting the image plane in mutually perpendicular directions along bisecting lines, 4. Computing distances between a center of the image plane and intersections of the bisecting lines to derive XLEFT, XRIGHT, YTOP, YBOTTOM distances, 5. Computing ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances, 6. Representing the extent of overlap between the image plane and the projection as being proportional to a difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1, 7. Repeating above steps 1 - 6 for each of a plurality of rotational orientations each preferably arrived at by making one rotation step in either pitch or yaw until there is no longer any convergence of the difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1, 8. For each of the plural discrete fly zone positions, selecting a preferred rotational orientation based on the above steps 1 - 7, which rotational orientation has the least difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1 and 9. Assigning a score to that one of the plural discrete fly zone positions at its optimal rotational orientation as established by above steps 1 - 8.

There is also provided in accordance with another preferred embodiment of the present invention a computer implemented system for controlling flight of a drone having a drone camera, the system including a fly zone and field of interest coordinate generation module, operative in response to a user input, defining coordinates of a field of interest to be imaged by the drone camera and of a fly zone adjacent to the field of interest, a plural field of interest portions designator, operative in response to a user input selecting a type of scene to be imaged in the field of interest by the drone camera, defining plural portions of the field of interest corresponding to plural activities expected to take place at the plural portions, a plural fly zone positions and orientations designator, defining plural drone positions within the fly zone and corresponding plural drone camera orientations to corresponding to the plural portions of the field of interest and a fly control output generator operative to fly the drone sequentially to plural ones of the plural drone positions within the fly zone and orienting the drone camera to corresponding ones of the plural drone camera orientations in order to image corresponding ones of the plural activities at the plural portions of the fly zone.

In accordance with a preferred embodiment of the present invention the plural fly zone positions and orientations takes into account cinematographic considerations, such as those described in the above-quoted references, the disclosures of which are hereby incorporated by reference.

In accordance with a preferred embodiment of the present invention the coordinates include GPS coordinates. Additionally, the system also includes a coordinate converter converting the GPS coordinates to Cartesian coordinates. Preferably, the coordinates of the fly zone are GPS coordinates and height.

In accordance with a preferred embodiment of the present invention the system also includes a home point designator operative for obtaining coordinates of a home point outside of the fly zone in response to a user input.

Preferably, the fly control output generator includes a selector operative for selecting one of the plurality of discrete fly zone positions and a rotational orientation to be assumed by the drone camera. Additionally, the selector is operative to select a rotational orientation of the drone to enable the drone camera to view a user- selected one or more plural discrete two-dimensional portions of the field of interest under desired cinematographic conditions.

Preferably, the selector takes into account maximization of an extent of overlap between an image plane of the drone camera and a projection onto the image plane of the user-selected one or more plural discrete two-dimensional portions of the field of interest. Additionally, the maximization of an extent of overlap between an image plane of the drone camera and a projection onto the image plane of the user- selected one or more plural discrete two-dimensional portions of the field of interest includes 1. Projecting vertices of the user-selected one or more 2-dimensional portions of the field of interest onto the image plane for each of a plurality of discrete fly zone positions using a conventional camera pinhole model, 2. Drawing a rectangle subsuming the vertices, 3. Bisecting the image plane in mutually perpendicular directions along bisecting lines, 4. Computing distances between a center of the image plane and intersections of the bisecting lines to derive XLEFT, XRIGHT, YTOP, YBOTTOM distances, 5. Computing ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances, 6. Representing the extent of overlap between the image plane and the projection as being proportional to a difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1, 7. Repeating above steps 1 - 6 for each of a plurality of rotational orientations each preferably arrived at by making one rotation step in either pitch or yaw until there is no longer any convergence of the difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1, 8. For each of the plural discrete fly zone positions, selecting a preferred rotational orientation based on the above steps 1 - 7, which rotational orientation has the least difference between the ratios of the XLEFT/XRIGHT and YTOP/YBOTTOM distances and 1; and 9. Assigning a score to that one of the plural discrete fly zone positions at its optimal rotational orientation as established by above steps 1 - 8.

There is further provided in accordance with still another preferred embodiment of the present invention a method of controlling flight of a drone having a cinematographic payload, the method including presenting to a user, a representation of a three-dimensional scene to be filmed, receiving from the user an indication of a three- dimensional portion of interest in the scene and controlling flight of the drone in order to position the drone to enable the cinematographic payload to provide a desired moving image of the portion of interest, the controlling including cinematographic analysis of the three-dimensional structure of the portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the controlling takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the controlling takes into account expected interactions of a plurality of objects in the three-dimensional portion of interest of the scene.

Preferably, the method also includes, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the method also includes, prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

In accordance with a preferred embodiment of the present invention the method also includes, in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

In accordance with a preferred embodiment of the present invention the method also includes receiving from the user a different indication of a three- dimensional portion of interest in the scene and the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three- dimensional portion of interest in the scene. Additionally, the method also includes automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

Preferably, the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

In accordance with a preferred embodiment of the present invention at least one of the generating and the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene.

There is still further provided in accordance with yet another preferred embodiment of the present invention a method of controlling operation of a cinematographic payload on a drone, the method including presenting to a user, a representation of a three-dimensional scene to be filmed, receiving from the user an indication of a three-dimensional portion of interest of the scene, automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

In accordance with a preferred embodiment of the present invention the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria. Additionally or alternatively, the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the generating takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the selecting takes into account expected interactions of a plurality of objects in the three- dimensional portion of interest of the scene. Additionally or alternatively, the generating takes into account expected interactions of a plurality of objects in the three- dimensional portion of interest of the scene.

Preferably, the method also includes, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the method also includes, prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel. Additionally, the method also includes, in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

Preferably, the method also includes receiving from the user a different indication of a three-dimensional portion of interest in the scene and wherein the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene.

There is also provided in accordance with another preferred embodiment of the present invention a method of controlling flight of a drone having a cinematographic payload, the method including presenting to a user, a representation of a three-dimensional scene to be filmed, receiving from the user an indication of a three- dimensional portion of interest in the scene and controlling flight of the drone in order to position the drone to enable the cinematographic payload to provide a desired moving image of the portion of interest, the controlling including cinematographic analysis of the three-dimensional structure of the portion of interest of the scene.

Preferably, the controlling takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the controlling takes into account expected interactions of a plurality of objects in the three-dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the method also includes, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the method also includes, prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

Preferably, the method also includes, in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

In accordance with a preferred embodiment of the present invention the method also includes receiving from the user a different indication of a three- dimensional portion of interest in the scene and wherein the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene. Additionally or alternatively, the method also includes automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

Preferably, the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

In accordance with a preferred embodiment of the present invention at least one of the generating and the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene.

There is even further provided in accordance with still another preferred embodiment of the present invention a method of controlling operation of a cinematographic payload on a drone, the method including presenting to a user, a representation of a three-dimensional scene to be filmed, receiving from the user an indication of a three-dimensional portion of interest of the scene, automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

Preferably, the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

In accordance with a preferred embodiment of the present invention the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the generating takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the selecting takes into account expected interactions of a plurality of objects in the three- dimensional portion of interest of the scene. Additionally or alternatively, the generating takes into account expected interactions of a plurality of objects in the three- dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the method also includes, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the method also includes, prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel. Preferably, the method also includes, in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

In accordance with a preferred embodiment of the present invention the method also includes receiving from the user a different indication of a three- dimensional portion of interest in the scene and wherein the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene.

There is yet further provided in accordance with another preferred embodiment of the present invention a method of controlling operation of a cinematographic payload on a drone, the method including, presenting to a user, a representation of a three-dimensional scene to be filmed, receiving from the user an indication of a three-dimensional portion of interest of the scene and automatically generating a drone flight path based on a cinematographic analysis of the portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the controlling takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the automatically generating takes into account expected interactions of a plurality of objects in the three-dimensional portion of interest of the scene.

Preferably, the method also includes, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the method also includes, prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel. Additionally, the method also including in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

In accordance with a preferred embodiment of the present invention the method also includes receiving from the user a different indication of a three- dimensional portion of interest in the scene and wherein the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene. Additionally or alternatively, the method also includes automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

In accordance with a preferred embodiment of the present invention the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

Preferably, at least one of the generating and the selecting takes into account at least one of the identity and characteristics of at least one object in the three- dimensional portion of interest of the scene.

There is still further provided in accordance with still another preferred embodiment of the present invention a system for controlling flight of a drone having a cinematographic payload, the system including a user interface presenting to a user, a representation of a three-dimensional scene to be filmed and receiving from the user an indication of a three-dimensional portion of interest in the scene and a controller, controlling flight of the drone in order to position the drone to enable the cinematographic payload to provide a desired moving image of the portion of interest, the controlling including cinematographic analysis of the three-dimensional structure of the portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the controlling takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the controlling takes into account expected interactions of a plurality of objects in the three-dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the system also includes a zone definer, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and a flight restrictor, automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the system also includes a home base definer, operative prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly- zone of permitted drone travel. Additionally, the system also includes a fly path definer, operative in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

Preferably, the user interface is also operative to receive from the user a different indication of a three-dimensional portion of interest in the scene and the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene.

In accordance with a preferred embodiment of the present invention the system also includes a generator automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and a selector automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

Preferably, the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

In accordance with a preferred embodiment of the present invention at least one of the generating and the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene.

There is also provided in accordance with still another preferred embodiment of the present invention a system for controlling operation of a cinematographic payload on a drone, the system including a user interface presenting to a user, a representation of a three-dimensional scene to be filmed and receiving from the user an indication of a three-dimensional portion of interest of the scene, a generator automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and a selector automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

In accordance with a preferred embodiment of the present invention the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

Preferably, the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the generating takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the selecting takes into account expected interactions of a plurality of objects in the three- dimensional portion of interest of the scene. Additionally or alternatively, the generating takes into account expected interactions of a plurality of objects in the three- dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the system also includes a fly zone definer operative prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and a flight restrictor automatically restricting flying of the drone to be within the permitted fly zone. Additionally or alternatively, the system also includes a home base definer, operative prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel.

In accordance with a preferred embodiment of the present invention the system also includes a fly path definer, operative in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

Preferably, the user interface is operative to receive from the user a different indication of a three-dimensional portion of interest in the scene and wherein the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene.

There is yet further provided in accordance with yet another preferred embodiment of the present invention a system for controlling operation of a cinematographic payload on a drone, the system including a user interface presenting to a user, a representation of a three-dimensional scene to be filmed and receiving from the user an indication of a three-dimensional portion of interest of the scene and a generator automatically generating a drone flight path based on a cinematographic analysis of the portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the controlling takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Additionally or alternatively, the automatically generating takes into account expected interactions of a plurality of objects in the three-dimensional portion of interest of the scene.

In accordance with a preferred embodiment of the present invention the system also includes a fly zone definer operative, prior to flying of the drone, defining a permitted fly-zone of permitted drone travel and a fly zone restrictor, operative for automatically restricting flying of the drone to be within the permitted fly zone.

Preferably, the system also includes a home base definer, operative prior to flying of the drone, defining a home base for the drone, which home base may be outside of a permitted fly-zone of permitted drone travel. Additionally, the system also includes a safe fly path definer, operative in cases where the home base for the drone is outside the permitted fly-zone of permitted drone travel, defining a safe fly path between the home base and the permitted fly zone.

In accordance with a preferred embodiment of the present invention the user interface is also operative for receiving from the user a different indication of a three-dimensional portion of interest in the scene and wherein the controlling includes changing a flight path of the drone in mid-flight responsive to the different indication of a three-dimensional portion of interest in the scene.

In accordance with a preferred embodiment of the present invention the system also includes a generator operative for automatically generating a plurality of candidate flight paths for the drone based on a cinematographic analysis of the portion of interest of the scene and a selector operative for automatically selecting at least one of the plurality of candidate flight paths for the drone based on predetermined criteria.

Preferably, the predetermined criteria include distance of the drone from the portion of interest of the scene, expected presence of at least one predetermined object in the three-dimensional scene to be filmed and expected compliance of a moving video of the portion of interest of the scene with predetermined cinematographic quality criteria.

In accordance with a preferred embodiment of the present invention at least one of the generating and the selecting takes into account at least one of the identity and characteristics of at least one object in the three-dimensional portion of interest of the scene. Alternatively, the generating and the selecting take into account at least one of the identity and characteristics of at least one object in the three- dimensional portion of interest of the scene.

BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be understood and appreciated more fully from the following detailed description, taken in conduction with the drawings in which:

Fig. 1 is a simplified illustration of a system for controlling the flight of a drone constructed and operative in accordance with a preferred embodiment of the present invention and shows an imaging drone, a drone controller and a smartphone docked to the drone controller and having a graphical user interface;

Fig. 2 is a simplified top level flow chart illustrating pre-flight set up and flight operation of the system of Fig. 1 in accordance with a preferred embodiment of the present invention;

Fig. 3 is a simplified flow chart illustrating a portion of the pre-flight set up of Fig. 2;

Fig. 4 is a simplified flow chart illustrating another portion of the pre flight set up of Fig. 2;

Fig. 5 is a simplified flow chart illustrating a further portion of the pre flight set up of Fig. 2;

Fig. 6 is a simplified flow chart illustrating yet another portion of the pre flight set up of Fig. 2;

Fig. 7 is a simplified flow chart illustrating yet a further portion of the pre-flight set up of Fig. 2;

Fig. 8 is a simplified flow chart illustrating a still further portion of the pre-flight set up of Fig. 2;

Fig. 9 is a simplified flow chart illustrating a portion of the flight operation of Fig. 2;

Fig. 10 is a simplified flow chart illustrating another portion of the flight operation of Fig. 2;

Fig. 11 is a simplified flow chart illustrating a further portion of the flight operation of Fig. 2; Figs. 12 and 13 are simplified respective pictorial and planar illustrations useful in understanding the portion of the flight operation referenced in Fig. 11.

DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT

Reference is now made to Fig. 1, which is a simplified illustration of a system for controlling the flight of a drone constructed and operative in accordance with a preferred embodiment of the present invention and shows an imaging drone 100, a drone controller 102 and a smartphone 104, such as an I-PHONE ®, docked to the drone controller 102, preferably via a USB connector of the drone controller 102, and having a graphical user interface. The imaging drone is equipped with a camera 106. In the illustrated embodiment of Fig. 1, the illustrated drone and drone controller are a DJI Mavic Drone and Remote Controller as described in the following link:

https://www.ebay.eom/itm/DJI-Mavic-Pro-GL200A-Radio-Remot e-Control-Controller- Transmitter-NO-Cables-

/163374430879?ef_id=Cj0KCQjwl9DlBRCSARIsAOnfRejKGWqWvvUlS U5IO9Sjj4

AAT85g2_VaGZLaCvedJlZpU9Xwpd8YDMsaAiUpEALw_wcB:G:s

It is appreciated that any other suitable drone 100, drone controller 102 and smartphone 104, or other mobile communicator, may be employed.

The functionality of the present invention, as described hereinbelow, is preferably embodied in software loaded as an application onto smartphone 104. The software enables the smartphone 104 to have a graphical user interface displaying information to a user and enabling control of the drone 100 via the drone controller 102 using touch screen functionality of the smartphone 104.

Four simplified typical screen shots illustrating graphical user interfaces employed in some of the novel functions provided in accordance with an embodiment of the present invention are shown in blocks A, B, C and D of Fig. 1. Block A illustrates a map display showing a Field of Interest 120 in a geographical context, such as that available on GOOGLE MAPS®. Field of Interest 120 is delineated by vertices 121. Block B illustrates a touch screen which shows a Fly Zone 122 relative to the Field of Interest and enables a user to modify the Fly Zone 122. Block C illustrates a touch screen, which enables a user to select a 2-dimensional portion of the Field of Interest 120, which the user wishes to film, and also shows the current view as seen by the drone camera 106 at the current position and rotational orientation of the drone 100. A drone Home Point 124, from which the drone 100 is intended to take off and at which the drone 100 is intended to land, is also shown in Block C. Block D illustrates a graphical representation of the Field of Interest 120, the Fly Zone 122 and its vertices 126, the Home Point 124 and a location 128 in Fly Zone 122, which is closest to Home Point 124, as well as a real time video output of the drone camera 106.

A preferred implementation of an embodiment of the present invention incorporates a plurality of proportional integral derivative (PID) controllers, typically four in number. The PID controllers are preferably implemented in software and interact with functionality in the drone controller 102.

In this preferred implementation, the functions of the four PID controllers may be summarized as follows:

PID controller 130 receives, as inputs from drone controller 102, the current position and orientation of drone 100 and the desired position and orientation of drone 100 and provides as outputs to drone controller 102 desired pitch and roll velocities of drone 100.

PID controller 132 receives, as inputs, from the drone controller 102, the current yaw of drone 100 and the desired yaw of drone 100 and provides as outputs to drone controller 102, the desired yaw velocity of drone 100.

PID controller 134 receives, as an input, from the system, the desired height of drone 100 and provides as an output to drone controller 102, the desired height of drone 100. PID controller 134 may be obviated when employing certain types of drones, whose drone controllers include this functionality.

PID controller 136 receives, as inputs, from the drone controller 102, the current camera gimble pitch and yaw of drone 100 and, from the system, the desired camera gimble pitch and yaw of drone 100 and provides, as outputs to drone controller 102, the desired camera pitch and yaw of drone 100.

Reference is now made to Figs. 2 - 13, which illustrate operation of a preferred embodiment of the present invention. Referring initially to Fig. 2, it is seen that Fig. 2 provides a simplified top level flow chart illustrating pre-flight set up and flight operation of the system of Fig. 1 in accordance with a preferred embodiment of the present invention.

The pre- flight set up preferably includes the following stages: Obtaining GPS Coordinates of Boundaries of a Field of Interest, such as Field of Interest 120.

The Field of Interest may be any suitable geometrical area which it is sought to film using drone 100 and drone controller 102. In one example, used throughout the present specification, the Field of Interest 120 is a playing field, such as a football field, as seen, for example in blocks A, B, C and D in Fig. 1. It is appreciated that, alternatively, the Field of Interest may be any other suitable geometrical area, such as, for example, a baseball field, a basketball court, a soccer field, a back yard, a wedding venue, a theme park ride and a theater.

One relatively straightforward way of obtaining GPS coordinates of the boundaries of the Field of Interest 120 is summarized in the flow chart of Fig. 3. As referenced generally in Fig. 3, the user may physically place the drone 100 or the drone controller 102 or even only the smartphone 104 at vertices 121 of the Field of Interest and record the GPS coordinates of each of such vertices, using a GPS sensor which forms part of the drone 100, the drone controller 102 or the smartphone 104. Preferably, the GPS coordinates of each vertex 121 are recorded multiple times and are subsequently averaged, filtered and verified by the system for correctness. Preferably, for the sake of ease of computation, but not necessarily, a cartesian coordinate system is associated with the Field of Interest 120 and cartesian coordinates of all of the vertices 121 are calculated by the system. Alternatively, the GPS coordinates of the boundaries of the Field of Interest 120 may be obtained by the system in any other suitable manner.

Returning now to Fig. 2, following obtaining GPS coordinates of the boundaries of the Field of Interest 120, the system defines the GPS coordinates and height of the Fly Zone 122 in three dimensions, as described hereinbelow with reference to Fig 4.

As indicated in Fig. 4, the user, preferably using the touch screen of the smartphone 104, selects the type of scene to be filmed in the Field of Interest 120 from a menu, which may include various types of scenes, such as a soccer game, a baseball game, a wedding, a family gathering, a theme park visit and a performance. The system then retrieves, from a pre-existing database, a Fly Zone proposal which is appropriate to the user-selected type of scene and to the user-selected Field of Interest 120. The user is preferably afforded an opportunity to modify the location and the configuration of the Fly Zone 122, preferably by using the touch screen of smartphone 104, which displays a screen of the type shown in block B of Fig. 1. The system defines the GPS coordinates and the height of vertices 126 of the Fly Zone 122.

Returning again to Fig. 2, it is seen that following definition of the GPS coordinates and the height of the vertices 126 of the user-confirmed Fly Zone, the user defines the GPS coordinates of the Home Point 124 from which the drone 100 is intended to take off and at which the drone 100 is intended to land, as described hereinbelow with reference to Fig 5.

As indicated in Fig. 5, the user preferably physically positions drone 100 at a desired Home Point 124, which is normally on the ground, and records the GPS coordinates of the Home Point 124. Preferably, the GPS coordinates of the Home Point 124 are recorded multiple times and are subsequently averaged, filtered and verified for correctness. Preferably, for the sake of ease of computation, but not necessarily, a cartesian coordinate system is associated with the Home Point 124 and cartesian coordinates of the Home Point 124 are calculated. The cartesian coordinate system associated with the Home Point 124 is preferably identical to that associated with the Field of Interest 120.

Returning once again to Fig. 2, it is seen that following definition of the GPS coordinates of the Home Point 124, the system calculates an optimal flight path between Home Point 124 and Fly Zone 122. As described below with reference to Fig. 6, the optimal flight path between Home Point 124 and Fly Zone 122 is validated by the system and/or the user.

As indicated in Fig. 6, the system preferably calculates the shortest line of sight path between the Home Point 124 and the location 128 in the Fly Zone 122 which is closest to the Home Point 124. This system-calculated path is hereinafter termed the“Fly Zone Flight Path”. The user, using the system, may fly the drone 100 along the Fly Zone Flight Path and validate the Fly Zone Flight Path by ascertaining that there are no obstacles preventing drone flight along the Fly Zone Flight Path.

Returning yet again to Fig. 2, it is seen that following calculation and validation of the Fly Zone Flight Path, the system divides the Field of Interest 120 into plural discrete 2-dimensional portions 140, as described below with reference to Fig. 7. As indicated in Fig. 7, the system preferably divides the Field of Interest 120 into a plurality of Field of Interest Portions 140, which are not necessarily of the same size or configuration. The system preferably carries out this division based on the user’s selection of the type of scene and the coordinates of the Field of Interest 120. For example, if the scene type is a soccer game, the Field of Interest Portions 140 preferably will include the regions surrounding each of the goals, at the center of the field, corner kick zones. Preferably, all of the Field of Interest 120 lies within at least one Field of Interest Portion 140.

Returning still again to Fig. 2, it is seen that following division of the Field of Interest 120 into a plurality of Field of Interest Portions 140, the system defines plural discrete Fly Zone positions 150 for filming the plurality of Field of Interest Portions 140, as described below with reference to Fig. 8. The plural discrete Fly Zone positions 150 may be uniformly distributed throughout the Fly Zone 122 or may be located at particular regions thereof which provide a cinematographic ally acceptable view of various ones of the Field of Interest Portions 140, referenced in Fig. 7 above.

As indicated in Fig. 8, the system typically calculates plural discrete Fly Zone positions 150 by initially selecting multiple lines of sight to various ones of the Field of Interest Portions 140 taking into account cinematographic considerations, such as those described in the above-quoted references, the disclosures of which are hereby incorporated by reference. The system then ascertains whether there are present any occlusions of portions of the Field of Interest Portions 140 along each of the multiple lines of sight and then selects one or more lines of sight to each of the Field of Interest Portions 140 having no or minimal occlusions. Based on the selected lines of sight, the system calculates the plural discrete Fly Zone positions 150.

Turning now to drone flight operation, reference is again made to Fig. 2 and it is seen that an initial stage is for the drone 100 to be flown to Fly Zone 120 along the Flight Zone Flight Path. This flight is initiated by the user and controlled automatically by the drone controller 102 in accordance with the pre-established coordinates of the Home Point 124, the Fly Zone 120 and the location 128 in the Fly Zone 122 which is closest to the Home Point 124. Preferred stages in the flight of the drone 100 along the Flight Zone Flight Path to location 128 are indicated in Fig. 9. As seen in Fig. 9, the user selects the Field of Interest 120 preferably by using the touch screen on smartphone 104, which may be appear as illustrated in Block A of Fig. 1. Having selected the Field of Interest 120, the user sees a graphical representation of the Field of Interest 120, the Fly Zone 122, the Home Point 124 and location 128 and the real time video output of the drone camera 106, as illustrated in Block D of Fig. 1.

The user then physically places the drone at Home Point 124 and by using the touch screen on smartphone 104, instructs the drone 100 to fly from the Home Point 124 to location 128. When the drone 100 arrives at location 128, the filming process can begin.

Returning to Fig. 2, the user selects one or more of the plural discrete 2- dimensional portions 140 of the Field of Interest 120 that the user wishes to film. As indicated in Fig. 10, the user is typically looking at a screen similar to that illustrated in Block D of Fig. 1 wherein the graphical representation of the Field of Interest 120 is overlaid with representations of the plural 2-dimensional portions 140 of the Field of Interest 120. Using this touch screen, the user selects one or more of the plural 2- dimensional portions 140 to be filmed.

Thereafter, as indicated in Fig. 2, the system selects one of the plurality of discrete fly zone positions 150 and rotational orientations to be assumed by the drone 100 during this stage of filming. This selection is carried out as described below with reference to Figs. 11 - 13.

As seen in Fig. 11, for each of the plural discrete fly zone positions 150, the system optimizes the rotational orientation of the drone 100 to enable the drone camera 106 to view the user selected one or more plural discrete two-dimensional portions 140 of the Field of Interest 120 under optimal cinematographic conditions. Referring additionally to the diagrams in Figs. 12 and 13, it is appreciated that optimization of the rotational orientation of the drone 100 for each of the plural discrete fly zone positions 150 includes maximization of the extent of overlap between the image plane 300 of the drone camera 106 and a projection 310 of the user-selected one or more 2-dimensional portions 140 of the Field of Interest 120, indicated by reference numeral 320 in Figs. 12 and 13. Maximization of the extent of overlap between the image plane 300 of the drone camera 106 and a projection 310 of the user-selected one or more 2- dimensional portions 320 of the Field of Interest 120 is preferably carried out by calculation of the following steps:

1. Projecting vertices 330 of the user-selected one or more 2-dimensional portions 320 of the Field of Interest 120 onto the image plane 300 for each of the plurality of discrete fly zone positions 150 initially for a somewhat arbitrary rotational orientation of the drone camera 106 using a conventional camera pinhole model;

2. Drawing a rectangle 350 subsuming the vertices 340 of the projection 310 of the user-selected one or more 2-dimensional portions 320 of the Field of Interest 120;

3. Bisecting the image plane in mutually perpendicular directions as indicated by lines 360 and 370;

4. Computing distances between a center 380 of the image plane 300 and the intersections 390 of the bisecting lines 360 and 370 of rectangle 350 to derive X LEFT, X RIGHT, Y TOP, Y BOTTOM distances 400, 410, 420 AND 430 respectively;

5. Computing ratios of the X LEFT/XRIGHT AND YTOP/YBOTTOM distances;

6. Representing the extent of overlap between the image plane 300 of the drone camera 106 and the projection 310 of the user-selected one or more 2-dimensional portions 320 of the Field of Interest 120 as being proportional to the difference between the ratios of the X LEFT/XRIGHT AND YTOP/YBOTTOM distances and 1;

7. Repeating above steps 1 - 6 for each of a plurality of rotational orientations each preferably arrived at by making one rotation step in either pitch or yaw until there is no longer any convergence of the difference between the ratios of the X LEFT/XRIGHT AND YTOP/YBOTTOM distances and 1;

8. For each of the plural discrete fly zone positions 150, selecting a preferred rotational orientation based on the above steps 1 - 7, which rotational orientation has the least difference between the ratios of the X LEFT/XRIGHT AND YT OP/YB OTTOM distances and 1;

9. Assigning a score to the one of the plural discrete fly zone positions 150 at its optimal rotational orientation as established by above steps 1 - 8. The score may take into account cinematographic considerations as well as extent of overlap.

The system then selects that one of the plural discrete Fly Zone positions 150 having the best score and selects the optimal rotational orientation for that Fly Zone position in accordance with steps 1 - 8 above.

Returning once again to Fig 2., it is seen that the system provides control input instructions to the drone controller 102 to cause the drone 100 to arrive at the system selected one of the plural discrete Fly Zone positions 150 and to cause the drone camera 106 to assume the system-selected rotational orientation.

Filming takes place until the user or the system initiates a return flight from the Fly Zone 120 to the Home Point 124.

It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the present invention includes both combinations and sub combinations of features described hereinabove and modifications thereof, which are not in the prior art.