Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD TO CONTROL A MOVEMENT OF A ROBOT AND A CONTROLLER
Document Type and Number:
WIPO Patent Application WO/2024/002479
Kind Code:
A1
Abstract:
A method to control a movement of a robot (100), in particular a legged robot, comprising a controller (1) with a user interface unit configured to receive a user command for controlling a robot movement by applying an actuator command (I) to an actuator of the robot (100). The user interface unit comprises a touch sensitive surface (10). The method comprises the steps of touching the touch sensitive surface (10) and thereby setting a new point of origin (x0, y0) of a user reference frame (x, y), while staying in touch with the touch sensitive surface (10), generating a user command (x1, y1) within the user reference frame (x, y), and translating the user command (x1, y1) into the actuator command (II), with ν = ƒ1 (y1) and w = ƒ2 (x1) and into the robot movement, wherein ν and w each correspond to one velocity out of the velocities of a robot reference frame (xR,yR,zR).

Inventors:
MENNECHET FLORENT (CH)
GEHRING CHRISTIAN (CH)
Application Number:
PCT/EP2022/067936
Publication Date:
January 04, 2024
Filing Date:
June 29, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ANYBOTICS AG (CH)
International Classes:
G05D1/00; B25J9/16; G05B19/427
Foreign References:
US20110221692A12011-09-15
US20190094850A12019-03-28
US20150253771A12015-09-10
US20200089302A12020-03-19
Attorney, Agent or Firm:
E. BLUM & CO. AG (CH)
Download PDF:
Claims:
Claims

1. A method to control a movement of a robot (100) , in particular a legged robot, comprising

• a controller (1) with a user interface unit configured to receive a user command for controlling a robot movement by applying an actuator command c = (v, w) to an actuator of the robot (100) ,

• wherein the user interface unit comprises a touch sensitive surface (10) , the method comprises the steps of

• touching the touch sensitive surface (10) and thereby setting a new point of origin (xo, yo) of a user reference frame (x, y) ,

• while staying in touch with the touch sensitive surface (10) , generating a user command (xl, yl) within the user reference frame (x, y) ,

• translating the user command (xl, yl) into the actuator command t, with

• v = fl (yl)

• w = f2 (xl) and therefore into the robot movement, wherein v and w each correspond to one velocity selected out of the velocities of a robot reference frame (xR,yR,zR).

2. The method according to claim 1, wherein v corresponds to a translational velocity tx in heading direction xR of a robot reference frame (xR,yR,zR) and w corresponds to one of the remaining velocities in dimension (yR,zR) of the robot reference fr ame (xR,yR,zR) .

3. The method according to one of the preceding claims, wherein the method comprises the steps of

• translating the user command (xl, yl) into the actuator command c, wherein w corresponds to an angular velocity az, and therefore into the robot movement of yawing around the vertical axis zR of the robot reference frame (xR,yR,zR).

4. The method according to claim 1 or 2, wherein the method comprises the steps of

• translating the user command (xl, yl) into the actuator command c, and therefore into the robot movement, wherein w corresponds to a translational velocity tY in lateral direction yR of the robot reference frame (xR,yR,zR).

5. The method according to one of the preceding claims, comprising the step of stopping the robot movement, as soon as a touch of the touch sensitive surface (10) is released.

6. The method according to one of the preceding claims, comprising the step of ignoring any further functionalities of the touch sensitive surface (10) as long as the touch is not released.

7. The method according to one of the preceding claims, wherein the user command (xl, yl) is only implemented into an actuator command c if a movement of the cursor on the touch sensitive surface (10) extends over a buffer zone in x-direction and/or in y-direction of the user reference frame (x, y) .

8. The method according to claim 7 wherein the buffer zone

• for the user command on a ±x-axis of the user reference frame is: 0.005 cm < |xl | < 0.5 cm,

• for the user command on a ±y-axis of the user reference frame is: 0.005 cm < I yl I d 0.5 cm.

9. The method one of the preceding claims, wherein the sensitivity of the user reference frame (x, y) is adaptable such that the user command on the ±x-axis is 10 times more sensitive than the user command on the

±y-axis, in particular wherein the user command on the ±x-axis is 2 times more sensitive than the user command of the ±y-axis.

10. The method according to one of the preceding claims, wherein a range of an absolute scale of the ±x-axis and/or the ±y-axis of the user reference frame (x, y) is adaptable in size.

11. The method according to one of the preceding claims, wherein the user reference frame (x, y) is invertible, such that the user commend (xl, yl) is translated into the actuator command c with v = - /i (yl) and w = —fl (xl) .

12. The method according to one of the preceding claims, wherein the touch sensitive surface (10) comprises a pre-defined area to set the new point of origin ( 2 ) .

13. The method according to one of the preceding claims, wherein switching between alternative method steps, in particular in claim 3, claim 4 and/or 9 or is done by o touching the touch sensitive surface (10) twice and staying in touch with the touch sensitive surface (10) at the second touch to switch method steps, and/or o releasing the touch to the touch sensitive surface (10) to return to the previous method step .

14. Method according to claim 13 wherein the touch sensitive surface (10) is a touch screen and wherein the background colour of the touch screen (10) and/or user reference frame (x, y) changes according to the respective method step, to indicate a haptic feedback of the respective method step.

15. Controller (1) to conduct the method according to one of claims 1 to 14.

16. Controller (1) according to claim 15, wherein the controller is integrated into a tablet application, a smartphone application and/or a pc application .

17. A computer program for carrying out the method according to claim 1 to 14.

18. Use of the controller according to claim 15 or 16 or computer program according to claim 17 with solely one finger or solely one input device.

19. Use according to claim 18, wherein a colour change provides a visual feedback about the present method step.

Description:
Method to Control a Movement of a Robot and a

Controller

Technical Field

The invention refers to a method to control a movement of a robot , a controller therefore and a use o f the controller .

Background Art

Robots are used for various tasks , in particular for supporting human work in various environments . The robot movements are thereby controlled and steered by a controller .

In particular, i f a robot is acting in a hazardous environment , the operator steering the robot needs to pay full attention to the robots movements to prevent damage of the robot or its environment .

The steering of a robot in a hazardous environment can therefore be very demanding and requires a controller that can be handled easily . Preferably, such a controller can be operated without even having visual control of the controller itsel f . This functionality is implemented by having j oystick-like elements on the controller that allow to steer the robot by j ust moving the j oystick-like elements with the fingers and no need o f visual control of the elements due to the haptic feedback that is provided by the j oystick-elements .

The disadvantage of such a controller is o f course its single-purpose application . Living in a world where devices are expected to have multi-purpose applications , a classic controller with j oysticks that are dedicated to only one functionality is not convenient anymore . A more convenient way is it to integrate such a controller as an application in a smartphone , tablet or pc device by means of using a touch sensitive surface functionality or touch screen of the respective device . Such integration would require to operate the controller via a touch sensitive surface or touch screen .

A further disadvantage of a known controller is that the steering with j oystick-elements often requires both hands and/or multiple fingers to be on the controller . In particular, since such a controller might be heavy to carry, it is very disadvantageous i f both hands are needed to control it , since then it is basically necessary to carry it by means of a neck holder or similar . In addition, in a hazardous environment , a user might wear gloves for steering the robot . A oystick-like element might be di f ficult to steer with gloves .

Therefore , advantageously, the steering should be possible with only one hand and in particular with only one finger, such that the controller can be carried with the other hand . In particular, i f the user is wearing gloves , it is advantageous i f the controller can be controlled by means of using only a single tactile device , such as a pen-like input device that can be used also with gloves .

Anyway, known methods or applications to control the robot via a touch sensitive surface or touch screen require the operator to have a visual control o f the touch sensitive surface or touch screen since in contrast to a j oystick-like controller, there is no haptic feedback anymore .

Disclosure of the Invention

The problem to be solved by the present invention is therefore to provide a method and application that allows an operator to control a robot in a hazardous environment with a tablet , smartphone or pc having a touch sensitive surface or touch screen, wherein there is no need of the operator to have visual control of the touch sensitive surface or touch screen .

In addition, the present invention further solves the problem to provide a method, wherein the controller can be operated by means of one finger or input device ( tactile device ) .

The problem is solved by a first aspect of the invention referring to a method to control a movement of a robot , a second aspect referring to a controller to conduct the method, a third aspect referring to a computer program for carrying out the method and a use of the controller or computer program .

Unless otherwise stated, the following definitions shall apply in this speci fication :

The terms "a" , "an" , "the" and s imilar terms used in the context of the present invention are to be construed to cover both the singular and plural unless otherwise indicated herein or clearly contradicted by the context . Further, the terms "including" , "containing" and "comprising" are used herein in their open, non-limiting sense . The term "containing" shall include both, "comprising" and "consisting of" .

Advantageously, the term "torso of a robot" or "torso" refers to a main body of a robot comprising the logic components for controlling the robot and wherein the limb section or limb is attached to the torso . In particular, wherein the torso might comprise multiple limbs , e . g . for a quadruped robot .

Advantageously, the term "actuator command t = f(v, w)" refers to a command that can be received by an actuator of a robot . In particular, i f the robot is a legged robot , the one or more actuator that can receive the actuator command t are integrated in the one or more leg of the robot .

Advantageously, a legged robot can comprise various actuators integrated in each of its legs : • a hip abduction/adduction (HAA) actuator adapted to connect to the robot torso and connecting to a hip f lexion/extension (HFE ) j oint ,

• the hip f lexion/extension (HFE ) j oint connecting to a hip f lexion/extension (HFE ) actuator,

• the hip f lexion/extension (HFE ) actuator connecting to the upper leg,

• the upper leg connecting to a knee f lexion/extension (KFE ) actuator,

• the knee f lexion/extension (KFE ) actuator connecting to a shank,

• the shank connecting to a shank tube , and

• the shank tube is adapted to connect to the robot foot or robot foot adapter which is adapted to connect to the robot foot .

Advantageously, the actuator command t is a general term for all the actuator commands that are sent to the in particular legged robot . Therefore , the actuator command t describes a signal that is sent to the robot to trigger an action of the robot , respectively a coordinated action of its actuators . In particular, the

Advantageously, the "robot movement" corresponds to a movement direction of the robot that comprises the actuator that receives the actuator command t .

Advantageously, the robot reference frame (X R , y R , Z R ) is a coordinate frame related to a robot main body or robot torso . Within the robot reference frame (x R ,y R , z R ) , the robot can move with six velocities defined in the following :

• a translational velocity t x in heading direction x R of the robot reference frame ,

• a translational velocity t Y in lateral direction y R of the robot reference frame ,

• a translational velocity t z in vertical direction z R direction of the robot reference frame , • an angular velocity a x around the x R -axis ( roll-axis ) ,

• an angular velocity around the y R -axis (pitch-axis ) ,

• an angular velocity c around the z R -axis ( yaw-axis ) .

Advantageously, the heading direction x R refers to a direction in which the heading of the robot is pointing . Further particular, the heading direction x R is the direction in which the legged robot is walking in a direction that corresponds to the longitudinal axis of its robot torso .

Further advantageously, the robot movement in lateral direction is lateral to the heading direction and corresponds to the robot movement in y R -direction of the robot frame .

The first aspect of the invention refers to a method to control a movement of a robot , in particular a legged robot , comprising a controller . The controller has a user interface unit to receive a user command for controlling a robot movement by applying an actuator command c = (v, w) to an actuator of the robot .

Such an actuator is advantageously an actuator of a leg in a legged robot . In addition, such an actuator can be an actuator in any other type of robot .

The user interface comprises a touch sensitive surface , in particular a touch screen or touch pad . The touch sensitive surface can advantageously receive input via touch of a finger or input device , like a pen-like input device or any other tactile device or similar .

Advantageously, the method is related to a computer program running on a device like a tablet or smartphone or pc . Therefore , i f the computer program i s activated, the method runs on the respective device .

The method comprises the steps of

• touching the touch sensitive surface and thereby setting a new point of origin (xo, yo ) of a user reference frame (x, y) , in particular for a cursor, • while staying in touch with the touch sensitive surface, generating a user command (xl, yl) within the user reference frame (x, y) , in particular, while staying in touch with the touch sensitive surface, moving the cursor within the user reference frame (x, y) to input a user command (xl, yl) ,

• translating the user command (xl, yl) into the actuator command c, with

• v = fl (yl)

• w = f2 (xl) and therefore into the robot movement, wherein v and w each correspond to one velocity, in particular correspond to different velocities, selected out of the (six) velocities that a robot can take within the robot reference frame (x R ,y R ,z R ).

Advantageously, the user reference frame is a coordinate frame that is related to the touch sensitive surface, where the user can input commands. The commands are input by means of touching the touch sensitive surface and moving a finger or input device over the touch sensitive surface.

When a finger or input device first touches the touch sensitive surface, this position is referred to the point of origin (xo, yo) of the user reference frame. The position might be marked with a cursor. While staying in touch with the touch sensitive surface and moving the finger or input device over the touch sensitive surface, user commands (xl, yl) are generated that translate into an actuator command c.

Advantageously, the term translate refers to a conversion of the signal by means of a computing unit.

Advantageously, a user command (xl, yl) is generated within an interval of 20Hz to 40Hz while moving the finger over the touch sensitive surface. Further advantageously, it could be defined that after each interval, a new user command (xl, yl) is defined. In an advantageous embodiment of the invention, v corresponds to the translational velocity t x in heading direction x R of the robot reference frame and w corresponds to one out of the ( five ) remaining velocities in another dimension (y R , z R ) of the robot reference frame .

In a further advantageous embodiment of the invention, the method comprises the step of translating the user command (xl , yl ) into the actuator command c, wherein w corresponds to the angular velocity a z , such that the movement of the robot results in yawing around the vertical axis z R of the robot frame .

In an advantageous embodiment , the finger or input device is moved over a touch sensitive surface to generate a user command (xl , yl ) , which is translated into and

This means in particular that an actuator of the robot is steered to move in x R -direction of the robot reference frame i f the finger or input device is moved over the touch sensitive surface in y-direction of the user reference frame . I f the finger or input device is moved over the touch sensitive surface in x-direction of the user reference frame , the actuator of the robot is steered to yaw around the vertical axis z R with an angular velocity a z .

In particular, for an advantageous method, i f the finger or input device is moved from (xo, yo ) to (xl , yl ) , this might result in a combination, in particular a vector addition, of the two velocities t x and c .

In a further advantageous embodiment of the invention, the user command (xl , yl ) is translated into an actuator command c, and therefore into a robot movement wherein w corresponds to a translational velocity t Y in lateral direction y R of the robot frame . In a further advantageous embodiment , the finger or input device is moved over a touch sensitive surface , to generate a user command (xl , yl ) , which is translated into an actuator command c = (v, w) , with v =

This means in particular that an actuator of the robot is steered to move in x R -direction of the robot reference frame i f the finger or input device is moved over the touch sensitive surface in y-direction of the user reference frame . I f the finger or input device is moved over the touch sensitive surface in x-direction of the user reference frame , the actuator of the robot is steered to move in a lateral y R direction, lateral to the heading direction x R .

In particular, i f the finger or input device is moved from (xo, yo ) to (xl , yl ) , this results in a combination of the two velocities t x and t Y .

In a further advantageous embodiment of the invention, the robot movement stops , as soon as the finger or input device is removed from the touch sensitive surface , therefore as soon as a touch of the touch sensitive surface is released . This functionality makes it very intuitive for a user to stop the robot movement .

In a further advantageous embodiment of the invention, the method comprises the step of protecting commands indicated by the touch of the touch sensitive surface . As long as a finger or input device moves on the touch sensitive surface , no other functionality is triggered by touching the touch sensitive surface .

Advantageously, as long as a finger or input device moves on the touch sensitive surface , no other functionality is triggered by touching the touch sensitive surface from this finer or input device .

Advantageously, even i f a second finger or second input device would touch the sensitive surface , thi s would not trigger another functionality than what the first touch of the surface started. Further advantageously, the protection against the involuntary "triggering" of other functions only involves the finger or input device which is first in touch with the sensitive surface and gives control commands, there is in particular no deactivation of other functions.

Therefore, advantageously, if the first finger or input device slides over the stop button during the steering, the stop button is protected. But the stop can still be triggered by a direct tap from second finger or input device.

To simplify, the finger in continuous contact involved in the robot control cannot trigger anything else In a further advantageous embodiment of the invention, the user command (xl, yl) is only implemented into an actuator command c if a movement of the cursor on the touch sensitive surface (10) extends over a buffer zone in x-direction and/or in y-direction. The buffer zone prevents that unintentional touches of the touch sensitive surface generate an actuator command c.

Advantageously, the buffer zone

• for the user command on a ±x-axis of the user reference frame is: 0.005 cm < |xl | < 0.5 cm,

• for the user command on a ±y-axis of the user reference frame is: 0.005 cm < I yl I d 0.5 cm.

In a further advantageous embodiment of the invention, the sensitivity of the user reference frame (x, y) is adaptable. In particular, the sensitivity refers to the sensitivity respectively the interval of the scale of the x- and y-axis of the user reference frame. Alternatively, the variation in sensitivity might be implemented by having different sized buffer zones in x- and y-direction of the user reference frame.

The user command on the ±x-axis is 10 times more sensitive than the user command on the ±y-axis. In particular wherein the user command on the ±x-axis is 2 times more sensitive than the user command of the ±y-axis. This feature allows to adapt the sensitivity of the robot movements to the environment .

In a further advantageous embodiment of the invention, the scale on the ±x-axis of the user reference frame is not proportional to the scale on the ±y-axis of the user reference frame . In particular, this feature has also an influence on the sensitivity of the user reference frame . By having di f ferent proportionality between the x- axis and the y-axis , the sensitivity in x-direction and y- direction of the user reference frame might vary . In particular, the di f ferent proportionality means that the scales have di f ferent intervals and therefore , a movement of the finger or input device of a defined distance in x- direction of the user reference frame does not result in the same increase of the actuator command c than the same touch movement of the defined distance in y-direction of the user reference frame .

In a further advantageous embodiment of the invention, an absolute scale of the ±x-axis and/or the ±y- axis of the user reference frame (x, y) is adaptable in si ze . This means , that the si ze of the x-axis and/or y- axis of the user reference frame might vary in si ze on the touch sensitive surface .

In a further advantageous embodiment of the invention, the user reference frame (x, y) is invertible , such that the user commend (xl , yl ) is translated into the actuator command c with v = - /i (yl) and w = -fl (xl) .

This has the ef fect , that the steering of the robot actuator respectively robot is more intuitive . I f the robot changes its movement direction, in particular has its heading direction directed towards the person controlling the robot , it might be more intuitive to inverse the steering directions , meaning that a touch movement of the finger or input device in -x-direction of the user reference frame on the touch sensitive surface results in a robot movement in heading direction x R of the robot reference frame (x R ,y R ,z R ) .

In a further advantageous embodiment of the invention, the touch sensitive surface comprises a predefined area to set the new point of origin .

In a further advantageous embodiment of the invention, the switching between alternative method steps might be done by

• touching the touch sensitive surface twice and staying in touch with the touch sensitive surface at the second touch to switch method steps , and/or

• releasing the touch to the touch sensitive surface to return to the previous method step .

The technical ef fect of this feature is in particular that it can be switched easily between alternative method steps , without visual control of the touch sensitive surface and without having a look at the touch sensitive surface . Therefore , the focus of the user can still stay with the robot movement and robot .

The switching between alternative method steps refers in particular to switching between the method steps :

• translating the user command (xl , yl ) into the actuator command c, wherein w corresponds to an angular velocity c , and therefore into the robot movement of yawing around the vertical axis z R of the robot frame , and/or

• translating the user command (xl , yl ) into the actuator command c, and therefore into the robot movement , wherein w corresponds to a translational velocity t Y in lateral direction y R of the robot frame , and/or

• adapting the sensitivity of the user reference frame (x, y) such that the user command on the ±x-axis is 10 times more sensitive than the user command on the ±y-axis , or vice versa, in particular wherein the user command on the ±x-axis is 2 times more sensitive than the user command of the ±y-axis , or vice versa . In a further advantageous embodiment of the invention, i f the touch sensitive surface is a touch screen, the background colour of the touch screen changes according to the respective method .

Such a colour change of the touch screen or user reference can be noticed by the user in the corner of the eyes . Therefore , even though the user does not look at the touch screen, but still focus on the robot , the user might notice the colour change of the touch screen or user reference frame . Since a speci fic colour change might refer to a speci fic method step, the user always knows which method step the action he takes on the touch screen refers to at the moment .

In particular, i f the colour is blue , this might be a signal that the present method step refers to the method step, wherein w corresponds to an angular velocity a z ( yawing) . I f the colour background is red, this might be a signal that the present method step refers to the method step, wherein w corresponds to a translational velocity t Y .

In particular, the colour change relies on the peripheral vision of the user . The user, even though not looking directly on the touch screen, notices a colour change at the periphery of its vis ible perception of the environment . Therefore , the colour change basically replaces a haptic feedback, e . g . from a j oystick-like control element , as it is known from prior art .

A second aspect of the invention refers to a controller to conduct the method according to the first aspect of the invention .

Advantageously, the controller i s integrated into a table application, a smartphone application and/or a pc application .

A third aspect of the invention refers to a computer program for carrying out the method according to the first aspect , in particular with a controller according to the second aspect of the invention .

A fourth aspect of the invention refers to a use of the controller according to the second aspect or a computer program according to the third aspect of the invention with solely one finger or input device .

Further advantageously, the use of the controller for the method according to the first aspect of the invention is adapted to not require any visual control of the touch sensitive surface or of the controller .

Other advantageous embodiments are listed in the dependent claims as well as in the description below .

Brief Description of the Drawings

The invention will be better understood and obj ects other than those set forth above will become apparent from the following detailed description thereof . Such description makes reference to the annexed drawings , wherein :

Fig . la shows a schematic of an embodiment of a controller according to a second aspect of the invention to conduct a method according to a first aspect of the invention;

Fig . lb shows a controller for a robot according to a second aspect of the invention to perform the method according to the first aspect of the invention;

Fig . 2a shows a schematic of how a user command (xl , yl ) on a touch sensitive screen gets translated into a translational velocity t x in heading direction x R of a robot reference frame (x R ,y R ,z R ) ;

Fig . 2b shows a schematic of how a user command (xl , yl ) on a touch sensitive screen gets translated into to an angular velocity c , and therefore into the robot movement of yawing around the vertical axis z R of the robot reference frame (x R ,y R , z R ) ;

Fig . 2c shows a schematic of how a user command (xl , yl ) on a touch sensitive screen gets translated into a combination of t x and c ;

Fig . 3a shows a coordinate system of an embodiment of a user reference frame (x, y) ;

Fig . 3b shows a coordinate system of an embodiment of a robot reference frame (x R ,y R , z R ) ; and

Fig . 4 shows an embodiment of a user reference frame (x, y) with buf fer zones .

Modes for Carrying Out the Invention

Fig . la shows a schematic of an embodiment o f a controller according to a second aspect of the invention to conduct the method according to a first aspect of the invention .

The controller can conduct a method to control a movement of a robot 100 , in particular a legged robot . The controller has a user interface unit to receive a user command for controlling a robot movement by applying an actuator command c = (v, w) to an actuator of the robot 100 . The user interface unit comprises a touch sensitive surface 10 , in particular as shown on the picture , a touch screen .

In particular, the controller is integrated into a tablet application, a smartphone application and/or a pc application as shown in Fig . la and lb .

In particular, a computer program runs on the controller, for carrying out the method according to the first aspect of the invention .

The method to control a movement of a robot 100 comprises the steps of • touching the touch sensitive surface 10 and thereby setting a new point of origin (xo, yo ) of a user reference frame (x, y) ,

• while staying in touch with the touch sensitive surface 10 , generating a user command (xl , yl ) within the user reference frame (x, y) ,

• translating the user command (xl , yl ) into the actuator command c, with

• v = fl (yl)

• w = f2 (xl) and therefore into the robot movement , wherein v and w each correspond to one velocity, in particular correspond to di f ferent velocities , in particular correspond each to one velocity selected out of the velocities of the robot reference frame (x R ,y R , z R ) .

As shown in Fig . lb, for an advantageous embodiment of the invention, a cursor 2 can be set at the point of origin (xo, yo ) of a user reference frame (x, y) . I f a finger or input device is moving on the touch sensitive surface 10 , the robot moves accordingly from the point of origin (xo, yo ) to a position of the user command (xl , yl ) .

Advantageously, the method might comprise the step that the movement of the robot 100 stops , as soon as a touch, respectively the finger or input device , is released or removed from the touch sensitive surface 10 . This allows very easy control over the robot 100 .

Further advantageously, the method might comprise the step of ignoring any further functionalities of the touch sensitive surface 10 , as long as the touch, respectively the finger or input device , is not released .

Further advantageously, the touch sensitive surface might comprise a pre-defined area to set the new point of origin 2 . In Fig . la und lb, this area refers to the touch sensitive surface 10 as shown in the figures .

In a further advantageous embodiment of the invention, the controller 1 or computer program can be controlled with only one finger or input device that moves over the touch sensitive surface 10, as shown in Fig. la and lb.

Therefore, no visible control of the touch sensitive surface 10 or of the controller 1 is required to interact with the robot 100.

For the Fig. 2a, 2b, and 2c, the input device (e.g. tactile pen) is positioned within the user reference frame (x, y) on the touch sensitive surface 10, here a touch screen, of a tablet device.

The first touch of the input device with the touch screen sets a new point of origin (xo, yo) in the user reference frame (x, y) . The point of origin is indicated as dotted circle in Fig. 2a, 2b and 2c.

The input device is then moved from the point of origin

Fig. 2a shows a schematic of how an user command (xl, yl) on a touch sensitive screen 10 gets translated into a translational velocity t x in heading direction x R of a robot reference frame (x R ,y R ,z R ).

Fig. 2b shows a schematic of how the user command (xl, yl) input on a touch sensitive screen 10 gets translated into an angular velocity c , and therefore into the robot movement of yawing around the vertical axis z R of the robot reference frame (x R ,y R ,z R ) ;

Fig. 2c shows a schematic of how the user command (xl, yl) input on a touch sensitive screen gets translated into a robot movement that corresponds to a combination of t x and c ;

In a further advantageous embodiment of the method, not shown in the figures, the actuator command c = (v, w) on a touch sensitive screen gets translated translating the user command (xl, yl) into the actuator command c, and therefore into the robot movement, wherein w corresponds to a translational velocity t Y in lateral direction y R of the robot reference frame (x R ,y R ,z R ). Fig . 3a shows a user reference frame (x, y) . The user reference frame is advantageously integrated into the controller . In particular, the user reference frame (x, y) is displayed on a touch sensitive surface , in particular a touch screen of a tablet , a smartphone and/or pc . The user reference frame (x, y) comprises an x-axis and a y-axis .

Fig . 3b shows a robot reference frame (x R ,y R , z R ) . Advantageously, the robot reference frame has a x R -Axis that corresponds to the lateral moving direction of the robot . A x R -Axis of the robot reference frame corresponds to the heading direction of the robot movement . A z R -Axis corresponds to a vertical axis , wherein a yawing of the robot would move the robot around this vertical axis .

In a further advantageous embodiment of the invention, wherein the controller is used to control a yawing movement of the robot , the method further comprises the step of translating the user command (xl , yl ) into the actuator command c, wherein w corresponds to an angular velocity and therefore into the robot movement of yawing around the vertical axis z R of the robot frame .

In an advantageous embodiment of the invention, the controller is used to control a translational velocity t x in heading direction x R of a robot reference frame (x R ,y R , z R ) .

In an alternative advantageous method of the controller, the controller is used to control a translational movement of the robot , wherein the translation is lateral to the heading direction . This method comprises the step of translating the user command (xl , yl ) into the actuator command c, and therefore into the robot movement , wherein w corresponds to a translational velocity t Y in lateral direction y R of the robot frame .

In a further advantageous embodiment of the invention, the user reference frame (x, y) is invertible , such that the user commend (xl , yl ) is translated into the actuator command c with v = - /i (yl) and w = -fl (xl) .

In a further advantageous embodiment of the invention, the method comprises the step of switching between di f ferent method steps . In particular, switching between alternative method steps for translating the user command (xl , yl ) into the actuator command c, as described above . The switching between alternative method steps might be done by

• touching the touch sensitive surface twice and staying in touch with the touch sensitive surface at the second touch to switch method steps , and/or

• releasing the touch to the touch sensitive surface to return to the previous method step .

Further advantageously, i f the touch sensitive surface 10 is a touch screen, the background colour of the touch screen and/or user reference frame (x, y) might change according to the respective alternative method step .

Fig . 4 shows an advantageous embodiment of a user reference frame (x, y) , with a buf fer zone a in ±y- direction along the x-axis , with a buf fer zone b in ±x- direction along the y-axis , and a buf fer zone c radially around the point of origin (xo, yo ) • There might be further embodiments of the user reference frame , where the user reference frame (x, y) comprises solely a buf fer zone and/or a buf fer zone b and/or a buf fer zone c .

An advantageous method step for the method according to the first aspect is therefore that the user command (xl , yl ) is only implemented into an actuator command c i f a movement of the cursor on the touch sensitive surface ( 10 ) extends over a buf fer zone in x-direction and/or in y-direction .

Advantageously, the buf fer zone • b for the user command on a ±x-axis along the y-axis of the user reference frame is : 0 . 005 cm < | xl | <

0 . 5 cm,

• a for the user command on a ±y-axis along the x-axis of the user reference frame is : 0 . 005 cm < I yl I d

0 . 5 cm .

In a further advantageous embodiment of the invention, the sensitivity of the user reference frame (x, y) is adaptable such that the user command in the buf fer zone b on the ±x-axis is 10 times more sensitive than the user command in the buf fer zone a on the ±y-axis or vice versa, in particular wherein the user command in the buf fer zone b on the ±x-axis is 2 times more sensitive than the user command in the buf fer zone a of the ±y-axis or vice versa .

Further advantageously, a range of an absolute scale of the buf fer zone b of the ±x-axis and/or the buf fer zone a of the ±y-axis of the user reference frame (x, y) is adaptable in si ze .