Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS, SYSTEM AND METHOD FOR CONTROLLING MOTION OF A ROBOTIC MANIPULATOR
Document Type and Number:
WIPO Patent Application WO/2017/020081
Kind Code:
A1
Abstract:
An apparatus including a probe attachment point configured to attach a sensing probe having a contact end for contacting a target, a support structure configured for attachment to a final joint of a robotic manipulator, a compliant apparatus, connected to the probe attachment point and the support structure, that is configured to allow movement of the probe, relative to the robotic manipulator, due to an applied force at the contact end, and to resist the movement.

Inventors:
FIELDING MICHAEL (AU)
NAJDOVSKI ZORAN (AU)
MULLINS JAMES (AU)
NAHAVANDI SAEID (AU)
ABDI HAMID (AU)
Application Number:
PCT/AU2016/050698
Publication Date:
February 09, 2017
Filing Date:
August 03, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV DEAKIN (AU)
TELSTRA CORP LTD (AU)
International Classes:
A61B34/30
Foreign References:
US4669483A1987-06-02
US20090088639A12009-04-02
US20120022552A12012-01-26
US20140213939A12014-07-31
US4489729A1984-12-25
US6932089B12005-08-23
Attorney, Agent or Firm:
DAVIES COLLISON CAVE PTY LTD (AU)
Download PDF:
Claims:
THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:

1. An apparatus including:

a probe attachment point configured to attach a sensing probe having a contact end for contacting a target;

a support structure configured for attachment to a final joint of a robotic manipulator;

a compliant apparatus, connected to the probe attachment point and the support structure, that is configured:

to allow movement of the probe, relative to the robotic manipulator, due to an applied force at the contact end, and

to resist the movement.

2. The apparatus of claim 1, wherein the compliant apparatus includes a passive compliant element that allows the movement and resists the movement.

3. The apparatus of claim 1 or 2, wherein the probe includes at least one medical imaging probe, and/or includes at least one ultrasound transducer.

4. The apparatus of any one of claims 1 to 3, including a limit detector configured to

detect when the movement due to the applied force reaches a preselected applied-force limit and/or a preselected distance limit, and to provide an emergency applied-force signal in response to the detection.

5. A system including:

the apparatus of any one of claims 1 to 4; and

one or more emergency stop switches configured to stop the robotic manipulator, due to the emergency applied-force signal.

6. A method including steps of:

applying force at a contact end of a sensing probe using a robotic manipulator; allowing movement of the probe relative to the robotic manipulator due to the applied force; and

resisting the movement using a compliant apparatus.

7. An apparatus including:

at least one input transducer configured to:

receive a discomfort level input from a patient representing a discomfort level of the patient,

generate a discomfort level signal, representing the discomfort level, from the discomfort level input, and

generate an emergency discomfort signal when the discomfort level input reaches a pre-defined maximum discomfort level value.

8. The apparatus of claim 7, wherein the at least one transducer includes a level transducer and an emergency discomfort transducer.

9. The apparatus of claim 7 or 8, including a housing with a hand grip.

10. A system including: the apparatus of any one of claims 7 to 9; a robotic manipulator; and one or more emergency switches that are configured to stop the robotic manipulator based on the emergency discomfort signal.

11. A method including steps of:

receiving a discomfort level input from a patient representing a discomfort level of the patient;

generating a discomfort level signal, representing the discomfort level, from the discomfort level input; and

generating an emergency discomfort signal when the discomfort level input reaches a pre-defined maximum discomfort level value.

12. A method including steps of:

receiving force and/or torque (F/T) signals from a plurality of respective F/T sensors in a respective plurality of joints of a robotic manipulator;

determining a total force or total torque applied to the robotic manipulator based on a sum of force values and/or torque values represented by the F/T signals; and generating an emergency over-force signal when the total force or total torque exceeds a pre-selected over-force threshold.

13. The method of claim 12, including a step of stopping the robotic manipulator in

response to the emergency over-force signal.

14. Computer-readable storage, having stored thereon machine-readable instructions

configured to control at least one computer processor to execute the method of claim 12 or 13.

15. A system including at least one computer processor and the computer-readable storage of claim 14.

16. The system of claim 15, including the robotic manipulator, the F/T sensors, and one or more emergency stop switches configured to stop the robotic manipulator due to the emergency over-force signal.

17. A method including steps of:

accessing boundaries data representing a workspace boundaries for an end effector on a robotic manipulator;

receiving new requested position data representing a new requested position for the end effector based on a user input;

processing the boundaries data and the new requested position data to determine whether the new requested position is in the workspace boundaries;

generating new movement data to control the robotic manipulator to move the end effector to the new requested position if the new requested position is in the workspace boundaries; and

generating no movement data to control the robotic manipulator to move the end effector if the new requested position is out of the workspace boundaries.

18. The method of claim 17, wherein the boundaries data represent the workspace

boundaries in at least three dimensions; and the new requested position data represent the new requested position in the at least three dimensions.

19. The method of claim 17 or 18, wherein the boundaries data represent the workspace boundaries for a probe of the end effector.

20. The method of any one of claims 17 to 19, including steps of:

receiving current position data representing a current position and a current pointing direction of the end effector;

receiving applied force data representing an applied force at a contact end of the end effector; and

processing the current position data and the applied force data to generate the boundaries data representing a dynamic boundary that excludes at least one point along the current pointing direction from the current position.

21. The method of claim 20, wherein the current position data represent the current position and the current pointing direction in three dimensions.

22. The method of claim 20 or 21, wherein the boundaries data representing the dynamic boundary are generated when the applied force is above a predefined force value.

23. The method of any one of claims 20 to 22, wherein the dynamic boundary excludes further movement of the contact end along the current pointing direction.

24. The method of any one of claims 20 to 23, including steps of:

receiving force and/or torque (F/T) signals from at least one F/T sensor in the end effector; and

generating the applied force data from the F/T signals by transforming the F/T signals using predefined calibration data.

25. Computer-readable storage, having stored thereon machine-readable instructions

configured to control at least one computer processor execute the method of any one of claims 17 to 24.

26. A system including at least one computer processor and the computer-readable storage of claim 25.

27. The system of claim 26, including the robotic manipulator, the F/T sensor, and a user- input apparatus configured to generate the new requested position for the new requested position data.

28. The system of claim 27, wherein the user-input apparatus is a haptic apparatus.

29. A system including:

a robotic manipulator;

a probe attachment point, on the final joint of the robotic manipulator, configured to attach a sensing probe for sensing a target; and

an emergency control, on the final joint, opposite the probe attachment point, configured to generate an emergency control signal when activated manually by a person attending the target.

30. The system of claim 5, 10, 26 and/or 29, including at least one emergency stop switch configured to stop the robotic manipulator due to any one of a plurality of emergency signals, including two or more of:

the emergency applied-force signal;

the emergency discomfort signal;

the emergency control signal; and

the emergency over-force signal.

31. A method including:

a robotic controller receiving a desired contact force signal representing a selected contact force to be applied by an end effector controlled by the robotic controller; the robotic controller receiving a motion control signal representing a motion command for the end effector;

processing the desired contact force signal and the motion control signal to generate a control signal to control motion of the end effector following the motion command while maintaining the selected contact force applied by the end effector.

32. The method of claim 31, including receiving the desired contact force signal in remote- control instructions from a master controller, wherein the remote-control instructions represent the selected contact force selected by a user.

33. The method of claim 31 or 32, including:

accessing applied-force signals representing an applied force between the end effector and a target; and

processing the applied-force signals with the desired contact force signal and the motion control signal to generate the control signal.

34. The method of any one of claims 31 to 33, including:

receiving force and/or torque (F/T) signals from at least one F/T sensor in the end effector; and

generating the applied-force signals from the F/T signals by transforming the F/T signals using predefined calibration data.

35. The method of any one of claims 31 to 24, including the robotic controller generating the motion control signal based on movement data representing remote-control instructions from a master controller based on user inputs.

36. The method of any one of claims 31 to 35, including controlling the robotic manipulator to move the end effector.

Description:
APPARATUS, SYSTEM AND METHOD FOR CONTROLLING MOTION OF A ROBOTIC MANIPULATOR

[0001] This application is related to Australian Provisional Patent Application

Number 2015903086, entitled "Apparatus, system and method for controlling motion of a robotic manipulator", filed on 3 August 2015, the entire specification of which is hereby incorporated as if set forth herein in its entirety by reference.

TECHNICAL FIELD

[0002] The present invention relates generally to apparatuses, systems and methods for controlling motion of a robotic manipulator.

BACKGROUND

[0003] Robotic systems are crucial in the developing fields of machine-assisted non-destructive testing and medicine, including telemedicine.

[0004] However, existing robotic systems are generally designed for industrial applications rather than medical applications or sensitive testing applications, and are therefore generally unsuitable for working on patients (including humans) or sensitive targets (including aircraft parts undergoing testing).

[0005] It is desired to address or ameliorate one or more disadvantages or limitations associated with the prior art, or to at least provide a useful alternative.

SUMMARY

[0006] In accordance with the present invention, there is provided an apparatus including:

a probe attachment point configured to attach a sensing probe having a contact end for contacting a target;

a support structure configured for attachment to a final joint of a robotic manipulator; a compliant apparatus, connected to the probe attachment point and the support structure, that is configured:

to allow movement of the probe, relative to the robotic manipulator, due to an applied force at the contact end, and

to resist the movement.

[0007] The compliant apparatus includes a passive compliant element, which can be a spring, that allows the movement and resists the movement. The probe includes at least one medical imaging probe, and/or includes at least one ultrasound transducer.

[0008] The apparatus includes a limit detector configured to detect when the movement due to the applied force reaches a preselected applied-force limit and/or a preselected distance limit, and to provide an emergency applied-force signal in response to the detection.

[0009] The present invention also provides a system including:

the apparatus above; and

one or more emergency stop switches configured to stop the robotic manipulator, due to the emergency applied-force signal.

[0010] The present invention also provides a method including steps of:

applying force at a contact end of a sensing probe using a robotic manipulator;

allowing movement of the probe relative to the robotic manipulator due to the applied force; and

resisting the movement using a compliant apparatus.

[0011] The present invention also provides an apparatus including:

at least one input transducer configured to:

receive a discomfort level input from a patient representing a discomfort level of the patient,

generate a discomfort level signal, representing the discomfort level, from the discomfort level input, and generate an emergency discomfort signal when the discomfort level input reaches a pre-defined maximum discomfort level value.

[0012] The at least one transducer includes a level transducer and an emergency discomfort transducer. The apparatus includes a housing with a hand grip.

[0013] The present invention also provides a system including: the apparatus above; a robotic manipulator; and one or more emergency switches that are configured to stop the robotic manipulator based on the emergency discomfort signal.

[0014] The present invention also provides a method including steps of:

receiving a discomfort level input from a patient representing a discomfort level of the patient;

generating a discomfort level signal, representing the discomfort level, from the discomfort level input; and

generating an emergency discomfort signal when the discomfort level input reaches a pre-defined maximum discomfort level value.

[0015] The present invention also provides a method including steps of:

receiving force and/or torque (F/T) signals from a plurality of respective F/T sensors in a respective plurality of joints of a robotic manipulator;

determining a total force or total torque applied to the robotic manipulator based on a sum of force values and/or torque values represented by the F/T signals; and generating an emergency over-force signal when the total force or total torque exceeds a pre-selected over-force threshold.

[0016] The method includes stopping the robotic manipulator in response to the emergency over-force signal.

[0017] The present invention also provides computer-readable storage, having stored thereon machine-readable instructions configured to control at least one computer processor to execute the method above.

[0018] The present invention also provides a system including at least one computer processor and the computer-readable storage above. [0019] The system above can include the robotic manipulator, the F/T sensors, and one or more emergency stop switches configured to stop the robotic manipulator due to the emergency over-force signal.

[0020] The present invention also provides a method including steps of:

accessing boundaries data representing a workspace boundaries for an end effector on a robotic manipulator;

receiving new requested position data representing a new requested position for the end effector based on a user input;

processing the boundaries data and the new requested position data to determine whether the new requested position is in the workspace boundaries;

generating new movement data to control the robotic manipulator to move the end effector to the new requested position if the new requested position is in the workspace boundaries; and

generating no movement data to control the robotic manipulator to move the end effector if the new requested position is out of the workspace boundaries.

[0021] The boundaries data can represent the workspace boundaries in at least three dimensions, and the new requested position data can represent the new requested position in the at least three dimensions.

[0022] The boundaries data can represent the workspace boundaries for a probe of the end effector.

[0023] The method includes steps of:

receiving current position data representing a current position and a current pointing direction of the end effector;

receiving applied force data representing an applied force at a contact end of the end effector; and

processing the current position data and the applied force data to generate the boundaries data by representing a dynamic boundary that excludes at least one point along the current pointing direction from the current position. [0024] The current position data represent the current position and the current pointing direction in three dimensions. The boundaries data representing the dynamic boundary can be generated when the applied force is above a predefined force value. The dynamic boundary excludes further movement of the contact end along the current pointing direction.

[0025] The method includes steps of:

receiving force and/or torque (F/T) signals from at least one F/T sensor in the end effector; and

generating the applied force data from the F/T signals by transforming the F/T signals using predefined calibration data.

[0026] The present invention also provides computer-readable storage, having stored thereon machine-readable instructions configured to control at least one computer processor execute the method above.

[0027] The present invention also provides a system including at least one computer processor and the computer-readable storage above.

[0028] The system above includes the robotic manipulator, the F/T sensor, and a user-input apparatus configured to generate the new requested position for the new requested position data. The user-input apparatus can be a haptic apparatus.

[0029] In accordance with the present invention, there is provided a system including:

a robotic manipulator;

a probe attachment point, on the final joint of the robotic manipulator, configured to attach a sensing probe for sensing a target; and

an emergency control, on the final joint, opposite the probe attachment point, configured to generate an emergency control signal when activated manually by a person attending the target.

[0030] The system includes at least one emergency stop switch configured to stop the robotic manipulator due to any one of a plurality of emergency signals, including two or more of:

the emergency applied-force signal;

the emergency discomfort signal;

the emergency control signal; and

the emergency over-force signal.

[0031] The present invention also provides a method including:

a robotic controller receiving a desired contact force signal representing a selected contact force to be applied by an end effector controlled by the robotic controller;

the robotic controller receiving a motion control signal representing a motion command for the end effector;

processing the desired contact force signal and the motion control signal to generate a control signal to control motion of the end effector following the motion command while maintaining the selected contact force applied by the end effector.

[0032] The method includes receiving the desired contact force signal in remote- control instructions from a master controller, wherein the remote-control instructions represent the selected contact force selected by a user.

[0033] The method can include:

accessing applied-force signals representing an applied force between the end effector and a target; and

processing the applied-force signals with the desired contact force signal and the motion control signal to generate the control signal.

[0034] The method can include:

receiving force and/or torque (F/T) signals from at least one F/T sensor in the end effector; and

generating the applied-force signals from the F/T signals by transforming the F/T signals using predefined calibration data.

[0035] The method can include the robotic controller generating the motion control signal based on movement data representing remote-control instructions from a master controller based on user inputs. [0036] The method includes controlling the robotic manipulator to move the end effector.

[0037] The present invention also provides computer-readable storage, having stored thereon machine -readable instructions configured to control at least one computer processor execute the method above.

[0038] The present invention also provides a system including at least one computer processor and the computer-readable storage above.

[0039] The system above includes a robotic manipulator, the F/T sensor, and the end effector. The system can include a user-input apparatus configured to generate the remote-control instructions. The user-input apparatus can be a haptic apparatus. The system can include a telecommunications link for transmitting the remote-control instructions.

BRIEF DESCRIPTION OF THE DRAWINGS

[0040] Preferred embodiments of the present invention are hereinafter further described, by way of example only, with reference to the accompanying drawings, in which:

[0041] Figure 1 is a block diagram of a system for controlling motion of a robotic manipulator;

[0042] Figure 2 is a perspective view of a final joint and an end effector of the robotic manipulator;

[0043] Figure 3 is a left-side view of the end effector;

[0044] Figure 4 is a front view of the end effector and the final joint;

[0045] Figure 5 is a right-side view of the end effector and the final joint;

[0046] Figure 6 is a rear view of the end effector and the final joint; [0047] Figure 7 is a perspective view of an apparatus for monitoring patient discomfort in the system;

[0048] Figure 8 is a right-side view of the apparatus;

[0049] Figure 9 is a top view of the apparatus;

[0050] Figure 10 is a left-side view of the apparatus;

[0051] Figure 11 is a left-side cross-sectional view of the apparatus, cut away through its centre; and

[0052] Figures 12 and 13 are schematic diagrams of components of the system that provide a user- selectable constant contact pressure.

DETAILED DESCRIPTION

Overview

[0053] Described herein are apparatuses, systems and methods for controlling motion of a robotic manipulator, which can be a robotic arm. The described apparatuses, systems and methods allow for improved operation of robotic systems used by

teleoperators (i.e., remote-control robotic systems, also known as "telemanipulator systems"). The described apparatuses, systems and methods can make robotic systems safer for operation on and with humans, including for operation on patients in medical applications, and on and with sensitive parts, including for non-destructive testing. The described apparatuses, systems and methods can be used with haptic (i.e., force feedback) teleoperator systems, including medical telepresence robotic systems (also known as "telerobots").

System

[0054] As depicted in Figure 1, a system 100 includes a robotic manipulator 102 in the form of a robotic arm. The system 100 includes a robot controller 104 that is connected to the robotic manipulator 102 by a control link 106 for controlling the robotic manipulator 102, and for receiving sensor signals from the robotic manipulator 102, including force and/or torque (F/T) signals from F/T sensors in the robotic manipulator 102.

[0055] The system 100 includes a master controller 108 (also referred to as a "tele- operation master") that is connected to the robot controller 104 by a remote link 110 (which is a telecommunications link including at least one electronic communications connection, wired and/or wireless) for sending control instructions to the robot controller 104, and for receiving position information and force information (based on positions and forces/torques of the robotic manipulator 102) from the robot controller 104.

[0056] The system 100 includes a user input apparatus 112 that is configured to receive user inputs from an operator 114 (who may be referred to as a "specialist") using the system 100. The user input apparatus 112 is connected to the master controller 108 by an input link 116 for sending operator inputs from the user input apparatus 112 to the master controller 108, and for receiving haptic information from the master controller 108 to provide force feedbacks to the operator 114. The system 100 includes at least one display 118 that is configured to display user interface (UI) information for the operator 114. The display 118 is connected to the master controller 108 by a display link 120 for receiving the UI information. The at least one display 118 can include a touch-sensitive display configured to receive additional inputs from the operator 114, and to send these additional inputs from the display 118 to the master controller 108 using the display link 120.

[0057] The robotic manipulator 102 and the robot controller 104 are in a robot side

122 of the system 100. The master controller 108, user input apparatus 112 and display 118, are in an operator side 124 of the system 100. The operator side 124 can be referred to as a "master side", and the robot side 122 can be referred to as a "slave side". The robot side 122 can be substantially remote from the operator side 124, and the remote link 110 can include one or more long-distance telecommunications links, including for

teleoperation and telemedicine applications.

[0058] The system 100 includes an end effector 126 on the last joint 128 (that also serves as the "end effector attachment point" of the robotic manipulator 102). The end effector 126 includes one or more devices for interacting with the environment, as described hereinafter, including one or more sensing equipment (including devices and probes) for interacting with a target 130 (including a patient, an animal, or a sensitive inanimate object). The sensing equipment includes imaging equipment, and include sensing devices with ultrasound transducers (in ultrasound probes). The sensing equipment generates electronic signals and/or data, representing sensor measurements from sensors in the sensing equipment, for transmission over the control link 106 and the remote link 110 to the master controller 108. Alternatively or additionally, the signals and data from the sensing equipment can be sent using a separate one electronic connection between the end effector 126 and the master controller 108. The master controller 108 generates sensor display data for the display 118 that represent the sensor measurements, or are based on the sensor measurements, according to established data processing protocols for diagnostics using the sensing equipment (including ultrasound processing routines), thus allowing the operator 114 to observe outputs from the sensing equipment of the end effector 126.

[0059] The target 130 is supported in the system 100 by a supporting apparatus in the form of a chair, bench or bed 132. The supporting apparatus is held in a fixed location relative to the robotic manipulator 102, and assists the target 130 to be motionless in relation to a fixed location of the robotic manipulator 102, which can be defined by a base or support 134 of the robotic manipulator 102. The support 134 and the bed 132 can be held in fixed relation, e.g., by resting on or attachment to a floor or other structure on the robot side 122. The support 134 may also be directly coupled to the bed 132 using suitable fixtures.

[0060] The remote link 110 and the input link 116 can be based on commercially available protocols, e.g., Ethernet and/or Internet protocols, and FireWire protocols. The operator inputs from the user input apparatus 112 include movements with six degrees of freedom, and the robotic manipulator 102 allows movement of the end effector 126 with a corresponding six degrees of freedom.

[0061] The system 100 includes available transformation metrics to transform the three-dimensional (3D) position and 3D rotation, and six F/T measurements from the robotic manipulator 102, into the six positional locations/rotations and six F/T values in the virtual workspace in the master controller 108.

[0062] The system 100 includes a 3D vision system (including cameras and screens) that allows the specialist 114 to see the scene (including the target 130, robotic manipulator 102 and bed 132) on the robot side 122.

Motion-Control and Workspace Limiting

[0063] The system 100 is configured to monitor the motion of the robotic manipulator 102, and to control this motion. The motion control of the robotic

manipulator 102 differs for different procedural scenarios that correspond to different modes (or "states") of the system 100, including a freedrive mode (for manually handling the robotic manipulator 102 into a position), a boundaries mode (for enforcing workspace boundaries on the robotic manipulator 102), and a constant contact pressure mode (for maintaining a constant contact pressure on the target 130 as the end effector 126 is moved).

[0064] When in the freedrive mode, the robotic manipulator 102 can be guided manually to allow a person on the robot side 122 (who can be a sonographer in ultrasound applications) to position robotic manipulator 102, e.g., while positioning the target 130 on the bed 132.

[0065] In contrast, when in the boundaries mode, the robotic manipulator 102 is controlled remotely based on remote-control instructions from the master controller 108; however, virtual boundaries are enforced to keep the end effector 126 within defined workspace boundaries, e.g., to stop the robotic manipulator 102 crashing into the bed 132 or the target 130, or moving too far from the expected working area. The virtual boundaries can make the system 100 safer for the target 130, and more efficient for the operator 114, because potentially dangerous and/or undesirable motions of the robotic manipulator 102 are restricted or prohibited. [0066] The workspace boundaries are defined by the master controller 108 on the operator side 124 before the remote-control instructions are sent to the robot controller 104.

[0067] The master controller 108 performs computational routines to enforce the workspace boundaries. These routines are executed by at least one computer processor, of the master controller 108, following machine-readable instructions stored in the master controller 108 (in computer-readable storage memory), and accessed by the computer processors. Following these routines, when in the boundaries mode, the master controller 108 continuously monitors the position of the end effector 126 based on the position information that represents the position of the end effector 126 from the robot controller 104. Following the routines, the master controller 108 also receives the operator inputs from the user input apparatus 112, and determines a proposed new position for the end effector 126 based on the current position and the proposed movement represented by the operator inputs: the master controller 108 then compares the proposed new position to one or more virtual boundaries defined by boundaries data stored in the master controller 108, defining selected boundaries or regions that are selected to limit the movement of the end effector 126. The boundaries data define the boundaries in a base coordinate system, using Cartesian coordinates. The boundaries include rotational limits for individual joints of the robotic manipulator 102. If the master controller 108 determines that the proposed new location lies within one of these restricted regions, the master controller 108 applies the corresponding restriction (defined in the boundaries data for the respective regions) to the proposed movement. The restrictions can include prohibitions, thus not allowing movement over the boundary, speed restrictions that restrict movement speed in the region, and force restrictions that restrict applied forces by the end effector 126 in the region. The boundary region ("cube") is geometrically defined in the boundaries data relative to a coordinate system of the robotic arm 102.

[0068] The virtual boundaries can include static planes (i.e., static relative to the bed 132 and robot support 134) in horizontal and vertical orientations, including in X, Y and Z Cartesian directions surrounding the bed 132. These static planes are determined for each individual configuration of the robot side 122, e.g., in each particular room where an embodiment of the robot side 122 is installed. The virtual planes are geometrically defined relative to a coordinate system of the robotic arm 102.

[0069] The virtual boundaries include at least one dynamic adaptive boundary that is generated by the master controller 108 during operation of the system 100 in the boundaries mode, i.e., while the robotic manipulator is operating with a target 130. The dynamic boundary is generated based on the force information from the robot controller 104, including forces and/or torques measured by the robotic manipulator 102, including the force measured between the end effector 126 and the target 130. The force between the target 130 and the end effector 126 is measured by an F/T sensor in the end effector 126, which generates F/T signals for the robot controller 104, which then sends the F/T measurements to the master controller 108 in the force information. The dynamic boundary is generated by the routines in the master controller 108 when the force applied by the end effector 126 to any body (including the target 130) exceeds a selected limit, which can be a force limit of between 5 and 40 Newtons, including 20 Newtons. The master controller 108 received force and/or torque (F/T) signals or data from the F/T sensor 316 in the end effector 126, and generates applied force data (representing the applied force) from the F/T signals by transforming the F/T signals using predefined calibration data for the particular setup of the robotic manipulator 102. The dynamic boundary can then restrict and/or prohibit motion of the end effector 126 that would increase the applied force. The dynamic boundary can define a plane perpendicular to the direction of the applied force, that includes the location of the end point of the end effector 126 (or a location incrementally beyond that location along the applied-force direction of the end effector 126) when the maximum applicable force is reached or exceeded.

Alternatively and/or additionally, the dynamic boundary can have a non-planar shape passing through, or incrementally beyond, the end point: in this case, the dynamic boundary can include a conical shape or a truncated conical shape (a frustoconical shape) with an axis aligned with the direction of the applied force, and extending back along the direction of the applied force, therefore surrounding the position of the end effector 126 in the virtual workspace defined by the master controller 108, and restricting or prohibiting movement of the end effector 126, or at least the end of the end effector, in directions perpendicular to the direction of the applied force, at least to an extent defined by the taper of the conical boundary. The conical boundary can define an operating region of generally increasing area as the end effector 126 is retreated from the end contact point with necessarily being a strict conical shape. The dynamic boundary can limit the motion of the end effector 126 when it is in contact with a target to avoid the end point of the end effector 126 being dragged across a target's body while it is applying pressure or significant pressure to the target's body. The boundaries mode thus improves the safety of the system 100.

[0070] The boundaries data can define force, velocity and/or acceleration limitations on the end effector 126. The boundaries data can define different values for the limitations in different respective regions defined by the boundaries in the boundaries data. The force, velocity and acceleration limits are defined in 3D zones enclosed within the robot's virtual operational workspace. Each zone is defined using a 3D shape (e.g., rectangular prisms, cones, geometric shapes, irregular shapes), and pre-defined force, velocity and acceleration values are applied to the robot control instructions as the end effector 126 enters each zone.

[0071] A method of operating the system 100 in the boundaries mode includes steps of:

the robot controller 104 receiving position measurements from the robotic manipulator 102 as a person manually moves the end effector 126, in the freedrive mode, to map and define the acceptable workspace (in the coordinate system of the robotic arm 102) relative to the robot support 134 (step 1);

the master controller 108 receiving the workspace positions to define the virtual workspace during the mapping in step 1 (step 2);

the master controller 108 using the virtual workspace, with the virtual boundaries, to restrict and limit the available motion that is sent in the remote-control instructions to the master controller 108 (step 3).

[0072] The boundary-monitoring routines performed by the master controller 108 in the boundaries mode form a method for motion control and workspace limiting. The method can be implemented using machine-readable instructions generated from computer code written in C++, for example the C++ code in Appendix A. The method includes the following steps performed by the master controller 108 (which are implemented based on respective portions of the C++ code referred to by line numbers in the following steps):

receiving a request to move the end effector 126 to a next position (also referred to as a "next commanded position" or a "new requested position") from the user input apparatus 112 (step 101), e.g., based on code line 584;

receiving force data representing the applied force by the end effector 126 on any body, including a force value, represented by current applied force data, and a force direction (or "current pointing direction") in three dimensions, represented by current position data (step 102);

determining a force vector at the contact point from the force data (step 103), e.g., based on code lines 589-595;

determining whether the robot's next position is out of the static boundaries in the boundaries data, e.g., based on code line 598 and equivalent code for implementing y-axis limits (step 104);

if the robot next position is out of the static boundaries, disregarding the motion command (step 105), e.g., based on code lines 650-672 and equivalent code for implementing y-axis limits;

if the robot next position is within the static boundaries, determining if the contact force is less than the predefined contact force limit defined for the current region in the boundaries data (step 106), e.g., based on code lines 598-600 that define a selected force limit of 5 Newtons for testing (20 Newtons could be preferable for some

applications); and

if the contact force is less than the contact force limit, allowing the motion command to be sent to the robot controller 104 by generating and sending new movement data, representing the motion command, to the robot controller 104 (step 107), and ending the method, e.g., based on code lines 639-645; or

else, if the contact force is equal to or greater than the contact force limit, performing the dynamic boundary process described hereinafter (step 108), and ending the method, e.g., based on code lines 600-636.

[0073] The dynamic boundary process includes steps of the master controller 108:

normalising the force vector of the contact force to determine a plane orthogonal vector (step 201), e.g., based on code lines 589-590;

saving the force vector as the plane vector variable intersecting an XYZ point in the coordinate system (step 202), e.g., based on code line 595;

using the contact position, represented by the current position data, as the contact point variable (step 203), e.g., based on code lines 610-612;

determining the dynamic boundary using the plane vector value and the contact point value (step 204), e.g., based on code line 615;

for any next commanded position of the next position value, the plane state value is determined by transposing the plane vector value with a difference between the next position value and the contact point value (step 205), e.g., based on code line 621; and if the plane state value is greater than the previous plane state, the motion is performed, else, the motion is disregarded (step 206), e.g., based on code lines 628-634.

[0074] The boundary routines may be written using standard programming languages to generate the machine-readable code in the master controller 108. The boundary routines are called continuously by on operating routine that receives the operator inputs from the user input apparatus 112, and directs these to the robot controller 104 using standard communications formats. The routines on the operator side 124 can be developed using standard commercially available software development kits (SDKs) provided with commercially available forms of the user input apparatus 112 (e.g., commercially available haptic controllers). The master controller 108 includes a commercially available personal computer (e.g., a desktop computer) with appropriate data communication ports supporting standard communications protocols for the remote link 110, the input link 116 and the display link 120. The operating system is a commercially available operating system for executing the routines using the computer processes of the master controller 108.

[0075] In embodiments, the boundary routines in the master controller 108 can adjust the boundaries based on the level of discomfort level signals— described

hereinafter— e.g., by reducing speed or applied force of the end effector 126 monotonically in relation to increased level of the discomfort level signals, and/or increasing the haptic feedback proportional to the level of discomfort level signals. As the patient discomfort level increases, the maximum contact force threshold is lowered, thus limiting the amount of force the robotic arm 102 can apply to the target 130, and/or providing the discomfort level rating to the specialist 114, who can then interpret this information and manually limit the force they control the robotic arm 102 to apply through the user input apparatus 112.

[0076] Having the boundary routines executed on the operator side 124 can reduce the required bandwidth for the connection 110 by excluding commands that are prohibited or limited entirely on the operator side 124; however, in embodiments, the boundary routines can be executed by a computing system (or "controller") on the robot side 122, which can be a computing system separate from, and connected to, the robot controller 104. Such a robot-side boundary controller can receive the control information (as data) from the master controller 108, and limit any consequent movements of the robotic manipulator 102 based on the regions defined in the boundaries data, by executing the boundary routines.

Passive Compliant Probe Mount

[0077] As shown in Figure 2, the end effector 126 includes: a sensing probe 202; and a compliant probe mount 204 that compliantly connects the probe 202 to the last joint of the robotic manipulator 102. The probe 202 can be a medical imaging probe, and can be an ultrasound probe. The probe 202 includes a contact end that senses and/or contacts the target 130 during operation of the probe 202 (i.e., while the probe 202 is generating sensing measurements or information), generally under the control of the operator 114. The mount 204 is passive in the sense that it receives no electrical power or power from a motor to provide its compliant operation. The mount 204 assists the contact end of the probe 202 to maintain contact with a body, including selected regions of the target 130 as required by the operator 114 during use, and to limit the applied force of the probe 202 to the body, and further to provide a safe limit on possible applied forces by the probe 202. The compliant mount 204 can assist in imaging by following contours of the target's body through its compliance (i.e., elasticity) along the axis of the probe 202, thus allowing the probe 202 to be swept across the body without the operator 114 having to follow every body contour manually, which can be difficult in telemedicine applications. [0078] As shown in Figure 3, the compliant probe mount 204 includes a fixture 302 permanently attached or affixed, using commercially available glue, to the probe 202. The probe fixture 302 allows a commercially available probe to be adapted to fit the compliant probe mount 204, without any requirement to re-engineer the probe itself. The compliant probe mount 204 includes a probe attachment point 304 (which is also referred to as a "probe attachment piece") that holds the probe fixture 302 in a fixed relation when the probe fixture 302 has been locked onto the probe attachment point 304. The compliant probe mount 204 includes a detent pin 306 that clamps or holds the probe fixture 302 in the fixed relation to the probe attachment point 304 when in place, and allows for manual release of the probe fixture 302 from the probe attachment point 304 when the detent pin 306 is released manually. The detent pin is a commercially available detent pin, including a detent release button 308 that is manually compressible to release the detent pin 306 from the probe attachment point 304, after which the detent pin 306 can be pulled from the probe attachment point 304 thus releasing the probe fixture 302 from the probe attachment point 304. The probe fixture 302 includes four apertures aligned with the insertion direction of the detent pin 306, and these apertures fit removably onto four respective pins of the probe attachment point 304, thus securing the probe fixture 302 relative to the probe attachment point 304 in directions perpendicular to the insertion axis of the probe fixture 302 (and the detent pin 306). The pins and the apertures that connect the probe fixture 302 and the probe attachment point 304 are positioned in such a way that they sufficiently reduce compliance between the probe fixture 302 and the probe attachment point 304 when they are engaged. At the proximal ends of the pins in the probe attachment point 304, the probe attachment point includes interference surfaces to hold corresponding interference surfaces of the probe fixture 302 at the ends of the respective apertures. The detent pin 306, when in position, forces the probe fixture 302 against the probe attachment point 304, thus forcing the interfering surfaces against each other to hold the probe fixture 302 in fixed relation to the probe attachment point 304 along the insertion axis of the probe fixture 302. The detent pin 306 can include a detent washer 312 (also referred to as a detent spacer) that ensures the interfering surfaces are pushed together when the detent pin 306 is in place: the thickness of the detent washer 312 is selected based on the length of the commercially available detent pin 306. This allows differently commercially available pins to be used with the compliant probe mount 204. The detent pin 306 operates using two ball bearings in the distal end of an insertion shaft 314 of the detent pin. The balls extend into "slotted cutouts" (i.e., holes or cavities) inside the probe attachment point 304 that are close to the end of a shaft cavity in the probe attachment point 304 that receives the shaft 314. When the detent pin 306 is in place in the probe attachment point 304, the ball bearings extend from the shaft 314, thus locking the detent pin 306 in the probe attachment point 304. Depressing the detent release button 308 allows the ball bearings to retreat into the shaft 314, thus allowing the detent pin 306 to be released from the probe attachment point 304 by pulling on the detent handles 310.

[0079] The compliant probe mount 204 includes an F/T sensor 316 connected in fixed relation to the probe attachment point 304, as shown in Figure 3. The F/T sensor 316 measures forces and torques between the probe attachment point 304 and an adaptor plate 318 (or "adapter plate") of the compliant probe mount 204. The F/T sensor 316 measures forces in three dimensions, and torques around three axes, and these are indicative of corresponding 6-dimensional forces and torques applied by the contact point of the probe 202 and the contacting body. The F/T sensor 316 includes a plurality of strain gauges for measuring the three orthogonal forces and three orthogonal torques, and an electronic output that carries F/T signals representing the measured F/T values to the robot controller 104 over an F/T sensor electronic connector 320. The F/T sensor 316 can include a plurality of threaded holes in each face (e.g., three threaded holes), including in the sensor face (on the sense plane) and in reference face (in the reference plane) that is opposite from the sensor face (i.e., on the opposite side of the F/T sensor 316). The sensor face connects in fixed relation to the probe attachment point 304 using fixing elements (including threaded elements, e.g., bolts or screws) that hold the probe attachment point 304 to the sensing face of the F/T sensor 316. Similarly, the adaptor plate 318 is fixed using fixing elements (including threaded elements, e.g., bolts or screws) that fits the adaptor plate 318 in fixed relation to the reference face of the F/T sensor 316.

[0080] The adaptor plate 318 is configured: (i) to mount the F/T sensor 316 to a linear slide of the compliant probe mount 204; (ii) to interface with a passive compliant element in the form of a spring element 322 that provides the compliance of the compliant probe mount 204; and (iii) to mount protective covers that fit around the compliant probe mount 204. The adaptor plate 318 is fixedly connected to a carriage 324 of the slide using fixing elements (e.g., screws or bolts).

[0081] The carriage 324 is fixed in linear sliding relation to a rail 326 of the slide, such that the carriage 324 is constrained to move linearly along the axis of the compliant probe mount 204. The carriage 324 is parallel to the axis of the probe 202, which is coaxially aligned to the last joint of the robotic manipulator 102. The slide, including the carriage 324 and the corresponding rail 326, can be a commercially available slide, with a selected length, which can be selected based on the possible maximum allowable displacement of the probe 202 in the compliant probe mount 204, e.g., 5, to 50, or up to 100 millimetres.

[0082] The adaptor plate 318 includes an interface for the spring element 322. The interface for a spring element 322 includes an end cap 328 that acts against the end of the spring element 322, and thus transfers linear force from the spring element 322 to the adaptor plate 318 (and the other components of the compliant probe mount 204 that move with the adaptor plate 318). The interface for a spring element 322 includes a centre element 330 that holds the spring element 322 onto the end cap 328 by extending into the centre of the spring element 322. The end cap provides an annular mating surface for the end of the spring element 322, and thus this surface can be built up, allowing fitment of a shorter spring element 322, using commercially available washers or spacers. The spring element 322 can be a commercially available spring, including a coiled metal spring.

[0083] The adaptor plate 318 includes external mounting points 602 for fitting the covers, and these mounting points 602 can be accessed once the compliant probe mount is assembled, and attached to the robotic manipulator 102, as shown in Figure 6. The covers can protect and limit access to the moving components of the compliant probe mount 204, e.g., to keep them clean and reduce interference.

[0084] The slide rail 326 is fixedly connected to a super structure 332 of the compliant probe mount.

[0085] The super structure 332 provides the main interface between the compliant probe mount 204 and the robotic manipulator 102. The super structure 332 is fixedly attached, using bolts 344, to an end effector attachment point (also referred to as the "end- effector mount") of the robotic manipulator 102.

[0086] The compliant probe mount 204 includes two limit switches 334 (i.e., a plurality of limit switches) that detect movement of the probe 202 beyond a selected distance towards the end effector attachment point. Such movement is due to applied forces on the probe 202. The compliant probe mount 204 automatically and mechanically detects movement of the carriage 324 reaching a pre-selected point along the rail 326 using the limit switches 334. Each limit switch 334 can be a commercially available switch, which can be a lever micro switch with a lever 336 that is activated by movement of the carriage 324 to an interference point with the lever 336 along the rail 326. The limit switches 334 can include other commercially available switches, including buttons, magnetic switches, and/or optical switches.

[0087] The limit switches 334 are electrically connected to the robot controller 104 to transmit emergency limit signals to the robot controller 104 indicating that the probe 202 has moved beyond the pre-selected maximum mount movement limit, thus indicating that the probe 202 has been moved too far into the body, due to a corresponding applied force on the spring element 322, thus triggering an emergency stop of the robotic manipulator 102. The two limit switches 334 are connected in series such that the emergency limit signal is sent if either of the limit switches 334 is activated by the movement of the compliant probe mount 204 beyond the pre-selected maximum distance. This configuration may be referred to as "fail-safe" wiring.

[0088] The mount 204 includes a switch mount 338 that holds the limit switches

334, but that can be slid along the super structure 332, and then affixed to the super structure 332, in a direction along the rail 326, thus allowing the maximum displacement distance of the probe 202 to be physically sent by the positioning of the switch mount 338 (and thus the limit switches 334). The switch mount 338 cups around a Z-axis spine of the super structure 332, and includes a screw that allows the switch mount 338 to be slid along the spine, and tightened to hold the switch mount 338 in place. [0089] The compliant probe mount 204 includes a pre-tension element 340, which can be in the form of a hex bolt driving an end cap that fits into, and presses on, an end of the spring element 322. The pre-tension element 340 can be manually rotated (using an Allen key extended through an aperture in the super structure 332) to pre-tension the spring element 322.

[0090] The switch mount 338 allows movement of the limit switches 334 along the

Z axis of the compliant probe mount 204. The pre-tension element 340 allows movement of a pre-tensioning end cap 342, which co-operates with the end cap 328 to hold the spring element 322 in place, also moves along the Z axis of the compliant probe mount 204. The combination of the pre-tension element 340 and the adjustable switch mount 338 allows control of the maximum acceptable movement within the compliant probe mount 204, and thus the maximum acceptable force applied to compliant probe mount 204 along the Z axis.

[0091] The super structure 332 can be formed of one piece from metal, including aluminium or steel. The super structure 332 holds the rail 326 and the counter-end of the spring element 322 in fixed relation to the end effector attachment point of the final joint of the robotic manipulator 102. The super structure 332 includes mounting points to receive the mounting bolts 344 that mount the super structure 332 in fixed relation to the end effector attachment point. The robotic manipulator 102 can be a commercially available robotic manipulator, and thus end effector attachment point can have a predetermined configuration. The super structure 332 is configured to hold the probe 202 in line with the final rotational axis of the robotic manipulator 102 (e.g., joint six), such that rotation of the final joint rotates the probe 202 about its central longitudinal axis: this alignment of the probe 202 and the final joint can extend the operating range of the robotic manipulator 102, can allow for more efficient programming of the robotic operational routines, can allow for efficient motion of the robotic manipulator 102 while rotating the probe 202 at the same contact point with the body (which may be advantageous for probes with rotatable imaging apparatuses, including ultrasound probes), and can allow for no additional pressure at the contact point while the probe 202 is rotated. Having the probe 202 aligned to the final rotational axis of the robot can improve clearance in the compliant probe mount 204 by avoiding interferences as the probe 202 moves in the compliant probe mount 204 towards the final joint of the robotic manipulator 102.

[0092] The compliant probe mount 204 is assembled by a method including steps of:

attaching the carriage 324 (also referred to as a "linear bearing") to the rail

326;

attaching the rail 326 to the super structure 332;

electrically connecting the F/T sensor 316 to the robot controller 104; electrically connecting the limit switches to the robot controller 104;

electrically connecting the probe 202 to the robot controller 104, using a probe port 206 on the probe 202;

attaching the pre-wired F/T sensor 316 to the adaptor plate 318;

attaching the spring-locating elements (including the end cap 328 and the centre element 330) to the adaptor plate 318;

attaching the adaptor plate 318 to the carriage 324;

inserting the pre-tension element 340 into the super structure 332;

inserting the spring element 322 (which can be referred to as a "Z-axis spring"), and the associated end caps 342, 328 (including the centre element 330);

attaching the probe attachment point 304 to the F/T sensor 316;

assembling the pre-wired limit switches 334 into the switch mount 338 to form a switch carriage (also known as a "microswitch" carriage);

attaching the switch carriage to the spine of the super structure 332 so it can slide along the Z axis of the super structure 332;

connecting the protective covers to the adaptor plate 318, and securing the electrical cables to allow appropriate movement of the compliant probe mount 204;

fixing the super structure 332 to the end effector attachment point;

partially inserting the detent pin 306 into the probe attachment point 304; fitting the probe fixture 302 to the probe attachment point 304 using the cooperating holes and pins, where the probe fixture 302 is pre-attached to the pre-wired probe 202; and performing a maximum-force and maximum-movement calibration procedure, with or without the probe 202 in place.

[0093] The maximum allowable movement, and force, of the compliant probe mount 204 are set in the calibration procedure that includes steps of:

selecting the maximum allowable travel of the probe 202 along the Z axis, which is preferably more than zero so the probe 202 can maintain contact with the body as it is moved over the body, thus providing mechanical following of the body's contours, but not too much such that the haptic feedback provided by the F/T sensor 316 to the user input apparatus 112 is too spongy for precise control by the operator 114;

selecting the maximum allowable force to be applied along the Z axis of the compliant probe mount 204 by the probe 202 on the body (e.g., based on testing with patients, which can be between 10 and 70 Newtons, including 35 Newtons), and holding the probe 202 against a fixed testing body, and monitoring the applied force signals from the F/T sensor 316 while adjusting the pre-tension element 340 to reach this force value; and

moving the switch mount 338 along the spine of the super structure 332, and fixing the switch mount 338 such that the limit switches 334 are activated at precisely the point of the carriage 324 while the applied force is at the selected maximum value.

[0094] The configuration of the compliant mount 204 allows the probe 202 to be mounted to the robotic manipulator 102 with the F/T sensor 316 in the loop. The limit switches 334 automatically trigger an emergency stop if excess axial force is applied to a target body. The relationship of the probe fixture 302, probe attachment point 304, and the detent pin 306 allows the probe 202 to be efficiently and manually removed from the end effector 126 for separate manual use by a person on the robot side 122 (e.g., which may include stopping operation of the robotic manipulator 102, e.g., by activating the freedrive mode). The mechanically triggered emergency stop provided by limit switches 334 is activated at a pressure that can be selected by controlling the pre-compression of the spring element 322. Over-Force Sensing and Response

[0095] The robot controller 104 is configured to execute an over- force monitoring routine that operates continuously during operation of the robotic manipulator 102. The over-force monitoring routine monitors a total of applied forces/torques to the robotic manipulator 102, and controls the robotic manipulator 102 to limit this total applied force to a pre-selected maximum total level, i.e., keeping it at or below a pre-selected over- force threshold. The selected maximum allowable total force can be between 35 and 140 Newtons, including 70 Newtons (N). As the safety systems in the system 100 are staggered, the over- force threshold can be set between the lowest detectable force (e.g., 0.1N), and the highest applicable total force (e.g., 50N per joint) because the over-force routine monitors the sum of forces or torques over all joints. The over-force monitoring routine stops the operation of the robotic manipulator 102, e.g., by activating the freedrive mode, when the over-force threshold is exceeded.

[0096] The over-force monitoring routine uses a force/torque measurement at each joint in the robotic manipulator 102 between the robot support 134 and where the force is applied (i.e., all joints between the static environment, including the bed 132, and the applied force). The F/T measurements are made by at least one internal F/T sensor in each joint. Each F/T sensor may measure torque applied at its corresponding joint. The internal joint F/T measurements are sent as signals from the joint sensors, and converted to values represented in measurement data in the robot controller 104

[0097] The over-force routine is executed by one or more computer processors of the robot controller 104 following machine -readable instructions stored on computer- readable storage/memory of the robot controller 104. The over- force routine can be implemented using machine-readable instructions generated from computer code written in a scripting language of the robot controller 104, for example the scripting code in

Appendix B.

[0098] The over-force routine forms a method including the following steps (which are implemented based on respective portions of the scripting code referred to by line numbers in the following steps): receiving at least one F/T measurement from a sensor at each operational joint in the robotic manipulator 102 between the end effector mount and the robot support, e.g., based on code line 118;

generating total applied force data, representing a total applied force vector, by performing scalar addition of the forces or torques from the operational joints to determine the total applied force vector, e.g., based on code internal to the robotic controller and generated by the "force()" command in line 118;

if the total applied force is greater than the selected maximum allowable total force, the robot controller 104 generates an over-force signal, e.g., based on code line 126; and

the robot controller receives the over- force signal and kills program execution, and the mechanical safety switches then enable the robotic manipulator 102 in the freedrive mode— in the freedrive mode, the patient or an assistant on the robot side 122 can physically move the end effector 126, and the robotic manipulator 102, to a safe position, e.g., based on code lines 123-124.

[0099] The over-force routine allows the robot controller 104 to monitor the total force applied using F/T sensors in the joints of commercially available robotic

manipulators, e.g., based on code lines 118 and 126.

[0100] The over-force routines can be written in scripts using coding languages associated with commercially available robotic manipulators, e.g., the robotic manipulators from Universal Robots A/S in Denmark. The robot controller 104 can be a commercially available robot controller, e.g., from Universal Robots A/S.

Monitoring Patient Discomfort During Procedure

[0101] As shown in Figure 1, the system 100 includes a comfort control 700 that is configured to be operated by the patient, including manually. The comfort control 700 is connected to the robot controller 104 by a comfort monitor link 136 that carries discomfort signals, including discomfort level signals and emergency discomfort signals, from the comfort control 700 to the robot controller 104. The discomfort signals can be electronic signals. [0102] As shown in Figures 7-10, the comfort control 700 (also referred to as a

"comfort switch assembly") includes a housing 702 with ergonomic finger grooves 704 (also referred to as dips, cutouts, or dimples) to receive the patient's fingers while grasping and holding the comfort control 700. The comfort control 700 includes an ergonomic lip 706 that extends over the fingers of the patient while the comfort control 700 is held. The comfort control 700 includes a button 708 in a thumb position on the comfort control 700. The button 708 can be depressed into the housing 702 by a thumb force from the patient. Due to a reactive "upward" force on the button 708, the thumb force of the patient on the button 708 forces the comfort control 700 in one direction ("downward"), and the ergonomic lip 706 rests on the fingers of the patient to resist this downward force, thus making the comfort control 700 easier to grasp while the button 708 is being depressed. The force applied to the button 708 can be significant because the comfort control 700 can activate one of the emergency stop switches in the system 100, and this would preferably not be done accidentally.

[0103] As shown in Figure 11, the button 708 is rigidly connected to a plunger

1102 that includes a projection 1104 that activates an emergency-signal transducer (also referred to as "an output transducer for emergency discomfort" or "an emergency discomfort transducer") in the form of an on/on switch 1106, which in turn generates the emergency discomfort signal. The projection 1104 pushes against a toggle 1107 of the switch 1106 when the plunger 1102 reaches its maximum designed depth in the housing 702. As shown in Figure 11, the projection 1104 is configured to press against the toggle

1107 close to the fulcrum of the toggle 1107 to increase the force required to activate the switch 1106.

[0104] Once the switch 1106 has been activated, the toggle 1107 moves to a stop position, and the toggle 1107 can then be accessed through an aperture 802 in the housing that allows the toggle 1107 to be reset to the initial position. As shown in Figure 11, with the toggle 1107 in the initial position, the toggle 1107 cannot be moved or accessed through the aperture 802. The plunger 1102 includes a cutout 1108 through which the toggle 1107 projects during movement of the plunger 1102. The upper end of the cutout

1108 includes the projection 1104 for activating the on/on switch 1106. The lower (thus opposite) end of the cutout 1108 does not interfere with the on/on switch 1106, or its toggle 1107, in any operational position of the plunger 1102. Thus, the on/on switch 1106 is reset by a manual activation of the toggle 1107 through the aperture 802 and not by the returning movement of the plunger 1102.

[0105] The comfort control 700 includes a comfort-signal transducer 1110 (also referred to as a "discomfort transducer", "an input transducer for discomfort levels", or "a level transducer") in the form of a commercially available potentiometer. A pin of the potentiometer is mounted into the plunger 1102, and moves in fixed relation to the plunger 1102 to change the signal level from the transducer 1110 as the plunger 1102 moves. The body of the potentiometer generates a value for the discomfort signals based on the depth of the plunger 1102, and thus monotonically (and linearly) related to the depression of the button 708.

[0106] If no force is applied to the button 708, the plunger 1102 is pushed to its outermost position by a spring element 1112 in the housing 702. The spring element is held between a lower locating cap shaped in the inner end of the plunger 1102 (i.e., the opposite end from the button end of the plunger 1102), and locating cap 1114 fixed in the housing 702 along the movement axis (the Z axis) of the plunger 1102.

[0107] The components of the comfort control 700 fit inside the housing 702 without separate adhesives or mechanisms (e.g., glue, screws or bolts), and are held in place by the two halves of the housing 702 (including one half 902 and another

symmetrical half 904, as shown in Figure 9) being held together by fixing elements 804 (e.g., bolts, pins or screws). The housing 702 is symmetrical from side to side, allowing the comfort control 700 to be operated equally in the patient's left hand or the patient's right hand.

[0108] The cap 1114 provides an end spacing, and the thickness of this end spacing can be selected, in combination with the length and spring constant of the spring element 1112, to control the force required to move the plunger 1102 into the housing 702, and thus to generate the discomfort level signals. [0109] The spring constant of the on/on switch 1106, and the distance of the projection 1104 along the toggle 1107 from the fulcrum of the toggle 1107, are selected to determine the force required from the plunger 1102 to activate the emergency discomfort signal.

[0110] The master controller 108 receives the discomfort level signals, and generates one or more discomfort indicators for the operator 114 that are monotonically related to the level or value of the discomfort level signals. The indicators may include audible indicators (e.g., beeping), tactile indicators (e.g., vibrating of the user input apparatus 112), or increased resistance of the user input apparatus 112 in a direction towards the patient body, or a visual indicator (which can be on the display 118).

[0111] In alternative embodiments, the "click" feel provided by the toggle of the on/on switch 1106 in the comfort control 700 could be provided by an alternative mechanical system— e.g., pushing the plunger 1102 past a ball resiliency held in the housing 702— and the on/on operation of the emergency switch 1106 (also referred to as a "set-reset operation") could be provided by an electronic microchip, thus removing the protrusion in the housing 702, opposite the lip 706, to hold the large on/on switch 1106.

[0112] In embodiments, the comfort control 700 can include a tactile feedback apparatus, including an off-axis vibrating disc motor, that is driven to vibrate at a level monotonically related to the level of the discomfort level signals, thus providing tactile feedback to the patient as the level in the discomfort level signals increases, and the maximum displacement of the plunger 1102 approaches, i.e., as the activation of the emergency switch 1106 is approached. Alternatively and/or additionally, the comfort control 700 could provide audible feedback also monotonically related to the level of the discomfort level signals, e.g., a beeping tone for the patient, and other persons on the robot side 122, to hear.

End-Effector Safety Control

[0113] As shown in Figures 1, 2, and 4-6, the system 100 includes an emergency safety control 138 on the final joint of the robotic manipulator 102. [0114] The emergency control 138 is affixed to the final joint in a position where a person, including the patient or a person on the robot side 122, would instinctively grasp the robotic manipulator 102 during a procedure, e.g., in an attempt to move the end effector 126 away from the patient. This location may be preferable to equivalent emergency switch controllers placed on or around the bed 132 or support 134.

[0115] The emergency control 138 is affixed on the same axis as the probe 202, but accessed from the opposite direction, thus making the emergency control 138 accessible to a person assisting the operation on the robot side 122 while the probe 202 is pointing towards or touching the patient, regardless of the rotational position of the probe 202 and the end effector 126 on that final joint. The visual appearance (including a colour) of the emergency control 138 is selected to contrast sharply from the appearance of the last joint of the robotic manipulator 102 so that it can be immediately visible to a person on the robot side 122. The emergency control 138 generates an electronic emergency control signal when activated manually by a person attending the target 130, or by the target 130 themselves.

[0116] The emergency stop switches of the system 100 are connected in series such that activation of any one switch stops the robotic manipulator 102, including due to any one of the following plurality of emergency signals: the emergency applied-force signal, the emergency discomfort signal, the emergency control signal, and the emergency over- force signal. This configuration of emergency stop switches can be referred to as a "fail safe" configuration. Stopping the robotic manipulator 102 includes activating the freedrive mode of the robotic manipulator 102.

Constant Contact Pressure Mode

[0117] As mentioned hereinbefore, the constant contact pressure mode is for maintaining a constant contact pressure on the target 130 as the end effector 126 is moved.

[0118] In contrast to the freedrive mode, the constant contact pressure mode provides remote control of the end effector 126 by the master controller 108. The constant contact pressure mode can be used independently of, or with, the boundaries mode. [0119] The constant contact pressure mode may be referred to as an open-loop haptic control mode, or a non-haptic control mode, because the applied force is controlled locally on the robot side 122 without requiring the haptic (force-feedback) signals to be sent to the operator side 124. The constant contact pressure mode may automatically become available when network performance (i.e., quality-of-service (QoS) of the remote link 110) deteriorates beyond a level practical for servicing the closed-loop haptic control. The constant contact pressure mode may be invoked by the user for procedures requiring very light or high contact pressures to be maintained, whereby the user relies on the robotic controller 104 to maintain their chosen contact pressure, thus reducing the user's physical and/or cognitive workloads.

[0120] In the constant contact pressure mode, the robotic controller 104 receives a desired contact force signal representing a selected contact force to be applied by the end effector 126. The desired contact force signal is received by the robotic controller 104 in the remote-control instructions from the master controller 108. The remote -control instructions represent the selected contact force selected by the user using the operator inputs from the user input apparatus 112, or using an alternative user interface of the master controller 108. The selected contact force can be a scalar force value selected by the user, e.g., between available minimum and maximum selectable values in the master controller 108, and the robotic controller 104 then maintains the applied force at this scalar force value— or close to it— as the user moves the end effector 126, and thus the probe, across a surface of the target 130.

[0121] In the constant contact pressure mode, the robotic controller 104 automatically maintains the selected contact force on the target 130 as the end effector 126 is moved across the target 130, i.e., generally parallel to the surface of the target 130, so the user does not need to worry about exerting a constant force with the user input apparatus 112 as the user scans the end effector 126 across the target 130. This may be of value in scanning applications, e.g., with the ultrasound probe, to keep the probe in contact with the surface of the target 130 while sweeping the probe across the body of the target 130.

[0122] In order to keep the applied force at or about the selected contact force, the robotic controller 104 receives a motion control signal representing a motion command for the end effector, and processes the desired contact force signal and the motion control signal to generate a control signal to control motion of the end effector 126 following the motion command while maintaining the selected contact force applied by the end effector. The robotic controller 104 accesses applied-force signals representing the applied force between the end effector 126 and the target 130, and processes the applied-force signals with the desired contact force signal and the motion control signal to generate the control signal. The applied-force signals are generated as described hereinbefore.

[0123] As shown in Figure 12, the operator side 124 includes a user interface with a contact force selector 1202. The user controls the contact force selector 1202 to send the selected contact force with the remote-control instructions. The user also controls the user input apparatus 112 to move the end effector 126 across the target 130. The robotic controller 104 on the robot side 122 controls the robotic manipulator 102 to follow nonlinear contours of the target 130.

[0124] The contact force selector 1202 can be provided in the routines of the master controller 108 described hereinbefore. The routines of the master controller 108 can also include a user control to switch the constant contact pressure mode on and off by sending mode instructions to the robotic controller 104. The contact force selector 1202 may include a physical dial or slider, an on-screen dial or slider, a touch- sensitive display, a computer mouse, a computer keyboard, or other input devices.

[0125] As shown in Figure 12, the user motion 1204 of the user input apparatus 112 may include straight line 1204 (i.e., a linear motion) to generate the motion commands; however, the surface 1206 of the target 130 can be other than straight (i.e., nonlinear). In the constant contact pressure mode, the robotic controller 104 adjusts the position of the end effector 126 in a direction 1208 that may be generally perpendicular to the surface 1206 while following the user motion 1204, thus resulting in an actual controlled movement 1210 that follows the curvature of the surface 1206 instead of the straight line 1204.

[0126] As shown in Figure 13, the motion commands can be in the form of a commanded motion 1302 sent from the operator side 124 to the robot side 122. The selected contact force can be in the form of a command to set the contact force 1304 sent from the operator side 124 to the robot side 122. The applied-force signal can represent a measured contact force 1306 from the F/T sensor 1212 in the end effector 126 (which can be the F/T sensor 316).

[0127] In the constant contact pressure mode, the robotic controller 104 executes a constant contact pressure routine using the computer processors of the robot controller 104 following machine-readable instructions stored on the computer-readable storage/memory of the robot controller 104. The constant contact pressure routine can be implemented using machine-readable instructions generated from computer code written in a scripting language of the robot controller 104, for example scripts using coding languages associated with commercially available robotic manipulators, e.g., the robotic manipulators from Universal Robots A/S in Denmark. The constant contact pressure routine can be implemented as a control loop running continuously in combination with the over-force monitoring routine described hereinbefore.

[0128] The constant contact pressure routine includes the robotic controller 104 determining a first error (or a difference) between the set contact force 1304 and the measured contact force 1306 in a first comparison module 1308 of the robotic controller 104. The constant contact pressure routine includes the robotic controller 104 determining a second error (or a difference) between the first error and the commanded motion 1302 in a second comparison module 1310 of the robotic controller 104. An output signal from the second comparison module 1310 then represents the motion control input (from the commanded motion 1302) modified by the difference between the set contact force 1304 and the measured contact force 1306 to maintain the selected contact pressure on the target 130. The first comparison module 1308 and the second comparison module 1310 can be script modules in the robotic controller 104. The first comparison module 1308 and the second comparison module 1310 can be combined into one module, or different modules of equivalent overall functionality can be provided to modify the commanded motion 1302 based on differences between the set contact force 1304 and the measured contact force 1306. These modules can include a control-loop feedback routine to keep the measured contact force 1306 tracking the set contact force 1304 as quickly as possible while avoiding undesirable resonances, e.g., using a proportional-integral-derivative controller (PID controller) routine in the script modules. The parameters of the control-loop feedback routine may be selected for each application of the robotic manipulator 102, e.g., taking into account expected resistance of the target 130 (e.g., which body parts) and inertia of the robotic manipulator 102 and the end effector 126.

[0129] Having the constant contact pressure routine executed on the robot side 122 may allow the selected contact pressure to be maintained even if there are quality-of- service (QoS) problems (including latency and packet loss, etc.) in the remote link 110. If QoS problems arise, e.g., reducing the available bandwidth or increasing the latency of the remote link 110, the robotic controller 104 and the master controller 108 can include communications control routines that prioritise imaging traffic, including signals for probe signal visualisation 1314, over force-feedback commands (i.e., haptic information) for the master controller 108, leaving the applied force to be maintained at the selected contact force by the constant contact pressure routine.

[0130] The constant contact pressure routine allows a user (e.g., a specialist) to set their optimal contact pressure using the contact force selector 1202, and then have the robotic manipulator 102 automatically maintain this contact pressure once the user has moved a stylus (of the user input apparatus 112) to contact the target 130 (e.g., a patient). Once in contact, the robotic manipulator 102 moves to maintain the correct pressure, while the user uses their haptic device (of user input apparatus 112) to move /orient the contact point. In the constant contact pressure mode, no haptic force feedback is need to be presented to the user, although an on-screen display may still communicate instantaneous contact pressure to the user. In the constant contact pressure routine, the closed control loop that maintains the selected pressure is located on the robot side 122, with only the desired contact pressure information needing to be sent from the specialist on change. This allows the robot manipulator 102 to respond very quickly to small changes in pressure, independent of end-to-end latency from the robotic controller 104 to the master controller 108 (sending the current applied force signals as haptic signals to the user input apparatus 112) and back from the master controller 108 to the robotic controller 104 (i.e., the motion response from the user input apparatus 112 to the current applied force), potentially improving the consistency of the contact pressure compared to the closed-loop haptic modes. In the constant contact pressure mode, the master controller 108 no longer needs to receive the haptic force feedback over the remote link 100.

[0131] In the constant contact pressure mode, the end effector 126 may still be positioned naturally using the haptic user input apparatus 112, but there may be less chance of the user instructing the robotic manipulator 102 to apply too much or too little pressure to the target 130 when traversing with the end effector 126 (and the probe). For example, in a chest-imaging application, the probe may automatically adjust for rising and falling of the lungs caused by breathing; or an a body-imaging application, the end effector 126 may automatically maintain the selected pressure when traversing between soft tissue and bony areas of the target 130.

Alternatives

[0132] In embodiments, the robotic manipulator 102 can be another robotic manipulator that is controllable at a low-enough level to receive and respond to motion commands at a fast-enough rate for haptic control, and that has mechanical and

performance characteristics suitable for the system 100.

[0133] In embodiments, the probe 202 can include a stethoscope, an

electrocardiograph, at least one electroencephalography electrode, a palpation probe, a blood-pressure probe, a temperature probe, a thermal camera, an optical camera, and/or a microscope.

Interpretation

[0134] Many modifications will be apparent to those skilled in the art without departing from the scope of the present invention.

[0135] The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that the prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates. APPENDIX A: C++ CODE

583 //Calculate the delta position of the haptic device from current measured position minus

previous "old" position

584 cVector3d deltaHapticPos (newHapticPos . x- OldHapticPos . x, newHapticPos . y-01dHapticPos . y, newHapticPos . z-OldHapticPos . z ) ;

585 PositionScaler = ( (double) PROGRAM_POSITION_GAIN) /100

; // Scalar quantity used to scale the position mot ion of the UR5 relative to the haptic device

586 deltaHapticPos . mul (PositionScaler) ; // Multipl y delta haptic position by scalar

587

588 //Calculate Normal force vector to limit user from a pplying too high force

589 double norm = force . x* force . x + force . y*force . y + fo rce . z * force . z ;

590 norm = sqrt (norm) ;

591 //printf ("force: %f, deltaX: %f deltaY: %f, deltaZ:

%f\n", norm, deltaHapticPos . x, deltaHapticPos . y, delt aHapticPos . z ) ;

592 printf (" force : %f, plunger: %d\n", norm, AnlnO);

593

594 cVector3d Reeforcedirection;

595 cVector3d forcedirection ( force . x/norm , force. y/norm

, force . z /norm) ;

596

597 // Limit X- axis to never go above 700mm and never go below 300mm . Limit Z- axis to never go above 610mm and never go below 100mm

598 if ( (current . x < 0.7) && (current. x > 0.15) && (curre nt.z < 0.610) && (current. z > 0.07) )

599 {

600 if (norm > 5)

601 {

602 cVector3d CheckNextPos ; // Define vector f or next position to move robot to

603 CheckNextPos . x = UR5Position . x - deltaHapticPos . x; // Calculate where the next posi tion in X direction will be based on haptic motion

604 CheckNextPos . y = UR5Position . y - deltaHapticPos . y; // Calculate where the next posi tion in Y direction will be based on haptic motion 605 CheckNextPos . z = UR5Position . z + deltaHapticPos . z ;

// Calculate where the next position in Z directio n will be based on haptic motion

606

607 if (ForceFlag == true)

608 {

609 ForceFlag = false;

610 ContactPos.x = UR5Position . x; // Store curre nt UR5 position X

611 ContactPos.y = UR5Position . y; // Store curre nt UR5 position Y

612 ContactPos.z = UR5Position . z ; // Store curre nt UR5 position Z

613 Reeforcedirection = forcedirection; // Store the direction of the normal force vector at the first instance of the contact time

614 //Calculate the status of the first position of t he UR5 with respect to the orthogonal plane at the fi rst force limit

615 CurrPlane = Reeforcedirection . x* (CheckNextPos . x- ContactPos.x) +Recforcedirection . y* (CheckNextPos . y- ContactPos.y) +Recforcedirection . z * (CheckNextPos . z- ContactPos . z ) ;

616 PrevPlane = CurrPlane; // Store current pi ane as previous for reference.

617 }

618 else

619 {

620 //Update the status of the current position with respect to the orthogonal plane.

621 CurrPlane = Reeforcedirection . x* (CheckNextPos . x- ContactPos.x) +Recforcedirection . y* (CheckNextPos . y- ContactPos.y) +Recforcedirection . z * (CheckNextPos . z- ContactPos . z ) ;

622

623 UR5Position . set (output . x, output . y, output . z ) ;

// Stores the position to be sent to the UR5 into re lativePosition Vector (at startup is the basic output position defined above)

624 OldHapticPos . set (newHapticPos . x, newHapticPos . , ne wHapticPos . z ) ; // Stores the position of the Omni i nto relativeHaptic Vector

625 UR5Orient . set (output . rx, output . ry, output . rz ) ;

// Stores the orientation to be sent to the UR5 int o orientation Vector (at startup is the basic output position defined above) 626 OldHapticOrient = newHapticOrient ; // Rec ords the current haptic device orientation as the old haptic orientation

627

628 if (CurrPlane > PrevPlane) //If (CurrPlan e) is more than (PrevPlane) then we are moving away f rom the position of the force limit

629 {

630 output . x = UR5Position . x - deltaHapticPos . x; // Assign new UR5 position based on old UR5 position and the delta positon of haptic device (minus because of haptic device base frame dir ection)

631 output . y = UR5Position . y - deltaHapticPos . y; // Assign new UR5 position based on old UR5 position and the delta positon of haptic device (minus because of haptic device base frame dir ection)

632 output . z = UR5Position . z + deltaHapticPos . z ; /

/ Assign new UR5 position based on old UR5 position a nd the delta positon of haptic device (plus because o f haptic device base frame direction)

633 PrevPlane = CurrPlane; // Store current plane as previous for reference.

634 }

635 }

636 }

637 else

638 {

639 if (norm <= 5)

640 {

641 output . x = UR5Position . x - deltaHapticPos . x; // Assign new UR5 position based on old UR5 position and the delta positon of haptic device (minus because of haptic device base frame dir ection)

642 output . y = UR5Position . y - deltaHapticPos . y; // Assign new UR5 position based on old UR5 position and the delta positon of haptic device (minus because of haptic device base frame dir ection)

643 output . z = UR5Position . z + deltaHapticPos . z ; //

Assign new UR5 position based on old UR5 position an d the delta positon of haptic device (plus because of haptic device base frame direction)

644 ForceFlag = true;

645 }

646 } 647 }

648 else

649 {

650 UR5Position . set (output . x, output . y, output . z ) ; /

/ Stores the position to be sent to the UR5 into rela tivePosition Vector (at startup is the basic output p osition defined above)

651 OldHapticPos . set (newHapticPos . x, newHapticPos . y, newH apticPos.z); // Stores the position of the Omni int o relativeHaptic Vector

652 UR5Orient . set (output . rx, output . ry, output . rz ) ;

// Stores the orientation to be sent to the UR5 into orientation Vector (at startup is the basic output po sition defined above)

653 OldHapticOrient = newHapticOrient ; // Recor ds the current haptic device orientation as the old h aptic orientation

654 if (current.x >= 0.7 && deltaHapticPos . x > 0.0 )

// If UR5 goes over 700mm in X axis then drive it b ack to regain control

655 {

656 output . x = UR5Position . x - deltaHapticPos . x;

657 }

658

659 if (current.x <= 0.15 && deltaHapticPos . x < 0.0 )

// If UR5 goes under 300mm in X axis then drive it forward to regain control

660 {

661 output . x = UR5Position . x - deltaHapticPos . x;

662 }

663

664 if (current. z >= 0.610 && deltaHapticPos . z < 0.0 )

// If UR5 goes over 610mm in Z- axis then drive it down to regain control

665 {

666 output . z = UR5Position . z + deltaHapticPos . z ;

667 }

668

669 if (current. z <= 0.07 && deltaHapticPos . z > 0.0 )

// If UR5 goes under 100mm in Z- axis then drive it up to regain control

670 {

671 output . z = UR5Position . z + deltaHapticPos . z ;

672 }

673

674 if (norm >= 20 && deltaHapticPos . z > 0.0 ) //

If UR5 goes under 100mm in Z- axis then drive it up to regain control 675 {

676 output . z = UR5Position . z + deltaHapticPos . z ;

677 }

678 }

679 }

680 else

681 {

682 //When the buttons are NOT pressed on the Omni (on s tartup it comes into here even if holding down a butt on for some reason) :

683 //Set the relative position vector for the UR5 equal to the the UR5 position

684 //Set the relative position vector for the haptic de vice equal to the position read from the haptic devic e

685 UR5Position . set (output . x, output . y, output . z) ; // S tores the position to be sent to the UR5 into relativ ePosition Vector (at startup is the basic output posi tion defined above)

686 OldHapticPos . set (newHapticPos . x, newHapticPos . y, newHa pticPos.z) ; // Stores the position of the Omni into r elativeHaptic Vector

687

688 // HER Updated code//////////////

689 UR50rient . set (output . rx, output . ry, output . rz ) ; //

Stores the orientation to be sent to the UR5 into ori entation Vector (at startup is the basic output posit ion defined above)

690 OldHapticOrient = newHapticOrient ; // Records the current haptic device orientation as the old hapt ic orientation

691 // HER Updated code//////////////

692 }

693 //printf ("RTCI : %f\n", UR5Time) ;

694 //Perform force torque transfer matrix

695 cVector3d fo (-force. x, - force. y, force. z); //Set the output force based on computed force components for the haptic device

696 double newPower = ( (double) PROGRAM_FORCEFEEDBACK) /500

; // Set a scaling power for the force feedback t o the haptic device

697 fo . mul (newPower) ; // Multipy force vector by a scalar (newPower)

698 ////if ( !PROGRAM_FORCES)

699 //// fo.set (0, 0, 0) ;

700 // send computed force to haptic device 701 hapticDevices [ i ] -

>setForce (fo) ; // Apply the force to the hapt ic device

702

703 // increment counter

704 i++;

705 }

APPENDIX B: SCRIPT CODE

103 while (run_program) :

104 # This array contains 6 elements, the pose to which t he robot will move

105 global WorkPos = p [ axis_l , axis_2 , axis_3 , axis_4 , axis_5

, axis_6 ]

106

107 # Display the WorkPos variable contets (pose) on the teach pendant

108 varmsg ( "Position" , WorkPos )

109

110 # Read Digital input 0 and allocated boolean result t o DiglnO

111 DiglnO = get_digital_in ( 0 )

112

113 # Read Analog input 0 and allocate result to AnlnO

114 global AnlnO = get_analog_in ( 0 )

115 global AnlnO = floor ( (AnlnO * 10 ) /2.7 )

116 varmsg ("AnlnO", AnlnO)

117

118 TCPForce = force ()

119 varmsg ("FORCE", TCPForce)

120

121 # If Digital input 0 is high then the soft E-

Stop has been triggered, therefore terminate program.

122 # Else servo to position (linear in joint-space)

123 if DiglnO == False:

124 # Kill program execution

125 run_program = False

126 elif TCPForce > 70:

127 # Kill program execution

128 run_program = False

129 else:

130 # Get joint angles from pose information and send to servoj command with time (t) to complete the move

131 servoj (get_inverse_kin (WorkPos) , t=0.03)

132 end

133

134 # Uses up the remaining "physical" time Thread_l() ha s in the current frame

135 sync ( )

136 end