Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PALPATION TRAINING APPARATUS
Document Type and Number:
WIPO Patent Application WO/2019/243828
Kind Code:
A1
Abstract:
A palpation device (100) comprises a variable stiffness element (130) that is configured to provide a spatially variable stiffness to represent one or more objects to be palpated. The spatially variable stiffness is adjustable such that the one or more objects to be palpated is adjustable. A system (1000) for palpation training is also provided comprising a palpation device (100) and a virtual reality device (200) configured to display a virtual reality environment (700) to a user. The system further comprises a motion sensor (300) to detect the position, gesture and/or motion of the user's hand(s). The virtual reality environment simulates a virtual object to be palpated, and a virtual representation of the user's hand(s). The user can perceive the virtual representation of the user's hand(s) palpating the virtual object when the user palpates the palpation device.

Inventors:
LI DONGYUAN (GB)
Application Number:
PCT/GB2019/051736
Publication Date:
December 26, 2019
Filing Date:
June 20, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HAPTIC ILLUSION LTD (GB)
International Classes:
G09B23/30; G09B23/32
Domestic Patent References:
WO2015089512A12015-06-18
Foreign References:
US20170212589A12017-07-27
US20170303805A12017-10-26
US20020163497A12002-11-07
Other References:
J. OU ET AL.: "jamSheets: Thin interfaces with tunable stiffness enabled by layer jamming", THE PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON TANGIBLE, EMBEDDED AND EMBODIED INTERACTION, February 2014 (2014-02-01), pages 65 - 72
Attorney, Agent or Firm:
JOHNSON, Dr Carrie-Anne (GB)
Download PDF:
Claims:
CLAIMS

1. A palpation device comprising a variable stiffness element, the variable stiffness element being configured to provide a spatially variable stiffness to represent one or more objects to be palpated; wherein the spatially variable stiffness is adjustable such that the one or more objects to be palpated is adjustable.

2. The device of claim 1, wherein the variable stiffness element is configured to provide one or more first regions having a first stiffness and one or more second regions having a second stiffness, wherein the first stiffness is greater than the second stiffness.

3. The device of claim 2, wherein the stiffness, position, size and/or shape of the one or more first regions and/or the one or more second regions is adjustable, optionally or preferably, in response to a user palpating the device.

4. The device of any of claims 1 to 3, wherein the variable stiffness element comprises an array of individually adjustable variable stiffness elements, optionally or preferably, wherein the variable stiffness element is or comprises a variable stiffness layer.

5. The device of any preceding claim, wherein the variable stiffness element is configured to change its stiffness in response to a pressure or force applied to it.

6. The device of any preceding claim, wherein the variable stiffness element comprises a granular and/or jamming material, and optionally or preferably, wherein the granular and/or jamming material is pressure actuated.

7. The device of claim 6, wherein the variable stiffness element comprises one or more variable stiffness chambers containing the granular and/or jamming material, and optionally or preferably, wherein the or each variable stiffness chamber is configured to change its stiffness in response to a pressure or force applied to it.

8. The device of claim 7, wherein the variable stiffness element further comprises one or more actuators associated with the one or more variable stiffness chambers and configured to apply a pressure to one or more of the variable stiffness chambers.

9. The device of claim 8, wherein the or each actuator is or comprises an inflatable chamber, such that inflation of the or each inflatable chamber changes the stiffness of one or more of the variable stiffness chambers, and optionally or preferably, wherein the variable stiffness chambers form at least a part of a wall of the inflatable chambers.

10. The device of any preceding claim, further comprising one or more tactile stimulus elements to provide tactile stimulus to the user.

11. The device of claim 10, wherein the one or more tactile stimulus elements is configured to provide a tactile stimulus including any one or more of: vibration, movement, temperature, and texture.

12. The device of any preceding claim, wherein the one or more tactile stimulus elements is configured to simulate a heartbeat, a pulse, muscular movement, breathing movement and/or a body temperature.

13. The device of any preceding claim, wherein at least one of the tactile stimulus elements is or comprises:

a movable element configured to move at least a portion of the variable stiffness element in one or more directions, and optionally or preferably, wherein the movable element is driven by a motor;

a heater configured to heat at least a portion of the variable stiffness element; and/or

a vibrating element.

14. The device of any of claims 10 to 13, wherein the one or more tactile stimulus elements are configured to provide tactile feedback to a user in response to the user palpating the device; or the device of any preceding claim further comprising one or more tactile feedback elements configured to provide tactile feedback to a user in response to the user palpating the device.

15. The device of any preceding claim, further comprising a touch interface layer positioned on or over the variable stiffness element for palpating the one or more objects of the variable stiffness element therethrough.

16. The device of claim 15, wherein the touch interface layer comprises a substantially compressible material, optionally or preferably, a foam material.

17. The device of claim 15 or 16, wherein the touch interface layer comprises a flexible outer layer or skin, optionally or preferably, comprising a rubber or plastics material.

18. A system for palpation training comprising:

a palpation device comprising a variable stiffness element configured to provide a spatially variable stiffness to represent one or more objects to be palpated;

a virtual reality device configured to display a virtual reality environment to a user; and a motion sensor configured to detect the position, gesture and/or motion of the user’s hand(s);

wherein the virtual reality environment simulates a virtual object to be palpated positioned according to a corresponding position of the palpation device, and a virtual representation of the user’s hand(s) positioned according to a corresponding position of the user’s hand(s) based on an output from the motion sensor, such that the user can perceive the virtual representation of the user’s hand(s) palpating the virtual object when the user palpates the palpation device, wherein the spatially variable stiffness is adjustable such that the one or more objects to be palpated is adjustable.

19. The system of claim 18, wherein the palpation device is configured to provide tactile feedback to a user in response to the user palpating the palpation device with their hand(s) based on an output from the virtual reality device.

20. The system of claim 19, wherein, when a user’s hand(s) touches, or is touching, the palpation device, the virtual representation of the user’s hand(s) collides with the virtual object in the virtual environment to trigger the palpation device to provide the tactile feedback.

21. The system of claim 19 or 20, wherein the tactile feedback includes any one or more of: stiffness, vibration, movement, temperature, and texture.

22. The system of any of claims 19 to 21, wherein the tactile feedback provided by the palpation device is configured to simulate a heartbeat, a pulse, muscular movement, breathing movement and/or a body temperature.

23. The system of any of claims 19 to 22, wherein the tactile feedback is provided in response to a pressure applied to the palpation device by the user, wherein the pressure applied is determined by the relative positions of the virtual representation of the hand(s) and the virtual object in the virtual environment.

24. The system of any of claims 19 to 23, wherein the stiffness, position, size and/or shape of the one or more objects represented by the variable stiffness element is adjustable, optionally or preferably, user selectable and/or adjustable, optionally in response to a user palpating the device.

25. The system of any of claims 18 to 24, wherein the palpation device is or comprises a palpation device according to any of claims 1 to 17.

Description:
PALPATION TRAINING APPARATUS

Technical Field

This invention relates generally to a palpation training apparatus, particularly, but not exclusively, to a virtual reality based palpation training system including a haptic feedback device.

Background to the Invention

Palpation is a physical examination technique whereby hands and fingers are used in prehensile motions, e.g. grasping and seizing, and/or in non-prehensile motions, e.g. pushing and lifting, to check a body for abnormal sizes, shapes, firmness and/or locations of organs or body parts, or the presence of growths or tumours. A substantial level of skill is required for a person, such as a physician, to successfully assess, detect and/or diagnose problems through palpation.

Various products exist for use in palpation training. Traditional training methods use rubber models of the human body or an organ. Such rubber models provide good shape and stiffness information, but are relatively expensive, and tend to age and need replacing after a period of time or number of uses. Different models with different shapes and properties are needed for different training contexts. For large institutions, a whole range of models may be affordable, but for smaller institutions or clinics, or those in developing countries, the cost of this training method may be prohibitively high. In addition, in the process of palpation, the person or physician doesn’t need to touch all the body parts for a particular examination or training exercise. Therefore, when a trainee uses a rubber model most of the model provides only visual feedback/information and the tactile function of the material of the rest of the model is wasted.

Aspects and embodiments of the present invention have been devised with the foregoing in mind.

Summary of the Invention

The present invention relates to a palpation device and/or a haptic feedback system for clinical palpation training that addresses many of the problems associated with traditional training methods. In particular, the invention is focused on non-prehensile movements of the user’s hand(s) and/or fingers, such as pushing and lifting. Advantageously, the system can be used in various different training situations or contexts, e.g. for palpation of a variety of different bodies, body parts or objects. This can reduce the cost associated with palpation training, in particular the cost associated with purchasing a range of different rubber models for each different training context/task.

The system may utilise a haptic feedback device to represent a key palpation training area of the body and provide tactile feedback to the user. Virtual reality (VR) technology may be used to allow the user to visualise the body or part of the body to be examined/palpated. The VR environment can also provide auditory information and/or feedback to the user. The visual and/or auditory feedback/information provided to the user can be readily changed in the VR environment, allowing the system to be adaptable and used in almost any palpation training context/task. For example, different bodies, or shapes and sizes of body parts such as an abdomen or a breast, can be simulated in the VR environment and viewed by the user.

The haptic feedback device may be a physical interface that the user can touch and interact with. The physical interface may be configurable such that the device can represent a range of different objects to be palpated. The haptic feedback device can provide a range of different tactile stimuli to the user, such as the shape and/or size of an object perceived to be in or on a body, the stiffness or firmness of the object, temperature and vibration. For example, the haptic feedback device may be able to simulate body temperature, muscular movement, breathing movements, a pulse and/or a heartbeat to provide a more realistic, dynamic and interactive training system compared to traditional rubber model systems that are static. In particular, the tactile stimuli can change dynamically in response to the user’s interaction with the haptic feedback device, such as changes in the hand(s) and/or finger position or pressure applied to the device, or the user’s interaction with the VR environment.

Visual information in the VR environment, such as the position of the user’s hand(s) and/or fingers can be fed back to the haptic feedback device to trigger the tactile feedback and/or stimuli provided to the user. The system may further comprise a motion tracker to track the position of the user’s hand(s) and/or fingers. The motion tracker may be used to generate a virtual model of the user’s hand(s) and/or fingers that can be displayed in the VR environment together with the virtual model of the body or body part. In this way, the user may be able to visualise their hand(s) and/or fingers in VR environment interacting with the virtual body or body part.

The user’s hand(s) and/or finger positions can be determined and recorded by the motion tracker and fed back to the VR environment in near real-time. When a user’s hand(s) and/or finger touches, or is touching, the haptic feedback device, the virtual model of the user’s hand/fingers will collide with the virtual representation of the body or body part to trigger the haptic feedback device to provide the required tactile feedback, such a change in surface stiffness or a vibration.

Through the interaction of tactile, visual and/or auditory information within a VR environment, the invention provides a palpation training system that can be used for a range or different palpation training tasks with improved dynamic tactile feedback and reduced cost compared to traditional training systems.

According to a first aspect of the invention, there is provided a palpation device which may comprise a variable stiffness element. The variable stiffness element may be configured to provide a spatially variable stiffness and/or a spatially variable profile to represent one or more objects to be palpated.

The spatially variable stiffness and/or the spatially variable profile may be adjustable or configurable such that the one or more objects to be palpated is or can be adjustable. For example, the stiffness, position, size and/or shape of the one or more objects represented by the variable stiffness element may be adjustable. The spatially variable profile may be or comprise one or more undulations in or on the variable stiffness element, e.g. in a surface thereof. The spatially variable profile may be or comprise a spatially variable thickness. The spatially variable profile may be or comprise one or more raised regions and/or one or more lowered regions. The spatially variable profile may be or comprise one or more convex regions and/or one or more concave regions. For example, the spatially variable profile may be or comprise one or more surface protrusions and/or one or more surface depressions. The or each protrusion may comprise a thickness and/or extend away from the variable stiffness element to a height. The height and/or thickness of the or each protrusion may be the same or different. The spatially variable profile of the variable stiffness element may be configurable or adjustable. For example, the position, size, shape, width, thickness and/or height of the or each element or surface protrusion or depression may be adjustable. In this way, the variable stiffness layer may be able to simulate a wide range of different objects to be palpated.

The variable stiffness element may be configured to provide one or more first stiffness regions having a first stiffness and one or more second stiffness regions having a second stiffness. The first stiffness may be greater than the second stiffness. The or each first region having an increased stiffness may be surrounded by a second region having a reduced stiffness.

The one or more first and/or second stiffness regions may or may not correspond to the surface profile of the variable stiffness element. At least one of the one or more first and/or second stiffness regions may correspond to at least one of the raised and/or lowered regions (e.g. at least one surface protrusion and/or depression). In one example, at least one of the surface protrusions may correspond to a region of increased stiffness, e.g. first region with a first stiffness.

The spatially variable stiffness of the variable stiffness element may further be configurable or adjustable. For example, the stiffness, position, size and/or shape of the one or more first regions and/or the one or more second regions may be adjustable. Optionally or preferably, the stiffness, position, size and/or shape of any one or more of the one or more first regions and/or the one or more second regions may be adjustable in response to a user palpating the device. In this way, the variable stiffness element may be able to simulate a wide range of different objects with different stiffnesses and shapes and sizes to be palpated.

The variable stiffness element may comprise an array of individually adjustable variable stiffness regions. The variable stiffness element may comprise a plurality or array of variable stiffness elements. Each variable stiffness element may be positioned adjacent to another.

The variable stiffness element(s) may form a variable stiffness layer.

The variable stiffness element(s) or layer may be configured to change its stiffness, size and/or position in response to a pressure or force applied to it, e.g. in response to a user palpating the variable stiffness element. The force may be applied in one or more directions, e.g. a downward direction (i.e. compression) and/or one or more lateral directions (i.e. manipulation).

The variable stiffness element(s) or layer may comprise a granular and/or a jamming material, e.g. a granular/particle jamming material and/or a layer jamming material. The variable stiffness element(s) or layer and/or jamming material may be pressure actuated. The variable stiffness element may comprise one or more variable stiffness chambers filled with or containing the granular and/or jamming material. The or each variable stiffness chamber may be configured to change its stiffness in response to a movement such as a pressure or force applied to it.

The granular/particle jamming material may be or comprise any granular substance. For example, coffee, couscous, sand, millet, glass or plastic particles. The layer jamming material may be or comprise a laminar sheet structure or a plurality of laminar sheets, i.e. a plurality of sheets stacked on top of each other. The sheets may be formed of or comprise a flexible plastics or polymer material. A pressure or force applied to the granular/particle and/or layer jamming material may reduce the inter particle and/or inter-layer separation, thereby increasing the stiffness of the granular and/or layer jamming material. The layer jamming material may be similar to that described in J. Ou et al. “jamSheets: Thin interfaces with tunable stiffness enabled by layer jamming” (February 2014) in the Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction (pp. 65-72).

The variable stiffness element(s) or layer may comprise one or more actuators associated with the one or more variable stiffness chambers. The one or more actuators may be configured to apply a movement such as a force or pressure to one or more of the variable stiffness chambers. For example, an actuator may be configured to apply a pressure to a single variable stiffness chamber or a plurality of variable stiffness chambers. Additionally or alternatively, a plurality of actuators may be configured and arranged to apply a pressure to a single variable stiffness chamber. The pressure or force applied by the or each actuator may be in a direction substantially opposing a pressure or force applied by a user palpating the variable stiffness chamber.

The or each actuator may be or comprise an inflatable chamber. Inflation of the or each inflatable chamber may change the stiffness of one or more of the variable stiffness chambers. Optionally, a variable stiffness chamber may form at least a part (or all) of a wall of an inflatable chamber.

The pressure or force applied by the actuators and/or the inflation chambers may contribute to the stiffness of the variable stiffness layer.

Alternatively, the variable stiffness element(s) or layer may comprise one or more actuators or inflatable chambers without any variable stiffness chambers filled with a jamming material (i.e. granular or layer). In this way, the actuators or inflatable chambers may themselves provide the spatially variable stiffness of the variable stiffness element(s) or layer. The or each inflatable chamber may, in use, be inflatable with a fluid. The fluid may be or comprise a gas, such as air, nitrogen etc. Alternatively, the fluid may be or comprise a liquid, such as water, oil, or any other hydraulic liquid known in the art.

The or each actuator may be or comprise a gas actuator, hydraulic actuator or a mechanical actuator, or a combination thereof. For example, each actuator may comprise a movable element configured to apply a movement, force or pressure, e.g. to a variable stiffness chamber or another layer adjacent the actuator(s). The actuator(s) may be configured to provide a resistance to movement a variable stiffness chamber or another layer adjacent the actuator(s). The or each movable element of the actuator may be driven by a motor, linear actuator, hydraulic or gas actuator (e.g. a cylinder and rod arrangement).

The variable stiffness element(s) or layer may be supported on or by a solid or flexible platform or base. For example, the variable stiffness layer may be removably mounted on or attached to the platform or base.

The device may further comprise one or more tactile stimulus elements for providing a tactile stimulus to a user.

The device may be configured to provide tactile feedback to a user through the variable stiffness element(s) and/or the one or more tactile stimulus elements. The tactile feedback may be passive, for example, the user may be provided with tactile feedback through the action of touching and/or palpating the variable stiffness element(s). Any tactile stimulus provided by the one or more tactile stimulus elements may be unchanged by the user touching and/or palpating the device. Additionally or alternatively, the tactile feedback may be active, for example, the user may be provided with tactile feedback in response to a user touching and/or palpating the device (e.g. a change in stiffness of the variable stiffness element(s)). Any tactile stimulus provided by the one or more tactile stimulus elements may change or be triggered by the user touching and/or palpating the device.

The one or more tactile stimulus elements may be configured to provide a tactile stimulus (e.g. to a user) including any one or more of: a vibration, a movement, a pulsing, a temperature, and a texture. The one or more tactile stimulus elements may be configured to simulate a heartbeat, a pulse, muscular movement, breathing movement and/or a body temperature. In this way, the device may be more interactive, versatile, adaptable and provide a more realistic representation of a body or body part to be palpated compared to known devices.

At least one of the tactile stimulus elements may be or comprise a movable element. The moveable element may be configured to move at least a portion of the variable stiffness element(s) or layer in one or more directions. For example, the moveable element may be configured to move the variable stiffness element(s) or layer directly, or indirectly, e.g. by moving the platform or base on which the variable stiffness element(s) or layer is supported. Optionally or preferably, the movable element may be driven by a motor or actuator. The motor or actuator may be electronically controlled. The moveable element may be configured to move at least a portion of the variable stiffness element(s) or layer in response to a user palpating the variable stiffness element(s) or layer.

At least one of the tactile stimulus elements may be or comprise a heater. The heater may be configured to heat at least a portion of the variable stiffness element(s) or layer. The heater may be configured to heat at least a portion of the variable stiffness element(s) or layer to a temperature in the range between room temperature and body temperature, e.g. substantially any temperature in the range 15-40 °C, including standard body temperature 36.5 °C, 37 °C or 37.5 °C. The heater may be configured to heat at least a portion of the variable stiffness element(s) or layer in response to a user palpating the variable stiffness element(s) or layer.

At least one of the tactile stimulus elements may be or comprise a vibrating element. The vibrating element may be configured to provide a vibration in response to a user palpating the variable stiffness element(s) or layer.

The tactile stimulus provided by the or each tactile stimulus element may be static or dynamic. For example, the or each tactile stimulus element may provide a stimulus (constant or dynamic) irrespective of a user touching and/or palpating the device (i.e. passive feedback). Additionally or alternatively, the or each tactile stimulus element may provide a stimulus in response to a user touching and/or palpating the device (i.e. active feedback).

The device may further comprise one or more tactile feedback elements configured to provide tactile feedback to a user in response to the user palpating the device. For example, the tactile feedback element(s) may be in addition to or instead of the tactile stimulus element(s).

The device may further comprise a surface or touch interface layer. The touch interface layer may be positioned on or over the variable stiffness element(s) or layer. For example, the touch interface layer may be provided for palpating the one or more objects of the variable stiffness element(s) or layer through the touch interface layer. The touch interface layer may be formed of or comprise a substantially compressible material. For example, the touch interface layer may be formed of or comprise a foam material. The touch interface layer may be configured to simulate the softness and/or texture of skin, fat and/or tissue of a body or body part.

The touch interface layer may further comprise a flexible outer layer or skin. The flexible outer layer may be formed of or comprise a rubber or plastics material, or any other material e.g. with a texture or feel similar to skin, tissue, fat etc.

The touch interface layer and/or the variable stiffness element(s) or layer may be removable from the device. The touch interface layer may be interchangeable with other touch interface layers. For example, different touch interface layers may be used for different applications, for example to simulate different body parts, such as an abdomen or breast.

The device may further comprise one or more controllers for controlling the variable stiffness element(s) or layer and/or the or each tactile stimulus element. For example, the controller(s) may be configured to control the stiffness of the variable stiffness element(s) or layer through the one or more actuators or inflatable chambers. The controller(s) may be configured to control the movement of the one or more actuators and/or the inflation of the one or more inflatable chambers (e.g. through one or more pumps or fluid/gas sources). The controller(s) may be configured to control the vibration, movement and/or temperature of the or each tactile stimulus element. The controller(s) may be in wired or wireless (e.g. wifi, Bluetooth, optical) communication with the or each tactile stimulus element and/or the variable stiffness element(s) or layer (e.g. with the actuators).

The controller may be configured to control the variable stiffness element(s) or layer and/or the or each tactile stimulus element in response to one or more input signals received from an external device.

According to a second aspect of the invention, there is provided a system for palpation training. The system may comprise a palpation device comprising a variable stiffness element or layer configured to provide a spatially variable stiffness to represent one or more objects to be palpated. The system may further comprise a virtual reality device configured to display a virtual reality environment to a user. The system may further comprise a motion sensor configured to detect the position, gesture and/or motion of the user’s hand(s). The virtual reality environment may simulate a virtual object to be palpated positioned according to a corresponding position of the palpation device. The virtual environment may further simulate a virtual representation of the user’s hand(s) positioned according to a corresponding position of the user’s hand(s). The simulated virtual representation of the user’s hands may be based on an output from the motion sensor, such that the user can perceive the virtual representation of the user’s hand(s) palpating the virtual object when the user palpates the palpation device.

The palpation device may be configured to provide tactile feedback to a user in response to the user palpating the palpation device with their hand(s) based on an output from the virtual reality device. The palpation device may be configured to provide tactile feedback to a user in response to the user interacting with the VR environment.

In use, when a user’s hand(s) touches, or is touching, the palpation device, the virtual representation of the user’s hand(s) may interact or collide with the virtual object in the virtual environment to trigger the palpation device to provide the tactile feedback. Alternatively or additionally, the interaction of the user’s hand(s) with the palpation device may itself trigger the palpation device to provide the tactile feedback. The tactile feedback provided by the palpation device may include one or more of: stiffness, vibration, movement, temperature, and texture. The tactile feedback provided by the palpation device may be configured to simulate a heartbeat, muscular movement, breathing movement and/or a body temperature.

The tactile feedback may be provided in response to a pressure applied to the palpation device by the user. The pressure applied may be determined by the relative positions of the virtual representation of the hand(s) and the virtual object within the virtual environment. In this way, pressure sensors and/or additional sensing components in the palpation device may be avoided.

The stiffness, position, size and/or shape of the one or more objects represented by the variable stiffness element(s) may be adjustable, optionally or preferably, user selectable and/or adjustable in response to a user palpating the device and/or interacting with the VR environment.

The virtual object displayed in the virtual environment may be or comprise a body or a part of a body. The virtual environment may further comprise a graphical user interface (GUI) for the user to interact with. The GUI may be configured with a number of user selectable input fields, for example, to allow the user to select a training exercise and/or a virtual object to be displayed in the virtual environment.

The virtual reality device may be or comprise a head mounted device, such as a headset.

The palpation device of the system may be or comprise a palpation device according to the first aspect.

According to a third aspect of the invention, there is provided a palpation training system comprising the palpation device according to the first aspect. The system may further comprise a virtual reality device configured to display a virtual reality environment to a user. The system may further comprise a motion sensor configured to detect the position, gesture and/or motion of the user’s hand(s). The virtual reality environment may simulate a virtual object to be palpated positioned according to a corresponding position of the palpation device. The virtual environment may further simulate a virtual representation of the user’s hand(s) positioned according to a corresponding position of the user’s hand(s). The simulated virtual representation of the user’s hands may be based on an output from the motion sensor, such that the user can perceive the virtual representation of the user’s hand(s) palpating the virtual object when the user palpates the palpation device.

The palpation device may be configured to provide tactile feedback to a user in response to the user palpating the palpation device with their hand(s) based on an output from the virtual reality device. The palpation device may be configured to provide tactile feedback to a user in response to the user interacting with the VR environment.

According to a fourth aspect of the invention, there is provided a method of palpation training. The method may comprise providing a palpation device having a variable stiffness element or layer, the variable stiffness element or layer being configured to provide a spatially variable stiffness to represent one or more objects to be physically palpated, and providing one or more corresponding virtual objects within a virtual environment to be virtually palpated, wherein the appearance of the one or more virtual objects in the virtual environment is different to the appearance of the palpation device. The method may further comprise providing a virtual representation of a user’s hand(s) within the virtual environment. The method may further comprise physically palpating the palpation device whilst simultaneously viewing the virtual representation of the user’s hand(s) palpating the one or more virtual objects in the virtual environment.

The method may further comprise providing tactile feedback to the user through the palpation device based on the interaction between the virtual representation of the user’s hand(s) and the one or more virtual objects in the virtual environment.

Providing the one or more corresponding virtual objects within a virtual environment may further comprise providing the one or more corresponding virtual objects within a virtual environment positioned according to a corresponding position of the palpation device. Providing the virtual representation of a user’s hand(s) within a virtual environment may further comprise providing the virtual representation of a user’s hand(s) within a virtual environment positioned according to a corresponding position of the user’s hand(s).

The method may further comprise generating the one or more corresponding virtual objects and/or the virtual representation of a user’s hand(s) within the virtual environment.

The method may further comprise tracking the position, gesture and/or motion of the user’s hand(s), and providing the virtual representation of a user’s hand(s) within a virtual environment positioned according to a corresponding position of the user’s hand(s) based on the tracked position, gesture and/or motion of the user’s hand(s).

The method may further comprise providing a graphical user interface (GUI) within the virtual environment, and optionally selecting, by interacting with the GUI, a training exercise and/or the virtual object to be provided in the virtual environment.

The method may further comprise configuring the one or more objects represented by the palpation device, based on the training exercise and/or the virtual object selected by the user. The method may further comprise generating the spatially variable stiffness in the variable stiffness element or layer of the palpation device to represent the one or more objects to be palpated, based on the training exercise and/or the virtual object selected by the user.

The method may further comprise generating a spatially variable profile in the variable stiffness element or layer of the palpation device to represent the one or more objects to be palpated, based on the training exercise and/or the virtual object selected by the user. The method may comprise providing tactile feedback including one or more of: a vibration, temperature change, movement of the variable stiffness element or layer, in response to the user palpating the palpation device, and/or the interaction between the virtual representation of the user’s hand(s) and the one or more virtual objects in the virtual environment.

The method may further comprise providing audio feedback to the user in response to the user palpating the palpation device, and/or the interaction between the virtual representation of the user’s hand(s) and the one or more virtual objects in the virtual environment.

The method may further comprise changing or adjusting the stiffness, size, shape, and/or position of the one or more objects represents by the palpation device in response to the user palpating the palpation device, and/or the interaction between the virtual representation of the user’s hand(s) and the one or more virtual objects in the virtual environment.

The method may be performed using the palpation device and/or the system of the first and second aspects.

Features which are described in the context of separate aspects and embodiments of the invention may be used together and/or be interchangeable. Similarly, where features are, for brevity, described in the context of a single embodiment, these may also be provided separately or in any suitable sub- combination. Features described in connection with the device may have corresponding features definable with respect to the method(s) and these embodiments are specifically envisaged.

Brief Description of Drawings

In order that the invention can be well understood, embodiments will now be discussed by way of example only with reference to the accompanying drawings, in which:

Figure 1 shows a schematic diagram of the system according to an embodiment of the invention;

Figure 2 shows an illustration of a user interacting with the system of figure 1;

Figure 3 shows a haptic feedback device;

Figures 4A and 4A show schematic sectional views of embodiments of the haptic feedback device of figure 3;

Figures 5 and 6 show a portion of a granular jamming air cell in an deflated and inflated state;

Figure 7 shows an example variable stiffness layer;

Figure 8 shows an exploded view of the device of figure 3; and

Figures 9 and 10 show methods of using the system of figure 1.

Detailed Description

Figure 1 shows a palpation training system 1000 according to an embodiment of the present invention. The system 1000 comprises a physical interface device 100 and a virtual reality (VR) interface device 200. The physical interface device 100 and the virtual reality (VR) interface device 200 may be in wired or wireless data communication (e.g. using any wireless technology including wifi, Bluetooth, optical etc.)·

The VR device 200 may be or comprise a head mounted device configured to display a VR environment to the user 500. The VR device 200 allows the user to visualise a body or a part of the body to be examined/palpated, or being examined/palpated, in the VR environment. The physical interface device 100 may be or comprise a haptic feedback back device 100. The physical interface device 100 may be configured to provide tactile feedback or stimuli to a user 500, e.g. in response to the user 500 interacting with the haptic feedback device 100, in particular with their hand(s) and/or fingers, and/or in response to the user 500 interacting with the VR environment. The device 100 may be configured to represent a key palpation training area of the body.

The VR device 200 may comprise a display (not shown) or video screen configured, in use, to be positioned in front of the user’s eyes for displaying the VR environment to the user. Alternatively, the VR device 200 may be configured to project the VR environment onto a screen positioned, in use, in front of the user. For example, the VR device 200 may be or comprise any known VR headset technology.

The system 1000 may be configured to display a plurality of different simulated VR environments to the user 500. For example, the VR environment may comprise a virtual representation of a body or part of a body, or an organ. The virtual representation may take various forms, e.g. size, shape, age, gender. The VR environment may further comprise a virtual representation of the user’s hand(s). The system 1000 may be configured to change the position of the virtual representation of the user’s in the VR environment in response to a change in the position of the of the user’s hand(s). In addition, the point of view of the user in the VR environment may further be responsive to the position and/or direction of the user’s head. In this way, the user 500 may be able to look around and/or scan the VR environment, making the training experience more realistic.

The system 1000 may further comprise a motion tracking device 300. The motion tracking device 300 may be configured to detect the position and/or movement of the user’s hand(s). The motion tracking device 300 may be further configured to detect the position and/or orientation of the user’s head. The motion tracking device 300 may further be configured to detect the position/location of the device 100 (e.g. using known VR device location and identification technology, such as imagine identification). The motion tracking device 300 may be configured to record position and/or tracking data of the user’s hand(s), head and/or the device 100 and transmit the tracking data to the VR device 200 for generating and displaying the virtual representation of the user’s hand(s), the virtual representation of the body or body part and/or adjusting the point of view of the user in the virtual environment.

The motion tracking device 300 may be or comprise any motion tracking device known in the art that is suitable for hand and/or finger tracking. For example, the motion tracking device 300 may be or comprise a commercial motion tracking device, such as a Leap Motion device. The motion tracking device 300 may be integrated, integratable with, or attachable to the VR device 200. Alternatively, the motion tracking device 300 may be separate from the VR device 200.

The VR environment may be generated by software executed on a computing device 400. The VR device 200 may comprise the computing device 400. Alternatively, the computing device 400 may be separate from the VR device 200. The motion tracking device 300 may be in wired or wireless communication (e.g. wifi, Bluetooth, optical) with the computing device 400. The computing device 400 may be configured to receive tracking data from motion tracking device 300 for generating the virtual representation of the user’s hands in the VR environment.

The virtual representation displayed may depend on the type of training exercise. In one example, the user may be able to choose or select the type of training exercise within the VR environment. For example, the user may be presented with a graphical user interface (GUI) with a number of selectable options or data entry fields within the VR environment. When the virtual representation of the user’s hand(s) collides with the GUI, one or more different exercises, options or input commands may be selected. The selection of the one or more different exercises, options or input commands may trigger a specific training exercise to be initiated or type of virtual representation to be generated. For example, the selection of the one or more different exercises, options or input commands from the GUI may send a signal to the computing device 400 to change the VR environment.

The position of the virtual representation of the body or body part in the VR environment may correspond (or be perceived to correspond) to the actual position of the device 100 (e.g. based on the tracking data). Similarly, the position of the virtual representation of the user’s hand(s) the VR environment may correspond (or be perceived to correspond) to the actual position of the user’s hand(s). For example, when a user’s hand(s) touches, or is touching, the device 100, the virtual representation of the user’s hand(s) may collide with the virtual representation of the body or body part in the VR environment.

The device 100 may be configured to provide tactile feedback or stimuli to a user 500 in response to the virtual representation of the user’s hand(s) interacting or with the virtual representation of the body or part of the body. For example, collision between the virtual representation of the user’s hand(s) and the virtual representation of the body or body part in the VR environment may trigger a signal to be sent to the device 100 to provide the tactile feedback.

The tactile feedback may comprise a vibration, a movement of the device 100, a change in the stiffness of the device 100 and/or a change in the temperature of the device 100. For example, the device 100 may be configured to provide a vibration in response to the user 500 palpating the correct area or the incorrect area. Alternatively or additionally, the device 100 may be configured to provide a vibration periodically to simulate a pulse or heartbeat. The temperature of the device 100 may be adjusted to be close to body temperature, e.g. in the range 35-40 °C. The device 100 may further be configured with one or more movable elements to simulate muscular movement and/or breathing movement of a body. Further details of the tactile feedback will be described below.

Figure 2 shows an illustration of a user 500 interacting with the system 1000. In this example, the user 500 is touching a device 100 with both hands 550. The user 500 is shown wearing a VR device 200 (a VR headset) and the VR environment 700 that is displayed to the user 500 through the VR device 200 is shown in the display of a computing device 400. The VR environment 700 shows a virtual representation of the user’s hands 550’ interacting with a virtual representation of a body 600.

Figures 3, and 4A and 4B illustrate a device 100 according to an embodiment of the invention. A touch interface layer 120 may be provided on a surface of the device 100, e.g. the top of the device 100, as shown in figure 3. Optionally or preferably, the outer surface 120a of the touch interface layer 120 may be configured to provide colour contrast against a user’s hand for the motion tracking device 300 to reliably detect and track the user’s hand(s). For example, the outer layer 120a of the touch interface layer 120 may be coated with or comprise a dark material or colour. Additionally or alternatively, the outer layer 120a may be representative of a patient’s skin.

The device 100 comprises a variable stiffness element or layer 130, as shown in figures 4B and 4B. The variable stiffness layer 130 may be configured to provide a range of different stiffnesses that the user 500 can feel. The variable stiffness layer 130 may be further configured to provide a spatially variable or varying stiffness in order to represent one or more palpable objects 130a, 130b, 130c. For example, the variable stiffness layer 130 may be configured to provide one or more first regions 130a, 130b, 130c having a first stiffness and one or more second regions having a second stiffness 130a, 130b, 130c, wherein the first stiffness is greater than the second stiffness.

In addition, the variable stiffness layer 130 may be further configured to provide a spatially variable or varying profile or thickness to further represent one or more palpable objects 130a, 130b, 130c. For example, each of the one or more first regions of increased stiffness may be associated with a region of increased thickness (e.g. to represent a tumour). Alternatively, a region of increased thickness may comprise one or more first regions and/or second regions. In this way, the device 100 may be able to generate a variety of palpable objects having different properties, such as size, shape, thickness and stiffness.

The stiffness, position, size and/or shape of the one or more first regions 130a, 130b, 130c and/or the one or more second regions 130a, 130b, 130c may be adjustable (e.g. in response to a user palpating the device or by a user selecting a specific training exercise or type of body /body part). Alternatively or additionally, the spatially varying profile of the variable stiffness layer 130 may be adjustable (e.g. in response to a user palpating the device or by a user selecting a specific training exercise or type of body /body part). As shown in figures 4B and 4B, the variable stiffness layer 130 may be or comprise an array of individually adjustable variable stiffness regions 130a, 130b, 130b. The stiffness and/or the thickness of each region may be adjusted individually.

The variable stiffness layer 130 may be configured to change its stiffness and/or thickness in response to a pressure or force applied to it. In one embodiment, the variable stiffness layer 130 comprises one or more inflatable chambers 134. In this case, inflating the inflatable chambers 134 (e.g. via tubing 138) increases the stiffness and/or thickness of the variable stiffness layer 130.

Alternatively or additionally, the variable stiffness layer 130 may comprise a granular jamming material 133. The granular jamming material 133 may be pressure actuated. The variable stiffness layer 130 may comprise one or more variable stiffness chambers 132 filled with the granular jamming material 133 and be configured to retain the granular jamming material 133 within the variable stiffness chambers 132. The or each variable stiffness chamber 132 may be configured to change its stiffness in response to a pressure or force applied to it. The variable stiffness layer 130 may further comprise one or more actuators 134 associated with the one or more variable stiffness chambers 132 and configured to apply a pressure to one or more of the variable stiffness chambers 132. The or each actuator 134 may be mechanical, pneumatic or hydraulic.

The or each actuator 134 may be or comprise an inflatable chamber 134, such that inflation of the or each inflatable chamber 134 changes the stiffness of one or more of the variable stiffness chambers 132. The variable stiffness chamber 132 may form at least a part of a wall of the inflatable chamber 134. Alternatively, the variable stiffness chambers 132 and the inflatable chambers 134 may be separate.

The or each inflatable chamber and/or the or each variable stiffness chamber may be formed from or comprise a substantially flexible plastics material, such as poly vinyl chloride (PVC).

Figures 5 and 6 show an example of a variable stiffness layer 130 comprising a variable stiffness chamber 132 and an inflatable chamber 134 in a deflated and inflated state, respectively. In the deflated state, the pressure applied to the variable stiffness chambers 132 is low and the granular jamming material 133 within the variable stiffness chamber 132 is relatively free to move, resulting in low stiffness. In the inflated state, pressure is applied to the variable stiffness chamber 132 causing the granular jamming material 133 to jam together and increase the stiffness of the variable stiffness chamber 132. This inflated state may also provide a region of the variable stiffness layer 130 with increased thickness compared to the deflated state. In a specific example, pressure is applied to a bottom surface of the variable stiffness chamber 132 and a top surface of the variable stiffness chamber 132 comprises one or more perforations or holes (not shown) to allow air within the chamber 132 to escape when pressure is applied to the bottom surface. The perforations/holes are sufficiently small such that the granular jamming material 133 cannot escape. This allows the interior volume of the variable stiffness chamber 132 to reduce upon application of pressure (or inflation of chamber 134), which in turn increases the density of the granular jamming material 133 and the stiffness of the variable stiffness layer 130.

The variable stiffness layer 130 may comprise an array of individually adjustable inflatable chambers 134, as shown in figures 4A and 4B. Each inflatable chamber 134 may provide pressure to one or more variable stiffness chambers 132. Although three inflatable chambers 134 and variable stiffness chambers 132 are shown in figures 4A and 4B, it will be appreciated that the variable stiffness layer 130 may comprise any number of inflatable chambers 134 and variable stiffness chambers 132. The variable stiffness layer 1300 may be supported by a solid or flexible platform or base 136. For example, the variable stiffness layer 136 may be removably mounted on or attached to the platform or base 136.

The granular jamming material 133 may be or comprise any granular substance. For example, coffee, couscous, sand, millet, glass or plastic particles.

As shown in figure 3, the device 100 may further comprise a touch interface layer 120 for the user 500 to touch and/or palpate the one or more objects 130a, 130b, 130c through. The touch interface layer 120 may comprise a substantially compressible material 120a, such as a foam material. In one example, the compressible material 120a is or comprises a high density memory foam. The touch interface layer 120 may further comprise an outer layer 120b, such as a sheet of rubber or plastics material. In one example, the outer layer 120b is or comprises a silicone sheet. The outer layer 120b may provide a smooth surface or a textured/profiled surface, for example, to simulate the smoothness and/or texture of skin and any surface features such as a nipple or navel.

The touch interface layer 120 may be removable from the device 100, for example to exchange one touch interface layer 120 with another. This is illustrated in devices 100 shown in figure 4A and 4B, which are otherwise identical apart from the shape of the touch interface 120 layer. Specifically, in the example of figure 4A, the touch interface layer is substantially planar or rectangular in cross-section, which may be suitable for representing certain body parts, such as an abdomen. In the example of figure 4b, the touch interface layer 120 is substantially curved in cross-section which may, for example, be suitable for representing a breast or part of a chest.

The device 100 may further comprise one or more tactile feedback elements 140, 150, 160 to provide a tactile stimulus to the user 500, e.g. in response to the user palpating the device 100 and/or in response to the user 500 interacting with the VR environment 700 (e.g. through the virtual representation of the user’s hand(s) 550’ interacting with one or more virtual objects 600 and/or the GUI). The one or more tactile feedback elements may be or comprise one or more heaters 140. The one or heaters 140 may be arranged to heat at least a portion of the variable stiffness layer 130 and/or the touch interface layer 120. The one or more heaters 140 may be arranged to provide a uniform temperature across the variable stiffness layer 130 and/or the touch interface layer 120, or a non-uniform temperature across variable stiffness layer 130 and/or the touch interface layer 120. For example, the one or more heaters 140 may be arranged to provide a localised temperature change (i.e. a region of higher or lower temperature) corresponding to the position or area of the one or more palpable objects 130a, 130b, 130c. In this way, the temperature of the device 100 may be adjusted uniformly or non-uniformly (e.g., to be close to body temperature), to provide a more realistic training experience.

Preferably, the or each heater 140 may be capable of changing its output temperature rapidly (i.e. increase or decrease). Further, the or each heater 140 may provide an output sufficient to heat the touch interface layer 120 to a temperature in the range 15-40°C, preferable 35°C to 40°C. For example, the or each heater 140 may be or comprise a thermoelectric heater.

The one or more tactile feedback elements 140, 150, 160 may be or comprise one or more moveable elements 160. The one or more moveable elements 160 may be arranged and configured to move at least a portion of the variable stiffness layer 130 and/or the touch interface layer 120 in one or more directions. In one example, the movable element 160 is driven by a motor 165, as shown in figures 4A and 4B. However, it will be appreciated that various other means (e.g. mechanical, pneumatic or hydraulic actuators or otherwise) to move the moveable element 160 may be implemented.

In the example shown in figures 4A and 4B, the moveable element 160 comprises an arm 164 coupled to a motor 165 at one end and a platform 162 at the other end. The platform 162 may be used to transfer the movement of the arm 164 to the variable stiffness layer 130 and/or the touch interface layer 120. In use, the moveable element 160 can push or pull, raise or lower the variable stiffness layer 130 and/or the touch interface layer 120 in a generally upwards/downwards direction, e.g. to simulate muscular movement or breathing movement of a body. Additionally or alternatively, the moveable element 160 may further be configured to move the variable stiffness layer 130 and/or the touch interface layer 120 in a sideways direction (not shown).

Although only one moveable element 160 is shown in figures 4A and 4B, it will be appreciated that the device 100 may comprise multiple moveable elements 160. In this case, each moveable element 160 may be configured to move a different area of the variable stiffness layer 130 and/or the touch interface layer 120.

The one or more tactile feedback elements 140, 150, 160 may be or comprise one or more vibrating elements 150. In one example, the or each vibrating element 150 is a vibration motor, such as a flat coin vibration motor, an eccentric rotating mass motor or a linear resonant actuator motor. The device 100 may be configured to provide a vibration in response to the user 500 palpating the device 100, e.g. in the correct area or the wrong area. Alternatively or additionally, the device 100 may be configured to provide a vibration periodically to simulate a pulse or heartbeat.

The device 100 may further comprise a controller 170 in data communication with the one or more tactile feedback elements 140, 150, 160 and the variable stiffness layer 130 to control the tactile feedback provided by the device 100. The controller 170 may be configured to receive one or more input signals from the VR device 200 and motion tracking device 300 in order to provide tactile feedback in response to the user 500 interacting with the visual information in the VR environment 700. For example, the controller 170 may be configured to control the stiffness of the variable stiffness element(s) or layer 130 through the one or more actuators 134 or inflatable chambers 134. The controller may be configured to control the movement of the one or more actuators 134 and/or the inflation of the one or more inflatable chambers 134 (e.g. through one or more pumps or fluid/gas sources, not shown).

The device 100 may further comprise a housing 110 to hold or support the variable stiffness element(s) or layer 130, the one or more tactile feedback elements 140, 150, 160 and the controller 170, as shown in figures 3, 4 A and 4B.

Figures 7 and 8 show further illustrations of an example device 100. Figure 7 shows an example device 100 with the touch interface layer 120 removed revealing a variable stiffness layer 130 comprising a 3 x 3 array of granular jamming air cells 132, 134. Although a 3 x 3 array is shown, it will be appreciated that any sized array of granular jamming air cells 132, 134 may be implemented. Figure 8 shows an exploded view of an example device 100 showing the touch interface layer 120, the variable stiffness layer 130, the controller 170 and the tactile stimulus elements 140, 150, 160.

Figure 9 shows an example method 2000 of using the system 1000. In step SI, the user 500 engages the system 1000. This may involve a user 500 sitting in front of the palpation device 100 and wearing the VR device 200, such that the VR environment 700 is displayed to the user 500. The method 2000 may proceed with a number of steps performed by the user 500 within the VR environment 700, with corresponding actions being performed in reality by the user 500 and/or the palpation device 100. In step S2, the user 500 may be introduced to objects of different stiffness. This may involve the user 500 being presented, within the VR environment, with one or more virtual objects 600 having different stiffnesses (e.g. each object may be marked with a stiffness indicator). One or more corresponding physical objects 130a, 130b, 130c may be generated by the palpation device 100 for the user 500 to touch, feel and/or palpate. For example, one or more regions 130a, 130b, 130c of increased stiffness and/or thickness may be generated in the variable stiffness layer 130 of the palpation device 100. In step S3, the user 500 may proceed to touch and palpate the or each object 130a, 130b, 130c with their hand(s) whilst simultaneously viewing the interaction of the virtual hand(s) 550’ with the virtual objects 600’within the VR environment 700. In this way, the VR environment provides visual information and feedback to the user 500, whilst the palpation device 100 provides tactile feedback to the user 500.

In step S4, the user may select a training task to start. The user 500 may be presented with a GUI within the VR environment 700 having one or more selectable data input fields. The user may select a data input field using the virtual representation of the hand(s). For example, a collision between the virtual hand(s) and a virtual data input field may trigger selection. Additionally or alternatively, data maybe input to the system 1000 orally, i.e. via voice commands. The training task may comprise a type of body or body part to palpate or type of abnormality within a body or body part to palpate (e.g. a type of growth or organ). In response to the data input/task selected by the user 500 within the VR environment 700, the palpation device 100 may generate one or more corresponding physical objects 130a, 130b, 130c in palpation device 100 for the user 500 to touch, feel and/or palpate. For example, one or more specific regions 130a, 130b, 130c of increased stiffness and/or thickness may be generated in the variable stiffness layer 130 of the palpation device 100. In step S5, the user 500 may touch and/or palpate the palpation device 100 to execute the training process. Visual feedback may be provided to the user 500 within the VR environment to guide and/or teach the user the correct areas to palpate and/or correct hand gestures/movements to use. This may be accompanied by auditory feedback. At the same time, tactile feedback may be provided by the palpation device 100. For example, the palpation device 100 may change the stiffness of the object being palpated in response to a pressure or force being applied. The device 100 may provide a vibration to indicate when the user 500 is palpating the correct or incorrect area, or using the correct or incorrect movements. Vibrations of differing strength or frequency may be used to indicate different information. The device 100 may provide periodic short vibrations to simulate a pulse or heartbeat. The device 100 may further periodically move the variable stiffness layer 130 and/or touch interface layer 120 to simulate muscle movement and/or chest movement, e.g. from breathing. At the same time, the temperature of the device 100 (e.g. the touch interface and/or the variable stiffness layer) may be set or adjusted to a body temperature, or change in response to certain user actions. In step S6, the user 500 may be presented with visual and/or auditory information to indicate the outcome or results of the training. For example, the accuracy of the palpation, correctness of hand movements, successful assessment and location of an object, and/or progress (e.g. by comparing to previous training result) may be provided to the user 500. The device 100 may then reset itself for the next training task.

Figures 10A to 10C show further steps that may be involved for different training tasks, in this example, deep palpation, light palpation and percussion. Such steps may, for example, correspond to steps S4 to S6 of figure 9. At step S4’, a type of training is selected. Introductory video and/or auditory information relating to the training task may be provided to the user within the VR environment 700. Following selection of deep palpation, light palpation or percussion training, one or more virtual objects corresponding to a body or body part are generated and displayed in the VR environment 700 and corresponding objects with increased stiffness at specific positions are generated in the variable stiffness layer 130 of the palpation device 100. This may be achieved by increasing the stiffness and/or thickness of specific regions 130a, 130b, 130c of the variable stiffness layer 130. In step S5’, the user 500 touches and/or palpates the one or more objects (which may be marked in the VR environment 700). Visual feedback may be provided to the user 500 if the hand gesture/movement is correct or incorrect (e.g. using the motion tracking device 300). The user 500 may report a position of the object(s), e.g. through one or more data input fields of the GUI and receive an indication (visual, audio or tactile) of whether the reported position is correct of incorrect. The user 500 may further examine the size, shape and/or stiffness of the one or more objects and input corresponding information to the system e.g. through the GUI. The above steps/process may be repeated until the user completes the training task such as by indicating the correct position, size, shape and/or stiffness of the object being palpated. At step S6’, the user 500 may be presented with visual and/or auditory information to indicate the outcome or results of the training.

From reading the present disclosure, other variations and modifications will be apparent to the skilled person. Such variations and modifications may involve equivalent and other features which are already known in the art, and which may be used instead of, or in addition to, features already described herein.

Although the appended claims are directed to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel feature or any novel combination of features disclosed herein either explicitly or implicitly or any generalisation thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention.

Features which are described in the context of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub- combination.

For the sake of completeness it is also stated that the term "comprising" does not exclude other elements or steps, the term "a" or "an" does not exclude a plurality, and any reference signs in the claims shall not be construed as limiting the scope of the claims.