Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR MONITORING DRUG DELIVERY
Document Type and Number:
WIPO Patent Application WO/2024/042324
Kind Code:
A1
Abstract:
A system for monitoring delivery of a drug to a subject by a drug delivery device, the system comprising: a data capture device, separate from the drug delivery device, configured to obtain sensory data about the drug delivery device, when the drug delivery device is in use; a processor configured to process the sensory data to determine a state of the drug delivery device.

Inventors:
HUMPHRIES MARK ROBSON (GB)
HUDSON-FARMER KATE (GB)
HALL SIMON (GB)
PATEL MAYUR (GB)
Application Number:
PCT/GB2023/052196
Publication Date:
February 29, 2024
Filing Date:
August 23, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PA KNOWLEDGE LTD (GB)
International Classes:
G16H40/63; G16H20/17; A61M5/20
Domestic Patent References:
WO2017147202A12017-08-31
Foreign References:
US20210280291A12021-09-09
US20200312437A12020-10-01
US20220122698A12022-04-21
US20200381106A12020-12-03
Attorney, Agent or Firm:
J A KEMP LLP (GB)
Download PDF:
Claims:
CLAIMS

1. A system for monitoring delivery of a drug to a subject by a drug delivery device, the system comprising: a data capture device, separate from the drug delivery device, configured to obtain sensory data about the drug delivery device, when the drug delivery device is in use and when the data capture device is separated from the drug delivery device; a processor configured to process the sensory data to determine a state of the drug delivery device.

2. The system of any preceding claim, wherein the processor is configured to further compare the determined state of the drug delivery device to a desired state.

3. The system of claim 2 wherein the processor is configured to determine an action to be taken to change the determined state to the desired state.

4. The system of claim 2 or 3, comprising a user interface configured to communicate information relating to the desired state and/or determined action to a user of the drug delivery device.

5. The system of claim 4, wherein the user interface comprises a screen configured to display a real-time image of the drug delivery device with the information relating to the desired state and/or determine action visually overlaid.

6. The system of any one of claims 2 to 5, wherein the desired state is a next state in a predetermined sequence of states.

7. The system of any preceding claim, wherein the predetermined sequence of states depends on the type of drug delivery device determined by the processor.

8. The system of any preceding claim, wherein the processor is configured to continuously re-determine of the state of the drug delivery device.

9. The system of claim 7, wherein the processor is configured to continuousiy redetermine of the state of the drug delivery device until a desired end state is determined.

10. The system of any preceding claim, wherein the drug delivery device is an autoinjector.

11. A method of monitoring delivery of a drug to a subject by a drug delivery device, the method comprising: obtaining sensory data about the drug delivery device using a data capture device separate from the drug delivery device, when the drug delivery device is in use; processing the sensory data to determine a state of the drug delivery device.

12. The method of any preceding claim, further comprising comparing the determined state of the drug delivery device to a desired state.

13. The method of claim 12, further comprising determine an action to be taken to change the determined state to the desired state.

14. The method of claim 12 or 13, comprising communicating information relating to the desired state and/or determined action to a user of the drug delivery device.

15. The method of claim 14, wherein the user interface comprises a screen configured to display a real-time image of the drug delivery device with the information relating to the desired state and/or determine action visually overlaid.

16. The method of any one of claims 11 to 15, wherein the desired state is a next state in a predetermined sequence of states.

17. The method of any one of claims 11 to 16, wherein the predetermined sequence of states depends on the type of drug delivery device determined by the processor.

18. The method of any one of claims 11 to 17, comprising continuously re-determine of the state of the drug delivery device.

19. The method of claim 17, comprising continuously re-determine of the state ot tne drug delivery device until a desired end state is determined.

20. The method of any one of claims 11 to 18, wherein the drug delivery device is an auto-injector

Description:
SYSTEM AND METHOD FOR MONITORING DRUG DELIVERY

TECHNICAL FIELD

The present disclosure relates to a system and method for monitoring delivery of a drug to a subject by a drug delivery device.

BACKGROUND ART

Drug delivery devices, particularly those that inject a drug into a patient, are often used by people without medical training, e.g. self-administering a drug or administering a drug to another person in their care. This can result in ineffective or incomplete drug delivery.

Auto-injectors are one example of such drug delivery devices, and are specifically designed for untrained users. However, even these devices may be misused, for example a user may lift the device too early such that only a partial dose is administered.

The present disclosure aims to at least partially solve the above problems.

SUMMARY OF THE INVENTION

According to an aspect of the disclosure there is provided a system for monitoring delivery of a drug to a subject by a drug delivery device, the system comprising: a data capture device, separate from the drug delivery device, configured to obtain sensory data about the drug delivery device, when the drug delivery device is in use; a processor configured to process the sensory data to determine a state of the drug delivery device.

Optionally, the processor is configured to further compare the determined state of the drug delivery device to a desired state. Optionally, the processor is configured to determine an action to be taken to change the determined state to the desired state.

Optionally, the system comprises a user interface configured to communicate information relating to the desired state and/or determined action to a user of the drug delivery device. Optionally, the user interface comprises a screen configured to display a real-time image of the drug delivery device with the information relating to the desired state and/or determine action visually overlaid.

Optionally, the desired state is a next state in a predetermined sequence of states.

Optionally, the predetermined sequence of states depends on the type of drug delivery device determined by the processor.

Optionally, the processor is configured to continuously re-determine of the state of the drug delivery device. Optionally, the processor is configured to continuously re-determine of the state of the drug delivery device until a desired end state is determined.

Optionally, the drug delivery device is an auto -injector.

Optionally, the state of the drug delivery device comprises the orientation of the drug delivery device determined based on image data. Optionally, the orientation comprises an angle formed by the drug delivery device with a reference surface separate from the drug delivery device. Optionally, the reference surface is part of the subject. Optionally, the orientation comprise an angle formed by the drug delivery device with the data capture device.

Optionally, the state of the drug delivery device comprises initiation of drug delivery by the drug delivery device. Optionally, the initiation of drug delivery is determined based on image data and/or acoustic data.

Optionally, the state of the drug delivery device comprises an end of drug delivery by the drug delivery device. Optionally, the end of drug delivery is determined based on image data. Optionally, the end of drug delivery is determined based on acoustic data.

Optionally, the state of the drug delivery device comprises an amount of a drug within the drug delivery device or delivered by the drug delivery device. Optionally, the processor is configured to determine the position of at least a portion of a mechanical dispensing means of the drug delivery device in order to determine an amount of a drug within the drug delivery device or delivered by the drug delivery device, the mechanical dispensing means being configured to move to dispense the drug. Optionally, the state of the drug delivery device comprises a rate of delivery of a drug by the drug delivery device. Optionally, the processor is configured to determine the rate of change in position of at least a portion of a mechanical dispensing means of the drug delivery device to determine the rate of delivery of a drug by the drug delivery device, the mechanical dispensing means being configured to move to dispense the drug.

Optionally, the state of the drug delivery device comprises the location of the drug delivery device relative to a reference location separate from the drug delivery device. Optionally, the reference location is part of the subject.

Optionally, the state is an angle formed by the drug delivery device with a reference surface separate from the drug delivery device and the desired state is an angle substantially perpendicular to the reference surface. Optionally, the processor is configured to determine the desired state based on image data of the reference surface.

Optionally, the state is an orientation of the device and the desired orientation is an orientation in which a specific part of the drug delivery device is visible by the data capture device. Optionally, the processor is configured to determine the desired state based on prestored data relating to the type of drug delivery device. Optionally, the specific part of the drug delivery device is a mechanical dispensing means of the drug delivery device, the mechanical dispensing means being configured to move to deliver the drug.

Optionally, the processor is configured to determine the type of drug delivery device and further process the sensory data based on the type of drug delivery device. Optionally, the type of drug delivery device is determined based on image data of the drug delivery device obtained by the data capture device. Optionally, the type of drug delivery device is determined based on the shape of the drug delivery device, or a part thereof, determined from the image data. Alternatively or additionally, the type of drug delivery device is determined based on a visual encoded data within the image data. Optionally, the visual encoded data is a character string, a barcode or a QR code.

Optionally, the data capture device comprises one of a smartphone and a tablet computer. According to a second aspect of the disclosure there is provided a method ot monitoring delivery of a drug to a subject by a drug delivery device, the method comprising: obtaining sensory data about the drug delivery device using a data capture device separate from the drug delivery device, when the drug delivery device is in use; processing the sensory data to determine a state of the drug delivery device.

Optionally, the method further comprises comparing the determined state of the drug delivery device to a desired state. Optionally, the method further comprises determining an action to be taken to change the determined state to the desired state.

Optionally, the method further comprises communicating information relating to the desired state and/or determined action to a user of the drug delivery device. Optionally, the user interface comprises a screen configured to display a real-time image of the drug delivery device with the information relating to the desired state and/or determine action visually overlaid.

Optionally the desired state is a next state in a predetermined sequence of states.

Optionally, the predetermined sequence of states depends on the type of drug delivery device determined by the processor.

Optionally, the method comprises continuously re-determine of the state of the drug delivery device.

Optionally, the method comprises continuously re-determine of the state of the drug delivery device until a desired end state is determined.

BRIEF DESCRIPTION OF THE DRAWINGS

Further features of the disclosure will be described below, by way of non-limiting examples and with reference to the accompanying drawings, in which:

Fig. 1 shows an example system according to the disclosure;

Fig. 2 shows an example identifier of drug delivery device type;

Fig. 3 shows example displays of a user interface of the example system; and Fig. 4 is a flow chart of an example process according to the disclosure.

DETAILED DESCRIPTION

Fig. 1 shows an example system for monitoring delivery of a drug to a subject 2 by a drug delivery device 1.

As in the example system described, the drug delivery device 1 may be an auto-injector. The drug delivery device 1 may therefore comprise a casing housing a syringe, the casing comprising an opening in one end through which the syringe can be deployed. The device 1 may also comprise a cap for covering the opening. The syringe may comprise a needle, a reservoir comprising a liquid comprising the drug and a mechanical dispensing means, such as a plunger for ejecting the liquid. The device 1 may further comprise an injection mechanism housed within the casing and configured to deploy the syringe, e.g. into a subject, and deliver the liquid, e.g. by actuating the mechanical dispensing means. The injection mechanism may be initiated by pressing a button. The casing may comprise a window for viewing the reservoir of the syringe.

In other example systems, different drug delivery devices may be used, e.g. a syringe, a prefilled syringe or a vial and syringe; a pen injector, an injector pump, an ambulatory injection device or a wearable injection device. The drug delivery device may alternatively comprise means for reconstituting a solid drug (e.g. in powder form, such as a lyophilized drug) with a liquid dispensing medium, such as water, prior to dispensing. Such a device is known as a reconstitution drug delivery device.

As shown in Fig. 1, the example system comprises a data capture device 3, separate from the drug delivery device 1 , configured to obtain sensory data about the drug delivery device 1, when the drug delivery device 1 is in use. The data capture device 3 may be a smart phone, as shown in Fig. 1. In other example systems, the data capture device may be a tablet computer, or other type of hand-held device. A user may hold the device in use, but may alternatively use a holder (e.g. a stand-alone holder also separate from the drug delivery device 1), such as a tripod or similar device, or mounted/worn on the body, or prop the device up. The data capture device 3 is configured to obtain sensory data about the drug delivery device 1, when separated from the drug delivery device 1. In other words, the data capture device 3 may remain separated from the drug delivery device 1 in use. This is in contrast to some known “add-on” devices that attach to a drug delivery device, in use.

As in the example system described, the data capture device 3 may be configured to obtain sensory data about the drug delivery device 1 including image data. Image data may be obtained using a camera forming part of the data capture device 3. As in the example system described, the data capture device may be configured to obtain video image data. The video image data may comprise a plurality of image frames obtained at a predefined frame rate. Alternatively, single or multiple still images may be used.

If the data capture device 3 is a smartphone or tablet computer, for example, which has front and back cameras, one or both of these cameras may be used to capture image data. This may enable the user to use the data capture device 3 in a number of different positions and with different orientations. This may allow the user to view a screen provided to the data capture device 3, as described below.

As in the example system described, the data capture device 3 may also be configured to obtain electromagnetic data, such as NFC data, e.g. via an NFC module forming part of the data capture device 3. In other examples, the data capture device 3 may alternatively or additionally obtain acoustic data, e.g. via a microphone forming part of the data capture device 3.

The example system further comprises a processor configured to process the sensory data to determine a state of the drug delivery device 1.

As in the example system described, the processor may form part of the data capture device 3. In other example systems, the processor may be separate from the data capture device 3, for example the processor may form part of a remote server. Alternatively, multiple processors may be used, with at least one processor forming part of the data capture device 3 and at least one processor being separate from the data capture device 3. If a processor separate from the data capture device is used, the data capture device 3 and the processor may comprise means for communicating data, either via a wiretess or wired data connection.

The state of the drug delivery device 1 may relate to a specific configuration of the drug delivery device 1 during use. The state may be one of a predefined group of possible states. As in the example described, the state may form part of a sequence of states that the drug delivery device 1 may have during use. The determined state of the drug delivery device 1 may indicate whether the drug delivery device is operating, or being operated, correctly or incorrectly.

Predefined possible states may be based on the type of drug delivery device 1 and/or the type of drug. As in the example system described, the processor may be configured to determine the type of drug delivery device 1 and further process the sensory data based on the type of drug delivery device 1.

The type of drug delivery device 1 and/or the type of drug may be determined based on image data of the drug delivery device 1. The type of drug delivery device 1 and/or the type of drug may be determined based on the shape of the drug delivery device 1, or a part thereof, determined from the image data. The type of drug delivery device 1 and/or the type of drug may be determined based on a visual encoded data within the image data. The visual encoded data may be a character string, a barcode or a QR code 11, as shown in Fig 2. Alternatively, or additionally the type of drug delivery device 1 and/or the type of drug may be determined based on electromagnetic data, such as NFC data from an NFC tag of the drug delivery device 1. The processor may execute a machine learning algorithm trained to recognise the type of drug delivery device 1 and/or the type of drug.

Alternatively, the processor may perform standard image processing that identifies specific aspects of an image indicative of the type of drug delivery device 1 and/or the type of drug.

The processor may be configured to determine (or re-determine) the state of the drug delivery device 1 continuously, e.g. at predefined intervals. The processor may be configured to compare the determined state of the drug delivery device 1 to a desired state. The desired state of the drug delivery device 1 may be determined based on the determined state of the drug delivery device 1. The desired state may be a next state in a predefined sequence of states. If the determined state of the drug delivery device 1 does not change (e.g. within a predefined time period) or changes to a different state that is not the desired state, then it may be determined that the drug delivery device 1 is operating, or being operated, incorrectly. If it is determined that the drug delivery device 1 is operating, or being operated, incorrectly, the user and/or a clinician may be alerted. The alert may communicate details of the incorrect operation. For example, the processor may alert the user via a user interface as described below. For example, the processor may alert a clinician by sending an electronic message, e.g. via a wireless communication module forming part of the data capture device 3. In such as case, processing may be brought to an end.

If the determined state of the device 1 changes to the desired state (e.g. within a predefined time period), then it may be determined that the drug delivery device 1 is operating, and/or being operated, correctly. In such as case, processing may continue.

When the state of the device is changed, the desired state may also change. For example, if the desired state is a next state in a sequence, as a desired state is reached, the next desired state may be the next state in the sequence. This process may continue until an end state is determined by the processor.

The processor may be configured to determine an action to be taken (e.g. by a user of the device 1) to change the determined state to the desired state. The action may be determined based on the determined state of the drug delivery device 1 and/or the desired state (e.g. based on the type of drug delivery device 1). Information relating to the desired state and/or determined action may be communicated to a user of the drug delivery device 1. This information may provide the user with instructions to take the determined action.

The system may comprise a user interface configured to communicate the information relating to the desired state and/or the determined action, to a user of the drug delivery device 1. As in the example system described and shown in Fig. 1, the user interface may comprise a screen 30, e.g. forming part of the data capture device 3. This may be the inbuild screen of a smartphone or tablet computer, for example. A shown in Fig. 3, the screen 30 may be configured to display a real-time image 31 of the drug delivery device 1 with the information 32, 33 relating to the desired state and/or determined action visually overlaid. As shown in Fig. 3, the information may comprise images 32 and/or text instructions 33. The images may comprise augmented reality images.

The information, such as instructions, may alternatively or additionally be communication via audio. Accordingly, the user interface may comprise a speaker, e.g. forming part of the data capture device 3. The user interface may alternatively or additionally provide haptic feedback, e.g. in the form of vibrations, for example when a desired state is reached.

Fig. 4 is flow chart showing example processing performed by the processor. In step S4.1 an initial state of the drug delivery device 1 is determined. In step S4.2 instructions for reaching the next state (desired state) are overlaid on a real-time image 31 of the drug delivery device 1 on the screen 30. In step S4.3, the state of the drug delivery device 1 is re-determined. In step S4.4 the state is compared to the next state. If the next state has not yet been reached, a counter in incremented at step S4.5, and if the counter reaches a threshold at step S4.6, an error message may be displayed on the screen 30 at step S4.7. If the threshold is not reached the process returns to step S4.3. If in step S4.4, the next state has been reached, it is determined whether the state is an end state at step S4.8. If the next state is not an end state the process returns to step S4.2 and the instructions for the new state are overlaid on a real-time image 31 of the drug delivery device 1 on the screen 30. If the next state is an end state, the processing ends.

The state of the drug delivery device 1 may relate to one or more of: priming of the drug delivery device 1, reconstitution of the drug in a reconstitution drug delivery device 1, location of the drug delivery device 1, orientation of the drug delivery device 1, initiation of drug delivery by the drug delivery device 1 , rate of drug delivery by the drug delivery device 1, amount of drug delivered by the drug delivery device 1, temperature of the drug delivered by the drug delivery device 1, interruption of drug delivery by the drug delivery device 1, and end of drug delivery by the drug delivery device 1.

Priming of the drug delivery device 1 may relate to removal of a cap from the drug delivery device 1. The processor may be configured to process images to determine the presence or absence of a cap. As shown in Fig. 3 if it is determined that the drug delivery device 1 is in a state in which the cap is on, instructions to remove the cap may be communicated of the user. This may include displaying an augmented reality image ot tne cap of the drug delivery device 1 in a real-time displayed image of the drug delivery device 1.

In the case of a reconstitution drug delivery device, the processor may be configured to process images to determine whether the drug has correctly reconstituted in the drug delivery device, e.g. from an image of a window showing the drug and/or liquid to be dispensed.

Location of the drug delivery device may relate to the location of the drug delivery device 1 relative to a reference location separate from the drug delivery device 1. The reference location may be part of the subject, e.g. an injection site. The processor may be configured to process images to determine whether the drug delivery device 1 is at the reference location. The reference location may also be determined by the processor based on image data. As shown in Fig. 3, if it is determined that the drug delivery device 1 is in a state in which it is not at the reference location, instructions to move it to the reference location may be communicated to the user. This may include displaying an augmented reality image of the reference location in in a real-time displayed image of the drug delivery device 1.

Orientation of the drug delivery device 1 may relate to an angle formed by the drug delivery device 1 with a reference surface separate from the drug delivery device 1. The reference surface may be part of the subject, e.g. an injection site. The processor may be configured to process images to determine whether the drug delivery device 1 has a desired orientation, e.g. substantially perpendicular to the reference surface. The reference surface may also be determined by the processor based on image data. As shown in Fig. 3, if it is determined that the drug delivery device 1 is in a state in which it is not at the desired orientation, instructions to move it to the desired orientation may be communicated to the user. This may include displaying an augmented reality image of the desired orientation in a real-time displayed image of the drug delivery device 1.

Orientation of the drug delivery device 1 may relate to an orientation in which a specific part (or parts) of the drug delivery device is visible by the data capture device 3. For example, the specific part of the drug delivery device 1 may be a part (or parts) of the mechanical dispensing means, such as an interface between the part of mecnamcat dispensing means (e.g. plunger) of the drug delivery device 1 and the liquid. The processor may be configured to process images to determine whether the drug delivery device 1 has a desired orientation, e.g. in which the specific part is visible. The processor may be configured to determine the desired orientation in which the specific part is visible based on pre-stored data relating to the drug delivery device 1. If it is determined that the drug delivery device 1 is in a state in which it is not at the desired orientation, instructions to move it to the desired orientation may be communicated to the user. This may include displaying an augmented reality image of the desired orientation in a real-time displayed image of the drug delivery device 1.

Initiation of drug delivery by the drug delivery device 1 may relate to pressing a button of an injection mechanism of the device, or initiation of movement of the drug through the drug delivery device. Movement of the drug may be based on the position of a part (or parts) of the mechanical dispensing means, such as an interface between the part of the mechanical dispensing means (e.g. plunger) of the drug delivery device 1 and the liquid. The processor may be configured to process images to determine whether the drug delivery has been initiated. Alternatively, or additionally, the processor may be configured to process acoustic data to determine whether the drug delivery has been initiated, e.g. the sound of a button press. As shown in Fig. 3, if it is determined that the drug delivery device 1 is in a state in which drug delivery has been initiated, instructions to continue administering the drug may be communicated to the user. This may include displaying an augmented reality image of the amount of drug remaining in a real-time displayed image of the drug delivery device 1.

It should be noted that drug delivery may refer to the act of the drug containing liquid being dispensed from the drug delivery device and/or the act of the drug containing liquid being administered to the subject. In some cases, dispensing and administering may be the same. However, in other cases, such as when the drug delivery device is not engaged with the subject, the drug may be dispensed but not administered. Herein, the drug delivery may refer to drug dispensing and/or drug administration. Information regarding drug administration may be determined based on drug dispensing. The rate of drug delivery by the drug delivery device 1 may be based on the rate ot movement of the drug through the drug delivery device 1. The processor may be configured to process images to determine the rate of drug delivery. This may be based on the position of a part (or parts) of the mechanical dispensing means, such as an interface between the part of the mechanical dispensing means (e.g. plunger) of the drug delivery device 1 and the liquid over time. Based on this the rate of drug delivery may be calculated, e.g. taking into account the type of drug delivery device and the type of drug.

The amount of drug delivered by the drug delivery device 1 may be determined based on the amount of movement of the drug through the drug delivery device 1. The processor may be configured to process images to determine rate of amount of drug delivered. This may be based on the position of a part (or parts) of the mechanical dispensing means, such as an interface between the part of the mechanical dispensing means (e.g. plunger) of the drug delivery device 1 and the liquid. Based on this the amount of drug delivered may be calculated, e.g. taking into account the type of drug delivery device and the type of drug. This may be used to verify that the expected does has been provided.

The temperature of the drug delivered by the drug delivery device 1 may be determined based on the rate of movement of the drug through the drug delivery device 1. The processor may be configured to process images to determine temperature of the drug delivered. This may be based on the position of a part (or parts) of the mechanical dispensing means, such as an interface between the part of the mechanical dispensing means (e.g. plunger) of the drug delivery device 1 and the liquid over time. Based on this the temperature of the drug may be calculated, e.g. taking into account the type of drug delivery device and the type of drug. The higher the rate of movement, the less viscous the drug and the higher the temperature.

Interruption of drug delivery by the drug delivery device 1 may be determined based on whether the rate of drug delivery (e.g. as determined above) substantially changes. For example, if the rate of drug delivery increases, then it may be determined that the drug delivery device has been removed from the subject such that that resistance to the syringe has been lowered. If the rate of drug delivery decreases or stops before the full dose is provided, there may be a problem with the mechanical dispensing means or another factor inhibiting delivery, e.g. a blockage. An end of drug delivery by the drug delivery device 1 may be determined when a full dose is calculated (as above) and/or when at least a part of a mechanical dispensing means reached a predefined end position, e.g. the position of an interface between the drug and the plunger of a syringe reaches a predefined end position. The processor may be configured to process images to determine the end of drug delivery. Alternatively, or additionally, the processor may be configured to process acoustic data to determine the end of drug delivery, e.g. a sound (e.g. click) emitted by the drug delivery device, such as needle retraction. As shown in Fig. 3, if it is determined that the drug delivery device 1 is in a state in which the end of drug delivery is reached, instructions to remove the drug delivery device 1 may be communicated to the user. This may include displaying an augmented reality image of the amount of drug remaining in a real-time displayed image of the drug delivery device 1.

The state of the drug delivery device 1 may be determined based on standard image or audio processing techniques that identify specific features in the data indicative of a particular state. Alternatively, the state of the drug delivery device 1 may be determined by machine learning algorithms that are trained to identify the state based on input data, e.g. a machine learning classifier. The machine learning algorithms may be self-learning, e.g. user feedback regarding the state of the drug delivery device 1 may be used to improve the algorithms.

The process executed by the processor to provide feedback to the user may be based on learned behaviour of the user. This may be implemented by machine leaning algorithms. For example, if a user performs a particular action the algorithm may learn to pre-empt that action and provide different feedback next time. For example, if a user prematurely removes the drug delivery device, the device will emphasis more on ‘Keep holding’ the next time the user is using the device.

It should be understood that variations of the above examples are possible in light of the above teachings without departing from the spirit or scope of the invention as defined by the claims.