Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
REAL-TIME CONTROL SYSTEM FOR BODY AUGMENTATION AND REDUCTION COSMETIC SURGERIES BY CALCULATION OF BODY FORM CHANGES IN DIFFERENT POSTURES
Document Type and Number:
WIPO Patent Application WO/2022/058777
Kind Code:
A1
Abstract:
The present invention includes a control system to guide during body augmentation and reduction cosmetic surgeries. The control system comprises surgical guides produced by 3D printers based on the simulated 3D result of the surgery and also a processor device using real-time 3D scanning, both used to evaluate the compliance of the surgery result with the simulated one. In this invention, the simulated result is provided in different body postures, including the posture in which the surgery is performed.

Inventors:
VALINIA SEYED SOROUSH (IR)
Application Number:
PCT/IB2020/058761
Publication Date:
March 24, 2022
Filing Date:
September 21, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
VALINIA SEYED SOROUSH (IR)
International Classes:
A61B34/10; A61B90/11; G09B23/28
Foreign References:
CN110169821A2019-08-27
US10052159B22018-08-21
Download PDF:
Claims:
8

CLAIMS A system, comprising:

A simulation software which can simulate the result in the lying or other postures based on the simulated result of augmentation and reduction cosmetic surgeries in a standing posture,

At least one surgical guide which is designed and produced specifically for each applicant, and

A processor device which is used as a guide and control system during the surgery. The method of claim 1, wherein a 3D scanning of the applicant in the standing, lying, and other postures is performed, and after producing the 3D model, removal or injection points based on the doctor’ s opinions and applicant’ s desires are determined in the standing posture. The method of claim 1, wherein the simulation software can simulate the result in the lying or other postures based on the body shape differences in different postures and the simulated result in the standing posture. The method of claim 1, wherein the surgical guides are produced based on the simulated result in the lying or other postures by 3D printers. The method of claim 1, wherein after the reduction cosmetic surgery, the surgical guide is used to evaluate the compliance of the body shape with the simulated result in the lying or other postures physically. The method of claim 1, wherein the surgical guide is used during the augmentation cosmetic surgery and determines the points and amounts of injection based on its distance from the body. Also, using surgical guides, it is possible to check the compliance of the body form with the simulated result physically. The method of claim 1, wherein the processor device includes:

At least one 3D scanner providing applicant’s real-time 3D model

At least one camera which displays the applicant’s real-time image on the monitor, and

A memory device which has the information about simulated results in different postures, removal or injection points, and surgical guides positioning. 9 The method of claim 7, wherein it is possible to display the removal or injection points on the applicant's body during the surgery by connecting the processor device to a lightgenerating tool. The method of claim 7, wherein by comparing the points of the 3D real-time model and the simulated model, the processor shows different points in blue and similar points in green on the applicant’s image on the monitor. The method of claim 7, wherein the operator/doctor/surgeon can ensure digitally that the surgery result matches the simulated result by looking at the monitor and colored points. The method of claim 7, wherein if the removal or injection is performed more than the predetermined amounts, the points in that area turn red. The method of claim 7, wherein it is possible to hide the real-time camera image and only see the differences between real-time 3D model and the simulated model, which helps the doctor understand the changes needed. The method of claim 7, wherein the processor device displays the correct location of the surgical guide on the applicant's image on the monitor by performing a real-time scan and accessing the memory device information, and the doctor by looking at it can place the surgical guide in the correct location. The method of claim 7, wherein the processor device checks the correct placement of the surgical guide on the applicant's body and warns if it is wrong.

Description:
INVENTION TITLE

Real-time control system for body augmentation and reduction cosmetic surgeries by calculation of body form changes in different postures

TECHNICAL FIELD OF INVENTION

This invention is related to a tool to determine the exact location and amount of injection or removal for body modeling, and also is related to a tool to evaluate the compliance of the applicant’s body shape with the pre- determined result considering body form changes in standing, lying and other body postures.

PRIOR ART

Different methods are used to shape and beautify the body form. Some of these methods involve removing fat from specific parts of the body, which may be injected into other parts needing volume increase. In addition to fat injection, other fillers are used to shape the body, the composition of which is compatible with the body tissue.

Design software has been used to perform injection in different parts of the body, through which the result of body augmentation or reduction surgery can be simulated, and the injection points and the amount of injection in each point are also determined. The injection points are then displayed to the operator in various ways during the surgery so that the operator can better understand the injection process. However, in these methods, the simulation of the result and the injection points are determined according to the body posture in the standing form, while during the injection process, the applicant is often lying down. Also, using the previous methods, it is not possible to evaluate the compliance of the surgery result with the simulated one.

Patent US20160242853, entitled SYSTEMS AND METHODS FOR WEARABLE INJECTION

GUIDES filed by Elwha LLC at the US Patent Office, indicates a method to determine filler injection points or any other injectable substance on the face or body. In this method, one or more images of the application guide model are designed in the software that can identify the injection points. Next, based on the designed model, a wearable injection guide is made using a 3D printer. This guide is made of materials that needles can penetrate. Accordingly, after the applicant has worn the guide, the operator can inject the filler by specifying the injection points. This procedure does not simulate the surgical outcome, and the applicant has no idea about the outcome of the filler injection. Besides, the operator cannot ensure the symmetry of the face as well as the desired shape. The mentioned method does not provide information about the body form differences in standing and lying postures.

Also, belonging to the present state of the art, there is patent US20170259013, entitled SYSTEMS AND METHODS FOR GENERATING AN INJECTION GUIDE filed by Elwha LLC at the US Patent Office. In this invention, using one or more applicant images, an injection guide is designed in the software which can identify injection sites. Then, using a system that has a camera and a light generator that can connect to a computer, the injection points on the applicant's face are determined using light. This procedure also does not simulate the surgical outcome, and the applicant has no idea about the outcome of the filler injection. Besides, the operator cannot ensure the symmetry of the face as well as the desired shape. The mentioned method does not provide information about the body form differences in standing and lying postures.

Moreover, patent WO2019178287, entitled AUGMENTED REALITY TOOLS AND SYSTEMS FOR INJECTION at the US Patent Office. In this invention, using photos, video, genetic examination, medical data such as MRI or CT scan, the applicant’s 3D model is developed, in which the anatomy and location of veins, bones, glands, etc. are identified. Using this method, the applicant can see the filler injection outcome. Once the filler injection points have been identified, the operator can see the injection site and subcutaneous structures using the virtual reality glasses by looking at the applicant. The invention also includes a system showing whether the injection is performed in the right place and direction during the filler injection. The problem with this method is that after filler injection, the operator cannot accurately evaluate the compliance of the applicant's face with the simulation. Using virtual reality glasses can also be difficult for the operator and can make the eye tired. TECHNICAL ISSUES

In body augmentation and reduction surgeries, it is necessary to determine the points and amount of tissue removal or injection during the surgery so that the doctor/surgeon can perform the surgery with sufficient accuracy and the least error. Also, if the simulated result was shown to the rhinoplasty applicant before surgery, it is necessary to have a tool to examine the compliance of the actual result with the simulated one. Various methods have been designed for this purpose, some of which have been filed as a patent. Each of the methods and tools designed has its disadvantages, including that in these methods, surgery results are simulated in standing posture while applicants are in lying posture during the surgery. Also, in the mentioned methods, it is not possible to examine the compliance of actual surgery results and simulated results physically or digitally.

PURPOSE OF INVENTION

The present invention is designed to solve the mentioned problems. It is possible to simulate three-dimensional results of body volume augmentation and reduction surgeries as well as determine injection points in standing, lying, and other body postures by using the guide system designed in this invention. Also, using this system, the points’ location and the injection amounts in each point can be determined, and the operator/doctor/surgeon can examine the compliance of the actual rhinoplasty result with the simulated result physically and digitally at any time and ensure the achievement of the predetermined result.

INVENTION ILLUSTRATIONS DESCRIPTION

Figure one illustrates surgical guides

Figure two illustrates the non-compliance of the body shape with the surgical guide before injection

Figure three illustrates compliance of the body shape with the surgical guide after injection

Figure four illustrates 3D scanners embedded in the operating room Figure five illustrates colored points on the applicant image on the screen

Figure six illustrates hided camera’s data and displayed real-time and simulated 3D model

INVENTION DESCRIPTION

The present invention is a control and guidance system for body augmentation and reduction cosmetic surgeries. In this invention, using the output information of 3D scanners, a 3D image of the applicant or the area in need of surgery in standing, lying, and other body postures are made in related software. Then, through the body change design software, it is possible to observe the effect of tissue removal or filler injection in different areas of the body in standing posture and decide on the areas that the removal or injection is performed. After confirming the design of the ideal form in the standing posture, it is possible to simulate the result in the posture that the surgery is performed. For this purpose, the differences of 3D models in different body postures are examined before applying the changes, and as a result, based on the design of the ideal form in the standing posture, the result of the cosmetic surgery in the lying posture or other postures is simulated through the software.

In one embodiment, the guidance system in this invention comprises a series of surgical guides (Figure 1) made by 3D printers based on the final design in lying or other postures. These surgical guides allow the operator/doctor/surgeon to check the compliance of applicant’s body shape with the predetermined design in lying or any other postures during the removal or injection process; in body reduction surgery, it is possible to check the mentioned compliance by placing these surgical guides on the applicant’s body. In body augmentation surgery, by placing these guides on the desired area, it can be understood how much filler should be injected to achieve the predetermined form based on the distance of the guide from the body area (Figures 2 and 3).

In one embodiment, the guidance system also includes a processor device that uses a 3D scanner or a combination of several 3D scanners simultaneously to scan the applicant at any time during the cosmetic procedure (Figure 4). The processor also has an internal memory that contains 3D simulation information of the result in different body postures, removal or injection points, and surgical guides’ location information. In one embodiment, by connecting the processor device to a focused light-generating tool such as a laser, it is possible to display the removal or injection points on the applicant's body during the surgery.

In one embodiment, the processor device also includes a camera that displays the applicant’s real-time image or the relevant body area on the screen during the surgery. By analyzing and comparing different points in the 3D real-time model and the simulated model, this device shows different points in blue and similar points in green on the applicant's image on the screen using artificial intelligence. During the removal or injection and achieving the predetermined result, blue points change to green gradually. If a part of the body is increased or reduced too much, the points in that area turn red (Figure 5). In this method, if the operator/doctor/surgeon does not want to see the real-time image of the camera, the image can be hidden, and only the difference between real-time 3D model and the simulated 3D result can be seen (Figure 6). Accordingly, the operator/doctor/surgeon can realize the amount of change required and check the compliance of the result with the simulated one digitally.

The processor also shows the location of the surgical guides on the applicant's real-time image on the screen by accessing information about the exact location of the surgical guides on the 3D model of different body postures and performing a real-time scan at any time. Thus, by looking at the monitor, the surgical guide location can be determined. If the surgical guide is placed on the wrong area, the processor system compares the guide location in the 3D real-time model and predetermined 3D model and detects the wrong positioning of the guide using the artificial intelligence and warns.

In one embodiment, the processor device can also be connected to augmented reality (AR) glasses; In this case, if the operator/doctor/surgeon wears the AR glass, all the information related to the processor device, including the removal or injection points, are displayed virtually on the applicant's body.

INVENTION ADVANTAGES

Creating high accuracy in body volume augmentation and reduction cosmetic surgeries Considering the body shape differences in the simulated result in standing, lying and other postures

Performing surgery based on the applicant’s desires

Producing surgical guides specifically for each applicant

Examining the compliance of the surgery result with the simulated 3D model physically and digitally

Creating symmetry between two sides of the body

Facilitating cosmetic surgery for the surgeons

AN EXECUTABLE METHOD FOR THE INVENTION: FAT TRANSFER USING REALTIME CONTROL SYSTEM FOR BODY AUGMENTATION AND REDUCTION COSMETIC SURGERIES

In this method, a 3D scan is performed on the applicant's body in both standing and lying postures. Then, in the design software, the result of fat transfer is simulated with the doctor’s opinion and based on the applicant’s desires in a standing posture. Next, the design of the ideal form is simulated by the software in the lying posture. At this stage, the points and amount of fat removal will also be determined. Next, the surgical guides are made by 3D printers based on the simulated result in the lying posture.

The applicant’s real-time 3D image is displayed on the screen using a camera, and 3D scanning is performed at any time using a 3D scanner. The fat removal points on the applicant's body are determined by connecting the processor device to the focused light-generating tool. The processor device displays the points from the surgery area that are different from the simulated result in blue and the points which are similar in both in green at any time by comparing the realtime 3D model and 3D simulated result in the lying posture. As the tissue removal process progresses, and the surgery result approaches the simulated result, the blue points gradually turn green. If the area is removed too much, the points turn red. Thus, using the processor device, it is possible to check the compatibility of the fat removal result with the simulated result digitally. In this method, if the operator/doctor/surgeon does not want to see the real-time image of the camera, the image can be hidden and, only the difference between real-time 3D model and the simulated 3D result can be seen. Also, the compatibility of the body shape with the simulated result in lying posture can be evaluated physically using the surgical guides designed in this invention. The processor device evaluates the positioning of the surgical guides; for this purpose, the 3D model of the surgical guides are displayed on the real-time image of the applicant on the screen, and the operator/doctor/surgeon can check the right placement of the guides by looking at the screen. If the surgical guide is placed on the wrong place, the device notices the wrong positioning and warns by comparing the real-time 3D model and the simulated one. If the processor device is connected to augmented reality glasses, the removal points and all the information of the processor device are displayed virtually on the applicant's body.

Next, the removed fat is purred and injected in the predetermined points. The injection points can be shown on the applicant’s body by connecting the processor device to the focused lightgenerating tool. The differences between the real-time 3D model and the simulated one in the lying posture are displayed in the form of blue dots on the applicant’s real-time image with the help of the processor device. If the area where the injection is performed is more prominent than the simulated model, the points in that area turn red. In this method, if the operator/doctor/surgeon does not want to see the real-time image of the camera, the image can be hidden, and only the difference between real-time 3D model and the simulated 3D result can be seen. Also, surgical guides are used to guide the injection process; by placing the guide on the area that needs an injection, the place and amount of injection are determined, and it is possible to check the compliance of the body shape with the simulated model physically based on the distance that the guide has from the body. As in the previous method, the processor shows the location of the surgical guide on the applicant's real-time image on the screen and also warns the applicant if the guide is placed incorrectly on the body. If the processor device is connected to augmented reality glasses, the injection points and all the information of the processor device are displayed virtually on the applicant's body. The processor shows the location of the surgical guides on the applicant's real-time image on the screen and also warns if the guide is placed incorrectly on the body. If the processor device is connected to augmented reality glasses, the injection points and all the information of the processor device are displayed virtually on the applicant's body. In this method, the fat transfer cosmetic surgery is completed when the result of the surgery matches the predetermined result.