Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM FOR REALISTIC THREE-DIMENSIONAL RENDERING
Document Type and Number:
WIPO Patent Application WO/2020/261290
Kind Code:
A1
Abstract:
A system for realistic three-dimensional renderering of automotive vehicle glazing and depositing coating layers on the glazing is disclosed. The system includes a measurement device, an input device, a modeling device, a rendering device, a data storage module and a deposition module. The measurement device is configured to measure and visual and optical properties of an object. The input device is configured to receive parameters from a user corresponding to the rendering properties of the object. The modelling device is configured to obtain output from the measurement device and the input device and generate a rendering file format. The rendering device is configured to receive the rendering file format. The data storage module is configured to store output data received from the measurement device, the modelling device, and the rendering device. The rendering device generates a physical realistic view of the realistic three-dimensional model, which is displayed on the display unit.

Inventors:
N BALASUBRAMANIYAN (IN)
B KARTHIKEYAN (IN)
THANGAMANI ARUNVEL (IN)
Application Number:
PCT/IN2020/050449
Publication Date:
December 30, 2020
Filing Date:
May 19, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SAINT GOBAIN (FR)
N BALASUBRAMANIYAN (IN)
International Classes:
G06T15/04; B60J1/00; G06F30/15; G06T19/20
Foreign References:
US20200043246A12020-02-06
US9076247B22015-07-07
Attorney, Agent or Firm:
KUMAR, S. Giriraj (IN)
Download PDF:
Claims:
CLAIMS

I/We claim:

1. A system for realistic three dimensional renderering of automotive vehicle glazing and depositing coating layers on the glazing, the system comprising:

a measurement device 102 configured to measure visual, thermal and optical properties of an object, wherein the object is one of a vehicle interior, exterior and glazing’s present therein;

an input device 120 configured to receive parameters from a user corresponding to the rendering properties of the object;

a modelling device 104 configured to receive output from the measurement device and the input device and generate a rendering file format;

a rendering device 106 configured to receive the rendering file format;

a data storage module 108 configured to store output data received from the measurement device, the modelling device, and the rendering device; and

a deposition module 122 coupled to the display unit and configured to deposit a coating layer on the object based on the received user parameters and the output from the rendering device; characterized by the rendering device that generates a realistic three-dimensional model of the object based on the output from the measurement device and the input device and the display unit providing a physical realistic view of the realistic three- dimensional model, wherein the realistic three-dimensional model defines the properties of the object in varied environments and camera angles, and the deposition module is designed to deposit a coating layer on the object based on the realistic three- dimensional model generated.

2. The system as claimed in claim 1, wherein the deposition module 112 is configured to detect an RGB value of the coating color from the rendered image and further synthesize a metal oxide.

3. The system as claimed in claim 1 and 2, wherein the deposition module 112 is configured to deposit the synthesized metal oxide on the object to obtain a desired coating on the object.

4. The system as claimed in claim 1, wherein the display unit 110 is configured to display three-dimensional models of the rendered object in an environment with a front view, a rear view, perspective view, side view, a top view and a panoramic view.

5. The system as claimed in claim 1, wherein the rendering properties of the object comprises luminosity, color, lighting, specularity, transmission, diffusivity and visual appeal of a coating.

6. The system as claimed in claim 1, wherein the rendering device 106 comprising a processor configured to:

generate a three-dimensional model of the object;

apply material characteristics based on measured properties to the three-dimensional model;

generate a rendering environment for positioning the object based on external parameters such as view angle, background, lighting, sky conditions and color;

apply optical color texture to the rendering environment from a plurality of pre-defined textures; and

generate a realistic three-dimensional model of the object by mapping the three- dimensional model in the rendering environment with physical textures.

7. The system as claimed in claim 1, wherein the measurement device 102 is a spectrophotometer, a camera and one or more sensors.

8. The system as claimed in claim 1, wherein the measurement device 102 is configured to measure the object at multiple angles to determine functional values, Lab color, RAL value, reflection and transmission coefficients.

9. The system as claimed in claim 1, wherein the data storage module 108 is configured to store rendering file formats for a plurality of objects.

10. The system as claimed in claim 1, wherein the data storage module 108 is a local database, a cloud-based server, an application server and a data server.

11. The system as claimed in claim 1 and 2, wherein the display unit 110 comprises a graphical user interface and a touch interface that is configured to provide the physical realistic view of the three-dimensional object with varied lightings and view angles.

12. The system as claimed in claim 1 and 2, wherein the object is one of, but not limited to a glazing, side lite, backlite automobile, vehicle interiors, and windshield.

13. The system as claimed in claim 1, wherein the deposition module 112 is designed to modify the optical, thermal and physical properties of the object by depositing the coating layer based on desired user parameters.

14. The system as claimed in claim 1, wherein the rendering device 106 configured to generate thermal planes of a vehicle; receive thermal measurements of a vehicle for a set of glazings; generate a rendering environment with thermal measurements mapped onto the thermal planes; and generate a realistic three-dimensional model of the vehicle displaying the thermal planes of the vehicle for a set of glazings.

15. A method for realistic three dimensional rendering of automotive vehicle glazing and depositing coating layers on the glazing, the method comprising: measuring visual, thermal and optical properties of an object using a measurement device; receiving parameters from a user, through an input device corresponding to the rendering properties of the object;

receiving an output from the measurement device and the input device to generate a rendering file format;

storing the output data received from the measurement device, the modelling device, and the rendering device in a data storage module;

receiving the rendering file format, by the rendering device to generate a realistic three- dimensional model of the object, wherein realistic three-dimensional model is generated based on the output from the measurement device and the input device; providing a physical realistic view of the three-dimensional model in a display unit; wherein the physical realistic view comprises a comparator output to compare a generic object with a modified object rendered with user parameters; and depositing a coating layer on the object based on the realistic three-dimensional model generated, wherein the coating layer comprises a metal oxide synthesized based on RGB values present in the three-dimensional model.

16. The method as claimed in claim 15, wherein the step of providing a physical realistic view of the three-dimensional model comprises:

generating a three-dimensional model of the object;

applying material characteristics based on measured properties to the three- dimensional model;

generating a rendering environment for positioning the object based on external parameters such as view angle, background, lighting, sky conditions and color; applying optical color texture to the rendering environment from a plurality of pre defined textures; and generating a realistic three-dimensional model of the object by mapping the three- dimensional model in the rendering environment with physical textures.

17. The method as claimed in claim 15, comprises displaying three-dimensional models of the rendered object in an environment with a front view, a rear view, perspective view, side view, a top view and a panoramic view.

18. The method as claimed in claim 15 and 16, comprises providing a physical realistic view of the object positioned in an environment in an augmented reality device or mobile device.

19. The method as claimed in claim 15, wherein the physical textures of the object comprises luminosity, color, lighting, specularity and diffusivity.

20. The method as claimed in claim 15, wherein the realistic three-dimensional model provided on the display unit comprises three-dimensional objects positioned on a specific environment pre-defined by a user.

21. The method as claimed in claim 15 and 18, wherein the realistic three-dimensional model provided on the display unit comprises one or more three-dimensional objects displayed simultaneously on a pre-defined environment.

22. The method as claimed in claim 15, wherein the step of providing a physical realistic view of the three-dimensional object comprises providing realistic thermal maps of a vehicle.

23. The method as claimed in claim 22, wherein the step of providing realistic thermal maps of a vehicle comprises: receiving thermal measurements of a vehicle for a set of glazings from the measurement device; generating thermal planes of a vehicle from the thermal measurements; generating a rendering environment with thermal measurements mapped onto the thermal planes; and generating a realistic three-dimensional model of the vehicle displaying the thermal planes of the vehicle for a set of glazings.

Description:
A SYSTEM FOR REALISTIC THREE-DIMENSIONAL RENDERING

Technical Field

The present disclosure relates generally to a systems and method for generating an image file of a high resolution 3D automotive glazing and in particularly, to an apparatus for realistic 3D rendering, and more particularly to an apparatus for realistic 3D rendering capturing the optical, thermal and visual properties of the vehicle and automobile glazing.

Background

[0001] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed disclosure, or that any publication specifically or implicitly referenced is prior art.

[0002] 3D rendering is a growing technology used in architecture, manufacture, clothing, automobile and the like. 3D rendering is two dimensional representation of a computer wireframe model that has been given properties such as texture, color, and material. 3D visualization is created using two primary software pillars: modelers and renderers. Every rendering starts as a 3D model, which is represented by a series of flat geometric shapes connected together in three dimensional space. These shapes are called polygons. The models themselves are often very crude, represented in the digital space as a simple wire-frame object or scene. In order to give these shapes real form, they must be introduced to texture maps, artificial light sources, and a number of other filters that turn out at the other end what amounts to a finished 3D rendering.

[0003] 3D renders may include photorealistic effects or non-photorealistic rendering. In photorealism, an object is replicated in a rendered environment with features and characteristics of the original objects. Such photorealistic rendering is typically implemented in garment industry to visualize the texture of the fabric. Rendering is also done in real-time using cameras to capture the object and create a 3D environment.

[0004] Typically, capturing texture of an object is performed using camera as mentioned in prior art WO2017029487. However, the method does not capture the properties of the object precisely, and the properties also vary with change in camera settings and light conditions. Thus, there exists need for a system that can capture the properties of the object precisely for realistic rendering.

[0005] Another prior art, US20090153673 teaches a method and apparatus for measuring 3D graphical model using images. The method includes mearing optical and visual properties of the actual object to achieve similar properties in the 3D model. Further, the rendered output is analyzed to determine the variation in the product. However, the prior art does not teach about measuring actual properties of the object to generate a realistic rendering.

[0006] Thus, there exists need for a rendering system and method that captures the physical properties of the object to generate physical realistic renders. Further, there exists need for a rendering system and method that is exclusively designed for automotive glazing’s. Furthermore, there exists need for a system and method that allows a user to compare various rendered objects to determine the performance of the object in varied surroundings.

Summary of the Disclosure

[0007] The primary object of the present disclosure is to provide an apparatus or system that captures the physical properties of the object to generate physical realistic renders. The physical realistic three-dimensional model defines the properties of the object in varied environments and camera angles. Examples of properties of the object include optical properties, thermal properties and acoustic properties. [0008] According to an embodiment of the disclosure, a system for realistic three dimensional rendering of automotive vehicle glazing and depositing coating layers on the glazing is disclosed. The system includes a measurement device, an input device, a modeling device, a rendering device, a data storage module and a deposition module. The measurement device is configured to measure and visual and optical properties of an object. The input device is configured to receive parameters from a user corresponding to the rendering properties of the object. The modelling device is configured to obtain output from the measurement device and the input device and generate a rendering file format. The rendering device is configured to receive the rendering file format. The data storage module is configured to store output data received from the measurement device, the modelling device, and the rendering device. The rendering device generates a physical realistic view of the realistic three-dimensional model, which is further displayed on the display unit. Lastly, the deposition module is coupled to the display unit and configured to deposit a coating layer on the object based on the received user parameters and the output from the rendering device.

[0009] According to an embodiment of the disclosure, a method for real-time deposition of material on a object using a realistic three dimensional Tenderer is disclosed. In accordance with the method, the visual and optical properties of a object are measured using a measurement device. The properties are values obtained from the measurement device including but not limited to functional values, LAB color, reflection and transmission color on all angles. Thereafter, a set of parameters are received from a user, through an input device corresponding to the rendering properties of the object. Combining the visual, optical properties and the parameters, a rendering file format is generated by a modelling device. The output data received from the measurement device, the modelling device, and the rendering device are stored in a data storage module. The three-dimensional model is generated by a creating a 3D shell and/or a UV map based on the rendering file format. The rendering file format is received by the rendering device to generate a realistic three-dimensional model of the object. Finally, a physical realistic view of the object is provided in a display unit. Subsequently, a rendering environment is generated for positioning the object based on external parameters such as view angle, background, lighting, sky conditions and color. Ultimately, the 3D shell or model is mapped onto the rendering environment with physical textures. The physical textures of the object comprise luminosity, color, lighting, specularity and diffusivity.

[0010] Other features and aspects of this disclosure will be apparent from the following description and the accompanying drawings.

Brief Description of the Drawings

[0011] Embodiments are illustrated by way of example and are not limited in the accompanying figures.

[0012] FIG. 1A is a block diagram illustrating a system for rendering realistic three- dimensional models;

[0013] FIG. IB illustrates a plurality of thermal planes generated for a vehicle by the system for rendering realistic 3-D models;

[0014] FIG. 1C illustrates a thermal plane mapped to the thermal measurement of the vehicle;

[0015] FIG. ID is a block diagram illustrating a system for realistic three dimensional rendering of automotive vehicle glazing and depositing coating layers on the glazing deposition;

[0016] FIG. 2 is a flowchart illustrating a method of rendering three-dimensional objects, according to an embodiment of the present invention;

[0017] FIG. 3A illustrates a side view of the rendered object;

[0018] FIG. 3B illustrates a perspective view of the rendered object;

[0019] FIG. 4A illustrates a panoramic exterior view of the realistic rendered object;

[0020] FIG. 4B illustrates a panoramic interior view of the realistic rendered object;

[0021] FIG. 5 illustrates an exemplary view of the rendered 3D model with comparator tool, according to an embodiment of the present invention; and

[0022] FIG. 6A, 6B and 6C illustrates an experimental setup for validation of the 3D rendered image.

[0023] Skilled artisans appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the disclosure.

Detailed Description

[0024] The present disclosure is now discussed in more detail referring to the drawings that accompany the present application. In the accompanying drawings, like and/or corresponding elements are referred to by like reference numbers.

[0025] Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or the like parts.

Definitions

[0026] For convenience, the meaning of certain terms and phrases used in the current disclosure are provided below. If there is an apparent discrepancy between the usage of a term in other parts of this specification and its definition provided in this section, the definition in this section shall prevail.

[0027] Physical realistic view- a view of an object that captures the real aesthetics of the object along with depicting the transmission, reflection and functional properties of the object.

[0028] 3D model- 3D model is a mathematical representation of any surface of an object in three dimensions via specialized software’s in computing devices.

[0029] UV mapping- UV mapping is the 3D modelling process of projecting a 2D image to a 3D models surface for texture mapping, that indicate the transparency and reflective properties of the object.

[0030] Spectrophotometer- It Is a tool used to qualitatively measure the transmission or reflection of visible light, UV light or infrared light.

[0031] In order to overcome the drawbacks associated with the prior art, the present disclosure provides an apparatus or system that automatically designs the heating circuit on automotive glazing based on a specific defrosting time. The apparatus also enables a user to visualize the heating pattern of the automotive glazing. The heating pattern indicates the defrosting or defogging pattern and the peak temperatures achieved in each zone of the glazing. The apparatus also analyzes the heating circuit present on the glazing check for defects and estimate the performance. The performance of the heating circuit can be improved by providing the desired parameters through an input interface on the apparatus.

FIG. 1A is a block diagram illustrating a system for rendering realistic three-dimensional models. The system comprises measurement device 102, modelling device 104, rendering device 106, data storage module 108 and input device 110. The measurement device 102 is used to measure and visual and optical properties of an object. In an example, the measurement device 102 is a spectrophotometer, angular spectrophotometer, camera, and sensors. The device collects three co-efficient of samples. The optical data of the glass is measured at 5 angles such as 15 degree, 30 degree, 45 degree, 60 degree and 75 degree to obtain reflection and transmission coefficients. In another example, the measuring device measures the coefficients at multiple points over a 360-degree angle. The physical data captured by the spectrophotometer includes transmission, reflection, refraction and related optical properties of the object, for ex. glazing. In an example, the measurement device such as camera combined with image editing tool is utilized to capture the realistic textural properties of the surface. The aforementioned data is used to generate normal and bump maps which is applied for rendering.

[0032] The obtained values are further converted into a rendering material format supported to replicate the properties and characteristics of the glass. The measurement device measures the glass samples and other objects using the spectrometer on a real time basis with all 360 degrees. The output values obtained from the measurement device include at least one of but not limited to functional values, LAB color, reflection and transmission color on all angles. In an example, the measurement device generates output values corresponding to the physical glass samples and output is material file with accurate measured values of the glass samples.

[0033] The modelling device 104 generates a rendering file format from the output values (collected by measurement device). The rendering file format is applied to the glass object during three-dimensional modeling. The rendering file includes information about the material properties and characteristics. The modelling device 104 generates a 3D model by creating and exporting a 3D model of the desired object with its appropriate measured dimensions. The modelling device also receives inputs from the user through the input device 110. User input includes rendering properties of the object, transmission, angle of view, luminosity, color, lighting, specularity and diffusivity. The input device 110 communicates with the modelling device 104 through a wired or a wireless communication protocol. Examples of input device 110 include a microphone, a keyboard, a touch screen, a bar code reader, and a gesture unit. Thereafter, the modelling device 104 positions the 3D model of the object in a specific camera angle and environment. The environment is generated by defining background, sky conditions and lighting (HDR). The 3D model generated by the modeling device incorporates material properties, physical textures, and render settings.

[0034] In an embodiment of the disclosure, the Rendering device validates the model to be rendered, and adjusts the resolution of the 3D model. The rendering device generates a realistic three-dimensional model of the object based on the output from the measurement device and the input device. The rendering device provides a physical realistic view of the 3D model with Front, side, back, perspective, top, interior along with panorama views of the object. The rendering device transmits the physical realistic view to a display unit 110. The display unit 110 is at least one of a mobile device, computing device, or an augmented reality device providing realistic view of the rendered object. The display device enables a user to visualize the properties of the object in varied environments and camera angles.

[0035] In an embodiment of the present disclosure, a plurality of rendered objects is created with varied samples of the object by the modeling device and stored in a remote server 114. The plurality of rendered objects is used by the rendering device for creating output in varied environment conditions and camera angles. The remote server 114 is a cloud based server or an application server.

[0036] In another embodiment, the system for rendering 3d objects is used for visualizing the thermal profile of a vehicle. In accordance with the present disclosure, the measurement device 102 is configured to receive thermal measurement of the vehicle. The thermal measurements are captured by a plurality of wireless sensors positioned in the vehicle. The thermal measurements are transmitted to the modelling device 104. The measurement device also estimates contour of the interiors of the car. The modelling device 104 receives the output from the measurement device 102 and determines the thermal plane of the vehicle. The thermal planes are divided into the middle of the car, middle point of right seat, and middle point of left seat. Further, the modelling device 104 maps thermal measurements onto various thermal planes to generate thermal planes. A plurality of thermal planes (as shown in FIG. IB) is generated for varying environmental conditions and stored in data storage module 108. The modelling device 104 overlays the thermal maps onto the thermal planes using UV coordinates. The render settings are fed into the system to obtain a scene consisting the vehicle and thermal planes (as shown in FIG. 1C)

[0037] The rendering device 106 receives the output of thermal maps from the modelling device. Thereafter, the rendering device renders the car model without the thermal plane and the second attempt renders including the thermal plane. The two images are further post processed to create a transparent thermal plane on the vehicle.

[0038] The rendering device 106 transmits the rendered output to the display unit 110. The display unit 110 includes an interactive touch interface that allows the user to hover around the rendered output 110 with thermal planes of the vehicle. The thermal planes of the vehicle varies with each type of glazing used in the car. The user is enabled to select a glazing type of the windshield, backlight and sidelights to view the rendered thermal planes for the selected glazing type. The rendered output on the display unit 110 varies for each glazing type to indicate different thermal planes. A plurality of such rendered output is stored in the data storage module 108 through a network or cloud. The data storage module 108, may be a server.

[0039] The display unit 110 enables the user to select one or more types of glazing, compare and visualize the aesthetic looks, optical, and thermal properties of the glazing. The rendering system with display unit 110 may select a glazing type for rendering based on the HVAC information provided by the user and the geolocation data. The display unit 110 informs the user about the thermal asymmetry and thus assist in selecting the sidelight and backlit to mitigate risk.

[0040] In another embodiment of the present disclosure, the outputs are imported into a custom designed digital application for mobile platforms. This application has features to vary the type of glass products, camera angles, sky conditions and environment. This also enables the viewer to compare products after making variations within the platform.

[0041] In another embodiment of the present disclosure invention also encompasses panoramic view of the physical realistic rendered output from exterior and interior to make the viewer to have a look and feel of the glazing. This is also accompanied along with VR experience of the same by integrating them with the VR app. The Virtual reality experience of the rendered model enhances the customer experience in making choices among the wide range of products in the database which is replicating the physical realism of the actual product. The customer delight is achieved and does not falsify the impression/aesthetics of the end.

[0042] FIG. ID is a block diagram illustrating a system for realistic three dimensional rendering of automotive vehicle glazing and depositing coating layers on the glazing,. With respect to FIG.1B, the system includes a scanning device 116 that is configured to scan the object and further convert the scanned object into a 3D shell. The scanning device 116 includes a plurality of sensors including Charge-coupled device (CCD), CMOS (Complementary metal oxide semiconductor) and the like. In an example, the scanning technique can be used to capture shape and dimensions of an automobile and thereafter render a three-dimensional model of the automobile with a selected glazing or laminate or windshield. The rendered model of the automobile indicates the functional properties of the glazing or windshield in various lighting, color and camera angles.

[0043] In an example, the system is configured to create a panoramic view. The panoramic view is created by rendering a 360-degree panoramic image and wrapping it in the virtual sphere, where the camera is positioned in the center of the sphere. This enables the user to turn around the image and visualize all around. The different views are achieved by placing multiple hotspots around the car. The hotspots are placed using the multiple layered images. Each hotspot is linked to the images in each layer. The different hotspots open the corresponding image to the desired camera angle. In another example, the system is used to provide an augmented realistic view. The augmented view is developed using a custom program defining the visual and optical properties and 3d geometry of the glazing. The augmented content is overlaid on the real time environment suiting appropriate lighting conditions.

[0044] In an embodiment of the present disclosure, the data storage module 108 is wirelessly coupled to a remote server 114. The remote server 114 is one of a cloud storage, application server, data server and the like. The remote server 114 stores a plurality of rendering files corresponding to the plurality of facade types, windshield type and the like. The remote server 114 also stores a plurality of car models, color options, texture options, environment/background options and the like.

[0045] In an embodiment of the present disclosure, the system includes a deposition module 112. The deposition module 112 is configured to receive output from the display unit and perform coating deposition in real-time based on the rendered design and color of the object. In an example, the deposition module is a laser printer. The coating occurs by dip/spray coating process, where the corresponding metal oxides synthesize to form the color data obtained from the rendered output. The optical properties defined in the system determines the thickness and the number of coating layers to be deposited. The coating applicator ejects the metal oxide on the glass surface to be coated

[0046] In another example, the coating deposition may be obtained by screen printing where the design is made on a screen and later the combination of ceramic powder and filler is applied over the object. With respect to the user input and the display output, a coating layer is deposited on the substrate/object. In an example, if the substrate is a glass, a coating layer of metal or metal oxide, and ceramic is deposited on the glass to suit the requirements of the user.

[0047] In an example, a vehicle with a specific type of windshield is to be rendered. The type of the automotive vehicle is rendered by processing stored values from the remote server 114. Further, the rendered model of the windshield type is generated by applying material characteristics (retrieved from the remote server 114). Further, the system enables a user with options for adjusting transmission values, reflective values, texture, finish and color of the vehicle. Thereafter, the environment data corresponding to road, sky, are also generated or added to the rendered object through images. The rendering files corresponding to various glass types such as TSA-NX, TSA 3+, VV55, SCN 450 are stored in the remote server 114.

[0048] In another example, the system provides a realistic 3D view with a comparator tool. The system allows to select a first product type from the database/server, and a second product type from the database/server. Thereafter, both rendered view of both the first product type and the second product type are displayed simultaneously alongside each other on a graphical interface. The comparator tool enables a user to view and compare the difference in properties, reflection and transmission between two products.

[0049] In another embodiment, the input device 110, the data processing unit 118 and the display unit 110 is implemented in a computing device. The computing device is a multi computing device configured to receive input, process and visualize the physical realistic 3D models. Further, the computing device may be coupled to an automatic printing, deposition module or coating mechanism. The computing device may be a computer, a smartphone, an iPad, a laptop and the like.

[0050] The data processing unit 118 comprises the measurement device 102, the modelling device 104 and the rendering device 106. The operation in the data processing unit 118 is executed by a processor. The processor may be any conventional processor, such as commercially available CPUs or hardware -based processor. It will be understood by those of ordinary skill in the art that the processor, computer, or memory may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. Further, processor is one of the component of the rendering device and transmits commands to the display unit. The memory is configured to store instructions accessible by the processor. Further, the memory includes data that is executed by the processor. Memory is any storage device, a computer-readable medium, or another medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, and write-capable or read-only memories. [0051] The processor is configured to receive user parameters through the input device and transmit output signals to the display unit and the deposition module via a communication module. The processor is also configured to communicate with external devices, servers via the communication module. The processing unit can have wired or wireless communication with the output device.

[0052] The display unit 110 can be a display of a mobile device or a computing device or a smart phone that wirelessly communicates with the data processing unit 118. The display unit includes a graphical user interface to visualize the heating pattern. The graphical user interface can include a touch interface enabling a user to scroll through the heating pattern. Examples of display unit 110 include but not limited to an LCD, cathode ray tube display (CRT), light- emitting diode display (LED), electroluminescent display (ELD), plasma display panel (PDP), liquid crystal display (LCD), organic light-emitting diode display and the like. Inn

[0053] In an embodiment, the data processing unit 118 further controls coating of conductive material on the glazing based on the generated realistic 3D model. The output requested by the customer is instantly processed in the data processing unit which commands the deposition module to apply coating on the object. The corresponding color output similar to the rendered value is deposited and demonstrated to the customers instantly which is achieved thorough the combination of the data processing unit and coating deposition module. The deposition module detects the RGB value of the coating color from the rendered image. The deposition module has preset metal oxides in the chamber to which is synthesized according to the RGB value. The synthesized metal oxide is further applied on the object to achieve the actual color of the coating on glass intended.

[0054] In an embodiment of the present invention, the deposition module is a laser printer. The technology used is ink synthesis, where a combination of ceramic powder and filler is used for printing a rendered design and pattern. The laser printer deposits the conductive material in desired design pattern on the glazing. In another example, the deposition module can be a screen printer. The printing system is integrated with the rendering device, so that a user is enabled to calibrate the design as per requirement. [0055] In another embodiment of the present invention, the deposition module may comprise one or more robotic arms operated by a processor The one or more robotic arms are configured to control coating of conductive material on the windshield. The robotic arms are configured to perform coating of the synthesized metal oxides on the substrate. Each robotic arm is linked to one or more sensors in the data processing unit and a local positioning system. The live data from sensors is fed into a piece of custom software allowing control of the deposition of the material output. Sensors mounted inside the deposition module controls direction, following a predefined path. Traveling in a circular path allows for a vertical actuator to incrementally to adjust the nozzle height for a smooth, continuous, layer of conductive material/metal oxides. Further, the robotic arms may comprise one or more servo motors to enable rotation and movement across the length of the glazing.

[0056] According to an embodiment of the present invention, the communication between the processor, memory and other components within the processing unit is established by an address bus and a data bus. The communication module may include an antenna for transmission and reception of signals. In an example, a Bluetooth/ Wi-Fi module is used for Online Data Acquisition and Management. The communication module is also enabled to establish communication with a remote server via a communication network such as internet.

[0057] FIG. 2 is a flowchart illustrating a method of rendering three-dimensional objects. In accordance to an embodiment, the visual and optical properties of a substrate are measured using a measurement device (201). The properties are values obtained from the measurement device including but not limited to functional values, FAB color, reflection and transmission color on all angles. Thereafter, a set of parameters are received from a user, through an input device corresponding to the rendering properties of the substrate (202). Combining the visual, optical properties and the parameters, a rendering file format is generated by a modelling device (203). The output data received from the measurement device, the modelling device, and the rendering device are stored in a data storage module. The three-dimensional model is generated by a creating a 3D shell and/or a UV map based on the rendering file format. The rendering file format is received by the rendering device to generate a realistic three- dimensional model of the object (204). Finally, a physical realistic view of the object is provided in a display unit (205). The physical realistic view is created from the 3D shell by applying material characteristics thereto. Subsequently, a rendering environment is generated for positioning the object based on external parameters such as view angle, background, lighting, sky conditions and color. Ultimately, the 3D shell or model is mapped onto the rendering environment with physical textures. The physical textures of the object comprise luminosity, color, lighting, specularity and diffusivity.

[0058] The physical realistic view comprises a front view, top view, perspective view, interior view and panoramic view of the substrate in a rendered environment. The physical realistic view also provides a comparator tool to compare a generic object with a rendered object (incorporating user parameters). The user parameters include but not limited to view angle, background, lighting, sky conditions and color. The comparator tool simultaneously displays one or more three-dimensional objects alongside a pre-defined environment.

[0059] In an embodiment of the present disclosure, the physical realistic view is provided on a virtual reality device. In another embodiment of the present disclosure, a plurality of environments can be selected by the user. Further, the rendered three-dimensional model of the object is positioned on a specific environment selected by the user.

[0060] The sensing unit 104 receives the input parameters and further analyses the heating circuit. The sensing unit 104 identifies the properties and performance of the heating circuit. The sensing unit also identifies defects in the heating circuit. Examples of sensing unit 104 include Charge-coupled device (CCD) and CMOS (Complementary metal oxide semiconductor), IR sensor, an electrical power sensor, and a voltage sensor. The sensing unit 104 also verifies the input parameters received. The sensing unit is configured to determine and verify the design parameters of the heating coil, wherein the design parameters thickness, width, profile, defrost time and peak temperature of the heating coil.

[0061] FIG. 3A illustrates a side view of the rendered object. FIG. 3B illustrates a perspective view of the rendered object. In an example, the rendering object is windshield and side lite and backlight mapped onto an automobile model. The realistic rendering system enables a user to select color, angle, texture, lighting, optics of the automobile model. Further, the realistic rendering system captures the properties of the glass and generates a 3D model of the glass with material and functional properties. Thus, the rendered realistic image enables the user to visualize the windshields and sidelights present within a vehicle in a realistic manner as shown in FIG. 3 A and 3B. The rendered image does not falsify the aesthetics and impression of the actual product.

[0062] FIG. 4A illustrates a panoramic exterior view of the realistic rendered object. FIG. 4B illustrates a panoramic interior view of the realistic rendered object. With respect to FIG. 4A and 4B, the user is enabled to select the type of windshield and other vehicle glass, type, color and model of car. The realistic rendering system selects a rendering file corresponding to the type of windshield selected. The rendering file includes the material properties and functional properties of the glass. A 3D model of the glass is generated by incorporating the properties specified in the rendering file. Further, a 3D model of the vehicle is created based on the type, color and model of car. Thereafter, a physical realistic three dimensional view of the vehicle with the selected glass type is generated. The physical realistic three dimensional view can be displayed based on the camera angle selected by the user. The user can select at least one of a top view, perspective view, side view, front view and interior view for the rendered vehicle. The physical realistic three dimensional view (shown in FIG. 4A and 4B) is displayed on a mobile device, a tablet, a computing device or an augmented reality device.

[0063] FIG. 5 illustrates an exemplary view of the rendered 3D model with comparator tool. The system allows to select a first product type from the database/server, and a second product type from the database/server. Thereafter, both rendered view of both the first product type and the second product type are displayed simultaneously alongside each other on a graphical interface (501, 502, 503). The comparator tool enables a user to view and compare the difference in properties, reflection and transmission between two products.

[0064] FIG. 6A, 6B and 6C illustrates an experimental setup for validation of the 3D rendered image. The visual and optical properties of the render output is validated by comparing the same with the original photograph of the product in a controlled environment. The environment has the lighting setup common in both cases. The environment is termed as“light cabin” in which the intensity and angle of light is controlled. In an example, the glass sample is placed in the light cabin. Further, the light parameters are set. Photographs of the samples are captured in multiple angles. The light cabin along with the glass sample is modelled in 3D tool with exact measurements of the original setup. The physical realistic render output is generated with similar lighting parameters in the render tool. Finally, the original photograph and render output are compared. The comparison results are analyzed to determine the similarity.

[0065] FIG. 7 illustrates an exemplary graphical user interface on the rendering device showing thermal profile of a vehicle. The rendering device includes a display unit with graphical user interface to indicate the rendered thermal profile of a vehicle for a set of glazing’s. A user is enabled to provide inputs on the type of glazing and the location of the vehicle through an interactive interface on the rendering device. The rendering device is coupled to a cloud for retrieving rendered 3D models for each type. The rendering device provides thermal map for each set of glazing types and environmental conditions. Thus, the rendering device enables the user to visualize, select and compare the difference in thermal performance of the varying automotive glazing solutions.

[0066] The rendering device provides realistic thermal comfort visualization. The output of the system is a series of realistic images which demonstrates/maps the thermal zones within the car region.

Industrial Applicability

[0067] According to the basic construction described above, the apparatus of the present disclosure is implemented to visualize type and material characteristics of windshields, backlights, sidelites and may be subject to changes in materials, dimensions, constructive details and/or functional and/or ornamental configuration without departing from the scope of the protection claimed. The system of the present disclosure is used to determine the actual properties of the glass/glazing or windshield for generating rendered 3D models. The system provides realistic images which demonstrates/maps the thermal zones within the car region. The system maybe also used to perform real-time deposition of coating on glass substrate based on the rendered image. [0068] Note that not all of the activities described above in the general description or the examples are required, that a portion of a specific activity may not be required, and that one or more further activities may be performed in addition to those described. Still further, the order in which activities are listed is not necessarily the order in which they are performed.

[0069] Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any feature(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature of any or all the claims.

[0070] The specification and illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The specification and illustrations are not intended to serve as an exhaustive and comprehensive description of all of the elements and features of apparatus and systems that use the structures or methods described herein. Certain features, that are for clarity, described herein in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features that are, for brevity, described in the context of a single embodiment, may also be provided separately or in a sub combination. Further, reference to values stated in ranges includes each and every value within that range. Many other embodiments may be apparent to skilled artisans only after reading this specification. Other embodiments may be used and derived from the disclosure, such that a structural substitution, logical substitution, or another change may be made without departing from the scope of the disclosure. Accordingly, the disclosure is to be regarded as illustrative rather than restrictive.

[0071] The description in combination with the figures is provided to assist in understanding the teachings disclosed herein, is provided to assist in describing the teachings, and should not be interpreted as a limitation on the scope or applicability of the teachings. However, other teachings can certainly be used in this application.

[0072] As used herein, the terms "comprises," "comprising," "includes," "including," "has," "having" or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a method, article, or apparatus that comprises a list of features is not necessarily limited only to those features but may include other features not expressly listed or inherent to such method, article, or apparatus. Further, unless expressly stated to the contrary, "or" refers to an inclusive-or and not to an exclusive-or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).

[0073] Also, the use of "a" or "an" is employed to describe elements and components described herein. This is done merely for convenience and to give a general sense of the scope of the disclosure. This description should be read to include one or at least one and the singular also includes the plural, or vice versa, unless it is clear that it is meant otherwise. For example, when a single item is described herein, more than one item may be used in place of a single item. Similarly, where more than one item is described herein, a single item may be substituted for that more than one item.

[0074] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. The materials, methods, and examples are illustrative only and not intended to be limiting. To the extent that certain details regarding specific materials and processing acts are not described, such details may include conventional approaches, which may be found in reference books and other sources within the manufacturing arts.

[0075] While aspects of the present disclosure have been particularly shown and described with reference to the embodiments above, it will be understood by those skilled in the art that various additional embodiments may be contemplated by the modification of the disclosed machines, systems and methods without departing from the spirit and scope of what is disclosed. Such embodiments should be understood to fall within the scope of the present disclosure as determined based upon the claims and any equivalents thereof.

List of Elements

TITLE: A SYSTEM FOR REALISTIC THREE-DIMENSIONAL RENDERING

display unit

data processing unit

measurement device

modelling device

rendering device

deposition module

data storage module

network

input device

, 502, 503 graphical interface