Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR MAKING EMOTION BASED DIGITAL STORYBOARD
Document Type and Number:
WIPO Patent Application WO/2009/093784
Kind Code:
A1
Abstract:
An emotion based digital storyboard generating method and system is disclosed. The digital storyboard generating system includes an emotion-expressing character producing unit to produce an emotion-based emotion-expressing character, and a storyboard generating unit to generate storyboard data using the emotion-expressing character.

Inventors:
LEE WON-HYOUNG (KR)
YOO KIL-SANG (KR)
Application Number:
PCT/KR2008/004143
Publication Date:
July 30, 2009
Filing Date:
July 15, 2008
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV CHUNG ANG IND (KR)
LEE WON-HYOUNG (KR)
YOO KIL-SANG (KR)
International Classes:
G06F17/00
Foreign References:
KR20070120706A2007-12-26
KR20020038618A2002-05-23
Other References:
"Korea Society of Design Science, 2003 Spring Conference Proceeding", May 2003, ISSN: 1226-8046, article SHIM, YOUN-SOOK. ET AL.: "A Study on the Emotional Interface Design using User interaction.", pages: 270 - 271
KWANG, TAE JUNG.: "Sensibility and Preference Evaluation for Character Design.", JOURNAL OF THE ERGONOMICS SOCIETY OF KOREA., vol. 26, no. 1, February 2007 (2007-02-01), pages 63 - 69
Attorney, Agent or Firm:
MUHANN PATENT & LAW FIRM (6th Floor Myeonglim Building,51-8 Nonhyeon-dong, Gangnam-gu, Seoul 135-814, KR)
Download PDF:
Claims:

CLAIMS

1. A system for generating a storyboard, the method comprising: an emotion-expressing character producing unit to produce an emotion-based emotion-expressing character; and a storyboard generating unit to generate storyboard data using the emotion- expressing character.

2. The system of claim 1, wherein the emotion-expressing character producing unit includes at least one of an emotional facial expression producing unit to produce a facial expression of the emotion-expressing character, and an emotional pose producing unit to produce a pose of the emotion-expressing character.

3. The system of claim 2, wherein the emotional facial expression producing unit enables a user to select a facial expression of the character stored in advance in an emotional facial expression database, and the emotional pose producing unit enables the user to select a pose of the character stored in advance in an emotional pose database.

4. The system of claim 2, wherein; the emotional facial producing unit provides a facial manipulation box that manipulates and generates a facial expression based on the emotional facial expression database, and generates the facial expression through manipulation by a user using the facial manipulation box; and the emotional pose producing unit provides a pose manipulation box that manipulates and generates a pose based on the emotional pose database, and generates the pose through manipulation by a user using the pose manipulation box.

5. The system of claim 2, wherein the storyboard generating unit generates the storyboard data by combining the produced facial expression and the produced pose.

6. The system of claim 1, further comprising a storage unit to store layout data with respect to the emotion-expressing character and camera setup data set by a user, and to store subsidiary image data including at least one of background data, clothing

data, and prop data.

7. The system of claim 6, wherein; the layout data comprises a facial expression of the character or a layout image of a pose; and the camera setup data comprises a coordinate value for reproducing the facial expression or the pose, and a setup value with respect to an angle.

8. The system of claim 6, wherein the storyboard generating unit decodes, into a three dimensional (3D) object, an emotion-expressing character selected by the user and subsidiary image data extracted from the storage unit.

9. The system of claim 1, wherein the storyboard generating unit further comprises a script input unit to receive script data to be displayed in the storyboard inputted from user.

10. The system of claim 1, wherein the storyboard generating unit further comprises a user interface including a keyboard, a touch screen, a mouse, and a pen mouse, and generates the storyboard using the user interface.

11. The system of claim 1 , further comprising; a cartoon producing unit to perform cartoon-rendering of the generated storyboard data; and an output unit to output the cartoon.

12. The system of claim 1, further comprising: a file converting unit to convert the generated storyboard data into one or more file formats; and a file storage unit to store the converted file.

13. A method for generating a storyboard, the method comprising: producing an emotion-based emotion-expressing character; and

generating storyboard data using the emotion-expressing character.

14. The method of claim 13, wherein the producing of the emotion-expressing character includes at least one of producing of an emotional facial expression of the emotion-expressing character and producing of an emotional pose of the emotion- expressing character.

15. The method of claim 14, wherein the producing of the emotional facial expression enables a user to select a facial expression of the character stored in advance in an emotional facial database, and the emotional pose producing unit enables the user to select a pose of the character stored in advance in an emotional pose database.

16. The method of claim 14, wherein; the producing of the emotional facial expression provides a facial manipulation box that manipulates and generates a facial expression based on the emotional facial expression database, and generates the facial expression through manipulation by a user using the facial manipulation box; and the producing of the emotional pose database provides a pose manipulation box that manipulates and generates a pose based on the emotional pose database, and generates the pose through manipulation by a user using the pose manipulation box.

17. The method of claim 14, wherein the generating of the storyboard data generates the storyboard data by combining the produced facial expression and the produced pose.

18. The method of claim 13, wherein the generating of the storyboard data further comprises receiving script data to be displayed in the storyboard inputted from user.

19. The method of claim 13, further comprising storing layout data and camera setup data set by a user with respect to the emotion-expressing character.

20. The method of claim 13, further comprising;

performing cartoon-rendering of the generated storyboard data; and outputting the cartoon.

21. The method of claim 13 , further comprising : converting the generated storyboard data into one or more file formats; and storing the converted file.

22. A computer readable recording medium storing a program implementing the method of claims 13 through 21.

Description:

SYSTEM AND METHOD FOR MAKING EMOTION BASED DIGITAL

STORYBOARD

Technical Field The present invention relates to a method and a system for generating an emotion-based digital storyboard, and particularly to a method and a system for generating a digital storyboard in which various emotions are easily produced.

Background Art A storyboard expresses an image that is desired to be delivered as an illustration according to a sequence, illustrates a motion of a camera and a motion of a subject with respect to each scene by visualizing the image to be seen to an audience and a customer, such as displaying a motion of a character, a role, a scene change, the motion of the camera, and the like, before filming, and also expresses a story and how to make a film through a illustration to explain everything for making the film, just like a design drawing. That is, it is a pre-production process that visualizes everything in a scenario in detail, which is a scene design drawing, and at the same time, is a producing manual in which a predetermined place and situation, a role of the character, and a motion and a timing, a music and sound effect, a scene change method, and filming method, and the like are written all together aiming at a scene intended by a director.

When the storyboard is written, all information may be provided, so that all staffs, such as a producer, a director, an art director, and the like, may understand a construction of a story. The director may write from about connectivity of a short and a sequence to about details such as creating of a space, a line of flow of the character, a type of a camera and location of the camera, a intensity and a color of light, location of props, a script, a sound effect, an atmosphere of the scene, a time, and the like.

Important things for the storyboard is maintaining of the scene which is written in words, and directing based on decision on how to divide scenes and how to assign determined scenes. There is the need for careful attention not to forget intension, direction, and connectivity, although strength and weakness of each scene is of great consequence.

A precisely and well made storyboard enables various problems that may occur

in a project production, to be predicted and corrected in advance. As an example, when a storyboard indicates that a specific short is impossible due to excessively high cost, the short may be changed to a practical short. Accordingly, a role of the storyboard in the pre-production is to help estimating a precise budget, and help staff to recognize and understand an idea based on consensus on a concept of a work.

A conventional hand drawing-based storyboard lacks in spatial and temporal efficiency, when expressing an image. A substance of the image is for illusion of motion, and spatially sequential motion over time. The storyboard may include information of a scene and a motion that are displayed on a screen, information that an actor appears and disappears on the screen, and information of other actions and a motion of a transfer that may affect a flow of a sequence. Accordingly, the storyboard is required to include complex motions of a character and a camera. However, since the complex motions are illustrated in a certain standard of the storyboard, the storyboard is limited as a tool for visualizing the director's idea. Also, there is a burden of re-drawing when a stage set and a character are needed to be composed and corrected.

To offset described weak points of the conventional storyboard, a digital storyboard is able to illustrate details for each scene, such as a background, a motion, a frame setup, a motion of a camera, and the like, in a film drawing. Accordingly, a system and a method for producing and using a character that expresses emotion desired by the user, to express a detailed emotion of the character in the digital storyboard, is required.

Disclosure of Invention Technical Goals

An aspect of the present invention provides a storyboard generating method and system that may simply and easily write a storyboard which is conventionally handwritten, using a computer, and may enable a user to setup a facial expression and a pose of the character, thereby making a detailed facial expression and pose. An aspect of the present invention provides a storyboard generating method and system that may perform cartoon-rendering of a storyboard generated by a user to output an image like a cartoon, thereby outputting an outcome similar to a handwritten

storyboard.

An aspect of the present invention provides a storyboard generating method and system that may convert a storyboard generated by a user into various file formats to store, and thus, the storyboard may be read in various environments including a web.

Technical solutions

According to an aspect of the present invention, there may be provided a system for generating a storyboard, the method including an emotion-expressing character producing unit to produce an emotion-based emotion-expressing character, and a storyboard generating unit to generate storyboard data using the emotion- expressing character.

In an aspect of the invention, the emotion-expressing character producing unit may include at least one of an emotional facial expression producing unit to produce a facial expression of the emotion-expressing character, and an emotional pose producing unit to produce a pose of the emotion-expressing character.

In an aspect of the invention, the system may further include a cartoon producing unit to perform cartoon-rendering of the generated storyboard data, and an output unit to output the cartoon.

In an aspect of the invention, the system may further include a file converting unit to convert the generated storyboard data into one or more file formats, and a file storage unit to store the converted file.

According to an aspect of the present invention, there may be provided a method for generating a storyboard, the method including producing an emotion-based emotion-expressing character, and generating storyboard data using the emotion- expressing character.

In an aspect of the present invention, the producing of the emotion-expressing character may include at least one of producing of an emotional facial expression of the emotion-expressing character and producing of an emotional pose of the emotion- expressing character. In an aspect of the present invention, the method may further include performing cartoon-rendering of the generated storyboard data and outputting the cartoon.

In an aspect of the present invention, the method may further include converting the generated storyboard data into one or more file formats, and storing the converted file.

Advantageous Effect

According to the present invention, when various facial expression data and pose data is generated and stored in advance and a user selects and drags an icon representing a motion to write a digital storyboard, the user who does not have background knowledge about direction may directly and easily write the storyboard. According to the present invention, a storyboard enables a user to setup a facial expression and a pose of a character, thereby making a detailed facial expression and pose.

According to the present invention, a storyboard performs cartoon-rendering of a storyboard generated by a user to output an image like a cartoon, thereby outputting an outcome similar to a handwritten storyboard.

According to the present invention, a storyboard converts a storyboard generated by a user into various file formats to store, and thus, the storyboard may be read in various environments including a web.

Brief Description of Drawings

FIG. 1 illustrates a block diagram of a digital storyboard generating system according to an example of the present invention;

FIG. 2 illustrates a flowchart of a method for generating a digital storyboard according to an example of the present invention; FIG. 3 illustrates a flowchart of a method for producing and storing an emotion- expressing according to an example of the present invention; and

FIG. 4 illustrates an emotional facial expression producing unit according to an example of the present invention.

Best Mode for Carrying Out the Invention

Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference

numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures.

FIG. 1 illustrates a block diagram of a digital storyboard generating system according to an example of the present invention. Here, the digital storyboard generating system includes a user interface 110, a storyboard generating unit 120, an emotional-expressing character producing unit 130, an emotional facial expression DB 140, an emotional pose DB 141, a storage unit 150, a file converting unit 160, a file storage unit 170, a cartoon producing unit 180, and an output unit 190, as illustrated in FIG. 1. The user interface 110 provides an interface to input a command of a user to generate a storyboard. In this instance, at least one of a keyboard, a touch screen, a mouse, a tablet, and the like may be provided as an example of the user interface.

As an example, the user may select a character, a background, clothes, props, and the like via the user interface, and may generate the storyboard by arranging the selected items in a creation window of the storyboard using drag and drop.

The emotion-expressing character producing unit 130 may produce a detailed facial expression, pose, and the like, by enabling the user to select a facial expression, a pose, and the like, stored in advance in a database.

That is, the user may compose the storyboard by extracting the character stored in the database as is, and also may compose the storyboard by changing the stored character and producing a desired emotion-expressing character. In this instance, the user may change and store a color of the clothes, and also may direct a facial expression and a pose of the character as the user desires.

Accordingly, the emotion-expressing character producing unit 130 may include an emotional facial expression producing unit 131 and an emotional pose producing unit 132.

In this instance, the emotional facial expression producing unit 131 is for selecting a precise facial expression and generating an emotional facial expression as the user desires. The emotional facial expression producing unit 131 may read emotional facial expression data of the character stored in advance in the emotional facial expression DB 140 and may produce a desired facial expression by manipulating eyebrows, eyes, a nose, a mouth, and the like of the character based on the read data.

As an example, to produce a surprised look, raising a position of the eyebrows, making a mouth small, lengthen a lower jaw, and taking edges of the eyebrows down may be performed, thereby an appropriate facial expression is produced. The emotion facial expression producing unit 131 will be described in detail with reference to FIG. 4. The emotional pose producing unit 132 is for closely selecting a pose of the character and generating an emotional pose as the user desires. The emotional pose producing unit 131 may read emotional pose data of the character stored in advance in the emotional pose DB 141 and may produce a desired pose by manipulating hands, feet, a head, and the like of the character based on the read data. In this instance, the pose of the character may be variously directed in a variety of angles based on three dimensions (3D).

The storage unit 150 may store layout data with respect to an emotion- expressing character and camera setup data set by the user, and may store subsidiary image data including at least one of background data, clothing data, and prop data. That is, the storage unit 150 stores the subsidiary image data, such as the background data, clothing data, prop data, and the like, required for generating the storyboard, and enables to user to extract and to use the data when the user produces the storyboard. Also, the storage unit 150 stores the layout data of the emotion-expressing character generated according to a facial expression or pose setup information inputted by the user, and stores camera setup data by receiving a coordinate value for reproducing the facial expression or the pose and a setup value with respect to an angle inputted by the user. In this instance, the storage unit 150 may support an image format, such as, bmp, gif, jpeg, png, psd, tiff, and the like, and may enable the selected image to be arranged. Also, the storage unit 150 may provide a user-defined expanding function in a panel, and thus, the user may directly add a background image and may use the background image in the storyboard. Also, the storage unit 150 may store and load each cut or a project of the storyboard.

The storyboard generating unit 120 generates the storyboard by combining the emotion-expressing character data extracted by the user and the subsidiary image data, such as the background, properties, and the like. In this instance, the storyboard generating unit 120 may include a script input unit 121 that receives script data to be included in the storyboard, inputted from the user. That is, the user may generate the

storyboard by compounding the extracted emotion-expressing character data and the subsidiary image data, together with the script data. Also, the storyboard generating unit 120 may decode, into a 3D object, data selected by the user, such as facial expression, a motion of a character, a background, an indication of switching a camera, clothes, props, and the like. Accordingly, the data which is decoded into the 3D object may be arranged in the storyboard window and may be produced as a 3D screen.

When the user desires to convert completed storyboard to store the same, the file converting unit 160 may convert the storyboard in various file formats. In this instance, the converted filed may be stored in a file storage unit 170. That is, the storyboard may be converted into various file formats such as ASCII, HTML, XML, and the like, and also the user may convert the data into a file format that is directly readable from a web.

The cartoon producing unit 180 performs cartoon-rendering to output an image like a cartoon, when the user desires to print out the completed storyboard. In this instance, a cartoon-rendering process is a process of converting storyboard data generated from the storyboard generating unit 120 into a storyboard most similar to a real storyboard, to output the storyboard data as an image like a cartoon. That is, when the user desires to print out the storyboard data, the cartoon producing unit 180 may perform cartoon-rendering of the storyboard data to make the storyboard data be most similar to handwritten storyboard data, and outputs the rendered data via the output unit

190, and thereby reducing confusion of the user caused by a difference with a conventional storyboard. In this instance, a page layout function may be provided, the page layout function providing a preview before printing out the completed storyboard.

FIG. 2 illustrates a flowchart of a method for generating a digital storyboard according to an example of the present invention.

In operation S210, a user may produce and store an emotion-expressing character. That is, the user may compose the storyboard using the character stored in advanced in a database without change, and may compose the storyboard by setting a facial expression, a motion, and the like to express emotion of the character according to an intension of the user.

A process of operation S210 will be described in detail with reference to FIG. 3 later.

In operation S220, the user may select a character and a subsidiary image such as a background, props, clothes, and the like from the database. That is, to compose the storyboard, the user may select either character data stored in advance in the database or character data set by the user in operation S210, and may select the subsidiary image such as the background, the props, the clothes, and the like.

In operation S230, the user may arrange the selected character and subsidiary image in a creation window of the storyboard as the user intension. In this instance, the user may arrange the character and the subsidiary image at a desired position by merely using drag and drop. Also, the character and the subsidiary image may be decoded into a 3D object, and thus a screen may be composed as a 3D format.

In operation S240, to compose the storyboard to have a script, the digital storyboard may receive script data inputted from the user and insert the script data to the storyboard. That is, the 3D images appropriately arranged by the user according to a scenario may be re-arranged by combining with the script data inputted by the user via a script input unit.

In operation S250, layout data of the character and camera setup data are stored. That is, the digital storyboard may store the layout data of the emotion-expressing character generated according to a facial expression or pose setup information inputted by the user, and may store the camera setup data by receiving a coordinate value for reproducing the facial expression or the pose and a setup value with respect to an angle inputted by the user.

In this instance, various camera presets, such as 4 : 3 and 16 : 9, and a change of an image resolution according to an application medium, are supported, thereby supporting various functions directly related to an animation or a movie. In operation S260, when the storyboard is generated, whether to print out the generated storyboard or to store the generated storyboard may be determined.

In this instance, when the generated storyboard is determined to be printed out, cartoon-rendering may be performed to printed out an image like a cartoon in operation

S270. The cartoon-rendering is a process of rendering the storyboard as a cartoon image most similar to a real handwritten storyboard data, when the user desires to print out the storyboard data.

Accordingly, when the cartoon-rendering is completed in operation S270, the

cartoon-rendered image is printed out in operation S280.

Also, when the generated storyboard is determined to be stored, the storyboard data may be converted in various file formats in operation S290. That is, the storyboard data may be converted in a file format such as ASCII, HTML, XML, and the like, and thus, if necessary, the user may convert the file to read the storyboard on a web.

FIG. 3 illustrates a flowchart of a method for producing and storing an emotion- expressing according to an example of the present invention.

In operation S211, a user may read a character stored in advance from a database, and may set a facial expression using a facial manipulation box. The facial manipulation box may separately manipulate eyebrows, a nose, a mouth, and the like of a face of the character, and may deftly express emotion by minutely manipulating each part. That is, the facial expression manipulation box will be described in detail with reference to FIG. 4 later.

In operation S212, the user may set a pose using a pose manipulation box. The pose manipulation box may set details of the pose by separately controlling arms, legs, a head, and the like of the character, thereby accurately expressing the emotion.

In operation S213, the manipulated emotional facial expression and emotional pose are respectively stored in an emotional facial expression DB and an emotional pose DB. Accordingly, the user may extract the previously produced pose whenever it is needed, and may compose a storyboard using the extracted pose.

FIG. 4 illustrates an emotional facial expression producing unit according to an example of the present invention.

First, the emotional facial expression producing unit is mainly constituted by a basic facial expression button unit 410, a facial manipulation box 420, and a preview screen 430.

The basic facial expression button unit 410 stores emotions in advance as a single setting value, such as disappointed, frightened, kissing, sleeping, surprised, happy, angry, annoyed, talking, and the like, and thus, the user may produce a desired facial expression by merely selecting each button of the basic facial expression button unit 410.

The facial manipulation box 420 is an interface for separately controlling details of a facial expression. As illustrated in FIG. 4, the facial manipulation box 420

may provide an interface for setting eyebrows and eyelids, a gaze, a mouth, a head, and the like. Also, the facial manipulation box may be constituted to additionally control lips, an inclination of a face, and the like, besides the above-described details. In this instance, the facial manipulation box may set the details precisely according to details, such as up, mid, down, and the like, thereby producing more detailed and deft facial expression and well expressing emotion that the user desires.

The preview screen 430 displays a change of a facial expression in real time, while the user controls each detail of a face, thereby enabling the user to easily recognize the change of each of the details. As an example, when producing a surprised look, a degree of surprise may be variously set depending on a situation, such as a situation that a character is substantially surprised, a situation that a character slightly shows surprised look even when trying to hide the surprised look, and the like. In this instance, a position of eyebrows, a mouth (a size of an open mouth), and the like may be minutely controlled by controlling the facial manipulation box 420 and using the preview screen 430.

The storyboard generating method according to the exemplary embodiments of the present invention include computer-readable media including program instructions to implement various operations embodied by a computer. The media may also include, alone or in combination with the program instructions, data files, data structures, tables, and the like. The media and program instructions may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described embodiments of the present invention, or vice versa.

Although a few embodiments of the present invention have been shown and described, the present invention is not limited to the described embodiments. Instead, it would be appreciated by those skilled in the art that changes may be made to these embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.