Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PROVISION OF USER INTERFACE BASED ON GROUP ATTRIBUTE INFORMATION
Document Type and Number:
WIPO Patent Application WO/2022/055547
Kind Code:
A1
Abstract:
An example image forming apparatus includes a user interface (UI) device to provide a UI, a memory to store a UI table, the UI table comprising a plurality of records of personal attribute information, a plurality of records of group attribute information, and a plurality of records of UI setting information for setting a presentation of the UI, the records of the UI setting information respectively corresponding to the records of the personal attribute information and the records of the group attribute information, and a processor to identify a first record in the records of the personal attribute information and a second record in the records of the group attribute information as being associated with a user, identify a third record and a fourth record in the records of the UI setting information as respectively corresponding to the first record and the second record, and control the UI device to provide a presentation of the UI for the user based on the third record and the fourth record.

Inventors:
YOON JIHYUN (KR)
BAE JUNGNAM (KR)
LEE CHULGEE (KR)
OH SEUNGHEE (KR)
Application Number:
PCT/US2021/018978
Publication Date:
March 17, 2022
Filing Date:
February 22, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HEWLETT PACKARD DEVELOPMENT CO (US)
International Classes:
G06F3/12; G06F13/00
Foreign References:
JP2015118517A2015-06-25
US8275876B22012-09-25
KR101487357B12015-01-29
JP4559828B22010-10-13
Attorney, Agent or Firm:
KIM, Minsun et al. (US)
Download PDF:
Claims:
23

WHAT IS CLAIMED IS:

1 . An image forming apparatus, comprising: a user interface (III) device to provide a III; a memory to store a III table, the III table comprising a plurality of records of personal attribute information, a plurality of records of group attribute information, and a plurality of records of III setting information for setting a presentation of the III, the records of the III setting information respectively corresponding to the records of the personal attribute information and the records of the group attribute information; and a processor to: identify a first record in the records of the personal attribute information and a second record in the records of the group attribute information as being associated with a user; identify a third record and a fourth record in the records of the III setting information as respectively corresponding to the first record and the second record; and control the III device to provide a presentation of the III for the user based on the third record and the fourth record.

2. The image forming apparatus of claim 1 , wherein the personal attribute information comprises an age, a visual acuity, a color vision deficiency (CVD), an auditory acuity, a preferred language, or an indication of impairment.

3. The image forming apparatus of claim 1 , wherein the III setting information comprises a size of a font for use in the presentation of the III, a type of the font, an indication of whether to activate a voice assistant service in the presentation, an indication of whether to activate an easy-to-use mode of the III in the presentation, or an indication of whether to activate a function of color adjustment in the presentation.

4. The image forming device of claim 1 , wherein the group attribute information comprises an organization department of the user, a job position of the user, a type of task assigned to the user, an amount of time spent on the task, or an indication of a status of the user for the task.

5. The image forming apparatus of claim 1 , wherein the III setting information comprises which menu for image forming jobs including scan, photocopy, and print is to be provided in the presentation by priority, which menu is optional for the presentation, whether to provide an option in the presentation to verify a total number of to-be-printed copies before printing, whether scanning with optical character recognition (OCR) processing is allowed or not, whether printing of multiple pages per sheet is allowed or not, whether printing with a modification to an original text is allowed or not, whether grayscale or color printing is to be used, or a dots-per-inch (DPI) resolution in which scanning is to be performed.

6. The image forming apparatus of claim 1 , wherein the first record and the second record respectively have a first item and a second item to each of which a respective priority is assigned and a same item of the III setting information corresponds to, and wherein the processor is to: if the third record and the fourth record have different values for the same item, control the III device based on which of the first item and the second item is assigned a higher priority.

7. The image forming apparatus of claim 1 , wherein the processor is to: identify a first additional record in the records of the group attribute information as being associated with the user; identify a second additional record in the records of the III setting information to correspond to the first additional record; and combine a pair of the third record and the fourth record with another pair of the third record and the second additional record to control the III device based on the combination. 8. The image forming apparatus of claim 1 , wherein the III device is to: if the third record and the fourth record have different values for a same item of the III setting information, provide respective presentations of the III; and receive a user input to select one of the respective presentations of the III.

9. The image forming apparatus of claim 1 , wherein the III device is to receive update information regarding the III table, and wherein the processor is to update the III table based on the update information.

10. The image forming apparatus of claim 1 , further comprising a communications unit to communicate with an external server, wherein the III device is to receive user identification (ID) information regarding the user, and wherein the processor is to: control the communications unit to deliver the user ID information to the external server such that the external server uses the user ID information to provide a first instance of the personal attribute information and a second instance of the group attribute information; and receive the first instance and the second instance from the external server to identify the first record and the second record.

11. An electronic device, comprising: a user interface (III) device to provide a III; a memory to store a III table, the III table comprising a plurality of records of personal attribute information, a plurality of records of group attribute information, and a plurality of records of III setting information for setting a presentation of the III, the records of the III setting information respectively corresponding to the records of the personal attribute information and the records of the group attribute information; and 26 a processor to: identify a first record in the records of the personal attribute information and a second record in the records of the group attribute information as being associated with a user; identify a third record and a fourth record in the records of the III setting information as respectively corresponding to the first record and the second record; and control the III device to provide a presentation of the III for the user based on the third record and the fourth record.

12. The electronic device of claim 11 , wherein the personal attribute information comprises an age, a visual acuity, a color vision deficiency (CVD), an auditory acuity, a preferred language, or an indication of impairment.

13. The electronic device of claim 11 , wherein the III setting information comprises a size of a font for use in the presentation of the III, a type of the font, an indication of whether to activate a voice assistant service in the presentation, an indication of whether to activate an easy-to-use mode of the III in the presentation, or an indication of whether to activate a function of color adjustment in the presentation.

14. The electronic device of claim 11 , wherein the group attribute information comprises an organization department of the user, a job position of the user, a type of task assigned to the user, an amount of time spent on the task, or an indication of a status of the user for the task.

15. A computer readable storage medium storing a computer program that comprises instructions that, when executed by a processor, cause the processor to: identify a first record in a plurality of records of personal attribute information and a second record in a plurality of records of group attribute information as being associated with a user, the records of the personal attribute 27 information and the records of the group attribute information being comprised in a user interface (III) table, the III table further comprising a plurality of records of III setting information for setting a presentation of a III to be provided by a III device, the records of the III setting information respectively corresponding to the records of the personal attribute information and the records of the group attribute information; identify a third record and a fourth record in the records of the III setting information as respectively corresponding to the first record and the second record; and control the III device to provide a presentation of the III for the user based on the third record and the fourth record.

Description:
PROVISION OF USER INTERFACE BASED ON GROUP ATTRIBUTE INFORMATION

BACKGROUND

[0001] There exist different types of image forming apparatuses, including printers, scanners, photocopiers, facsimile machines, etc., as well as multifunction products (MFPs) that may provide a combination of functions, e.g., print, copy, scan, and fax functions.

[0002] An image forming apparatus may be used in an organization such as a company, a corporation, and the like, and in a household as well. An image forming apparatus may provide a user interface (Ul), through which a user can interact with the image forming apparatus to perform any of various activities such as conducting his/her workplace activities.

BRIEF DESCRIPTION OF THE DRAWINGS

[0003] FIG. 1 schematically depicts an example of an image forming apparatus that may provide a user interface (Ul) having identical or different Ul settings for different users.

[0004] FIG. 2 is a schematic illustration of attribute information and items of user identification (ID) information according to an example.

[0005] FIG. 3 schematically depicts a sequence of interactions between the image forming apparatus of FIG. 1 and a user terminal according to an example.

[0006] FIG. 4 schematically depicts a sequence of interactions between the image forming apparatus of FIG. 1 and a server according to an example.

[0007] FIG. 5 is a block diagram illustrating an example of the image forming apparatus of FIG. 1 .

[0008] FIG. 6 is a conceptual diagram illustrating examples of items of information recorded in a Ul table.

[0009] FIG. 7 is a conceptual diagram illustrating examples of values of information recorded in a III table.

[0010] FIG. 8 is a conceptual diagram illustrating an example of a III provision model.

[0011] FIG. 9 is a schematic illustration of a scenario in which an image forming apparatus may use pre-configured III settings according to an example.

[0012] FIG. 10 is a schematic illustration of a scenario in which an image forming apparatus may configure combined III settings according to an example. [0013] FIG. 11 is a flow diagram illustrating a method of image forming according to an example.

[0014] FIG. 12 is a block diagram of an electronic device according to an example.

[0015] FIG. 13 is a block diagram of a computer readable storage medium according to an example.

DETAILED DESCRIPTION

[0016] Various terms used in the following examples are chosen from a terminology of commonly used terms in consideration of their function herein, which may be appreciated differently depending on an intention of a person skilled in the art, a precedent case, or an emerging new technology. In some cases, terms may be construed as set forth in the following examples. Accordingly, the terms used herein are to be defined consistently with their meanings in the context of the various examples, rather than simply by their names.

[0017] The terms “comprising,” “including,” “having,” “containing,” etc. are used herein when specifying the presence of the elements listed thereafter. Unless otherwise indicated, these terms and variations thereof are not meant to exclude the presence or addition of other elements.

[0018] As used herein, the ordinal terms “first,” “second,” and so forth are meant to identify several similar elements. Unless otherwise specified, such terms are not intended to impose limitations, e.g., a particular order of these elements or of their use, but rather are used merely for referring to multiple elements separately. For instance, an element may be referred to in an example with the term “first” while the same element may be referred to in another example with a different ordinal number such as “second” or “third.” In such examples, the ordinal terms are not to limit the scope of the present disclosure. Also, the use of the term “and/or” in a list of multiple elements is inclusive of all possible combinations of the listed items, including any one or a plurality of the items.

[0019] The term “image forming job” as used herein may encompass any of a variety of image-related jobs, such as a print job, a scan job, a photocopy job, a facsimile transmission job, and the like, that involve an operation of forming an image and/or other processing operation, e.g., creation, generation and/or transfer of an image file. Furthermore, an image forming job performed by an image forming apparatus may comprise various jobs related to printing, photocopying, scanning, faxing, storing, transmitting, coating, etc.

[0020] The term “image forming apparatus” as used herein may encompass any of a variety of apparatuses, such as a printer, a scanner, a photocopier, a facsimile machine, a multi-function product (MFP), a display device, and the like, that are capable of performing an image forming job. In some examples, an image forming apparatus may be a two-dimensional (2D) or three- dimensional (3D) image forming apparatus.

[0021] The term “user” as used herein may refer to a person who manipulates an image forming apparatus to perform an image forming job. Further, the term “administrator” as used herein may refer to a person who has access to the entire functionality of an image forming apparatus. In some examples, one person may have both the roles of an administrator and a user.

[0022] Certain examples of the present disclosure will now be described with reference to the accompanying drawings. The various examples may, however, be implemented in many different forms and should not be construed as limited to the examples set forth herein. Rather, these examples are given in order to provide a better understanding of the scope of the present disclosure.

[0023] FIG. 1 schematically depicts an example of an image forming apparatus that may provide a user interface (III) having identical or different III settings for different users.

[0024] Referring to FIG. 1 , an image forming apparatus 100 may retain, for a III, III setting information as a record that is identifiable in association with a user. For example, a record included in the III setting information may contain a value of a III setting for controlling a presentation of the III, including a setting pertaining to the III and/or an operation or component of the image forming apparatus 100. The setting may include a value having a logical type such as Boolean, a numeric type such as an integer, a text type such as a string, or any other suitable data type. In response to identifying setting values associated with a user, the image forming apparatus 100 may coordinate the settings for the user to carry out an image forming job accordingly. In the processing of the image forming job, the image forming apparatus 100 may control, based on the coordinated settings, a hardware and/or a software component thereof to, e.g., provide a visual, audio, and/or other sensory presentation of the III to the user. [0025] As an example, the image forming apparatus 100 may use the III setting information to perform a user-specific operation related to an image forming job. For example, different users may be provided with identical or different presentations of the III, depending on which record of the III setting information is identified as being associated with each of the users. In the example of FIG. 1 , the image forming apparatus 100 may obtain, for each of users A, B, C, D, and E, an associated instance Ul#1 , Ul#2, or Ul#3 of the III setting information. In this and other examples, an instance of the III setting information is a set of data identified, extracted, fetched, retrieved, constructed, and/or otherwise processed, in association with a specific user, from one or more records of the III setting information. The three different reference characters “Ul#1 ,” “Ul#2,” and “Ul#3” indicate differences in their underlying data. Thus, the user E may be presented with a III in accordance with the information associated with the reference character “Ul#3,” that is, a III in a different form from those for the remaining users A, B, C, and D. In addition, based on the same data designated by “Ul#1 ,” the image forming apparatus 100 may serve the users A and B with identical presentations of the III. These presentations may also be different from the presentations for the users C and D, for both of whom the image forming apparatus 100 utilizes the same data indicated by the reference designator “Ul#2.” [0026] In various examples, the III setting information may include an item that is applicable, upon provision of the III, to the image forming apparatus 100. For example, the III setting information may include a size of a font for use in a presentation of the III, a type of the font, whether or not to activate a voice assistant service in the presentation, whether or not to activate an easy-to-use mode of the III in the presentation, whether or not to activate a function of color adjustment in the presentation, which of image forming job menus (e.g., scan, photocopy, print, etc. menus) is to be provided in the presentation by priority, which of the menus is optional for the presentation, whether or not an option is to be provided in the presentation to verify a total number of to-be-printed copies before printing, whether scanning with optical character recognition (OCR) processing is allowed or not, whether printing of multiple pages per sheet is allowed or not, whether printing with a modification to an original text is allowed or not, whether grayscale or color printing is to be used, a dots-per-inch (DPI) resolution in which scanning is to be performed, or the like.

[0027] In an example, when two users of the image forming apparatus 100 exhibit different attributes, the users may be provided with their respective different presentations of the III of the image forming apparatus 100. Otherwise, these users may have in common some features of their respective III presentations. In an example, this may facilitate sharing of the same III settings, e.g., those settings for a “most suitable” presentation of the III, between users who have identical, well-defined attributes.

[0028] In an example, the image forming apparatus 100 may retain attribute information as a record that is identifiable in association with a user such as a user A, B, C, D, or E. In an example, the attribute information may be predefined based on a collection of records of information regarding attributes of real and/or fictional people, e.g., the users A, B, C, D, and E, who may be viewed as individuals as well as members of a group such as a corporate organization. As an example, a record of attribute information may contain a value of a person’s attribute that is available, as quantitative data, to the image forming apparatus 100. In an example, the attribute may have a value of a logical, numeric, text, or any other suitable data type.

[0029] Further, the image forming apparatus 100 may identify which of the records of the attribute information is associated with a user to identify which of the records of the III setting information corresponds to the identified record of the attribute information. Based on the identified record of the III setting information, the image forming apparatus 100 may determine how to alter the settings of the III. For example, the image forming apparatus 100 may determine what presentation of the III to provide for the user. In an example, for each of the users A and B, an identical instance of the attribute information associated with a user may be obtained, which, however, may be different from that for, e.g., the user C. In this and other examples, an instance of the attribute information is a set of data identified, extracted, fetched, retrieved, constructed, and/or otherwise processed, in association with a specific user, from one or more records of the attribute information.

[0030] FIG. 2 is a schematic illustration of attribute information and items of user identification (ID) information according to an example.

[0031] Referring to FIG. 2, an example is illustrated including the fields ID Code, Name, Age, Department, and Task Type. For example, the ID Code and the Name fields are indicative of values of items of the user ID information. Such ID information may be delivered to the image forming apparatus 100 from a user’s terminal. The remaining fields may be populated with values of items of the attribute information. By way of example, the items of the attribute information may be classified as a single-value attribute if two or more values of its data type are prevented from being concurrently assigned, or a set-value attribute. In the example of FIG. 2, while the Age field is of the single-value type, the Department and the Task Type fields are of the set-value type. As shown, the Task Type field has a set of the two text values, “Bulk Copy” and “Name Change,” concatenated by the sign “+” as indicated. The Department field is filled in with a single value, i.e. , the text-type value “Financial Department,” although it is also susceptible to multiple text-type entries.

[0032] To further demonstrate examples of how the attribute information could be organized, it may be assumed that the attribute information includes two of the examples of types of information, i.e., personal attribute information and group attribute information. Personal attribute information may be defined to refer to an attribute of an individual. For example, personal attribute information may include a user’s age, visual acuity, color vision deficiency (CVD), auditory acuity, preferred language, disability, or the like. These items pertain to physical, or at least by definition, non-multi-value characteristics, and may thus be treated as single-value attributes in this and other examples. Further, group attribute information may represent an attribute of members of a group, such as a corporation, business, or other organizational entity. For example, group attribute information may include organization department, job position, type of task assigned, time spent on the assigned task, indication of user status (e.g., whether the user is a beginner or experienced for the task), or the like. In this and other examples, the above-listed items may be referred to as set-value attributes.

[0033] In some examples, an item of the attribute information is categorical in a sense that the item is classified by the type of value it carries. For example, an item of the attribute information may hold a value selected from a set of enumerated numeric-based values (i.e., a numeric-based-categorized value), a value of a yes-or-no answer or choice (i.e., a YES/NO value), or a value representative of a category into which a text-type value is mapped based on a certain criterion (i.e., a text-based-categorized value). Examples of types of attribute values are further discussed below.

[0034] In an example, an item of attribute information, e.g., age, visual acuity value, auditory acuity value, time spent on an assigned task, etc. , may have a numeric-based-categorized value, with each value mapped to a group delimited by a boundary value. As an example, the value entry for the age item is a data type for describing a person’s age with boundary values of ten, twenty, thirty, forty, and fifty years old. In this example, if a user’s age is twenty-five, thirty, forty-one, or fifty-nine, the item value is indicative of the user being in his/her twenties, thirties, forties, fifties, or more, respectively. In another example, the visual acuity value item holds a value representing that a decimal visual acuity score of a person’s left or right eye, or both of his/her eyes on average, is greater than or equal to 2.0, greater than or equal to 1 .0 and lower than 2.0, greater than or equal to 0.1 and lower than 1.0, or lower than 0.1 .

[0035] An example item of the attribute information, e.g., indication of CVD, indication of disability, indication of user status, etc., may have a YES/NO value. By way of example, the indication of CVD item may have a value of YES or NO to indicate color blindness.

[0036] An example item of the attribute information, e.g., organization department, job position, type of assigned task, etc., may hold a text-based- categorized value. Examples of text-based-categorized values may be obtained by categorizing text-type values based on the equivalence therebetween. The criterion for equivalence may be established by, e.g., an administrator of the image forming apparatus 100. For example, for the organizational department item, two users from the financial department may be categorized as the same category of “Financial Department,” while a user from the financial department and another user from the personnel department may be categorized to two different categories. In another example, however, different text values of the organizational department item may be mapped to the same category based on the equivalence between the two departments. For example, the text-type value “Administrative Department” may be mapped to the category of “Personnel Department.”

[0037] Examples of approaches for identifying personal attribute information and group attribute information are described below.

[0038] In an example, the image forming apparatus 100 may obtain user ID information regarding a user from the user’s terminal. For example, upon detection of the presence of a terminal, the image forming apparatus 100 may receive therefrom user ID information. The image forming apparatus 100 may identify, from the records of the attribute information already retained therein, a corresponding record associated with the user (e.g., associated with the obtained user ID information). For example, in response to receiving the user ID information, the image forming apparatus 100 may utilize the ID information to identify a record of personal attribute information or a record of group attribute information as being associated with the user.

[0039] In an example, the image forming apparatus 100 may obtain an instance of the attribute information from an external device that retains at least some of the attribute information as a record and is capable of communicating and interacting with the image forming apparatus 100. For example, a user’s terminal may maintain the user’s associated record of the attribute information. Alternatively, or in addition, a server may retain, in any suitable data format, a record of the attribute information that is identifiable in association with the user. [0040] FIG. 3 schematically depicts a sequence of interactions between the image forming apparatus of FIG. 1 and a user terminal according to an example.

[0041] Referring to FIG. 3, a terminal 210 may be a smartphone, a tablet, a wearable device, a radio frequency identification (RFID) tag, or the like. In an example, the terminal 210 may include a memory, a communications unit, and a control unit. The memory may include means for storing, e.g., user ID information that identifies a user 200, and corresponding personal attribute information and group attribute information. The communications unit may enable communications with the image forming apparatus 100. The control unit may control the communications unit to provide the information stored in the memory to the image forming apparatus 100.

[0042] In an example, the image forming apparatus 100 may obtain, from the terminal 210, an associated instance of the attribute information including, e.g., personal attribute information and group attribute information of the user 200. In an example, upon detecting the presence of the terminal 210 within a certain proximity of the image forming apparatus 100, the terminal 210 and the image forming apparatus 100 may recognize each other in a suitable manner, e.g., via a short range communication technology such as RFID, Bluetooth, near-field communication (NFC), etc. The terminal 210 may fetch the retained record of the attribute information, which is associated with the user 200, to deliver the resulting instance of the attribute information to the image forming apparatus 100. Based on the delivered instance of the attribute information, the image forming apparatus 100 may identify which of the records of the attribute information that are retained therein is associated with the user, to provide a presentation of the III for the user 200.

[0043] FIG. 4 schematically depicts a sequence of interactions between the image forming apparatus of FIG. 1 and a server according to an example. [0044] Referring to FIG. 4, the image forming apparatus 100 is in communication with a server 300, as well as with the terminal 210, to provide the III for the user 200. In this example, in response to recognizing the image forming apparatus 100, the terminal 210 may provide the user ID information without the attribute information such as personal attribute information and group attribute information. The image forming apparatus 100 may provide the user ID information regarding the user 200 to the server 300, which may in turn perform authentication of the user 200 based on the user ID information. If the authentication is successful, the server 300 may retrieve, from its own records of attribute information, a corresponding record associated with the user 200 to provide the resulting instance of attribute information, e.g., an instance of personal attribute information and an instance of group attribute information, to the image forming apparatus 100. Based on the instance received from the server 300, the image forming apparatus 100 may identify which of the records of the attribute information that are retained therein is associated with the user 200.

[0045] FIG. 5 is a block diagram illustrating an example of the image forming apparatus of FIG. 1 .

[0046] Referring to FIG. 5, the image forming apparatus 100 may include a memory 110, a communications unit 120, a III device 130, an image forming unit 140, and a processor 150. For example, the image forming apparatus 100 may also include an additional component, e.g., a power supply unit to supply power to the above-mentioned components. In another example, the image forming apparatus 100 may include some, but not all, of the components shown in FIG. 5. Other examples are also contemplated.

[0047] The memory 110 may include any computer-readable storage medium that stores data in a non-transitory form. Thus, the memory 110 may be, for example, random access memory (RAM), read-only memory (ROM), and/or any other type of storage medium. The memory 110 may have stored therein a variety of information, for example, a set of instructions that are to be executed by the processor 150, the above-mentioned attribute information including, e.g., personal attribute information and group attribute information, and/or other information. In an example, the memory 110 may have stored therein a III table, and/or a III provision model, examples of both of which will be discussed below. Further, the information stored in the memory 110 may be updated.

[0048] The communications unit 120 may communicate with, e.g., the terminal 210, the server 300, and so forth. For example, the communications unit 120 may include a reader to recognize a sticker, a barcode, or any other suitable type of tag, including, e.g., an NFC tag. In addition, or alternatively, the communications unit 120 may include a wired communications module and/or a wireless communications module. The wired communications module may support one or more of a local area network (LAN), Universal Serial Bus (USB), High Definition Multimedia Interface (HDMI), and any other suitable type of wired communication technology. The wireless communications module may support one or more of Wi-Fi, Wi-Fi Direct, Bluetooth, Ultra-Wide Band (UWB), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), Fifth Generation (5G), NFC, and any other suitable type of wireless communication technology. As such, the communications unit 120 may be used by the image forming apparatus 100 for, e.g., network communications with another entity. The communications unit 120 may be a transceiver.

[0049] The Ul device 130 may include an input unit to receive a variety of user inputs, e.g., a user input to perform an image forming job, a user input of user ID information from a terminal such as the terminal 210 of the user 200, a user input of update information on a Ul, and the like. Examples of an input unit include a keyboard, a keypad, a physical button, a touch pad, a touch screen, a camera, a microphone, a reader to recognize a sticker, a barcode, etc., and any other type of device that can receive a variety of forms of user input.

[0050] The Ul device 130 may include an output unit to output or provide a presentation of a Ul to a user during an image forming job and/or to output or provide information on, e.g., a result of the image forming job or a status of the image forming apparatus 100. Examples of the output unit include a display panel, a speaker, and any other type of device that can provide a variety of outputs. In an example, the output unit may generate a form of sensory output, e.g., a user-specific presentation of a Ul, based on the Ul setting information as discussed above. For example, for each of the users A and B of FIG. 1 , a size of a font is set to 12-pointfor use with a presentation of the III, for the III presentation to the user A, a voice assistant service is activated, and for the III presentation to the user B, the voice assistant service is deactivated. In this example, the output unit may provide the two users A and B with their respective different III presentations, one with the voice assistant service and the other without it, although these presentations may feature the same-sized font.

[0051] The image forming unit 140 may perform an image forming job including, e.g., printing, copying, scanning, and/or facsimile transmission. Depending on the III setting information involved, as discussed earlier, the image forming unit 140 may operate in different manners or produce different results of the operations. In various examples, the image forming unit 140 may include a print unit 141 , a scan unit 142, and a fax unit 143, as shown in FIG. 5. In an example, the image forming unit 140 may include a subset of the above-listed components or include additional components for processing of other image forming jobs.

[0052] The print unit 141 may include a printing mechanism to form an image on a recording medium such as a sheet of paper. Various examples of the printing mechanism include an electro-photographic mechanism, an inkjet mechanism, a thermal transfer mechanism, a direct thermal mechanism, and the like.

[0053] The scan unit 142 may irradiate light onto a document and receive the light reflected therefrom to read the imagery on the document. As an example, an image sensor such as a charge coupled device (CCD) type sensor, a contacttype image sensor (CIS), or any other suitable type of image sensor may be used therein for image reading from a document. In various examples, the scan unit 142 may have a flatbed structure in which an image sensor is to move to read an image from a document page placed fixedly on a specific location, a document feed structure in which document sheets are to be fed to allow a fixedly-positioned image sensor to read images therefrom, or a combination thereof.

[0054] For example, the fax unit 143 may include a component to scan an image and a component to print a received image file. These components may also be used by the scan unit 142 and the print unit 141 , respectively. Further, the fax unit 143 may transfer a scanned image file to a destination or receive an image file from an external source.

[0055] The processor 150 may execute an instruction stored in the memory 110. The processor 150 may also read other information stored in the memory 110. In addition, the processor 150 may store information in the memory 110 and may update information stored in the memory 110.

[0056] The processor 150 may control an operation of the image forming apparatus 100. For example, the processor 150 may control the communications unit 120 to deliver the user ID information received from the user terminal 210. In an example, the processor 150 may be implemented with a central processing unit (CPU) or other processing circuitry to perform example operations herein.

[0057] The processor 150 may determine which record of Ul setting information is associated with a user. In an example, this determination may be made based on which record of the attribute information is associated with the user. In various examples, the processor 150 may identify a record of personal attribute information or group attribute information as being associated with the user by receiving an instance of attribute information from the terminal 210 of the user 200 through the communications unit 120, by receiving an instance of attribute information from the external server 300 through the communications unit 120, or by receiving user ID information from the terminal 210 and utilizing the received user ID information to retrieve the user-associated record from the records of attribute information that are stored in the memory 110.

[0058] In various examples, the processor 150 may perform the abovedescribed determination using a Ul table stored in the memory 110. In such examples, the term “Ul table” refers to a table by which sets of records of the Ul setting information respectively correspond to sets of records of attribute information, for example, one set of personal attribute information and one set of group attribute information. In an example, the Ul table may be updated by the processor 150 in response to update information regarding the Ul table being received from the Ul device 130.

[0059] FIG. 6 is a conceptual diagram illustrating examples of items of information recorded in a Ul table. [0060] Referring to FIG. 6, an example III table may retain an item of personal attribute information, such as visual acuity, age, auditory acuity, an indication of CVD, or the like. In an example, the III table may also retain an item of group attribute information, such as organization department, job position, type of task assigned, time spent on the assigned task, a status indication of the user (e.g., whether a user is a beginner or experienced for the task), or the like. For each personal attribute information and group attribute information, the III table may include individual records thereof that are associated with respective users, for example, previously collected from the users, and thus identifiable as being associated with the users.

[0061] In an example, a III table may include records of the III setting information, some of which are mapped to a set of records of personal attribute information, and others of which are mapped to another set of records of group attribute information. The example of FIG. 6 shows that for a given user, values of various items of the III setting information may be predetermined based on the user’s visual acuity value, age, and/or auditory acuity value. For example, the size of a font for use in a presentation of the III, the type of the font, whether or not to activate a voice assistant service in the presentation, and whether or not to activate an easy-to-use mode of the III in the presentation may be predetermined based on the user’s visual acuity value, age, and/or auditory acuity value. Likewise, the value of whether or not to activate a function of color adjustment in the presentation and the value of whether or not to activate a voice assistant service in the presentation may be predetermined based on the user’s CVD status. In addition, values of various items of the III setting information may be predetermined based on the user’s organization department, job position, type of assigned task, and/or time spent on the assigned task. For example, print settings status indicating whether grayscale or color printing is to be used, whether printing of multiple pages per sheet is allowed or not, whether bulk copy is allowed or not, whether name change is allowed or not, whether scanning with OCR processing is allowed or not, and a DPI resolution in which scanning is to be performed may be predetermined based on the user’s organization department, job position, type of assigned task, and/or time spent on the assigned task. Likewise, whether or not an option is to be provided in the presentation to verify a total number of to- be-printed copies before printing may be predetermined based on the user’s level of experience for the task.

[0062] FIG. 7 is a conceptual diagram illustrating examples of values of information recorded in a III table.

[0063] Referring to FIG. 7, some of the items of FIG. 6, including the items of age, organization department, font size, whether to activate a voice assistance service, grayscale or color printing, and scanning with or without OCR, may be assigned certain values. Each of the examples of values may be entered in a corresponding field of a record of the attribute information or the III setting information. It is noted that in an example, the items of the attribute information may be ranked by priority. For example, the ranking may be pre-established (e.g., by an administrator of the image forming apparatus 100). In this example, if both higher-ranking and lower-ranking items of the attribute information are mapped to a same item of the III setting information, the higher-ranking item of the attribute information is prioritized in determining the resulting value of the same item of the III setting information. By way of example, it may be assumed that the visual acuity value item and the auditory acuity value item have higher ranking than the age item. Referring merely to FIG. 7, it could be understood that when the user A is thirty-five years old, the default font size is 22-point and the voice assistant service is deactivated by default. If, however, the user A has a visual acuity value lower than 0.0 on a decimal scale and an auditory acuity value higher than a certain hearing loss threshold, it may be determined by default that the font size is larger than 22-point and the voice assistant service is activated. Similarly, if the user B is fifty-nine years old with a decimal visual acuity value higher than 0.0 and an auditory acuity value lower than the hearing loss threshold, the default font size may be lower than 32-point and the voice assistant service may be deactivated by default.

[0064] As such, in various examples, the III table may be employed in which two different sets of records of the III setting information are respectively mapped to different sets of records of the attribute information. By way of illustration and for ease of discussion, in the following example, the III table may be described to include a first set of records of personal attribute information, a second set of records of group attribute information, and a third and a fourth set of records of the III setting information, wherein the third and the fourth sets are respectively mapped to the first and the second sets. Thus, in response to a first record in the first set and a second record in the second set being identified to be associated with a user, e.g., the user A, B, C, D, or E, the processor 150 may identify a third record in the third set and a fourth record in the fourth set as respectively corresponding to the first and the second records. Based on the third and the fourth records, the processor 150 may control the image forming apparatus 100, including the III device 130, to provide a presentation of the III for the user.

[0065] For example, it may be assumed that the third set is a set of N records of III setting information that respectively correspond to N records of personal attribute information, and that the fourth set is a set of M records of III setting information that respectively correspond to M records of group attribute information, where N and M are natural numbers. The processor 150 may identify the third and fourth records from the third, N-record set, and the fourth, M-record set, respectively. Thus, in various examples, the user may be presented with one of N * M presentations of the III, which are configured in accordance with N * M record pairings between the two sets.

[0066] In an example, however, more than one III presentations may be made available from a pair of the third and the fourth records. In an example, the two paired records may have different values for a same item of the III setting information--for instance, for the font size item, one of the paired records has a value of 10 point and the other has a value of 20 point. In this example, the processor 150 may use the two different setting values to control the output unit of the III device 130 to provide respective presentations of the III for the user. Further, the III device 130 may receive, from its input unit, a user input indicative of a selected one of the respective presentations of the III. Thus, in response to receiving the user input, the III device 130 may provide the selected presentation of the III while suspending the other one of the respective presentations. In addition, or alternatively, in response to receiving the user input, the III device 130 may determine to provide the selected presentation of the III for later use of the image forming apparatus 100 by the user.

[0067] Instead of such slightly customizable III presentation, in an example, if the third and the fourth records have different values for the same item of the III setting information, the processor 150 may control the III device 130 in accordance with a certain criterion of priority between personal and group attribute items. In an example, the first and the second records may respectively have a first and a second item, to each of which a respective priority is assigned, and the same item of the III setting information is mapped. In this example, the processor 150 may control the III device 130 based on which of the first and the second item is assigned a higher priority. For instance, if the priority of the first item is higher than that of the second item, the III device 130 may provide a presentation of the III using the value represented by the third record that differs from using the value represented by the fourth record, for the same mapped item of the III setting information, and vice versa.

[0068] In an example, the processor 150 may employ a III provision model, other than the above-described III table, to determine the third and fourth records of the III setting information. The III provision model may be stored in the memory 110 and may be a pre-trained deep learning or other machine learning model.

[0069] FIG. 8 is a conceptual diagram illustrating an example of a III provision model.

[0070] Referring to FIG. 8, an input dataset and a label dataset may be taken for the pre-training of a III provision model 122. The input dataset contains a plurality of pieces of attribute information for input to the un-trained model. The label dataset contains a plurality of labels, or correct answers, i.e., labelled pieces of the III setting information, that are to be compared with output results of a certain operation of the model upon the input in the pre-training phase. For a suitable presentation of the III, the model may be trained by repeatedly updating its internal parameters, e.g., weights, based on the comparison of differences, or errors, between the labels and the output results.

[0071] FIG. 9 is a schematic illustration of a scenario in which an image forming apparatus may use pre-configured III settings according to an example. FIG. 10 is a schematic illustration of a scenario in which an image forming apparatus may configure combined III settings according to an example.

[0072] Referring to FIG. 9, it is seen that some examples of records of group attribute information and some examples of records of III setting information are identifiable from the III table in association with the users A, B, C, D, and E. Associated records of personal attribute information are not illustrated in FIG. 9 for the sake of brevity. In the example of FIG. 9, an example of a new user, user F, is using the image forming apparatus 100. In a similar manner to those described above, e.g., with respect to FIGS. 3 and 4, a record in the first set of records of personal attribute information and a record in the second set of records of group attribute information may be identified as being associated with the user F. The identified records of the attribute information are those identifiable as being associated with the users A and B, including the associated record of group attribute information, as indicated by the reference designator “Group-1” throughout FIG. 9. Based on a pair of records of III setting information that are identifiable as respectively matching the identified records of attribute information, the processor 150 may control the image forming apparatus 100, including the III device 130, to provide the same presentation of the III for the users A, B, and F, as indicated by “Ul#1” throughout FIG. 9.

[0073] In an example of a new user, user G, the processor 150 may control the III device 130 in a similar manner using the reference designators “Group-2” and “Ul#2” in FIG. 9.

[0074] Referring to FIG. 10, the examples of records of FIG. 9 are shown again in association with each of the users A, B, C, D, and E. In this regard, it is noted that in an example, the group attribute information is allowed to have a setvalued item while the personal attribute information is prevented from having any set-value item. Thus, in the example of FIG. 10, various records may be identified as being associated with a new user H. For example, a record in the first set of records of personal attribute information, and two records in the second set of records of group attribute information, as indicated in FIG. 10 by the concatenation of “Group-1” and “Group-2” with “+” inserted therebetween, may be identified as being associated with a new user H. [0075] In such an example, the processor 150 may identify two pairs of records of III setting information. For example, the processor 150 may identify a pair of records that respectively correspond to the identified record in the first set and one of the identified records in the second set and another pair of records that respectively correspond to the identified record in the first set and the other one of the identified records in the second set. As indicated in FIG. 10 by the concatenation of “Ul#1” and “Ul#2” with “+” inserted therebetween, the processor 150 may combine the two pairs Ul#1 and Ul#2 to control the image forming apparatus 100, including the III device 130, to provide a combined presentation of the III for the user H. In an example, this combination may be retained as a new record of the III setting information.

[0076] Further, for each item of the III setting information, different values may or may not be concurrently used in the presentation of the III. The processor 150 may perform processing on that item using priorities that are pre-assigned to all possible values thereof. For example, the III setting information may include the item of which one of print, scan, and facsimile transmission menus is to be provided in a presentation of the III based on priority. In this example, the item may hold the value of “Print,” “Scan,” or “Facsimile Transmission,” with each value assigned a respective priority. Based on the value represented by each of the pairs, Ul#1 and Ul#2, and also possibly on its priority, the processor 150 may control the image forming apparatus 100, including the III device 130, to perform processing for combined III settings. For example, if the two pairs have different values for that item, the processor 150 may have the item presented with a distinctive feature corresponding to the more highly prioritized value, for example, higher or more to the left on a screen of the III device 130 or with a larger-sized or thicker font on the screen, and/or may prevent the item from featuring the lower prioritized value on the screen.

[0077] FIG. 11 is a flow diagram illustrating a method of image forming according to an example.

[0078] The example method of FIG. 11 may be performed by the image forming apparatus 100, for example, the processor 150. This flow diagram is not intended to indicate that the operations of the example method are to be executed in any particular order, or that all of the operations of FIG. 11 are to be included in every case. Further, any number of additional operations not shown may be included within the example of a method.

[0079] Referring to FIG. 11 , a record of personal attribute information and a record of group attribute information associated with a user are obtained at operation S100. Various examples of records of personal attribute information and group attribute information may be included in a user interface (III) table. The III table may further include records of III setting information regarding the III to be provided by the III device 130. The records of III setting information may be respectively mapped to the records of personal attribute information and the records of group attribute information.

[0080] At operation S110, the records of III setting information corresponding to the user-associated record of personal attribute information and that of group attribute information are identified in the III table.

[0081] At operation S120, the III device 130 is controlled based on the identified records of III setting information.

[0082] For further details on the example method, reference may be made to the above description of the image forming apparatus 100.

[0083] FIG. 12 is a block diagram of an electronic device according to an example.

[0084] The example of an electronic device may include, implement, or be included within the image forming apparatus 100. Referring to FIG. 12, an electronic device 1100 includes a memory 1110, a communications unit 1120, a III device 1130, and a processor 1150, which may have the same configurations and perform the same function as those of the memory 110, the communications unit 120, the III device 130, and the processor 150 of the FIG. 5, respectively. The electronic device 1100 may be implemented without functions such as those of the image forming job unit 140 or may be implemented using additional components to perform other electronic functions. For further details on the example electronic device 1100, reference may be made to the above description of the image forming apparatus 100.

[0085] FIG. 13 is a block diagram a computer readable storage medium according to an example.

[0086] When executed by a processor, the computer program may implement a method of image forming, for example, the method of FIG. 11. As shown in FIG. 13, the computer program stored in the computer readable storage medium 1200 may include a set of program instructions S200, S210, and S230 that direct the processor to perform example operations.

[0087] Referring to FIG. 13, the instructions S200 may be to obtain a record of personal attribute information and a record of group attribute information associated with a user. Various examples of records of personal attribute information and group attribute information may be included in a user interface (III) table. The III table may further include records of III setting information regarding the III to be provided by the III device 130. The records of III setting information may be respectively mapped to the records of personal attribute information and the records of group attribute information.

[0088] The instructions S210 may be to identify a third record in the third set and a fourth record in the fourth set as respectively matching the first and the second records.

[0089] The instructions S230 may be to control the III device 130 based on the third and fourth records.

[0090] For further details on the examples of operations, reference may be made to the above description of the image forming apparatus 100.

[0091] The computer readable storage medium 1200 may be a non- transitory readable medium. The term “non-transitory readable medium” as used herein refers to a medium that is capable of semi-permanently storing data and is readable by an apparatus, rather than a medium, e.g., a register, a cache, a volatile memory device, etc., that temporarily stores data. For example, the foregoing program instructions may be stored and provided in a CD, a DVD, a hard disk, a Blu-ray disc, a USB, a memory card, a ROM device, or any of other types of non-transitory readable media.

[0092] In an example, the methodology disclosed herein may be incorporated into a computer program product. The computer program product may be available as a product for trading between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium, e.g., compact disc read only memory (CD-ROM), or distributed online through an application store, e.g., PlayStore™. For online distribution, at least a portion of the computer program product may be temporarily stored, or temporarily created, in a storage medium such as a server of the manufacturer, a server of the application store, or a storage medium such as a memory of a relay server.

[0093] The foregoing description has been presented to illustrate and describe various examples. It should be appreciated by those skilled in the art that many modifications and variations are possible in light of the above teaching. In various examples, suitable results may be achieved if the above-described techniques are performed in a different order, and/or if some of the components of the above-described systems, architectures, devices, circuits, and the like are coupled or combined in a different manner, or substituted for or replaced by other components or equivalents thereof.

[0094] Therefore, the scope of the disclosure is not to be limited to the precise form disclosed, but rather defined by the following claims and equivalents thereof.