Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR AUTOMATED GRADING
Document Type and Number:
WIPO Patent Application WO/2021/205378
Kind Code:
A1
Abstract:
A method for automated grading that configures a question paper with plurality of sections and a corresponding answer key through an assessor by a processor 122 of an application server 102, and receives a plurality of answer sheets each having a plurality of sections and tagged to a learner and further authenticates the assessor for assessing the plurality of the answer sheets, thereby creating a schema of the plurality of the sections of the plurality of the answer sheets. The method further generates a context based review report 350 of the plurality of the sections by comparing elements of the schema with the answer key using a self-learning assessment model 242 to validate and calculate a respective score of the learner based on the validated report 350 for the plurality of the sections of the plurality of the answer sheets.

Inventors:
GANESAN SWAMINATHAN (IN)
Application Number:
PCT/IB2021/052921
Publication Date:
October 14, 2021
Filing Date:
April 08, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SMARTAIL PRIVATE LTD (IN)
International Classes:
G09B7/00
Foreign References:
US20090226872A12009-09-10
US20160063873A12016-03-03
US20050272024A12005-12-08
US20030180703A12003-09-25
US20190333401A12019-10-31
Attorney, Agent or Firm:
VARMA, B. Naveen Kumar (IN)
Download PDF:
Claims:
We Claim:

1. A method for automated grading, the method comprising: configuring, by a processor (122) of an application server (102), a question paper and a corresponding answer key by an assessor, wherein the question paper has a plurality of sections; receiving, by the processor (122), a plurality of answer sheets, each having a plurality of sections wherein each of the plurality of the answer sheets is tagged to a learner; authenticating, by the processor (122), the assessor for assessing the plurality of the answer sheets; creating, by the processor (122), a schema of the plurality of the sections of the plurality of the answer sheets; generating, by the processor (122), a context based review report (350) of the plurality of the sections by comparing elements of the schema with the answer key, wherein the comparing is performed using a self-learning assessment model (242); validating, by the processor (122), the context based review report (350) of the plurality of the answer sheets; calculating, by the processor (122), a respective score of the learner based on the validated context based review report (350) for the plurality of the sections of the plurality of the answer sheets.

2. The method as claimed in claim 1, wherein the method is executed using a micro service (208) configured on a cloud solution.

3. The method as claimed in claim 1, wherein receiving an answer sheet further includes having an OCR micro service (208) to identify and accept a handwritten answer sheet.

4. The method as claimed in claim 1, wherein the context based review report (350) further provides an inhibitor of a section of the answer sheet without an assessment.

5. The method as claimed in claim 1, wherein the self-learning assessment model (242) is executed by an AI Engine (162).

6. The method as claimed in claim 1, wherein the schema further comprising: tagging, by the processor (122), each answer sheet with a unique sheet identifier, segmenting, by the processor (122), the plurality of sections of an answer sheet of the plurality of the answer sheets, tagging, by the processor (122), each of the segmented plurality of the sections of the answer sheet with a unique section identifier.

7. The method as claimed in claim 1, further comprising: segregating, by the processor (122), questions into different types of questions with a classification criterion, wherein the classification criterion uses a bloom’s taxonomy criterion.

8. The method as claimed in claim 1, further comprising: generating, by the processor (122), a dashboard (402) for reporting and analysing the assessment based on user access.

9. A system for automated grading, the system comprising: a processor (122); and a memory (126) coupled to the processor (122), wherein the processor (122) executes a plurality of modules (128) stored in the memory (126), and wherein the plurality of modules (128) comprising: a configuration module (130), for configuring a question paper and a corresponding answer key by an assessor, wherein the question paper has a plurality of sections; a scanning module (132), for receiving a plurality of answer sheets, each having a plurality of sections wherein each of the plurality of the answer sheets is tagged to a learner; an access module (142), for authenticating the assessor for assessing the plurality of the answer sheets; a dissection module (138), for creating a schema of the plurality of the sections of the plurality of the answer sheets; a grading module (140), for generating a context based review report (350) of the plurality of the sections by comparing elements of the schema with the answer key, wherein the comparing is performed using a self-learning assessment model (242) and validating the context based review report (350) of the plurality of the answer sheets and calculating a respective score of the learner based on the validated context based review report (350) for the plurality of the sections of the plurality of the answer sheets. 10. The system as claimed in claim 9, wherein the system uses a micro service (208) configured on a cloud solution.

11. The system as claimed in claim 9, wherein the scanning module (132) further includes having an OCR micro service (208) to identify and accept a handwritten answer sheet.

12. The system as claimed in claim 9, wherein the context based review report (350) further provides an inhibitor of a section of the answer sheet without an assessment.

13. The system as claimed in claim 9, wherein the self-learning assessment model (242) is executed by an AI Engine (162).

14. The system as claimed in claim 9, wherein the dissection module (138) further includes the schema for tagging each answer sheet with a unique sheet identifier and segmenting the plurality of sections of an answer sheet of the plurality of the answer sheets and further tagging each of the segmented plurality of the sections of the answer sheet with a unique section identifier.

15. The system as claimed in claim 9, further comprising: a classification module (136), for segregating questions into different types of questions with a classification criterion, wherein the classification criterion uses a bloom’s taxonomy criterion.

16. The system as claimed in claim 9, further comprising: a report module (144), for generating a dashboard (402) for reporting and analysing the assessment based on user access.

Description:
SYSTEM AND METHOD FOR AUTOMATED GRADING

FIELD OF THE INVENTION

The present disclosure generally relates to the education industry, more specifically the present disclosure relates to a system and method for automated grading which assists reviewers in the education sector to efficiently analyse and grade learner submissions, allowing faster assessment of answer sheets, project reports and other submissions, while maintaining the expert quality grading standards.

BACKGROUND OF THE INVENTION

Tests and projects have been used for assessing a student’s knowledge of a particular subject matter since long. The assessment methods were developed to ensure that each student understood or learnt the topics, subjects and their scope. A test or project would normally have questions or directions, directed to analyse the understanding of a student pertaining to a subject matter- in a theoretical manner, or a practical manner, or both- a theoretical and a practical manner.

Generally, a test or examination may refer to an assessment intended to measure a student’s knowledge, skill, aptitude, or classification in a particular or on a variety of topics. Moreover, a test may be administered verbally, on paper, on a computer, or in a predetermined area that requires a learner, a test taker or a student to demonstrate or perform a set of skills. Often, assessors utilize a specified grading system in order to analyse the performance of a student or test-taker.

An answer sheet or test attempt report or project report usually entails answers or responses provided by the student, learner or test taker at time of the assessment. Such answers may be graded by an assessor, a reviewer or a teacher to estimate the understanding of the student pertaining to a particular question or the set of questions. Standardized tests assist reviewers to understand the performance of students in explaining a particular subject matter, if the subject matter is being understood by the students; further, it also assesses the performance of an individual student compared to the rest of the students. In plain words, a test provides a benchmark to assess knowledge or understanding conventionally. Usually a reviewer would work on one particular answer sheet, or test report or project report submitted by a student and grade all answers therein individually before proceeding to a second answer sheet, submitted by a second student. Such a process is repetitive, time-taking and redundant, aside from causing hodgepodge. It has also been proven that such a method may also result in an inaccurate assessment due to unavoidable distractions, and may also be influenced by an unconscious bias due to external factors.

In particular, systems are available in the market in order to assist teachers in grading answer sheets however, such systems are very primitive and do not offer any benefit for accomplishing the task at hand. At most, such systems offer tips and tricks for grading through colour coding and time management. Further, the unorganized or partly organized sector of education and particularly, grading, has limitations in assessing the knowledge of a student in a standard manner. In the competitive times of today, grades shape up the future of an individual since grades determine entries into colleges, courses and even eligibility for certain work profiles. This may further drastically change the opportunities being offered to an individual for growth in a particular field. With an inappropriate assessment, most individuals may not be able to realize their respective roles and suitably take the same in society. Therefore, importance should be paid to improve the current grading system in order to ascertain genuine grades which are not flickered by any foreign involuntary factors.

Due to the segmented and largely unorganized education sector, there exists a need of a grading system which would in turn address a variety of issues including, but not limited to, reducing amount of time required at an assessors’ end to grade an answer sheet or test report or project report, provide a standardized grading for all answer sheets being graded, remove bias, can scale up, while addressing the variety of issues associated with current solutions, and its components.

SUMMARY OF THE INVENTION

This summary is provided to introduce concepts related to systems and methods for automated grading and the concepts are further described below in the detailed description. This summary is neither intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter. In one implementation, a method for automated grading is disclosed. The method comprising configuring, by a processor of an application server, a question paper and a corresponding answer key by an assessor, wherein the question paper has a plurality of sections and receiving, by the processor, a plurality of answer sheets, each having a plurality of sections wherein each of the plurality of the answer sheets is tagged to a learner. Further, authenticating, by the processor, the assessor for assessing the plurality of the answer sheets and creating, by the processor, a schema of the plurality of the sections of the plurality of the answer sheets. Further, generating, by the processor, a context based review report of the plurality of the sections by comparing elements of the schema with the answer key, wherein the comparing is performed using a self-learning assessment model and validating, by the processor, the context based review report of the plurality of the answer sheets and calculating, by the processor, a respective score of the learner based on the validated context based review report for the plurality of the sections of the plurality of the answer sheets.

In yet another implementation, the method is executed using a micro service configured on a cloud solution.

In yet another implementation, the method receives an answer sheet that further includes having an OCR micro service to identify and accept a handwritten answer sheet.

In yet another implementation, the method has the context based review report that further provides an inhibitor of a section of the answer sheet without an assessment.

In yet another implementation, the method has the self-learning assessment model that is executed by an AI Engine.

In another implementation, the method has the schema that further comprises tagging, by the processor, each answer sheet with a unique sheet identifier, segmenting, by the processor, the plurality of sections of an answer sheet of the plurality of the answer sheets, tagging, by the processor, each of the segmented plurality of the sections of the answer sheet with a unique section identifier.

In another implementation, the method further comprises, segregating, by the processor, questions into different types of questions with a classification criterion, wherein the classification criterion uses a bloom’s taxonomy criterion. In another implementation, the method further comprises, generating, by the processor, a dashboard for reporting and analysing the assessment based on user access.

In one implementation, a system for automated grading is disclosed. The system comprises a processor, and a memory coupled to the processor, wherein the processor executes a plurality of modules stored in the memory, and wherein the plurality of modules comprising a configuration module, for configuring a question paper and a corresponding answer key by an assessor, wherein the question paper has a plurality of sections, a scanning module for receiving a plurality of answer sheets, each having a plurality of sections wherein each of the plurality of the answer sheets is tagged to a learner, an access module, for authenticating the assessor for assessing the plurality of the answer sheets, a dissection module, for creating a schema of the plurality of the sections of the plurality of the answer sheets, and a grading module, for generating a context based review report of the plurality of the sections by comparing elements of the schema with the answer key, wherein the comparing is performed using a self-learning assessment model and validating the context based review report of the plurality of the answer sheets and calculating a respective score of the learner based on the validated context based review report for the plurality of the sections of the plurality of the answer sheets.

In yet another implementation, the system uses a micro service that is configured on a cloud solution.

In yet another implementation, the system has the scanning module that further includes having an OCR micro service to identify and accept a handwritten answer sheet.

In yet another implementation, the system has the context based review report that further provides an inhibitor of a section of the answer sheet without an assessment.

In yet another implementation, the system has the self-learning assessment model that is executed by an AI Engine.

In yet another implementation, the system has the dissection module that further includes the schema for tagging each answer sheet with a unique sheet identifier and segmenting the plurality of sections of an answer sheet of the plurality of the answer sheets and further tagging each of the segmented plurality of the sections of the answer sheet with a unique section identifier. In another implementation, the system further comprises a classification module, for segregating questions into different types of questions with a classification criterion, wherein the classification criterion uses a bloom’s taxonomy criterion.

In another implementation, the system further comprises a report module, for generating a dashboard for reporting and analysing the assessment based on user access.

It is the primary object of the subject matter to provide a system and method for automated grading that may be used by an assessor to consecutively view answers written by all students for one particular question in order to gauge the understanding of the entire class as well as the individual understanding made by students, it may be able to register a question paper or assignment, classify the entailed questions into type of question as well as entailed chapter from the syllabus, perceive the grading criterion for every question. The system and method for automated grading may be used to scan multiple hand-written or digital answer sheets/ project reports, convert the hand-written answer sheets into digital answer sheets, dissect answer sheets in order to have a capability to view one or more than one answers to one or more than once questions at once, register all digital answers, grade all answers on the basis of the predetermined grading criterion, and showcase the results.

It is another object of the subject matter to provide a system and method for automated grading that provides standardized assessment and may reduce the amount of work and time invested by reviewers in grading tests and assignments, providing them with more constructive time to work with the students.

It is another object of the subject matter to provide a system and method for automated grading that may provide additional analytical insights like the overall grades for every student, class, and batch, among others.

It is another object of the subject matter to provide a system and method for automated grading that may be customized as per institutional requirement or user’s desire, allowing a higher level of customization in the grading mechanism, allowing different components to integrate and interact with each other and create combinations and permutations without the need of changing any other detail in either of the parts of the automated grading system.

It is another object of the subject matter to provide a system and method for automated grading that may provide data driven decision intelligence. The system may provide insights to a reviewer on what and which parts of a subject has been understood by learners or students, may have capability to provide cross subject feedbacks and may identify issues in answer sheet, make corrections and provide reports for the teachers and the management to make better decisions. Further, it has a self- learning capacity to augment reviewer’s expertise and bring consistency in whole of system.

It is another object of the subject matter to provide a system and method for automated grading that brings grading accuracy, saves time and removes unconscious bias.

It is another object of the subject matter to provide a system and method for automated grading that may generate deep learning based reports & have data traceability, further it may scale up using artificial intelligence tools and cloud solutions.

It is another object of the subject matter to provide a system and method for automated grading that grades the answer sheet submissions by learners by reducing time spent by reviewers, brings consistency, efficiency, increasing the overall execution, enhancing the reviewer’s performance and productivity.

It is another object of the subject matter to provide a number of advantages depending on the particular aspect, embodiment, implementation and/or configuration.

It is another object of the subject matter to provide a platform that can provide reliable execution, scalability, and value-added services, while controlling operating effort and costs.

It is another object of the subject matter to efficiently manage numerous instances simultaneously, work in different regulatory requirements, enable resources to collaborate and work together closely, efficiently and collectively with user friendly interfaces.

These and other implementations, embodiments, processes and features of the subject matter will become more fully apparent when the following detailed description is read with the accompanying experimental details. However, both the foregoing summary of the subject matter and the following detailed description of it represent one potential implementation or embodiment and are not restrictive of the present disclosure or other alternate implementations or embodiments of the subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS A clear understanding of the key features of the subject matter summarized above may be had by reference to the appended drawings, which illustrate the method and system of the subject matter, although it will be understood that such drawings depict preferred embodiments of the subject matter and, therefore, are not to be considered as limiting its scope with regard to other embodiments which the subject matter is capable of contemplating. Accordingly:

FIGURE.1 illustrates a schematic module diagram depicting an exemplary automated grading system, in accordance with an embodiment of the present subject matter.

FIGURE.2 illustrates a system diagram describing the working of an exemplary automated grading system, in accordance with an embodiment of the present subject matter.

FIGURE.3.1 depicts an exemplary view of an automated grading method, in accordance with an embodiment of the present subject matter.

FIGURE.3.2 depicts another exemplary view of an automated grading method, in accordance with an embodiment of the present subject matter.

FIGURE.4 illustrates an exemplary assessment report automatically generated by an exemplary automated grading system, in accordance with an embodiment of the present subject matter.

FIGURE.5 illustrates an exemplary flowchart of a method of automated grading, in accordance with an embodiment of the present subject matter.

DETAILED DESCRIPTION OF THE INVENTION

The following is a detailed description of implementations of the present disclosure depicted in the accompanying drawings. The implementations are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the implementations but it is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure. While aspects of described systems and methods for automated grading can be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system(s). The present disclosure provides an automated grading system for examining learner or student submissions. The cloud based solution involves assisting an assessor or teacher to consecutively view answers written by all students or test takers for one particular question in order to gauge the understanding of the entire class as well as that of an individual student, thus migrating an existing manual exercise to a cloud based intelligent application and work efficiently by taking advantage of scalability and elastic features of cloud. Further, the automated grading system, enabled by an artificial intelligence engine and supported by natural language processing (NLP) and data driven intelligent tools, is able to register a question paper or assignment, classify the entailed questions into type of question as well as entailed chapter from the syllabus, perceive the grading criterion for every question, scan multiple hand-written or digital answer sheets/ project reports, convert the hand- written answer sheets into digital answer sheets, dissect answer sheets in order to have a capability to view one or more than one answers to one or more than one questions at once, register all digital answers, automatically grade all answers on the basis of the predetermined grading criterion, accept or validate or subsequent manually modify and showcase the results. Leading to higher efficiency, time reduction and cost savings for the whole of educational & grading system management. The present disclosure lists a few important steps to be performed in the application, although they may vary from application to application, or modify existing application to work effectively on cloud platform or PaaS.

The automated grading system may be used to grade answer sheets including but not limited to test reports, project reports, assignment submissions, and examination answers. The automated grading system may be customized based on the requirements or user’s application. Further, the automated grading system may be utilized for grading answer sheets of a theoretical or a practical nature, while assessing the understanding and remembering capabilities of a student or test taker. A component wise structure and a step wise method is listed below.

FIGURE.1 illustrates a schematic module diagram 100 depicting an exemplary automated grading system, in accordance with an embodiment of the present subject matter.

In one implementation, an automated grading system 120 implements a method for automated grading on a server 102, the system 120 includes a processor(s) 122, interface(s) 124, an AI Engine 162, and a memory 126 coupled or in communication to the processor(s) 122. The processor(s) 122 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on automated grading application instructions. Among other capabilities, the processor(s) 122 is configured to fetch and execute computer-readable instructions stored in the memory 126. The AI Engine 162 has various engines, connected configured, including but not limited to a machine learning engine, a natural language processing (NLP) engine, a deep learning engine, a hand writing recognition engine, a HPA engine, a rules engine, a natural language processing engine, a pattern recognition engine. The various engines may work directly or connected or are in communication with each other or are micro-services offered by a cloud network.

Although the present disclosure is explained by considering a scenario that the system is implemented as an application on a server, the systems and methods can be implemented in a variety of computing systems. The computing systems that can implement the described method(s) include, but are not limited to, mainframe computers, workstations, personal computers, desktop computers, minicomputers, servers, multiprocessor systems, laptops, tablets, SCADA systems, smartphones, mobile computing devices, cloud network, web services, web solutions, and the like.

The interface(s) 124 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, etc., allowing the system 120 to interact with a user. Further, the interface(s) 124 may enable the system 120 to communicate with other computing devices, such as web servers and external data servers (not shown in figure) as the need be.

A network used for communicating between all elements in an application server 120 and a cloud environment 110 may be a wireless network, a wired network or a combination thereof. The network can be implemented as one of the different types of networks, such as intranet, local area network LAN, wide area network WAN, the internet, and the like. The network may either be a dedicated network or a shared network. The network further has access to storage devices residing at a client site computer, a host site server or computer, over the cloud, or a combination thereof and the like. The storage has one or many local and remote computer storage media, including one or many memory storage devices, databases, and the like.

The memory 126 can include any computer-readable medium known in the art including, for example, volatile memory (e.g., RAM), and/or non-volatile memory (e.g., EPROM, flash memory, etc.), over the cloud distributed storage, and the like. In one embodiment, the memory 126 includes module(s) 128 and a system data 150. The modules 128 further includes a configuration module 130, a scanning module 132, a deep matrix module 134, a classification module 136, a dissection module 138, a grading module 140, an access module 142, and other modules 144 including but not limited to an analytics module, a report module, a video & quiz module, an assignment module, and the like. It will be appreciated that any of such modules may be represented as a single module or a combination of different modules. Furthermore, the memory 126 further includes the system data 150 that serves, amongst other things, as a repository for storing data fetched, processed, received and generated by one or more of the modules 128. The system data 150 includes, for example, operational data, workflow data, and other data at a storage 152. The system data 150 has the storage 152, represented by 152a, 152b, ..., 152n, as the case may be. In one embodiment, the system data 150 is or has access to databases over a web or cloud network. The storage 152 includes multiple databases including but not limited to a configuration module data, a scanning module data, a deep matrix module data, a classification module data, a dissection module data, a grading module data, an access module data, and other modules data, libraries, link identifiers, a database adapter, a dictionary, a database parser, an application parser, open source libraries, rules library, and the like. It will be appreciated that such databases may be represented as a single database or a combination of different databases hosted locally, remotely or over the cloud. In one embodiment data may be stored in the memory 126 in the form of data structures. Additionally, the aforementioned data can be organized using data models, such as relational or hierarchical data models.

The server 102 is further connected or connectable or a part of a cloud environment 110 that has a plurality of computing systems including but not limited to a server 104, and a server 106, and databases including but not limited to a database 108, database 152a, ...n, and the like. It is imperative that the computing systems and databases communicate with each other under cloud computing rules and also with the server 102 under the web or cloud or other available communication mediums/ protocols. The computing systems of the server 102, 104, 106 generally are distributed processing systems including multiple computing devices connected by and communicating over a network or cloud. Software applications may be run "in the cloud" by configuring them to execute across one or more of the computing devices in a particular cloud computing environment/ system. The computing devices of a cloud computing system may each execute separate copies of the software application, or, in some cases, the operations of the software application may be split among different computing devices and executed in parallel. A cloud computing system may include a plurality of cloud computing instances representing resources available for executing applications. Each instance may be a physical computing device having particular capabilities (storage size, processing speed, network bandwidth, etc.), or may be a virtual computing device having particular capabilities. A particular cloud computing system may offer different instance types having different sets of capabilities for executing software applications.

In one implementation, at first, a user including but not limited to a learner student, a reviewer assessor, an application administrator user, a software professional, a database administrator, an application developer, a quality analyst, and a test professional may use a user access device to access the system 120 via the interface 124 through the server 102. The working of the system 120 and related method and associated modules, sub modules, methods may be explained in detail also using FIG.2, FIG.3.1, FIG.3.2, FIG.4, and FIG.5 explained below.

In one embodiment, the system 120 receives a user instruction data through the interface 124 on the server 102. The modules 128 of the system 120 processes the instructions using the processor 122 while using the system data 150, the other modules 144, and supporting components.

The configuration module 130 of the automated grading system 120 configures the automated grading application such that the system 120 may input question papers, grading criterion for particular questions being asked (if the same are not fed into the system previously) as well as answer key in order to provide a preliminary grade for all the answer sheets as scanned. The grading/ key as configured may assist assessors or teachers in appropriately grading answer sheets on the basis of a determined standard. The configuration module 130 configures the questions and grading system for a particular institute, exam, subject, syllabus and may include questions, multiple choice type or subjective type, have maps, equations, formulae, diagrams, equations, graphs and geometry or mathematical calculations.

The configuration module 130 configures a question paper, where an administrator or a teacher puts it with few clicks via the interface 124, once a question paper is configured it is stored safely in a database of the storage 152. The same questions can be re-used by the teacher for future exams. Teachers can provide their requirement for the question papers to the application and generate a sample or random question paper by a single click.

The scanning module 132 of the automated grading system 120 provides or feeds the scanned copies of the answer sheets. The scanning module 132 works with a document scanner or any digital converting devise such as a photo, image submission by a camera or smartphone device, and the like, as well as a parser which may scan question papers and or all of the answer sheets by learners or students and register the same in the storage unit of the storage 152. The scanning module 132 scans the question papers on the basis of sets, classes, and batches. The scanning module 132 also scans the answer sheets with respect to a question paper previously scanned and logged. These answer sheets may be digital answer sheets or hand- written answer sheets.

The classification module 136 of the automated grading system 120 utilizes a classification criterion in order to segregate questions into different types of questions such as using a bloom’s taxonomy. Scanned answer sheets are further analysed by the classification module 136 on the basis of bloom’s taxonomy, or bloom’s classification. The question paper may also be classified on the basis of bloom’s taxonomy or bloom’s categorization by the classification module 136. Different parts of the question paper may be categorized into representing the percentage of remembering or understanding questions being entailed therein. The marks appointed to questions entailed in a question paper may be categorized on the basis of the chapters in the syllabus. This assists in analysing the weightage of marks being given to various chapters, modules or topics in the exam or test.

A bloom analysis by the classification module 136 includes a question paper setting and analysing via the steps of remembering, understanding, applying, analytical testing, evaluating, and creative aspects. The analysis would further create a chapter wise marks density and weightages. A bloom density mapping is an outcome of the exercise that would result in a graphical representation and back up data map leading to a marks distribution over chapters’ axis to bloom aspects axis (remembering, understanding, applying, analytical testing, evaluating, and creative). A bloom density map would thus be helpful to create the aspects of the distribution of marks as well as the weightages. Different charts can be prepared for the questions and the whole of test paper. The same may be shown to a user of used in an analytical module of the other modules 144.

In one embodiment, a configuration module 130 would allow an administrator or management personnel to create a teacher list and a student list and allow access rights along with access details to an access module 142. Teachers would create questions and store in a database of a storage 152, the questions would be classified by blooms taxonomy of a classification module 136 and mapped to answers by an artificial intelligent engine such as the AI engine 162. In one embodiment, answer submission via smartphone is done where students can submit their homework and assignments to the teacher through a scanning module 132 by a mobile application and/ or a web application. The answers are written and scanned from a paper or book and a picture is taken using the mobile application to submit answers. Students can check the submissions. After correction the teacher can review the score/ performance report and publish the results to student.

The dissection module 138 of the automated grading system 120 utilizes the dissection methodology to dissect answer sheets or submissions by learners in order to view one or more answers to one or more questions by one or more students at the same time in order to provide a cohesive view of the entire answer sheet data. The dissection module 138 creates a schema of the plurality of the sections of the plurality of the answer sheets. The assessors may be able to view hand-written answer sheets for one or more than one answers pertaining to one or more than one questions by one or more than one students at the same time utilizing the dissection module 138. The dissection module 138 provides neat margins and aligns the scanned hand written answer sheets for optimizing the view of an assessor. It can turn the reverse scanned pages and cut/ crop the right sections for aligning and tagging to right questions. This is one of the important features of the application as hundreds of answer sheets would have 10s or hundreds of sub sections each and the respective answers of each sub-section is visible to an assessor in multiple modes.

In one embodiment, an assessor can view all section A answers of all students together so to grade only one section for all at higher efficiency or can choose to have an answer sheet of each log chart or a geographical graph to see how different students would have opted to answer the same question and thus use for grading factor.

The grading module 140 of the automated grading system 120 is used to grade the answer sheet data as it is forwarded to the grading module 140 which provides a preliminary grade to every answer on the basis of a predetermined criterion in form of a context review report 350. This grade may be accepted or rejected by an assessor. The predetermined criterion for grading an answer may include but not be limited to bloom’s categorization of questions into a remembering question or an understanding question, subject matter of the entailed chapters to which the question belongs, the configured information standardizing an ideal answer fed in by a teacher or assessor while configuring the automated grading system 120 through the configuration module 130, and the number of marks allotted to that particular question. The assessor may be able to change the grade or reject a preliminary grade assigned by the system 120 to include extra marks for good handwriting, or more such factors or correct the answer key. In some cases, the automated grading system 120 may not provide a preliminary grade to the answer sheet and instead, the assessors may be able to view all answers pertaining to a question at once in order to accurately grade all of them after review. The changes would be understood by the system and same will be fed to the self-assessment model for learning and intelligent decision making.

The grading module 140 uses natural language processing engine of the AI engine 162 that works on the logic of context, meaning and intent. It also uses a rubric grading matrix to grade and store the grades in the storage 152. The grading module 140 may utilize numeric or alphanumeric grades for assessing answers on the basis of the configuration data. The grading module 140 may calculate overall grades for every answer sheet. The grading module 140 may be utilized to grade assessments including but not limited to tests, assignments, project reports, home-work, class-work, thesis, essays, written educational competitions and research papers.

The deep matrix module 134 of the automated grading system 120 works where the scanned answer sheets are forwarded to the intelligence engine such as the AI engine 162 by the deep matrix module 134 where the hand- written text is converted to digital text utilizing machine learning and natural language processing technologies. The automated grading system 120 may utilize artificial intelligence technologies in order to make the system a self-learning system which improves and leams with every answer it grades and feed such learning to the self- assessment model.

In one embodiment, an answer sheet correction is performed by a deep matrix module 134. All answer sheets are scanned and uploaded to an application ecosystem by a teacher. The application identifies the present exam or assignment and links the answer sheet to a storage 152. An AI engine 162 identifies the answer sheet in prerequisite format, dissects the answers using a dissection module 138 and connects the answer sheet to a student account from the storage 152. On verification by the teacher, and further dissection, correction is performed and a scoring engine leads to record marks for a question for a student in the storage 152. Thereby, the deep matrix module 134 corrects and provides reports for the teachers and the management to make better decision. The access module 142 of the automated grading system 120 provides access rights to different users, such as teachers or assessors, students or learners, administrators or management personnel, database administrators, application administrators, developers, and the like.

The other modules 144 of the automated grading system 120 performs multiple tasks. Such as an analytics module 144 of the other modules 144 would generate analytics based on the collected data from student scores, question scores, patterns of misses, errors, and attempts. Teacher efficiency for a particular subject or topic, student learning efficiency, and the like. Different matrix can be prepared and fed to a report module 144.

The report module 144 of the other modules 144 generates data for a dashboard or reports for specific usage by management, assessors, or students. An assessment report 402 generated by the report module 144 using the AI engine 162 provides in-depth details regarding but not limited to a timeline, student groups, test groups, teacher groups, subject groups, marks obtained, percentages, group scores, individual scores, attendance, tests conducted, assignments, worksheets, lessons, quizzes, documents, bloom analysis data or any other visual, graphical reports for consumption by students, teachers or management.

In one embodiment, an RPA hot is configured to generate an assessment report 402 using a proactive automated grading (PAG) algorithm. In one embodiment, the RPA hot is a software hot or a combination of a software and hardware hot. In an embodiment, the software hot is a computer program enabling a processor to perform robotic process automation by utilizing AI. In another embodiment, the hot has a combination of hardware and software, where the hardware includes memory, processor, controller and other associated chipsets especially dedicated to perform functions that enable robotic process automation for automated grading.

A video & quiz module 144 of the other modules 144 is an AI powered module to manage videos and quiz for students or learners. Conventionally the videos have online classes & recorded sessions and are generally available but less reusable, the video & quiz module 144 has videos pre -prepared for the class, are highly effective and reusable & available in various modes. Conventionally the quizzes are restricted to classroom, have separated interactions and are not in context of videos, or lesson specific, and may not be available immediately. The video & quiz module 144 has quizzes which are context & concept specific, provides immediate feedback and helps the assessors to have a learning gap identification. An assignment module 144 of the other modules 144 is an AI powered module to manage assignments by the students or learners, the assignments conventionally are mostly manual, partially digital or semi-automated, and mimics a manual process. The assignment module 144 provides an automated assignments using AI that uses a question wise approach and not by student approach. The data thus collected drives the decision making. The new features also provide question paper analysis using both of handwritten and digital contents.

FIGURE.2 illustrates a system diagram 200 describing the working of an exemplary automated grading system, in accordance with an embodiment of the present subject matter.

The present disclosure depicts architecture and working of the automated grading system 120 for grading or assessing an answer sheet as submitted by a student(s) while using an AI engine to increase efficiency and bring data driven decision intelligence.

In one embodiment, a system 120, such as an automated grading system 120 resides on a server 102 in a cloud environment 110. The system 120 is linked to various supporting cloud components such as web services, data services, and the like in following architecture.

In a first step, the automated grading system application working with a user interface 202 such as an interface 124 where users can interact with the system 120 using a web-application and/ or a smart-phone application, be it an android or iOS platforms. The users will be authenticated into the system and then allowed to access all features of the application. The user pool & control access 204 is a security group step which works with an encrypted database that contains the user details and their access levels which will be used for authentication and authorization of their roles. Such as an access module 142 works with a storage 152 to provide user access rights depending on their role levels to determine their access levels, for example, access rights of management personnel, teachers, students, application developers and of troubleshooting teams. Further, the application uses an API gateway 206 to authenticate and access the underlying API’s which are served from the micro-services (detailed further). The API’s are authenticated through a multi-level process, access by role, access by authority, access by another internal service with security levels such as IP -restricted access, authentication-key and more. In one example, the application utilizes an AWS API gateway.

In next step, the automated grading system 120 interacts with an AI micro-services 208. A plurality of the AI micro-services 208 will perform different functions and scale as per the need of the application and user load. Each of the AI micro service 208, 208a, ...n has a built in architecture 210, 210a, ...n, EWS containers 212, 212a, 214, 214a, ...n, a multi-tenant encrypted DB 216, 218, 220, 216a, 218a, 220a, ...n along with respective S3 buckets. The 208 and 208a are shown as an examples, the AI micro services 208 can be multiple and will proportionally have all sub components. Such as, an application micro-services 208a, many of such application micro-services are built using JAVA Spring-boot architecture. An upgraded self-learning assessment model 242 as provided by a data training engine 240 is fed to upgrade, correct, modify or change inputs of a micro service 208, 208a, ...n. Each of the micro-services perform a specific domain function that has been pre-determined during the design phase. Each institution which is using the application is provided with a separate database and S3 bucket for storing the content in an authenticated manner. This ensures that there is no data integrity issue, inconsistency or data loss, while ensuring data privacy. This is in line with the popular multi tenant architecture used in complex software systems. Other micro-service, such as, an OCR AI micro-service 208, authorizes a user role and dynamically load the appropriate personalized weights file from S3. In one example, the OCR model 210 is built on Tensorflow Python. Similarly, a grading micro-service 208, takes care of grading the students answer in comparison with the answer key that has been provided as part of the answer schema configuration as by a dissection module 138. Here the intent, context and meaning of the student answer is compared and a cumulative context rating is provided for the back-end micro-services to consume and give appropriate marks or points. Various kinds of micro-services can be used via the AI micro service 208, 208a, ...n models such as for handwriting recognition, machine learning, natural language processing, HPA, Vision, deep learning, pattern recognition, and the like.

In next step, the automated grading system 120 proceeds to a logger engine 230 which is a custom log handler and is designed using AWS cloud-watch in an example. Status of every component in the micro-services will be saved in the custom logs for further insights and event triggers. This will enable analysis and incident detection to identify anomalies. In one example, it’s connected via an Amazon ECR 232. Further, micro-services 208, 208a, ...n send data to a data processing engine 234 that has a data pipeline engine 236 that handles streams of data in batches, it analyses and performs various computer vision tasks on Images and NLP pre processing tasks on text data. The data processing engine 234 is powered by unsupervised and supervised algorithms which take decisions on data’s credibility. Queries can be passed to the data processing engine 234 for particular types of data. In one example, answer sheet with handwritten texts written using black ink pen. Further, the data pipeline 236 classifies the data to be good or bad, all the good data tags are sorted and scheduled for training. It also has a training job scheduler 238 that is a job scheduler and handler for performing training tasks of the data processing engine 234. Furthermore, the data processing engine 234 send data to the data training engine 240 that is powered by GPU containers (240a, 240b, 240c, ..., 240n) that will train data models at scale. It performs distributed training using CUDA and the new trained weights file for every user will be staged to production. The data training engine 240 generates a leamt and upgraded self-learning assessment model 242 that will be generated and provided by the AI engine 162 to the AI Micro-services 208 based on user changes, error detection, customised solutions, intelligence drawn, AI matrix or any such variable or changes made on choice or intelligence by automatic decision making tools.

The leamt and upgraded self-learning assessment model 242 provided to the AI Micro-services 208 provides a data driven decision intelligence. In one example, in a class of 30 the application will assist to find how many of the students answered with the content for a particular question and how many did not, to what level was their answer matching with the answer provided by the teacher in terms of meaning and comparison. This would offer great insight to teacher on how much the lesson/ chapter has been understood by students. As if there is a question for which the teacher is expecting every child to write at least two points (from a total of 5 points available), the application will provide an analysis of how many points did the children actually write, which were the points preferred by kids which were the points ignored etc. These reports cannot be made manually in a consistent fashion and that’s where the application brings in a huge difference and provides the data in a fashion through which both teachers and the management can take intelligent decisions. The application also has the capability to provide cross subject feedbacks. For instance, in a social science exam the English grammatical errors could be easily sent as a feedback to the English teacher. Not only for one class but even for the whole school now it is easy to find out the pattern of errors that the children are making. This now enables management to take corrective action more easily and in a better-informed fashion.

FIGURE.3.1 depicts an exemplary view 300 of an automated grading method, in accordance with an embodiment of the present subject matter.

A method for automatic grading is depicted in the view 300 where a reading section 302 is used to configure a question test paper 312 and a corresponding answer key by an assessor over a secure cloud. Each of the question paper has a plurality of sections 352. It also receives a plurality of answer sheets 314, each having a plurality of sections 358 and each belonging to a learner. Various steps help in segregating questions into different types of questions with a classification criterion that is based on a bloom’s taxonomy criterion and authenticating the assessor for assessing the answer sheet.

Further, a methodical section 304 is used to create a schema of the plurality of the sections of the plurality of the answer sheets 314 over a secure cloud. The schema further tags each answer sheet 314 with a unique sheet identifier, segments the plurality of sections of an answer sheet 314 of the plurality of the answer sheets, and tags each of the segmented plurality of the sections of the answer sheet with a unique section identifier. A few scenarios covered also includes reading and correctly marking or dissecting answers written in different colours, having blank pages, with strike outs, having answer over flow to next page, and on inverted sheets or written with tilt flow, and the like.

Still further, an assessment section 306 is used to generate a context based review report 350 of the plurality of the sections by comparing the elements of the schema with the answer key where the comparing is performed using a self-learning assessment model 242. Further, validating the context based review report 350 of the plurality of the answer sheets 314 and calculating a respective score based on the validated context based review report 350 for the plurality of the sections of the plurality of the answer sheets 314. It then generates a dashboard 402 for reporting and analysing the assessment based on user access having analytics such as an analytical section 408.

The amount of time saved in grading using the application is humongous, as at least 40-50 % of teacher time is spent on grading class works, homework, tests, and exams. The volume tests versatility of teachers. Now easily at least 80 % of this time is saved because of the automated grading, as it not only helps in grading the exam papers but also the homework. The teacher creates homework, and the students can very simply click a picture of the homework, upload on application and the OCR extracted result is shown to the students, which they can review, modify and submit for grading by the assessor.

The accuracy of grading especially of the descriptive answers are something which varies from person to person and teacher to teacher. There are too many variables when it comes to human understanding and perception and there are many impacting factors such as mood of the person, comfortability during grading, fatigue level, focus level and many more. With present subject matter all these problems are solved, and the grading accuracy is going to be as accurate as possible for the given parameters. With consistency and standard process an unconscious bias is also removed to a great extent. FIGURE.3.2 depicts another exemplary view 300 of an automated grading method, in accordance with an embodiment of the present subject matter.

In one embodiment, a report module 144 of the other modules 144 of an automated grading system 120 generates a context based review report 350 that comprises a question 352 with a performance matrix 354 for each question, section, test paper or a subject (as the case may be). The performance matrix 354 is generated after assessing an answer sheet 314 submission using the application having a context match score 356a, a marks obtained score 356b, and any such parameters 356n based on the scanned and dissected answer 358 of a student. It also has a learning gap analysis with sections like misspelled words 360a, grammatical mistakes 360b and any such parameters 360n. The context based review report 350 further provides an inhibitor of a section of the answer sheet 314 that is left without an assessment. The inhibitor belongs to a section which cannot be graded by the application due to technical, subjective, non-answer or any other issue.

FIGURE.4 illustrates an exemplary assessment report 400 automatically generated by an exemplary automated grading system, in accordance with an embodiment of the present subject matter.

In one embodiment, a report module 144 generates a dashboard view 402 and a corresponding downloadable or sharable report 402 after assessing an answer sheet 314 or a plurality of answer sheets 314 for a class or a plurality of students. The dashboard 402 provides automated insights and analytics for management of the automatic grading system 120. The analytics are developed using an analytics module 144 of the other modules 144.

The dashboard 402 view provides various designs and customisable reports. In one example, the dashboard 402 has a filter section 404 to see examination 404a, teacher 404b, class 404c, and student 404n or others based views, the same will provide filters on various sections of the report to provide data or visual graphs based on the need or user access or rights. An insight section 406 provides data insights such as uploaded data 406a about the lessons, quizzes, documents, video lectures, worksheets, assignments, test papers, assessments, and the like; as consumed data 406b about the attempted or historical lessons, quizzes, documents, video lectures, worksheets, assignments, test papers, assessments, and the like. The insight section 406 further provides teacher student, exam matrix through 406c, and 406n etc. An analytical section 408 provides charts 408a and visual data 408b in form of analytics or charts or visual graph insights as teacher student ratio, students percentages, section wise marks, section wise scores, and the like. All sections of the dashboard 402 can be customised to include any data for usage by various users involved. The reports also provide data traceability and also offer different kinds of reports be it subject wise, chapter wise, blooms taxonomy wise etc.

FIGURE.5 illustrates an exemplary flowchart 500 of a method of automated grading, in accordance with an embodiment of the present subject matter.

In one implementation, a method 500 for automated grading is shown. The method may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communication network. In a distributed computing environment, computer executable instructions may be located in any of local or remote or cloud based computer storage media, including memory storage devices. The order in which the method is described and is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method or alternate methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the disclosure described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method may be implemented in the above-described system.

At step/ block 502, configure a question test paper having a plurality of sections and a corresponding answer key through an assessor by a processor 122 of an application server 102. In one implementation, the configuration may be performed by a configuration module 130 of a system 120.

At step/ block 504, receive a plurality of answer sheets each tagged to a learner and each having a plurality of sections. In one implementation, the plurality of answer sheets is received by a scanning module 132 of the system 120.

At step/ block 506, authenticate the assessor for assessing the plurality of the answer sheets. In one implementation, the authentication may be provided by an access module 142 of the system 120. At step/ block 508, create a schema of the plurality of the sections of the plurality of the answer sheets by the processor 122 of the application server 102. In one implementation, the schema is created by a dissection module 138 of the system 120.

At step/ block 510, generate a context based review report 350 of the plurality of the sections by comparing, elements of the schema with the answer key, using a self-learning assessment model 242. In one implementation, the context based review report 350 may be generated by a grading module 140 of the system 120.

At step/ block 512, validate the context based review report 350 of the plurality of the answer sheets. In one implementation, the validation of the context based review report 350 may be performed by the grading module 140 of the system 120.

At step/ block 514, calculate a respective score of the learner based on the validated context based review report for the plurality of the sections of the plurality of the answer sheets. In one implementation, the score may be calculated by the grading module 140 of the system 120.

Thus, the method 500 helps in automated grading by configuring question paper and scanning answer sheets, dissecting the answer sheets and generating a context based review report for assessor review, validating the report and generating a respective score for student.

The present solution is used to automatically grade answer sheets using AI enabled and cloud based micro services. The following shows preparation, planning and execution along with outcome of a few studies. In one example, the AI engine 162 uses a deep matrix module 134 to correct an error in the system using the self-assessment model 242. A question paper construction has a calculation error - though the maximum marks are set to be 80 but the actual marks of the spread of the questions is only 75. The question paper Q has a Section A with 20 marks (20 questions of 1 mark each), a Section B with 24 marks (8 questions of 3 marks each), a Section C with 25 marks (5 questions of 5 marks each), and a Section D with 6 marks (1 question of 6 marks). The deep matrix module 134 is used in correction and identifies a context review score for each sub section after tallying the dissected answers from the answer sheet compared to the answer key. The score used for grading a student (for all students) is converted by the self-assessment model from 80 to 75 as a student would have only attempted a maximum of 75 marks from the design of the question paper. Such issues are identified, highlighted and corrected by the automatic grading system 120. Although implementations of system and method for automated grading have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for automated grading.