Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATED TESTING WITH IMPROVED SCALABILITY AND COMPATIBILITY
Document Type and Number:
WIPO Patent Application WO/2023/096690
Kind Code:
A1
Abstract:
The present disclosure proposes a method for performing an automated test. A test request for performing an automated test with a specified test device set may be received through a registration center. A test task corresponding to the test request may be generated through the registration center. The test task may be scheduled to a test agent associated with the specified test device set through the registration center. The automated test may be performed with the specified test device set through the test agent. The present disclosure also proposes an automated testing system. The automated testing system may comprise: a registration center, at least one test agent and at least one test device set.

Inventors:
BU SHAOPENG (US)
DING HONG (US)
TAO RAN (US)
HU YUXING (US)
GUO JIAN (US)
LI YINGJIE (US)
SHEN LI (US)
Application Number:
PCT/US2022/042581
Publication Date:
June 01, 2023
Filing Date:
September 05, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MICROSOFT TECHNOLOGY LICENSING LLC (US)
International Classes:
G06F11/36
Domestic Patent References:
WO2017116912A12017-07-06
Other References:
MA XIN ET AL: "An Automated Testing Platform for Mobile Applications", 2016 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE QUALITY, RELIABILITY AND SECURITY COMPANION (QRS-C), IEEE, 1 August 2016 (2016-08-01), pages 159 - 162, XP032969553, DOI: 10.1109/QRS-C.2016.25
Attorney, Agent or Firm:
CHATTERJEE, Aaron C. et al. (US)
Download PDF:
Claims:
CLAIMS

1. A method for performing an automated test, comprising: receiving, through a registration center, a test request for performing an automated test with a specified test device set; generating, through the registration center, a test task corresponding to the test request; scheduling, through the registration center, the test task to a test agent associated with the specified test device set; and performing, through the test agent, the automated test with the specified test device set.

2. The method of claim 1, wherein the automated test comprises at least one of a test for a single-operating-system application, a test for a cross-operating-system application, and a test for a Web application.

3. The method of claim 1, wherein a form of the specified test device set is at least one of: a single test device, a test device pair consisting of two test devices, and a test device group consisting of more than two test devices.

4. The method of claim 1, wherein the specified test device set includes more than one test device, and the performing the automated test comprises: coordinately performing, through the test agent, the automated test with the more than one test device.

5. The method of claim 1, wherein the test agent is implemented independently of the specified test device set, or is implemented in the specified test device set.

6. The method of claim 1, further comprising: determining, through the test agent, whether the test task includes an authorization code of the specified test device set, the authorization code being from the test request, and wherein the automated test is performed in response to determining that the test task includes the authorization code.

7. The method of claim 1, further comprising creating the test agent through the following operations: running a test agent creating program on a terminal device; initiating, at the terminal device, a registration process to the registration center; and configuring the terminal device as the test agent in response to the completion of the registration process.

8. The method of claim 7, further comprising: receiving, through the registration center, an authorization code generating request for generating an authorization code of a test device set associated with the test agent; generating, through the registration center, the authorization code; and sending, through the registration center, the generated authorization code to the test agent.

9. The method of claim 1, further comprising: receiving, through the registration center, an authorization code acquiring request for acquiring an authorization code of the specified test device set; forwarding, through the registration center, the authorization code acquiring request to the testing agent; and determining, through the registration center, whether to provide the authorization code based on a response received from the test agent.

10. The method of claim 1, further comprising visualizing a test result of the automated test, the test results including a video navigation interface, and the video navigation interface including: a navigation area for displaying multiple test cases included in the automated test, each test case being selectable, and a video area for displaying a video clip corresponding to a selected test case.

11. An automated testing system, comprising: a registration center, configured to receive a test request for performing an automated test for a target application with a specified test device set, generate a test task corresponding to the test request, and schedule the test task to a test agent associated with the specified test device set; at least one test agent, each test agent being configured to receive a test task, and perform an automated test with a test device set specified in the received test task; and at least one test device set, each test device set being associated with one of the at least one test agent and configured to run the target application.

12. The automated testing system of claim 11, wherein each test agent in the at least one test agent is further configured to: determine whether the received test task includes an authorization code of a corresponding specified test device set; and perform the automated test in response to determining that the received test task includes the authorization code.

13. The automated testing system of claim 11, wherein the registration center is further configured to: receive a registration request from a terminal device, the registration request being triggered by a test agent creating program running on the terminal device; and register the terminal device as a test agent in response to receiving the registration request.

14. The automated testing system of claim 11, wherein the registration center is further configured to: receive an authorization code acquiring request for acquiring an authorization code of the specified test device set; forward the authorization code acquiring request to the test agent; and determine whether to provide the authorization code based on a response received from the test agent.

15. A computer program product for performing an automated test, comprising a computer program that is executed by at least one processor for: receiving, through a registration center, a test request for performing an automated test with a specified test device set; generating, through the registration center, a test task corresponding to the test request; scheduling, through the registration center, the test task to a test agent associated with the specified test device set; and performing, through the test agent, the automated test with the specified test device set.

Description:
AUTOMATED TESTING WITH IMPROVED SCALABILITY AND COMPATIBILITY

BACKGROUND

In the development process of software applications, test plays a very critical role as an extremely important part of ensuring application quality. Herein, a software application for which tests are performed may be referred to as a target application. Usually, when testing a target application, after a test case is determined, testers may perform a test step by step according to a procedure described in the test case, and may compare an actual test result with an expected test result to verify whether functions of the target application are correct. In this process, in order to save manpower, time or hardware resources and improve test efficiency, an automated test is introduced. An automated test may be a process of converting a human-driven test into a machine-executed test. In an automated test, specific software or programs may be utilized to control an execution of the test and a comparison between an actual test result and an expected test result. Through the automated test, some repetitive but necessary test tasks that exist in the testing process may be automated, or some test tasks that are otherwise difficult to perform manually may be performed.

SUMMARY

This Summary is provided to introduce a selection of concepts that are further described below in the Detailed Description. It is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

Embodiments of the present disclosure propose a method for performing an automated test. A test request for performing an automated test with a specified test device set may be received through a registration center. A test task corresponding to the test request may be generated through the registration center. The test task may be scheduled to a test agent associated with the specified test device set through the registration center. The automated test may be performed with the specified test device set through the test agent.

The embodiments of the present disclosure also propose an automated testing system. The automated testing system may comprise: a registration center, configured to receive a test request for performing an automated test for a target application with a specified test device set, generate a test task corresponding to the test request, and schedule the test task to a test agent associated with the specified test device set; at least one test agent, each test agent being configured to receive a test task, and perform an automated test with a test device set specified in the received test task; and at least one test device set, each test device set being associated with one of the at least one test agent and configured to run the target application. It should be noted that the above one or more aspects comprise the features hereinafter fully described and particularly pointed out in the claims. The following description and the drawings set forth in detail certain illustrative features of the one or more aspects. These features are only indicative of the various ways in which the principles of various aspects may be employed, and this disclosure is intended to include all such aspects and their equivalents.

BRIEF DESCRIPTION OF THE DRAWINGS

The disclosed aspects will hereinafter be described in connection with the appended drawings that are provided to illustrate and not to limit the disclosed aspects.

FIG.l illustrates an exemplary process for performing an automated test according to an embodiment of the present disclosure.

FIG.2 illustrates an exemplary process for creating a test agent according to an embodiment of the present disclosure.

FIG.3 illustrates an exemplary process for acquiring an authorization code of a test device set according to an embodiment of the present disclosure.

FIG.4 illustrates an exemplary video navigation interface according to an embodiment of the present disclosure.

FIG.5 illustrates an exemplary architecture for an automated testing system according to an embodiment of the present disclosure.

FIG.6 illustrates an exemplary test device set according to an embodiment of the present disclosure.

FIG.7 is a flowchart of an exemplary method for performing an automated test according to an embodiment of the present disclosure.

FIG.8 illustrates an exemplary automated testing system according to an embodiment of the present disclosure.

FIG.9 illustrates an exemplary apparatus for performing an automated test according to an embodiment of the present disclosure.

DETAILED DESCRIPTION

The present disclosure will now be discussed with reference to several example implementations. It is to be understood that these implementations are discussed only for enabling those skilled in the art to better understand and thus implement the embodiments of the present disclosure, rather than suggesting any limitations on the scope of the present disclosure. Currently, a single machine is usually used to perform an automated test and present a test result. Take an automated test for a mobile application as an example, the automated test usually runs on a personal computer connected to the mobile device, and the personal computer performs the automated test and presents a test result. The personal computer may crash due to malfunctions. Once it crashes, the automated test will be interrupted, and the existing test result will be lost. Therefore, it may be difficult to provide a reliable automated testing service in this way. Furthermore, since mobile devices are typically connected to a personal computer in a wired way, the number of mobile devices that may be connected to the personal computer is limited. This results in a limited number of automated tests that may be executed simultaneously.

Embodiments of the present disclosure propose an improved automated testing service. The automated testing service may be decoupled to a registration center and a test agent. The registration center may manage test agents, schedule test tasks, visualize test results, etc. The test agent may perform test tasks, send test results to the registration center, etc. The registration center and the test agent may be deployed at different locations. Multiple test agents may be deployed at the same time. Even if one or some of the test agents fail, automated tests may also be performed through other test agents, and the test results may be preserved. Registration centers may also be deployed on multiple points to avoid single point failure. In this way, the reliability of the automated testing service may be significantly improved.

In an aspect, the embodiments of the present disclosure propose creating a test agent with a terminal device. The terminal device may be any computing device located at any geographic location, e.g., a desktop computer, a laptop computer, a tablet computer, a cellular phone, a wearable device, etc. The terminal device may be configured as a test agent through running a test agent creating program on the terminal device and registering the terminal device to a registration center. A desired test device set installed with a target application may be connected to a test agent, to perform an automated test for the target application. Herein, a test device set may refer to a set of test devices connected to a test agent as a whole. A form of the test device set may include, e.g., a single test device, a test device pair consisting of two test devices, and a test device group consisting of more than two test devices, etc. Herein, a test device may refer to a computing device on which a target application is running and with which an automated test for the target application is performed, e.g., a desktop computer, a laptop computer, a tablet computer, a cellular phone, a wearable device, etc. Since the test agent may be created from a terminal device located at any geographic location, and an automated test may be performed through connecting a desired test device set to the test agent, the number of supported test device sets may be greatly expanded, thereby increasing the scalability of the automated testing service. In addition, this approach make it easy to perform an automated test with a test device set located at any geographic location. For example, a user may be located at location A, while a desired test device set maybe located at location B. A test agent may be created at location B and the desired test device set may be connected to the test agent. The user located at location A may request to perform an automated test with the test device set at location B. Additionally, in a case where the number of currently available test device sets is insufficient, a test device set may be supplemented through creating a test agent and connecting a appropriate test device set to the test agent, or a test device set may be supplemented through requesting use of a test device set on other test agent. In this way, the time for scheduling test tasks and waiting for test execution may be shortened.

In another aspect, the embodiments of the present disclosure propose to enable an automated testing service to be compatible with various types of automated tests through enabling a test agent to support various types of testing frameworks and associating various forms of test device sets with the test agent. Types of automated tests that may be compatible may include, e.g., a test for a single-operating-system application, a test for a cross-operating-system application, and a test for a Web application, etc. Herein, a single-operating-system application may refer to an application involving only a single operating system, e.g., an Android application, an iOS application, a Windows application, a macOS application, etc. A cross-operating-system application may refer to an application involving an interaction between multiple operating systems, e.g., an application involving an interaction between an Android operating system and a Windows operating system, an application involving an interaction between an iOS operating system and a macOS operating system, etc. A Web application may refer to a Web-based application, e.g., a website accessed through a browser, a plug-in running through a browser, etc. For example, a test for a cross-operating-system application may be achieved through enabling a test agent to support a testing framework such as Appium, and connecting a test device pair installed with different operating systems to the test agent. In this way, application scenarios for automated tests may be wider, thereby significantly improving the compatibility of the automated testing service.

In another aspect, the embodiments of the present disclosure propose unified management of test resources for sharing among various test agents. The test resources may include, e.g., an application package of a target application, a test suite including test cases for an automated test to be executed, a test device set, etc. Application packages and test suites may be stored in a data storage deployed in the cloud and managed by a registration center. Test device sets may also be managed through the registration center. In this way, different users or teams, e.g., users or teams located at different geographical locations, may share test resources, thereby improving the reusability of test resources and saving resource costs.

In another aspect, the embodiments of the present disclosure propose to manage uses-permission of a test device set through an authorization code. Herein, an authorization code may refer to an encrypted code associated with a specific test device set and used to authorize a sender of a test request to use the specific test device set for an automated test. Unlike a publicly visible identifier of a test device set, the authorization code of the test device set is access-restricted. The authorization code of the test device set may be provided to a test agent associated with the test device set by a registration center when the test agent is created, and further provided to a specified user of the test agent, e.g., a creator, an administrator ,etc. of the test agent. In addition, the authorization code of the test device set may be provided by the registration center to an ordinary user with the grant of a specified user of the test agent associated with the test device set. When the test agent receives a test task for performing an automated test with a specified test device set from the registration center, the test agent may determine whether the received test task includes an authorization code of the specified test device set, and if so, perform an automated test. In this way, arbitrary use of the test device set may be avoided, thereby improving the security of the automated testing service.

In another aspect, the embodiments of the present disclosure propose visualizing a test result of an automated test through presenting a video navigation interface. The video navigation interface may include a navigation area for displaying multiple test cases included in the automated test. Each test case may be selected. The video navigation interface may further comprise a video area for displaying a video clip corresponding to a selected test case. In this way, it is possible for a user to conveniently view the situation of a test device during the execution of each use case of the automated test. Furthermore, a video clip of a failed test case may be connected with a corresponding test log, device log, etc., so as to locate a bug existing in a target application more quickly and accurately.

FIG.l illustrates an exemplary process 100 for performing an automated test according to an embodiment of the present disclosure. In the process 100, an automated testing service may be decoupled to a registration center and a test agent. The registration center may manage test agents, schedule test tasks, visualize test results, etc. The test agent may perform test tasks, send test results to the registration center, etc.

At 102, a test request for performing an automated test with a specified test device set may be received through a registration center. The test request may include, e.g., an application package of a target application, a test suite including a set of test cases for an automated test to be executed, an identifier of a specified test device set, a recipient of a test report, etc. For example, when a user initiates a test request, a test device set on which an automated test is to be performed may be selected as a specified test device set. In addition, in the case where the user knows an authorization code of the specified test device set, the test request may also include the authorization code. The specified test device set may have various forms, e.g., a single test device, a test device pair consisting of two test devices, and a test device group consisting of more than two test devices, etc. An exemplary form of a test device set will be described later in conjunction with FIG.6. A test device may be any type of computing device, e.g., a desktop computer, a laptop computer, a tablet computer, a cellular phone, a wearable device, etc. Preferably, after the registration center receives the test request, information of the specified test device set in the test request may be presented to the user via a front end connected to the registration center, e.g., an identifier of the specified test device set, supported test types, supported test frameworks, and a current status, a device type, a device model, etc. of a test device included in the specified test device set.

At 104, a test task corresponding to the test request may be generated through the registration center. For example, the registration center may identify an application package, a test suite, an identifier of a specified test device set, an authorization code of the specified test device set, etc., from the test request, and generate a test task based on the identified application package, test suite, identifier of the specified test device set, authorization code of the specified test device set, etc.

At 106, the test task may be scheduled to a test agent associated with the specified test device set through the registration center. For example, the test task may include the identifier of the specified test device set. The registration center may identify a test agent associated with the specified test device set and schedule the test task to the identified test agent.

The automated test may be performed with the specified test device set through the test agent. Preferably, the test agent may firstly determine whether the test task it receives includes an authorization code of a test device set specified in the test task, and perform an automated test in response to determining that the test task includes the authorization code. For example, at 108, it may be determined, through the test agent, whether the test task includes an authorization code of a specified test device set. The authorization code may be from the test request.

If, at 108, it is determined that the test task does not include the authorization code of the specified test device set, the process 100 may proceed to a step 110. At 110, the specified test device set will not be used to perform the automated test.

If, at 108, it is determined that the test task includes the authorization code of the specified test device set, the process 100 may proceed to a step 112. At 112, the automated test may be performed with the specified test device set through the test agent.

According to the embodiments of the present disclosure, during the performing of the automated test, a status of each test device in the test device set may be monitored in real time, to obtain performance data of a target application under test. For example, a Central Processing Unit (CPU) utilization, a memory utilization, a network connection status, etc. of the test device may be monitored in real time. A battery usage of the test device may also be monitored in real time, so that a user may know a power consumption of the target application under test. In addition, a delay of application clicks may also be monitored in real time, so that a user may know a running speed of the target application. Furthermore, a video corresponding to the automated test may be recorded through capturing a screen and/or sound of the test device in real time during the performing of the automated test. The recorded video may include multiple video clips corresponding to multiple test cases included in the automated test.

At 114, a test result of the automated test may be acquired through the test agent, and the test result may be sent to the registration center. The test result of the automated test may include, e.g., pass/fail information, a running time, a test log ,a device log, a customized log screenshot, performance data, a video, etc. corresponding to each test case. The test result described above may comprehensively reflect the situation of the tested target application. A detailed test report for the automated test may be generated based on the comprehensive test result described above. The detailed test report may help a developer of the target application to quickly and accurately locate a bug in the target application.

At 116, the test result may be visualized through the registration center. For example, the test result may be presented through a dashboard. The dashboard may be displayed, e.g., via a front end connected to the registration center. Preferably, when visualizing the test result, a video navigation interface may be presented. The video navigation interface may include a navigation area for displaying multiple test cases included in the automated test. Each test case may be selected. The video navigation interface may further comprise a video area for displaying a video clip corresponding to a selected test case. An exemplary video navigation interface will be described later in conjunction with FIG.4.

It should be appreciated that the process for performing the automated test described above in conjunction with FIG.l is merely exemplary. Depending on actual application requirements, the steps in the process for performing the automated test may be replaced or modified in any manner, and the process may include more or fewer steps. For example, the test agent may directly perform the automated test with the specified test device set in the test task after receiving the test task, without determining whether the test task includes the authorization code of the specified test device set, such as in the case where the test agent knows that the test task is from its specified user. Additionally, at 116, in addition to visualizing the test result, a status of a test device in the test device set may also be presented. Furthermore, the specific order or hierarchy of the steps in the process 100 is merely exemplary, and the process for performing the automated test may be performed in an order different from the described one.

According to the embodiments of the present disclosure, a test agent may be created from a terminal device. The terminal device may be any computing device located at any geographic location, e.g., a desktop computer, a laptop computer, a tablet computer, a cellular phone, a wearable device, etc. For example, a user may create a test agent through using a computing device accessible to him or her. FIG.2 illustrates an exemplary process 200 for creating a test agent according to an embodiment of the present disclosure.

At 202, a test agent creating program may run on a terminal device. The test agent creating program may be a predetermined computer program that is able to be used to create a test agent. At 204, a registration process to a registration center may be initiated at the terminal device. For example, a registration request may be sent to the registration center through the terminal device. The registration center may register the terminal device in response to receiving the registration request.

At 206, the terminal device may be configured as a test agent in response to the completion of the registration process.

After the test agent is registered, an authorization code of a test device set associated with the test agent may be further acquired.

At 208, an authorization code generating request for generating an authorization code of a test device set associated with the test agent may be received through the registration center. For example, the test agent may send, to the registration center, an authorization code generating request for generating an authorization code of a test device set associated with the test agent. The authorization code generating request may include information about the test device set, e.g., an identifier of the test device set, supported test types, supported test frameworks, and a current status, a device type, a device model, etc. of a test device included in the test device set.

At 210, an authorization code may be generated through the registration center. For example, the registration center may generate the authorization code of the test device set based on the information of the test device set included in the authorization code generating request received from the test agent.

At 212, the generated authorization code may be sent to the test agent through the registration center.

Since a test agent may be created from a terminal device located at any geographic location, and an automated test may be performed through connecting a desired test device set to the test agent, the number of supported test device sets may be greatly expanded, thereby increasing the scalability of the automated testing service. In addition, a user may be located at location A, while a desired test device set maybe located at location B. A test agent may be created at location B and the desired test device set may be connected to the test agent. The user located at location A may request to perform an automated test with the test device set at location B. This approach enables the automated test to be conveniently performed through using a test device set located at any geographic location. Additionally, in a case where the number of currently available test device sets is insufficient, a test device set may be supplemented through creating a test agent and connecting an appropriate test device set to the test agent. Additionally or alternatively, a test device may be supplemented through requesting use of a test device set on other test agent. In this way, the time for scheduling test tasks and waiting for test execution may be shortened.

It should be appreciated that the process for creating the test agent described above in conjunction with FIG.2 is merely exemplary. Depending on actual application requirements, the steps in the process for creating the test agent may be replaced or modified in any manner, and the process may include more or fewer steps. In addition, the specific order or hierarchy of the steps in the process 200 is merely exemplary, and the process for creating the test agent may be performed in an order different from the described one.

As described above, when initiating a test request to a registration center for performing an automated test with a specified test device set, an authorization code of the specified test device set may be provided. When a test agent receives a test task corresponding to the test request from the registration center, the test agent may determine whether the received test task includes the authorization code of the specified test device set, and if so, perform the automated test. For a specified user of a test agent associated with a specified test device set, e.g., for a creator, a administrator, etc. of the test agent, an authorization code may be obtained from the test agent.. The authorization code at the test agent may be obtained from the registration center when the test agent is created, e.g., through the process 200 of FIG.2. For an ordinary user, e.g., for a user who is not the specified user of the test agent, an authorization code of the specified test device set may be requested from the test agent associated with the specified test device set via the registration center. The registration center may provide the authorization code to the ordinary user with grant from the specified user of the test agent associated with the test device set. In this way, arbitrary use of the test device set may be avoided, thereby improving the security of the automated testing service. FIG.3 illustrates an exemplary process 300 for acquiring an authorization code of a test device set according to an embodiment of the present disclosure.

At 302, an authorization code acquiring request for acquiring an authorization code of a specified test device set may be received through a registration center. The authorization code acquiring request may be received from, e.g., a computing device associated with a user who wants to perform an automated test with the specified test device set.

At 304, the authorization code acquiring request may be forwarded to a test agent associated with the specified test device set through the registration center.

Subsequently, it may be determined, through the registration center, whether to provide the authorization code based on a response received from the test agent. At 306, a response to the authorization code acquiring request may be received from the test agent through the registration center. For example, after receiving the authorization code acquiring request forwarded by the registration center, the test agent may determine whether to grant the authorization code to be provided to a sender of the authorization code acquiring request, and may include the determination result in the response to the authorization code acquiring request to be sent to the registration center.

At 308, it may be determined, through the registration center, whether the response received from the test agent indicates that the test agent grants the authorization code to be provided. If, at 308, it is determined that the response received from the test agent indicates that the test agent grants the authorization code to be provided, the process 300 may proceed to a step 310. At 310, the authorization code may be provided through the registration center. If, at 308, it is determined that the response received from the test agent does not indicate that the test agent grants the authorization code to be provided, the process 300 may proceed to a step 312. At 312, the authorization code will not be provided.

It should be appreciated that the process for acquiring the authorization code of the test device set described above in conjunction with FIG.3 is merely exemplary. Depending on actual application requirements, the steps in the process for acquiring the authorization code of the test device set may be replaced or modified in any manner, and the process may include more or fewer steps. For example, the test agent, when determining whether to grant the authorization code to be provided to the sender of the authorization code acquiring request, may consider whether the sender of the authorization code acquiring request has ever created other test agent. A sender who has created other test agent may use the test device set of this test agent preferentially. In addition, the specific order or hierarchy of the steps in the process 300 is merely exemplary, and the process for acquiring the authorization code of the test device set may be performed in an order different from the described one.

FIG.4 illustrates an exemplary video navigation interface 400 according to an embodiment of the present disclosure.

The video navigation interface 400 may include a navigation area 410 for displaying multiple test cases included in an automated test. In the navigation area 410, a name and a corresponding time of each test case are shown. Optionally, in the navigation area 410, an initializing event, a test run started event and corresponding times are also shown. Preferably, a failed test case may be marked. The failed test case may be marked in various ways. For example, the failed test case may be displayed with a different color than a successful test case, the failed test case may be underlined, the failed test case may be highlighted, etc. For example, a test case "5.NotesCardTest.createTextNote" is highlighted in the navigation area 410, which indicates that this test case failed.

Each test case may be selected through clicking it. The video navigation interface 400 may further comprise a video area 420 for displaying a video clip corresponding to a selected test case. After a test case is selected from the navigation area 410, a video corresponding to the automated test may jump to a time point corresponding to the selected test case, so that a video clip of selected test case may be displayed in the video area 420. In this way, it is possible for the user to conveniently view the situation of a test device during the execution of each use case of the automated test. Furthermore, a video clip of a failed test case may be connected with a corresponding test log, device log, etc., so as to locate a bug existing in a target application more quickly and accurately.

Optionally, the video navigation interface 400 may further include a button 430 for downloading a video and a group of buttons 440 for setting a play speed.

It should be appreciated that the video navigation interface 400 shown in FIG.4 is only an example of the video navigation interface. Depending on actual application requirements, the video navigation interface may have any other structure and may include more or fewer elements. For example, in the video navigation interface 400, in addition to the buttons for downloading the video and setting the play speed, buttons such as for setting the video resolution may also be displayed.

FIG.5 illustrates an exemplary architecture 500 for an automated testing system according to an embodiment of the present disclosure. The architecture 500 may provide an automated testing service for a target application through performing, e.g., the processes described above in connection with FIG. l to FIG.4. The architecture 500 may include a registration center 510 and at least one test agent, e.g., a test agent 520-1 to a test agent 520-K (K>1). Additionally, the architecture 500 may also include at least one test device set. Each test device set may be associated with one of the at least one test agents. For example, a test device set 540-1 to a test device set 540-M (M>1) may be associated with the test agent 520-1, and a test device set 542-1 to a test device set 542-N (N>1) may be associated with the test agent 520-K.

The registration center 510 may manage test agents, schedule test tasks, visualize test results, etc. The registration center 510 may be deployed in the cloud. It should be appreciated that although only one registration center 510 is shown in the architecture 500, the registration center may be scalable in some embodiments. For example, an automated testing system may include more than one registration center. These registration centers may be managed as a registration center cluster with unified endpoints. Various registration centers in the registration center cluster may be deployed on multiple points in a distributed way to avoid single point failure.

The registration center 510 may be connected with a front end 550. The front end 550 may interface with a user and present a user interface associated with the registration center 510 to the user. In addition, the registration center 510 may be connected to a data storage 560. The data storage 560 may be deployed in the cloud. The data storage 560 may store test resources, e.g., an application package 562, a test suite 564, etc. The application package 562 may include applications for installing and running a target application. The test suite 564 may include a set of test cases for an automated test to be performed. The registration center 510 may manage the test resources stored in the data storage 560.

The registration center 510 may include a permission management unit 512. The permission management unit 512 may manage permission of a test agent. For example, when receiving a registration request from a terminal device, the permission management unit 512 may determine whether to register the terminal device to configure the terminal device as a test agent. The registration request may be triggered by a test agent creating program running on the terminal device. In addition, the permission management unit 512 may manage permission of a user to determine a test device set that the user can use.

The registration center 510 may include an agent and device set management unit 514. The agent and device set management unit 514 may manage a test agent registered to the registration center 510 and a test device set associated with the test agent. A status of the test agent and/or a test device in the test device set may be presented to a user via the front end 550.

The registration center 510 may include a test task scheduling unit 516. The test task scheduling unit 516 may generate a test task corresponding to a test request, and schedule the test task to a corresponding test agent. For example, a test request might specify a test device set with which to perform an automated test. The test task scheduling unit 516 may schedule a test task corresponding to a test request to a test agent associated with a test device set specified in the test request.

The registration center 510 may include a test result visualization unit 518. The test result visualization unit 518 may visualize a test result of an automated test. The test result of the automated test may include, e.g., pass/fail information, a running time, a test log, a device log, a customized log screenshot, performance data, etc., corresponding to each test case. Preferably, when visualizing the test result, a video navigation interface, e.g., the video navigation interface 400 described above in conjunction with FIG.4, may be presented.

Each test agent 520-k (l<k<K) from the test agent 520-1 to the test agent 520-K may be registered to the registration center 510. The registration center 510 and the test agent 520-k may access each other through, e.g., Remote Procedure Call. The test agent 520-k may be created from any computing device located at any geographic location. For example, a test agent may be created through the process 200 previously described in connection with FIG.2. The test agent 520-k may perform test tasks, send test results to the registration center 510, etc.

The test agent 520-k may include a registration unit 522-k for initiating a registration process with the registration center 510.

The test agent 520-k may include a security unit 524-k for determining whether to perform a test task scheduled by the registration center 510. For example, the security unit 524-k may analyze a test task received from the registration center 510, determine whether the received test task includes an authorization code of a corresponding specified test device set, and if so, notify the test agent 520-k, e.g., a test execution unit 530-k in the test agent 520-k, to perform the automated test.

The test agent 520-k may include a device set management unit 526-k for locally managing one or more test device sets associated therewith.

The test agent 520-k may include a device set control tool 528-k for controlling and debugging one or more test device sets associated therewith. The test agent 520-k is typically associated with one type of test device set. The device set control tool 528-k may, e.g., correspond to a type of a test device set associated with test agent 520-k. As an example, when the test device set associated with the test agent 520-k is an Android device, the device set control tool 528-k may be a software development kit (SDK) for the Android device. As another example, when the test device set associated with the test agent 520-k is an iOS device, the device set control tool 528-k may be a SDK for the iOS device.

The test agent 520-k may include a test performing unit 530-k for performing an automated test with a test device set, a test suite, etc. specified in a test task.

The test agent 520-k may include a test result processing unit 532-k for acquiring a test result of an automated test and sending the test result to the registration center 510.

The test agent 520-k may support various types of testing frameworks, e.g., Appium, Espresso, Junit, etc. Various forms of test device sets may be associated with the test agent 520-k, to enable the automated testing service to be compatible with various types of automated tests. Types of automated tests that may be compatible may include, e.g., a test for a single-operatingsystem application, a test for a cross-operating-system application, and a test for a Web application, etc. The test agent 520-k may be associated with one or more test device sets. For example, the test agent 520-1 may be associated with the test device set 540-1 to the test device set 540-M, and the test agent 520-K may be associated with the test device set 542-1 to the test device set 542-N. Each test device set may be configured to run a target application.

The test device set may have various forms, e.g., a single test device, a test device pair consisting of two test devices, and a test device group consisting of more than two test devices, etc. FIG.6 illustrates exemplary test device sets 600, 620, and 640 according to an embodiment of the present disclosure. Each test device set of the test device sets 600, 620, and 640 may correspond to any one of the test device set 540-1 to the test device set 540-M and the test device set 542-1 to the test device set 542-N in FIG.5.

Test device set 600 may include a single test device 610. The single test device 610 may perform an automated test independently. A target application 612 may run on the test device 610. The target application 612 may be, e.g., a single-operating-system application, a Web application, etc.

The test device set 620 may include two test devices, e.g., a test device 630-1 and a test device 630-2. The test device 630-1 and the test device 630-2 may be bound to each other, forming a test device pair. The test device 630-1 and the test device 630-2 may perform an automated test coordinately. A target application 632 may run on the test device 630-1 and the test device 630-2. The target application 632 may be, e.g., a single-operating-system application, a Web application, a cross-operating-system application, etc. In the case where the target application 632 running on the test device 630-1 and the test device 630-2 is a cross-operating-system application, a version of the target application 632 corresponding to a respective operating system may run on the test device 630-1 and the test device 630-2, respectively. As an example, the target application 632 may be an application involving an interaction between an Android operating system and a Windows operating system. In this case, the test device 630-1 may be an Android mobile phone installed with an Android operating system, and the test device 630-2 may be a Windows computer installed with a Windows operating system. An Android version of the target application 632 may run on the test device 630-1, and a Windows version of the target application 632 may run on the test device 630-2. A test agent associated with the test device set 620 may be a test agent that supports a testing framework such as Appium. When performing an automated test, operations may be performed on both the test device 630-1 and the test device 630-2, respectively. Interoperation may also be performed between the test device 630-1 and the test device 630-2.

The test device set 640 may include more than two test devices, e.g., a test device 650-1, a test device 650-2, ..., a test device 650-T (T>2). The test device 650-1, the test device 650-2, ..., and the test device 650-T may be bound to each other, forming a test device group or a test device cluster. These test devices may perform an automated test coordinately. A target application 652 may run on the test device 650-1, the test device 650-2, ..., the test device 650-T. The target application 652 may be, e.g., a single-operating-system application, a Web application, a crossoperating-system application, etc. As an example, the target application 652 may be, e.g., an application about server distribution operations, which requires the participation of more than two test devices. By enabling the test agent to support various types of testing frameworks and associating various forms of test device sets with the test agent, application scenarios for automated tests may be wider, thereby significantly improving the compatibility of the automated testing service. Referring back to FIG.5, according to the embodiments of the present disclosure, a test agent may have various implementations with respect to a test device set.

In an implementation, a test agent may be implemented independently of a test device set. For example, the test agent 520-1 may be implemented independently of the test device set 540-1 to the test device set 540-M, the test agent 520-K may be implemented independently of the test device set 542-1 to the test device set 542-N, etc. FIG.5 illustrates the way in which a test agent is implemented independently of a test device set. In such an implementation, the test device set may be connected to its associated test agent in a wired way. The number of test device sets connected to a test agent may depend on the number of ports that the test agent can provide.

In another implementation, a test agent may be implemented in a test device set. For example, a test agent may be implemented as one test device in a test device set. The test agent needs to obtain resources of other applications when executing an automated test, so it requires mutual interaction across applications. If a test device can enable mutual interaction across applications, then the test device itself may become a test agent. In such an implementation, when the test device set includes only a single test device, the test agent and the test device may be the same device. When the test device set includes more than one test device, the test agent may be one test device of the more than one test device.

According to the embodiments of the present disclosure, in the architecture 500, unified management may be performed on test resources for sharing among various test agents. The test resources may include, e.g., the application package 562, the test suite 564, the test device set 540-1 to the test device set 540-M, and the test device set 542-1 to the test device set 542-N, etc. The application package 562 and the test suite 564 may be stored in the data storage 560 deployed in the cloud, and managed by the registration center 510. The test device set 540-1 to the test device set 540-M and the test device set 542-1 to the test device set 542-N may also be managed by the registration center 510. In this way, different users or teams, e.g., users or teams located at different geographical locations, may share test resources, thereby improving the reusability of test resources and saving resource costs.

It should be appreciated that the architecture 500 shown in FIG.5 is only an example of the architecture of the automated testing system. Depending on actual application requirements, the automated testing system may have any other structure, and may include more or fewer component. For example, a new test agent may be added to an existing automated testing system through the process 200 of FIG.2, to enhance the scalability of the automated testing system. In addition, the registration center 510 may be used as a test agent to re-register to other registration center, thereby further enhancing the scalability of the automated testing system. FIG.7 is a flowchart of an exemplary method 700 for performing an automated test according to an embodiment of the present disclosure.

At 710, a test request for performing an automated test with a specified test device set may be received through a registration center.

At 720, a test task corresponding to the test request may be generated through the registration center.

At 730, the test task may be scheduled to a test agent associated with the specified test device set through the registration center.

At 740, the automated test may be performed with the specified test device set through the test agent.

In an implementation, the automated test may comprise at least one of a test for a singleoperating-system application, a test for a cross-operating-system application, and a test for a Web application.

In an implementation, a form of the specified test device set may be at least one of: a single test device, a test device pair consisting of two test devices, and a test device group consisting of more than two test devices.

In an implementation, the specified test device set may include more than one test device. The performing the automated test may comprise: coordinately performing, through the test agent, the automated test with the more than one test device.

In an implementation, the test agent may be implemented independently of the specified test device set, or may be implemented in the specified test device set.

In an implementation, the method 700 may further comprise: determining, through the test agent, whether the test task includes an authorization code of the specified test device set, the authorization code being from the test request. The automated test may be performed in response to determining that the test task includes the authorization code.

In an implementation, the method 700 may further comprise creating the test agent through the following operations: running a test agent creating program on a terminal device; initiating, at the terminal device, a registration process to the registration center; and configuring the terminal device as the test agent in response to the completion of the registration process.

The method 700 may further comprise: receiving, through the registration center, an authorization code generating request for generating an authorization code of a test device set associated with the test agent; generating, through the registration center, the authorization code; and sending, through the registration center, the generated authorization code to the test agent. In an implementation, the method 700 may further comprise: receiving, through the registration center, an authorization code acquiring request for acquiring an authorization code of the specified test device set; forwarding, through the registration center, the authorization code acquiring request to the testing agent; and determining, through the registration center, whether to provide the authorization code based on a response received from the test agent.

In an implementation, the method 700 may further comprise: visualizing a test result of the automated test. The test result may include a video navigation interface. The video navigation interface may include: a navigation area for displaying multiple test cases included in the automated test, each test case being selectable, and a video area for displaying a video clip corresponding to a selected test case.

It should be appreciated that the method 700 may further comprise any step/process for performing the automated test according to the embodiments of the present disclosure as mentioned above.

FIG.8 illustrates an exemplary automated testing system 800 according to an embodiment of the present disclosure.

The automated testing system 800 may comprise: a registration center 810, configured to receive a test request for performing an automated test for a target application with a specified test device set, generate a test task corresponding to the test request, and schedule the test task to a test agent associated with the specified test device set; at least one test agent 820, each test agent being configured to receive a test task, and perform an automated test with a test device set specified in the received test task; and at least one test device set 830, each test device set being associated with one of the at least one test agent and configured to run the target application.

In an implementation, the target application may include at least one of a single-operatingsystem application, a cross-operating-system application, and a Web application.

In an implementation, a form of each test device set in the at least one test device set may be at least one of: a single test device, a test device pair consisting of two test devices, and a test device group consisting of more than two test devices.

In an implementation, each test agent in the at least one test agent may be implemented independently of a test device set associated with the test agent, or may be implemented in the test device set.

In an implementation, each test agent in the at least one test agent may be further configured to: determine whether the received test task includes an authorization code of a corresponding specified test device set; and perform the automated test in response to determining that the received test task includes the authorization code.

In an implementation, the registration center may be further configured to: receive a registration request from a terminal device, the registration request being triggered by a test agent creating program running on the terminal device; and register the terminal device as a test agent in response to receiving the registration request.

The registration center may be further configured to: receive an authorization code generating request for generating an authorization code of a test device set associated with the registered test agent; generate the authorization code; and send the generated authorization code to the registered test agent.

In an implementation, the registration center may be further configured to: receive an authorization code acquiring request for acquiring an authorization code of the specified test device set; forward the authorization code acquiring request to the test agent; and determine whether to provide the authorization code based on a response received from the test agent.

In an implementation, the registration center may be further configured to visualize a test result of the automated test. The test result may include a video navigation interface. The video navigation interface may include: a navigation area for displaying multiple test cases included in the automated test, each test case being selectable, and a video area for displaying a video clip corresponding to a selected test case.

It should be appreciated that the automated testing system may further comprise any other components configured to perform an automated test according to embodiments of the present disclosure as mentioned above.

FIG.9 illustrates an exemplary apparatus 900 for performing an automated test according to an embodiment of the present disclosure.

The apparatus 900 may comprise at least one processor 910 and a memory 920 storing computer-executable instructions. The computer-executable instructions, when executed, may cause the at least one processor 910 to: receive, through a registration center, a test request for performing an automated test with a specified test device set; generate, through the registration center, a test task corresponding to the test request; schedule, through the registration center, the test task to a test agent associated with the specified test device set; and perform, through the test agent, the automated test with the specified test device set.

It should be appreciated that the processor 10 may further perform any other steps/processes of the method for performing the automated test according to the embodiments of the present disclosure as mentioned above.

The embodiments of the present disclosure propose a computer program product for performing an automated test, comprising a computer program that is executed by at least one processor for: receiving, through a registration center, a test request for performing an automated test with a specified test device set; generating, through the registration center, a test task corresponding to the test request; scheduling, through the registration center, the test task to a test agent associated with the specified test device set; and performing, through the test agent, the automated test with the specified test device set. In addition, the computer program may further be performed for implementing any other steps/processes of the method for performing the automated test according to the embodiments of the present disclosure as mentioned above.

The embodiments of the present disclosure may be embodied in non-transitory computer- readable medium. The non-transitory computer readable medium may comprise instructions that, when executed, cause one or more processors to perform any operation of the method for performing the automated test according to an embodiment of the present disclosure as mentioned above.

It should be appreciated that all the operations in the methods described above are merely exemplary, and the present disclosure is not limited to any operations in the methods or sequence orders of these operations, and should cover all other equivalents under the same or similar concepts. In addition, the articles “a” and “an” as used in this specification and the appended claims should generally be construed to mean “one” or “one or more” unless specified otherwise or clear from the context to be directed to a singular form.

It should also be appreciated that all the modules in the apparatuses described above may be implemented in various approaches. These modules may be implemented as hardware, software, or a combination thereof. Moreover, any of these modules may be further functionally divided into sub-modules or combined together.

Processors have been described in connection with various apparatuses and methods. These processors may be implemented using electronic hardware, computer software, or any combination thereof. Whether such processors are implemented as hardware or software will depend upon the particular application and overall design constraints imposed on the system. By way of example, a processor, any portion of a processor, or any combination of processors presented in the present disclosure may be implemented with a microprocessor, microcontroller, digital signal processor (DSP), a field-programmable gate array (FPGA), a programmable logic device (PLD), a state machine, gated logic, discrete hardware circuits, and other suitable processing components configured for performing the various functions described throughout the present disclosure. The functionality of a processor, any portion of a processor, or any combination of processors presented in the present disclosure may be implemented with software being executed by a microprocessor, microcontroller, DSP, or other suitable platform. Software shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, threads of execution, procedures, functions, etc. The software may reside on a computer-readable medium. A computer-readable medium may include, by way of example, memory such as a magnetic storage device (e.g., hard disk, floppy disk, magnetic strip), an optical disk, a smart card, a flash memory device, random access memory (RAM), read only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), a register, or a removable disk. Although memory is shown separate from the processors in the various aspects presented throughout the present disclosure, the memory may be internal to the processors, e.g., cache or register.

The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein. All structural and functional equivalents to the elements of the various aspects described throughout the present disclosure that are known or later come to be known to those of ordinary skilled in the art are expressly incorporated herein and intended to be encompassed by the claims.