Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
FRAUDULENT TRANSACTION IDENTIFICATION METHOD AND APPARATUS, SERVER, AND STORAGE MEDIUM
Document Type and Number:
WIPO Patent Application WO/2019/178501
Kind Code:
A1
Abstract:
Implementations of the present specification provide a fraudulent transaction identification method. One example method includes training a deep learning network based on an operation sequence and time difference information; obtaining the operation sequence of a transaction to be identified and the time difference information of adjacent operations; and predicting, based on the operation sequence and the time difference information and based on a deep learning network, the probability that the transaction is a fraudulent transaction.

Inventors:
LI LONGFEI (CN)
Application Number:
PCT/US2019/022512
Publication Date:
September 19, 2019
Filing Date:
March 15, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ALIBABA GROUP HOLDING LTD (US)
International Classes:
G06Q20/02; G06N20/00; G06Q20/08; G06Q20/38; G06Q20/40
Foreign References:
US20170372317A12017-12-28
Other References:
None
Attorney, Agent or Firm:
MATTSON, Matthew (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A method for identification of fraudulent transactions, the method comprising:

training a deep learning network based on an operation sequence and time difference information (S305);

obtaining the operation sequence of a transaction to be identified and the time difference information of adjacent operations (S201); and

predicting, based on the operation sequence and the time difference information and based on a deep learning network, the probability that the transaction is a fraudulent transaction (S202).

2. The method according to claim 1, wherein training the deep learning network comprises: obtaining a black sample and a white sample of a transaction, and separately extracting operation sequences of the black sample and the white sample and time difference information of adjacent operations (S301);

separately performing feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature (S302);

calculating a similarity between the operation feature and the time difference feature

(S303);

combining a plurality of operation features based on the similarity, to obtain a combined operation feature (S304); and

training the deep learning network through classification based on the combined operation feature (S305).

3. The method according to claim 2, wherein separately performing the feature conversion comprises:

performing the feature conversion on the operation sequence and the time difference information to obtain an initial operation feature and an initial time feature; and

separately performing dimension reduction and irrelevant feature removal operations on the initial operation feature and the initial time feature to select the operation feature and the time difference feature.

4. The method according to claim 2, wherein calculating the similarity between the operation feature and the time difference feature comprises:

calculating an inner product of an operation feature matrix of the operation feature and a time difference feature matrix of the time difference feature to obtain the similarity between the operation feature and the time difference feature.

5. The method according to claim 2, wherein combining a plurality of operation features based on the similarity comprises:

combining the plurality of operation features by calculating a sum of corresponding similarities or by taking a maximum value of corresponding similarities.

6. The method according to claim 2, wherein the deep learning network is trained based on a recurrent neural network (RNN) algorithm, a long short-term memory model (LSTM) algorithm, a gated recurrent unit (GRU) algorithm, or a simple recurrent unit (SRU) algorithm.

7. The method according to claim 2, wherein the combined operation feature is based on a sum or a maximum value of the operation feature and a time difference feature matrix of the time difference feature.

8. The method according to claim 1, wherein the operation sequence comprises an operation identifier involved in the transaction.

9. The method according to claim 8, wherein the operation identifier comprises a code or a sequence number.

10. An apparatus for identification of a fraudulent transaction, the apparatus comprising a plurality of modules configured to perform the method of any one of claims 1 to 9.

Description:
FRAUDULENT TRANSACTION IDENTIFICATION METHOD AND APPARATUS,

SERVER, AND STORAGE MEDIUM

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority to Chinese Patent Application No. 201810214431.6, filed on March 15, 2018, which is hereby incorporated by reference in its entirety.

TECHNICAL FIELD

[0002] Implementations of the present specification relate to the field of machine learning technologies, and in particular, to a fraudulent transaction identification method and apparatus, a server, and a storage medium.

BACKGROUND

[0003] The development of network transactions is accompanied with various frauds. Fraudsters use some information about victims to cheat the victims and induce them to transfer money to designated accounts or bank cards. This causes huge losses to the victims. Thus, how to identify these fraudulent transactions is very important in financial security.

SUMMARY

[0004] Implementations of the present specification provide a fraudulent transaction identification method and apparatus, a server, and a storage medium.

[0005] According to a first aspect, an implementation of the present specification provides a fraudulent transaction identification method, including the following: obtaining an operation sequence of a transaction to be identified and time difference information of adjacent operations; and predicting, based on the operation sequence and the time difference information and based on a deep learning network, the probability that the transaction to be identified is a fraudulent transaction, where the deep learning network is obtained by performing training in advance based on an operation sequence and a time difference similarity of a transaction sample.

[0006] According to a second aspect, an implementation of the present specification provides a deep learning network training method, and the method is used for fraudulent transaction identification and includes the following: obtaining a black sample and a white sample of a transaction, and separately extracting operation sequences of the black sample and the white sample and time difference information of adjacent operations; separately performing feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature; calculating a similarity between the operation feature and the time difference feature; combining a plurality of operation features based on the similarity, to obtain a combined operation feature; and training a deep learning network through classification based on the combined operation feature.

[0007] According to a third aspect, an implementation of the present specification provides a fraudulent transaction identification apparatus, including the following: an information acquisition unit, configured to obtain an operation sequence of a transaction to be identified and time difference information of adjacent operations; and a prediction unit, configured to predict, based on the operation sequence and the time difference information and based on a deep learning network, the probability that the transaction to be identified is a fraudulent transaction, where the deep learning network is obtained by performing training in advance based on an operation sequence and a time difference similarity of a transaction sample.

[0008] According to a fourth aspect, an implementation of the present specification provides a deep learning network training apparatus, and the apparatus is used for fraudulent transaction identification and includes the following: a sample acquisition and extraction unit, configured to obtain a black sample and a white sample of a transaction, and separately extract operation sequences of the black sample and the white sample and time difference information of adjacent operations; a feature conversion unit, configured to separately perform feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature; a similarity calculation unit, configured to calculate a similarity between the operation feature and the time difference feature; a feature combination unit, configured to combine a plurality of operation features based on the similarity, to obtain a combined operation feature; and a classification training unit, configured to train a deep learning network through classification based on the combined operation feature.

[0009] According to a fifth aspect, an implementation of the present specification provides a server, including a memory, a processor, and a computer program that is stored in the memory and that can run on the processor, where when executing the program, the processor performs the steps of any described method. [0010] According to a sixth aspect, an implementation of the present specification provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the program is executed by a processor, the steps of any described method are performed.

[0011] The implementations of the present specification have the following beneficial effects:

[0012] In the implementations of the present invention, by combining a time factor and a deep learning network, a time-focused deep learning network is innovatively provided for fraudulent transaction identification, to find some associations of transaction operations in terms of time, thereby better identifying a fraudulent transaction.

BRIEF DESCRIPTION OF DRAWINGS

[0013] FIG. 1 is a schematic diagram illustrating a scenario of a fraudulent transaction identification method, according to an implementation of the present specification;

[0014] FIG 2 is a flowchart illustrating a fraudulent transaction identification method, according to a first aspect of the implementations of the present specification;

[0015] FIG 3 is a flowchart illustrating a deep learning network training method, according to a second aspect of the implementations of the present specification;

[0016] FIG. 4 is a schematic diagram illustrating an example of a deep learning network training method, according to a second aspect of the implementations of the present specification;

[0017] FIG 5 is a schematic structural diagram illustrating a fraudulent transaction identification apparatus, according to a third aspect of the implementations of the present specification;

[0018] FIG 6 is a schematic structural diagram illustrating a deep learning network training apparatus, according to a fourth aspect of the implementations of the present specification;

[0019] FIG. 7 is a schematic structural diagram illustrating a server, according to a fifth aspect of the implementations of the present specification; and

[0020] FIG 8 is a flowchart illustrating an example of a computer-implemented method for identifying a fraudulent transaction using a deep leaning network, according to an implementation of the present disclosure. DESCRIPTION OF IMPLEMENTATIONS

[0021] To better understand the previous technical solutions, the following describes the technical solutions in the implementations of the present specification in detail by using the accompanying drawings and specific implementations. It should be understood that the implementations of the present specification and the specific features in the implementations are detailed descriptions of the technical solutions in the implementations of the present specification, and are not intended to limit the technical solutions of the present specification. The implementations of the present specification and the technical features in the implementations can be mutually combined without conflicts.

[0022] FIG 1 is a schematic diagram illustrating a fraudulent transaction identification scenario, according to an implementation of the present specification. Client (user side) proposes a transaction processing request to a server (network side), and the server parses the transaction processing request, and performs time-difference based exception identification on the transaction processing request. The server obtains a deep learning network by performing training in advance based on an operation sequence and a time difference similarity of a transaction sample, and predicts, by using the deep learning network, the probability that a transaction to be identified is a fraudulent transaction.

[0023] According to a first aspect, an implementation of the present specification provides a fraudulent transaction identification method. Referring to FIG 2, the method includes the following steps:

[0024] S201. Obtain an operation sequence of a transaction to be identified and time difference information of adjacent operations.

[0025] S202. Predict, based on the operation sequence and the time difference information and based on a deep learning network, the probability that the transaction to be identified is a fraudulent transaction, where the deep learning network is obtained by performing training in advance based on an operation sequence and a time difference similarity of a transaction sample.

[0026] In the present implementation of the present invention, a time factor is considered in a fraudulent transaction identification process, so that a fraudulent transaction can be better identified. This is because a fraudulent transaction usually includes a plurality of operations periodically initiated by a machine. For example, a typical operation sequence of a fraud account is always A, B, C, D, A, B, C, D, and ABCD each is a specific operation. There seems to be no problem merely in terms of sequence, and many users perform operations in this way, but some problems can be identified after a time interval is considered. Usually, two operations of a user have a random time difference, but a time difference between two operations of the described user is basically stable, which is likely performed by a machine instead of a person. After checking, it is identified that the user is a fraudulent robot, which constantly sends collection requests to deceive others into transferring money.

[0027] As described above, whether the transaction to be identified is a fraudulent transaction is identified based on the deep learning network. Therefore, the deep learning network needs to be trained in advance before actual identification.

[0028] In an optional method, a process of training the deep learning network includes the following.

[0029] (1) Obtain a black sample and a white sample of a transaction, and separately extract operation sequences of the black sample and the white sample and time difference information of adjacent operations.

[0030] The black sample of a transaction is determined as a sample of a fraudulent transaction; and the white sample of a transaction is determined as a sample of a non-fraudulent transaction. The operation sequence is an operation identifier (for example, code or a sequence number) involved in the transaction; and the time difference information is a time difference between two adjacent operations.

[0031] (2) Separately perform feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature.

[0032] For example, feature conversion is first performed on the operation sequence and the time difference information to obtain an initial operation feature and an initial time feature, and then dimension reduction and irrelevant feature removal operations are separately performed on the initial operation feature and the initial time feature to select the operation feature and the time difference feature.

[0033] (3) Calculate a similarity between the operation feature and the time difference feature.

[0034] For example, an inner product of an operation feature matrix of the operation feature and a time difference feature matrix of the time difference feature is calculated to obtain the similarity between the operation feature and the time difference feature.

[0035] (4) Combine a plurality of operation features based on the similarity, to obtain a combined operation feature. [0036] For example, the plurality of operation features are combined by calculating a sum of corresponding similarities or by taking a maximum value of corresponding similarities.

[0037] (5) Train the deep learning network through classification based on the combined operation feature.

[0038] For example, the deep learning network can be trained based on algorithms such as a recurrent neural network (RNN) algorithm, a long short-term memory (LSTM) algorithm, a gated recurrent unit (GRU) algorithm, or a simple recurrent unit (SRU) algorithm.

[0039] It can be seen that in the present implementation of the present invention, by combining the time factor and the deep learning network, a time-focused deep learning network is innovatively provided for fraudulent transaction identification, to find some associations of transaction operations in terms of time, thereby better identifying a fraudulent transaction.

[0040] According to a second aspect based on the same invention concept, an implementation of the present specification provides a deep learning network training method for fraudulent transaction identification.

[0041] Referring to FIG. 3, the method includes S301 to S305.

[0042] S301. Obtain a black sample and a white sample of a transaction, and separately extract operation sequences of the black sample and the white sample and time difference information of adjacent operations.

[0043] The black sample of a transaction is determined as a sample of a fraudulent transaction; and the white sample of a transaction is determined as a sample of a non-fraudulent transaction. The operation sequence is an operation identifier (for example, code or a sequence number) involved in the transaction; and the time difference information is a time difference between two adjacent operations.

[0044] S302. Separately perform feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature.

[0045] For example, feature conversion is first performed on the operation sequence and the time difference information to obtain an initial operation feature and an initial time feature, and then dimension reduction and irrelevant feature removal operations are separately performed on the initial operation feature and the initial time feature to select the operation feature and the time difference feature. [0046] S303. Calculate a similarity between the operation feature and the time difference feature.

[0047] For example, an inner product of an operation feature matrix of the operation feature and a time difference feature matrix of the time difference feature is calculated to obtain the similarity between the operation feature and the time difference feature.

[0048] S304. Combine a plurality of operation features based on the similarity, to obtain a combined operation feature.

[0049] For example, the plurality of operation features are combined by calculating a sum of corresponding similarities or by taking a maximum value of corresponding similarities.

[0050] S305. Train the deep learning network through classification based on the combined operation feature.

[0051] For example, the deep learning network can be trained based on algorithms such as a recurrent neural network (RNN) algorithm, a long short-term memory (LSTM) algorithm, a gated recurrent unit (GRU) algorithm, or a simple recurrent unit (SRU) algorithm.

[0052] FIG. 4 is a schematic diagram illustrating an example of a deep learning network training method, according to a second aspect of the implementations of the present specification. In the present example, a deep learning network used for fraudulent transaction identification is trained based on an LSTM.

[0053] An example training process is described as follows:

[0054] 1. Extract operation records of both parties of a transaction based on transaction time in a system, including an operation name and specific operation time.

[0055] 2. Based on time and users of a fraudulent transaction that are recorded in a database, select some operations performed by the fraudulent user before the time, and chronologically sort the operations; number the operations, for example, number operation a as 1 and number operation b as 2; chronologically sort the data, for example, use D to represent the data, where each element is D_i; record time between each operation and a previous operation, and use T to represent the time, where each origin is T_i; and constitute a black sample in a training set by using the two pieces of data.

[0056] 3. Randomly select, from time and users of a non-fraudulent transaction that are recorded in the database, some operations performed by the user before the time, and chronologically sort the operations; number the operations, for example, number operation a as 1 and number operation b as 2; chronologically sort the data and use D to represent the data, where each element is D_i; constitute a white sample in the training set; and record a time difference between two operations and use T to represent the time difference, where each origin is T_i.

[0057] 4. Input processed data D into a long short-term memory model (LSTM).

Different LSTM layers (for example, an LSTM layer, a full mesh layer, and a convolutional layer) can be added based on actual needs, to achieve a better effect through layer accumulation.

[0058] 5. Obtain output at each moment of the LSMT to serve as an operation embedding parameter (embedding). The operation embedding is, for example, a two-dimensional matrix representing an operation feature.

[0059] 6. Input data T into the model, and select the corresponding embedding based on time embedding for T. The time embedding is, for example, a two-dimensional matrix representing a time feature.

[0060] 7. Calculate a similarity between operation embedding and time embedding of each corresponding time point. For example, the similarity is calculated by using an inner product as follows:

[0061] 8. Combine different embedding according to by calculating a sum or taking a maximum value, for example, by using the following sum equation:

O = åi F(ϋ· , T· ) O ϋ·

[0062] 9. Place output O (namely, the previous combined operation feature) into a classifier for network training.

[0063] In the present implementation of the present invention, by combing a time factor and the LSTM, a time-focused long short-term memory is innovatively provided. The memory can be used for both sequence modeling and time modeling, and can find some associations in the data. Compared with a conventional global variable model, the performance is improved by about 10% during fraudulent transaction identification.

[0064] According to a third aspect, based on the same invention concept, an implementation of the present specification provides a fraudulent transaction identification apparatus. Referring to FIG 5, the apparatus includes the following: an information acquisition unit 501, configured to obtain an operation sequence of a transaction to be identified and time difference information of adjacent operations; and a prediction unit 502, configured to predict, based on the operation sequence and the time difference information and based on a deep learning network, the probability that the transaction to be identified is a fraudulent transaction.

[0065] The deep learning network is obtained by performing training in advance based on an operation sequence and a time difference similarity of a transaction sample.

[0066] In an optional method, the apparatus further includes the following: a training unit 503, configured to train the deep learning network.

[0067] The training unit 503 includes the following: a sample acquisition and extraction subunit 5031, configured to obtain a black sample and a white sample of a transaction, and separately extract operation sequences of the black sample and the white sample and time difference information of adjacent operations; a feature conversion subunit 5032, configured to separately perform feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature; a similarity calculation subunit 5033, configured to calculate a similarity between the operation feature and the time difference feature; a feature combination subunit 5034, configured to combine a plurality of operation features based on the similarity, to obtain a combined operation feature; and a classification training subunit 5035, configured to train the deep learning network through classification based on the combined operation feature.

[0068] In an optional method, the feature conversion subunit 5032 is configured to perform feature conversion on the operation sequence and the time difference information to obtain an initial operation feature and an initial time feature; and separately perform dimension reduction and irrelevant feature removal operations on the initial operation feature and the initial time feature to select the operation feature and the time difference feature.

[0069] In an optional method, the similarity calculation subunit 5033 is configured to calculate an inner product of an operation feature matrix of the operation feature and a time difference feature matrix of the time difference feature to obtain the similarity between the operation feature and the time difference feature.

[0070] In an optional method, the feature combination subunit 5034 is configured to combine the plurality of operation features by calculating a sum of corresponding similarities or by taking a maximum value of corresponding similarities.

[0071] In an optional method, the training unit 503 trains the deep learning network based on an RNN algorithm, an LSTM algorithm, a GRU algorithm, or an SRU algorithm. [0072] According to a fourth aspect, based on the same invention concept, an implementation of the present specification provides a deep learning network training apparatus. Referring to FIG 6, the apparatus includes the following: a sample acquisition and extraction unit 601, configured to obtain a black sample and a white sample of a transaction, and separately extract operation sequences of the black sample and the white sample and time difference information of adjacent operations; a feature conversion unit 602, configured to separately perform feature conversion and selection on the sample operation sequence and the time difference information to obtain an operation feature and a time difference feature; a similarity calculation unit 603, configured to calculate a similarity between the operation feature and the time difference feature; a feature combination unit 604, configured to combine a plurality of operation features based on the similarity, to obtain a combined operation feature; and a classification training unit 605, configured to train a deep learning network through classification based on the combined operation feature.

[0073] In an optional method, the feature conversion unit 602 is configured to perform feature conversion on the operation sequence and the time difference information to obtain an initial operation feature and an initial time feature; and separately perform dimension reduction and irrelevant feature removal operations on the initial operation feature and the initial time feature to select the operation feature and the time difference feature.

[0074] In an optional method, the similarity calculation unit 603 is configured to calculate an inner product of an operation feature matrix of the operation feature and a time difference feature matrix of the time difference feature to obtain the similarity between the operation feature and the time difference feature.

[0075] In an optional method, the feature combination unit 604 is configured to combine the plurality of operation features by calculating a sum of corresponding similarities or by taking a maximum value of corresponding similarities.

[0076] In an optional method, the deep learning network is trained based on an RNN algorithm, an LSTM algorithm, a GRU algorithm, or an SRU algorithm.

[0077] According to a fifth aspect, based on an invention concept that is the same as that of the fraudulent transaction identification method or the deep learning network training method in the previous implementations, the present invention further provides a server. As shown in FIG. 7, the server includes a memory 704, a processor 702, and a computer program that is stored on the memory 704 and that can run on the processor 702. The processor 702 performs the steps of either the described fraudulent transaction identification method or the described deep learning network training method when executing the program.

[0078] In FIG 7, a bus architecture (represented by a bus 700), namely, the bus 700 can include any quantity of interconnected buses and bridges, and the bus 700 connects various circuits including one or more processors represented by the processor 702 and memories represented by the memory 704. The bus 700 can further connect various other circuits such as a peripheral device, a voltage regulator, and a power management circuit, which is well known in the art, and therefore is not further described in the present specification. A bus interface 706 provides an interface between the bus 700, a receiver 701, and a transmitter 703. The receiver 701 and the transmitter 703 can be the same element, that is, a transceiver, and provide units that are configured to communicate with various other apparatuses on a transmission medium. The processor 702 is responsible for managing the bus 700 and for general processing, and the memory 704 can be configured to store data used when the processor 702 performs an operation.

[0079] According to a sixth aspect, based on an invention concept that is the same as that of the fraudulent transaction identification method or the deep learning network training method in the previous implementations, the present invention further provides a computer readable storage medium. The computer readable storage medium stores a computer program, and the steps of either the described fraudulent transaction identification method or the described deep learning network training method are performed when the program is executed by a processor.

[0080] The present specification is described with reference to the flowcharts and/or block diagrams of the method, the device (system), and the computer program product according to the implementations of the present specification. It should be understood that computer program instructions can be used to implement each process and/or each block in the flowcharts and/or the block diagrams and a combination of a process and/or a block in the flowcharts and/or the block diagrams. These computer program instructions can be provided for a general purpose computer, a dedicated computer, an embedded processor, or a processor of any other programmable data processing device to generate a machine, so that the instructions executed by a computer or a processor of any other programmable data processing device generate a device for implementing a specific function in one or more processes in the flowcharts and/or in one or more blocks in the block diagrams. [0081] These computer program instructions can be stored in a computer readable memory that can instruct the computer or any other programmable data processing device to work by using a specific method, so that the instructions stored in the computer readable memory generate an artifact that includes an instruction device. The instruction device implements a specific function in one or more processes in the flowcharts and/or in one or more blocks in the block diagrams.

[0082] These computer program instructions can be loaded to a computer or another programmable data processing device, so that a series of operations and steps are performed on the computer or the another programmable device, thereby generating computer-implemented processing. Therefore, the instructions executed on the computer or the another programmable device provide steps for implementing a specific function in one or more processes in the flowcharts and/or in one or more blocks in the block diagrams.

[0083] Although some preferred implementations of the present specification have been described, a person skilled in the art can make changes and modifications to these implementations once they leam the basic invention concept. Therefore, the following claims are intended to be construed as to cover the preferred implementations and all changes and modifications falling within the scope of the present specification.

[0084] Obviously, a person skilled in the art can make various modifications and variations to the present specification without departing from the spirit and scope of the present specification. The present specification is intended to cover these modifications and variations of the present specification provided that they fall within the scope of protection described by the following claims and their equivalent technologies.

[0085] FIG 8 is a flowchart illustrating an example of a computer-implemented method 800 for identifying a fraudulent transaction using a deep learning network, according to an implementation of the present disclosure. For clarity of presentation, the description that follows generally describes method 800 in the context of the other figures in this description. However, it will be understood that method 800 can be performed, for example, by any system, environment, software, and hardware, or a combination of systems, environments, software, and hardware, as appropriate. In some implementations, various steps of method 800 can be run in parallel, in combination, in loops, or in any order.

[0086] At 802, an operation sequence and time difference information associated with a transaction are identified by a server. In some implementations, the transaction comprises a plurality of operations, wherein the operation sequence is an order associated with the plurality of operations, and wherein the time difference information comprises a time difference between each two adjacent operations included in the transaction. From 802, method 800 proceeds to 804.

[0087] At 804, a probability that the transaction is a fraudulent transaction is predicted by the server based on a result provided by a deep learning network, wherein the deep learning network is trained to predict fraudulent transactions based on operation sequences and time differences associated with a plurality of transaction samples, and wherein the deep learning network provides the result in response to input including the operation sequence and the time difference information associated with the transaction.

[0088] In some implementations, training the deep learning network includes obtaining a black sample associated with a fraudulent transaction and a white sample associated with a non-fraudulent transaction; separately extracting, from the black sample and from the white sample, an operation sequence and time difference information; performing a feature conversion and a selection on each of the operation sequences and on each of the time difference information that are extracted from the black sample and the white sample to obtain a number of operation features and a number of time difference features; calculating a similarity between each pair of the operation feature and the time difference feature that corresponds to a specific time point; combining more than one operation features based on the calculated similarity to obtain a combined operation feature; and training the deep learning network through classification based on the combined operation feature.

[0089] In some cases, performing a feature conversion and a selection on each of the operation sequences and each of the time difference information includes performing a feature conversion on each of the operation sequences and each of the time difference information to obtain an initial operation feature and an initial time feature; and separately performing a dimension reduction and an irrelevant feature removal on the initial operation feature and the initial time feature to select a number of operation features and time difference features.

[0090] In some implementations, calculating a similarity between each pair of the operation feature and the time difference feature that corresponds to a specific time point includes calculating an inner product of an operation feature matrix that includes operation features and a time difference feature matrix that includes time difference features to obtain a similarity between each of the operation features and the time difference features. [0091] In some implementations, the plurality of operation features are combined by calculating a sum of corresponding similarities. In some cases, the deep learning network is trained based on at least one of a recurrent neural network (RNN) algorithm, a long short-term memory (LSTM) algorithm, a gated recurrent unit (GRU) algorithm, and a simple recurrent unit (SRU) algorithm. After 804, method 800 stops.

[0092] Implementations of the present application can solve technical problems in fraudulent transaction identification. Traditionally, the industry mainly uses methods such as logistic regression and random forest to detect fraudulent transactions in a fraud detection. Using such methods, the data processed by the user is extracted according to the user's full operation data. In fraud scenarios, the sequence of the user's behavior sequence and the operation time are also important factors, therefore, using conventional models can bring at least two problems. First, when using user access data, these models need to put all the data together to extract features, ignoring the important factor of the time duration between a user's operations. Second, even if the data of time duration is added to the full data to train the model, because the limitations of the traditional global model itself, the local features cannot be extracted, thus losing some of the user's local operation characteristics. Other models, such as LSTM that models the sequence data and then determines if the transaction is fraudulent, all ignore the influence of time factors.

[0093] Implementation of the present application provide methods and apparatuses for improving the identification of fraudulent transactions. According to the present disclosure, client (user side) proposes a transaction processing request to a server (network side), and the server parses the transaction processing request, and performs time-difference based exception identification on the transaction processing request. The server obtains a deep learning network by performing training in advance based on an operation sequence and a time difference similarity of a transaction sample, and predicts, by using the deep learning network, the probability that a transaction to be identified is a fraudulent transaction. In the implementations of the present invention, by combining a time factor and a deep learning network, a time-focused deep learning network is innovatively provided for fraudulent transaction identification, to find some associations of transaction operations in terms of time, thereby better identifying a fraudulent transaction. For example, by combing a time factor and the LSTM, a time-focused long short-term memory is innovatively provided. The memory can be used for both sequence modeling and time modeling, and can find some associations in the data. Compared with a conventional global variable model, the performance is improved by about 10% during fraudulent transaction identification.

[0094] Embodiments and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification or in combinations of one or more of them. The operations can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources. A data processing apparatus, computer, or computing device may encompass apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include special purpose logic circuitry, for example, a central processing unit (CPU), a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC). The apparatus can also include code that creates an execution environment for the computer program in question, for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system (for example an operating system or a combination of operating systems), a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.

[0095] A computer program (also known, for example, as a program, software, software application, software module, software unit, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A program can be stored in a portion of a file that holds other programs or data (for example, one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (for example, files that store one or more modules, sub-programs, or portions of code). A computer program can be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network. [0096] Processors for execution of a computer program include, by way of example, both general- and special-purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random-access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data. A computer can be embedded in another device, for example, a mobile device, a personal digital assistant (PDA), a game console, a Global Positioning System (GPS) receiver, or a portable storage device. Devices suitable for storing computer program instructions and data include non-volatile memory, media and memory devices, including, by way of example, semiconductor memory devices, magnetic disks, and magneto-optical disks. The processor and the memory can be supplemented by, or incorporated in, special-purpose logic circuitry.

[0097] Mobile devices can include handsets, user equipment (UE), mobile telephones (for example, smartphones), tablets, wearable devices (for example, smart watches and smart eyeglasses), implanted devices within the human body (for example, biosensors, cochlear implants), or other types of mobile devices. The mobile devices can communicate wirelessly (for example, using radio frequency (RF) signals) to various communication networks (described below). The mobile devices can include sensors for determining characteristics of the mobile device's current environment. The sensors can include cameras, microphones, proximity sensors, GPS sensors, motion sensors, accelerometers, ambient light sensors, moisture sensors, gyroscopes, compasses, barometers, fingerprint sensors, facial recognition systems, RF sensors (for example, Wi-Fi and cellular radios), thermal sensors, or other types of sensors. For example, the cameras can include a forward- or rear-facing camera with movable or fixed lenses, a flash, an image sensor, and an image processor. The camera can be a megapixel camera capable of capturing details for facial and/or iris recognition. The camera along with a data processor and authentication information stored in memory or accessed remotely can form a facial recognition system. The facial recognition system or one-or-more sensors, for example, microphones, motion sensors, accelerometers, GPS sensors, or RF sensors, can be used for user authentication.

[0098] To provide for interaction with a user, embodiments can be implemented on a computer having a display device and an input device, for example, a liquid crystal display (LCD) or organic light-emitting diode (OLED)/virtual-reality (VR)/augmented-reality (AR) display for displaying information to the user and a touchscreen, keyboard, and a pointing device by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, for example, visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.

[0099] Embodiments can be implemented using computing devices interconnected by any form or medium of wireline or wireless digital data communication (or combination thereof), for example, a communication network. Examples of interconnected devices are a client and a server generally remote from each other that typically interact through a communication network. A client, for example, a mobile device, can carry out transactions itself, with a server, or through a server, for example, performing buy, sell, pay, give, send, or loan transactions, or authorizing the same. Such transactions may be in real time such that an action and a response are temporally proximate; for example an individual perceives the action and the response occurring substantially simultaneously, the time difference for a response following the individual's action is less than 1 millisecond (ms) or less than 1 second (s), or the response is without intentional delay taking into account processing limitations of the system.

[00100] Examples of communication networks include a local area network (LAN), a radio access network (RAN), a metropolitan area network (MAN), and a wide area network (WAN). The communication network can include all or a portion of the Internet, another communication network, or a combination of communication networks. Information can be transmitted on the communication network according to various protocols and standards, including Long Term Evolution (LTE), 5 IEEE 802, Internet Protocol (IP), or other protocols or combinations of protocols. The communication network can transmit voice, video, biometric, or authentication data, or other information between the connected computing devices.

[00101] Features described as separate implementations may be implemented, in combination, in a single implementation, while features described as a single implementation may be implemented in multiple implementations, separately, or in any suitable sub-combination. Operations described and claimed in a particular order should not be understood as requiring that the particular order, nor that all illustrated operations must be performed (some operations can be optional). As appropriate, multitasking or parallel-processing (or a combination of multitasking and parallel-processing) can be performed.