Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR UPDATING A USER IDENTIFICATION SYSTEM
Document Type and Number:
WIPO Patent Application WO/2022/084039
Kind Code:
A1
Abstract:
There are described a user identification system and a method for its update. The method comprises pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre- processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre-processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.

Inventors:
FARESSE MARC (CH)
Application Number:
PCT/EP2021/077554
Publication Date:
April 28, 2022
Filing Date:
October 06, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DORMAKABA SCHWEIZ AG (CH)
International Classes:
G06V40/16; G06V10/74; G06V40/50
Domestic Patent References:
WO2019173562A12019-09-12
Foreign References:
US20180365402A12018-12-20
US20190213394A12019-07-11
US9418214B12016-08-16
US20020112177A12002-08-15
Attorney, Agent or Firm:
BALDER IP LAW S.L. (ES)
Download PDF:
Claims:
CLAIMS

1. A method for updating a user identification system, the method comprising: pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre-processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre-processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.

2. The method of claim 1, wherein the second user data is anonymous.

3. The method of claims 1 or 2, wherein the second user data is obtained from an ID reader.

4. The method of any one of claims 1 to 3, further comprising completing the updating of the identification system when an update criteria has been met.

5. The method of claim 4, wherein the update criteria is at least one of a total number of user IDs stored in the second database as the second user data and a timer that has expired.

6. The method of claims 4 or 5, further comprising transitioning from the first feature extractor to the second feature extractor for performing user identification when the update is completed.

7. The method of claims 4 or 5, further comprising performing user identification with the first feature extractor and the second feature extractor when the update is completed.

8. The method of any one of claims 1 to 7, wherein the first database is associated with a first geographical location and the second database is associated with a second geographical location.

9. The method of any one of claims 1 to 8, wherein pre-processing the image data comprises pre-processing the image data for the first feature extractor separately from pre-processing the image data for the second feature extractor.

10. The method of any one of claims 1 to 9, wherein the second feature extractor is an updated or modified version of the first feature extractor.

11. A user identification system comprising: at least one processor; and at least one non-transitory computer readable medium having stored thereon program instructions executable by the at least one processor for: pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre-processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre-processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.

12. The system of claim 11 , wherein the second user data is anonymous.

13. The system of claims 11 or 12, wherein the second user data is obtained from an ID reader.

14. The system of any one of claims 11 to 13, wherein the program instructions are further executable completing the updating of the identification system when an update criteria has been met.

15. The system of claim 14, wherein the update criteria is at least one of a total number of user IDs stored in the second database as the second user data and a timer that has expired.

16. The system of claims 14 or 15, wherein the program instructions are further executable for transitioning from the first feature extractor to the second feature extractor for performing user identification when the update is completed.

17. The system of claims 14 or 15, wherein the program instructions are further executable for performing user identification with the first feature extractor and the second feature extractor when the update is completed.

18. The system of any one of claims 11 to 17, wherein the first database is associated with a first geographical location and the second database is associated with a second geographical location.

19. The system of any one of claims 11 to 18, wherein pre-processing the image data comprises pre-processing the image data for the first feature extractor separately from preprocessing the image data for the second feature extractor.

20. The system of any one of claims 11 to 19, wherein the second feature extractor is an updated or modified version of the first feature extractor.

Description:
METHOD AND SYSTEM FOR UPDATING A USER IDENTIFICATION SYSTEM

TECHNICAL FIELD

[0001] The present disclosure generally relates to the field of neural network-based facial recognition systems for user enrollment and identification.

BACKGROUND OF THE ART

[0002] Facial recognition is broadly used in many industries. Some example applications are in law enforcement, retail, healthcare, and access and security. While recognizing faces is a trivial task for humans, it remains a complex problem for machines. Neural networks are used to recognize faces by mapping facial features from a photograph or video to databases with known faces to find a match. Deep learning can leverage very large datasets of faces to provide highly performing systems.

[0003] While the ability to use complex systems to perform facial recognition is advantageous in some regards, it does present some challenges in other regards, such as when a system needs an update. Therefore, improvements are needed.

SUMMARY

[0004] In accordance with one aspect, there is provided a method for updating a user identification system. The method comprises pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre-processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre- processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.

[0005] In accordance with another aspect, there is provided a user identification system comprising at least one processor and at least one non-transitory computer readable medium having stored thereon program instructions. The program instructions are executable by the processor for pre-processing image data of a user acquired from an image capture device, for face detection and alignment; identifying the user by extracting a first feature vector from the pre-processed image data using a first feature extractor and comparing the first feature vector to first user data stored in a first database; and enrolling the user, concurrently with the identifying, by extracting a second feature vector from the pre-processed image data using a second feature extractor and storing the second feature vector with second user data in a second database.

[0006] Many further features and combinations thereof concerning embodiments described herein will appear to those skilled in the art following a reading of the instant disclosure.

DESCRIPTION OF THE FIGURES

[0007] In the figures,

[0008] Fig. 1 is a block diagram of an example of a user enrollment and identification system;

[0009] Fig. 2A-2C are block diagrams of example embodiments for updating the user identification system of Fig. 1 ;

[0010] Fig. 3 is a schematic diagram of multiple instances of algorithms running in parallel;

[0011] Figs. 4A-4C are flowcharts of example method for updating a user identification system; and

[0012] Fig. 5 is a block diagram of an example computing device.

DETAILED DESCRIPTION

[0013] There are described herein methods and systems for updating a user identification system. Fig. 1 shows an example of an identification system 100. The identification is based on facial recognition provided from image data acquired by an image capture device 101. In some embodiments, the image capture device 101 is a sensor, such as a camera, that can acquire an image directly from a person standing in front of it. In some embodiments, the image capture device 101 acquires an image of an image, such as a picture on a badge. Any device capable of acquiring image data may be used. The image data is received for pre-processing 102, which generally includes face detection and alignment steps. Face detection segments the face areas from the background and provides coarse estimates of the location and scale of a detected face. Face alignment then refines localization and normalization of the detected face. Preprocessing 102 may include cropping, rotation, and/or scaling of an image. If the image data is taken from a video stream, the detected face may need to be tracked using a face tracking component. Any known or other pre-processing steps for facial recognition may be used. [0014] The pre-processed image data is transmitted to a feature extractor 104, where feature vectors are extracted. The feature vectors are used for feature matching 108, by comparing the extracted feature vectors to user data stored in a database 106. The user data generally comprises previously extracted feature vectors associated with a given user, for example with a user identification number. When a match is found with sufficient confidence, for example using a Euclidean distance calculation between the extracted feature vector and the stored feature vectors, identification of the user is confirmed. If a match is not found with sufficient confidence, identification is denied. Other techniques for feature matching may also be used for user identification.

[0015] Enrollment, or registration, of users into the system 100 may be performed at any time by storing the extracted feature vector in the database 106 and associating the feature vector with a user identifier. In some embodiments, the user identifier and any other data stored in the database 106 is devoid of any details allowing users to be identified, such as pictures, names, addresses, etc. This may be done, for example, to respect various privacy laws in certain jurisdictions, such as the General Data Protection Regulation (GDPR) in the European Union.

[0016] With reference to Fig. 2A, an update to the system 100 is seamlessly performed by running a parallel instance 202 while also performing user identification concurrently. The parallel instance comprises an updated or modified feature extractor 204, that receives the pre- processed image data in parallel to the feature extractor 104. As used herein, “updated” and “modified” may be used interchangeably to refer to any changes having been applied to an original version. Similarly, the expression “updating the user identification system” encompasses any changes to be made to the software and/or hardware of the system.

[0017] Feature vectors extracted by the feature extractor 104 are used to perform user identification through comparison with feature vectors stored in the database 106. Feature vectors extracted by the updated feature extractor 204 are used to populate a new database 206, for enrollment purposes. Therefore, the update to the system 100 is performed concurrently with continued use of the system 100 for identification purposes, and there is no downtime to the system 100 required for the update. Moreover, the update is performed without having to store any images or other personal information related to the users.

[0018] Fig. 2B illustrates an example where the update to the system 100 concerns not only the algorithm used for extracting features through the feature extractor 104, but also the pre- processing steps 102. In this case, a parallel instance 212 includes an updated pre-processing stage 208. The image data is sent to both pre-processing stages 102, 208, and each feature extractor 104, 204 receives pre-processed image data from a respective pre-processing stage 102, 208.

[0019] In some embodiments, the system 100 may also be used to transition from an identification system based on credentials to an identification system based on facial recognition. With reference to Fig. 2C, an ID reader 210 may scan the credentials of the user, such as from a card, badge, or mobile device, and extract therefrom a user ID. In parallel, the image capture device 101 acquires the image data for the user. The database 206 of a parallel instance 222 can then be built without any external intervention as the user ID from the credentials is automatically associated with the feature vectors extracted from the pre- processed image data by the updated feature extractor 204. Although the transition from a credentials-based system to a facial recognition system is illustrated herein as applied to a parallel instance 222 where both the feature extractor 104 and the pre-processing 102 are updated, it will be understood that this feature may also apply to an embodiment where only the feature extractor 104 is updated, such as the example illustrated in Fig. 2A.

[0020] In some embodiments, the update to the system 100 is completed when one or more update criteria has been met. For example, the update criteria may be a total number of users to enroll in the new database 206. Alternatively, the update criteria may be a finite list of users to be enrolled in the new database 206, and the update is completed when each user from the list has been enrolled in the new database 206. In some embodiments, it may be desirable to account for new users to the system and/or old user accounts that are no longer active. As such, the update criteria may be the expiry of a timer or a combination of a number of user IDs and a timer. Various criteria may be used to determine that the update has been completed, depending on practical implementation.

[0021] In some embodiments, operation of the system 100 is transitioned to the second instance 202, 212, 222 for user identification when the update is completed. The first instance, namely the original feature extractor 104 and original database 106, and in some case the original pre-processing 102, may be removed entirely from the system 100 or simply deactivated (i.e. taken offline). The image data is then received only in the second instance 202, 212, 222 which would become the only instance of the algorithm. Transition to the second instance 202, 212, 222 may occur automatically or manually. [0022] In some embodiments, both the first instance and the second instance 202, 212, 222 are used for user identification once the update is completed. For example, feature matching 108 is performed by searching for a match in both the first database 106 and the second database 206. In addition, feature matching 108 may be based on feature vectors received from the feature extractor 104 and from the updated feature extractor 204.

[0023] Subsequent updates to the system 100 may result in multiple instances running in parallel, as illustrated in Fig. 3. An original version 300 may be supplemented and/or replaced by subsequent instances 302, 304, 306, which may be updated versions of an algorithm and/or modified versions of the algorithm. In some embodiments, the multiple instances may be associated with distinct geographical locations. For example, a company having different locations around the world may have different requirements with regards to the algorithms that they can run at each location. There may also be restrictions with regards to the data stored in each of the databases associated with each algorithm as a function of the location. The system 100 thus offers interoperability between locations and more flexibility. A user enrolled in an instance running in a location in the United States that travels to China would not need to reenroll in the system in China. The system running in China could access the database running in the instance located in the United States to verify the identity of the user.

[0024] Fig. 4A illustrates an example of a method 400 for updating a user identification system, such as the system 100. At step 402, image data acquired by an image capture device for a given user is pre-processed for face detection and alignment. The pre-processed image data is transmitted to a first feature extractor associated with a first database and a second feature extractor associated with a second database. The second feature extractor is an updated and/or alternate version of the first feature extractor. Once the image data is received by both feature extractors, user identification 404 and user enrollment 406 are performed concurrently. User identification 404 is performed with the former version of the feature extractor, whereas user enrollment is performed with the updated or alternate version of the feature extractor. User identification 404 comprises extracting feature vector(s) with the first feature extractor at step 408, and comparing the first feature vector(s) to user data stored in the first database to identify the user. User enrollment 406 comprises extracting second feature vector(s) with the second feature extractor at step 412, and storing the second feature vector(s) with user data in the second database at step 414. [0025] In some embodiments, user enrollment 406 is performed in compliance with privacy regulations by updating the user identification system without storing image data that would allow a user to be identified, such as face images. In this case, step 414 of enrolling users in the second database comprises associating an anonymous user ID to the second feature vector and storing the anonymous user ID and second feature vector(s) in the second database. In some embodiments, the user ID is obtained from an ID reader concurrently with the acquisition of the image data for facial recognition. While this may be done to facilitate the update to the user identification system, it may also allow a transition from a credentials-based system that only uses an ID reader to a facial recognition system based on image data.

[0026] In some embodiments, and as illustrated in Fig. 4B, the method 400 further comprises completing the update to the user identification system when one or more update criteria has been met. At step 416, the second feature extractor and second database are transitioned to user identification, which may involve taking the first feature extractor and the first database offline. Alternatively, both feature extractors and both databases are used concurrently to perform user identification.

[0027] In some embodiments, the update is not only for the feature extractor, but also for the pre-processing steps. In this case, pre-processing of the image data 402 is performed separately for each feature extractor, as illustrated in the embodiment of Fig. 4C.

[0028] In some embodiments, the user identification system 100 and the method 400 are implemented in one or more computing device 500, as illustrated in Fig. 5. For simplicity only one computing device 500 is shown but the user identification system 100 may include more computing devices 500 operable to exchange data. For example, each instance 300, 302, 304, 306 may be implemented in a separate computing device 500. The computing devices 500 may be the same or different types of devices.

[0029] The computing device 500 comprises a processing unit 502 and a memory 504 which has stored therein computer-executable instructions 506. The processing unit 502 may comprise any suitable devices configured to implement a method such that instructions 506, when executed by the computing device 500 or other programmable apparatus, may cause the functions/acts/steps to be executed. The processing unit 502 may comprise, for example, any type of general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, a central processing unit (CPU), a graphics processing unit (GPU), Al-enabled chips, an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, other suitably programmed or programmable logic circuits, or any combination thereof.

[0030] The memory 504 may comprise any suitable known or other machine-readable storage medium. The memory 504 may comprise non-transitory computer readable storage medium, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. The memory 504 may include a suitable combination of any type of computer memory that is located either internally or externally to device, for example random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro- optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like. Memory 504 may comprise any storage means (e.g., devices) suitable for retrievably storing machine-readable instructions 506 executable by processing unit 502.

[0031] The methods and systems described herein may be implemented in a high level procedural or object oriented programming or scripting language, or a combination thereof, to communicate with or assist in the operation of a computer system, for example the computing device 500. Alternatively, the methods and systems may be implemented in assembly or machine language. The language may be a compiled or interpreted language. Program code for implementing the methods and systems may be stored on a storage media or a device, for example a ROM, a magnetic disk, an optical disc, a flash drive, or any other suitable storage media or device. The program code may be readable by a general or special-purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein. Embodiments of the methods and systems may also be considered to be implemented by way of a non-transitory computer-readable storage medium having a computer program stored thereon. The computer program may comprise computer-readable instructions which cause a computer, or more specifically the processing unit 502 of the computing device 500, to operate in a specific and predefined manner to perform the functions described herein, for example those described in the method 700.

[0032] Computer-executable instructions may be in many forms, including program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.

[0033] The embodiments described herein provide useful physical machines and particularly configured computer hardware arrangements. The embodiments described herein are directed to electronic machines and methods implemented by electronic machines adapted for processing and transforming electromagnetic signals which represent various types of information. The embodiments described herein pervasively and integrally relate to machines, and their uses; and the embodiments described herein have no meaning or practical applicability outside their use with computer hardware, machines, and various hardware components. Substituting the physical hardware particularly configured to implement various acts for non-physical hardware, using mental steps for example, may substantially affect the way the embodiments work. Such computer hardware limitations are clearly essential elements of the embodiments described herein, and they cannot be omitted or substituted for mental means without having a material effect on the operation and structure of the embodiments described herein. The computer hardware is essential to implement the various embodiments described herein and is not merely used to perform steps expeditiously and in an efficient manner.

[0034] The term “connected” or "coupled to" may include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements).

[0035] The technical solution of embodiments may be in the form of a software product. The software product may be stored in a non-volatile or non-transitory storage medium, which can be a compact disk read-only memory (CD-ROM), a USB flash disk, or a removable hard disk. The software product includes a number of instructions that enable a computer device (personal computer, server, or network device) to execute the methods provided by the embodiments.

[0036] The embodiments described in this document provide non-limiting examples of possible implementations of the present technology. Upon review of the present disclosure, a person of ordinary skill in the art will recognize that changes may be made to the embodiments described herein without departing from the scope of the present technology. For example, the parallel instance may be running an identical version of the original feature extractor but for geographical reasons, a second database needs to be populated with users. Yet further modifications could be implemented by a person of ordinary skill in the art in view of the present disclosure, which modifications would be within the scope of the present technology.




 
Previous Patent: SHEET PROCESSING MACHINE

Next Patent: WINDING OPTIMIZATION