Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS, METHOD AND COMPUTER PROGRAM FOR MANAGING A REQUEST FOR COGNITIVE NETWORK FUNCTIONS AND/OR MACHINE LEARNING MODELS
Document Type and Number:
WIPO Patent Application WO/2022/218519
Kind Code:
A1
Abstract:
The disclosure relates to an apparatus comprising means for: storing (900) information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving (902) a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; processing (904) the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and sending (906) a response to the first request based on the processing.

Inventors:
MWANJE STEPHEN (DE)
GOERGE JÜRGEN (DE)
Application Number:
PCT/EP2021/059631
Publication Date:
October 20, 2022
Filing Date:
April 14, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
International Classes:
H04L12/24
Domestic Patent References:
WO2020253953A12020-12-24
Other References:
LIU TENGFEI CHINA UNICOM P R CHINA: "Architecture for ML marketplace integration in future networks including IMT-2020;ML5G-I-146-R9", vol. ML5G, 12 June 2019 (2019-06-12), pages 1 - 30, XP044267640, Retrieved from the Internet [retrieved on 20190612]
QI SUN CHINA MOBILE P R CHINA: "Draft new Recommendation ITU-T Y.ML-IMT2020-Data-Handling: "Framework of data handling to enable machine learning in future networks including IMT-2020" (output of Q20/13 e-meetings in August 2019);TD409/WP1", vol. 20/13, 24 September 2019 (2019-09-24), pages 1 - 52, XP044274255, Retrieved from the Internet [retrieved on 20190924]
Attorney, Agent or Firm:
NOKIA EPO REPRESENTATIVES (FI)
Download PDF:
Claims:
CLAIMS

1. An apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured, with the at least one processor, to cause the apparatus at least to: store information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; process the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request based on the processing.

2. The apparatus of claim 1 , wherein processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement comprises: comparing the information relating to the plurality of cognitive network functions and/or machine learning models to the requirement to identify a cognitive network function and/or a machine learning model fulfilling the requirement.

3. The apparatus of claim 1 or claim 2, wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; send a second request for the identified cognitive network function and/or machine learning model; receive a response to the second request comprising the identified cognitive network function and/or machine learning model; and send a response to the first request comprising the identified cognitive network function and/or machine learning model.

4. The apparatus of claim 1 or claim 2, wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request comprising information relating to the identified cognitive network function and/or machine learning model.

5. The apparatus of claim 1 or claim 2, wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: identify no cognitive network function and/or machine learning model fulfilling the requirement; and send a response to the first request comprising an indication that no cognitive network function and/or machine learning model has been identified.

6. The apparatus of any of claims 1 to 5, wherein the first request comprises an expected context or a detected context; and wherein the requirement comprises an availability of a cognitive network function and/or a machine learning model with a training context matching the expected context or detected context.

7. The apparatus of any of claim 1 to 6, wherein each training context, expected context or detected context comprises one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute.

8. The apparatus of claim 7, wherein the first request originates from a network management function or a cognitive network function deployed on the network; and wherein the response to the first request is directed to the network management function or the cognitive network function deployed on the network.

9. The apparatus of any of claims 3 to 8, wherein the second request is directed to a cognitive network function and model repository function deployed on the network; and wherein the response to the second request originates from the cognitive network function and model repository function deployed on the network.

10. The apparatus of any of claims 1 to 9, wherein the apparatus is a cognitive network function and model selection function deployed on the network. 11. The apparatus of claim 9 or claim 10, wherein the cognitive network function and model selection function is integrated within the cognitive network function and model repository function or is separate from the cognitive network function and model repository function. 12. The apparatus of any of claims 1 to 11 , wherein the at least one memory and the computer code are configured, with the at least one processor, to cause the apparatus at least to: receive a first request for all cognitive network functions and/or all machine learning models fulfilling a requirement or for information relation to all cognitive network functions and/or all machine learning models fulfilling a requirement; process the first request to identify all cognitive network functions and/or all machine learning models fulfilling the requirement.

13. The apparatus of any of claims 1 to 12, wherein each cognitive network function is associated with a cognitive network function information object class in an information model; wherein each machine learning model is associated with a machine learning model information object class in the information model; and/or wherein each training context, expected context or detected context is associated with a training context, expected context or detected context information object class in the information model. 14. The apparatus of claim 13, wherein the cognitive network function information object class comprises a cognitive network function properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

15. The apparatus of claim 13 or claim 14, wherein the machine learning model information object class comprises a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

16. The apparatus of any of claims 13 to 15, wherein the training context, expected context or detected context information object class comprises one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

17. The apparatus of any of claims 1 to 16, wherein each cognitive network function is identified by a cognitive network function identifier; and/or wherein each machine learning model is associated with a machine learning model identifier.

18. An apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured, with the at least one processor, to cause the apparatus at least to: store a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and send a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement.

19. The apparatus of claim 18, wherein the second request originates from a cognitive network function and model selection function deployed on the network; and wherein the response to the second request is directed to the cognitive network function deployed on the network and model selection function.

20. The apparatus of claim 18 or claim 19, wherein the apparatus is a cognitive network function and model repository function deployed on the network.

21 . The apparatus of claim 19 and claim 20, wherein the cognitive network function and model repository function integrates the cognitive network function and model selection function or is separate from the cognitive network function and model selection function.

22. A method comprising: storing information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and sending a response to the first request based on the processing.

23. A method comprising: storing a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and sending a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement. 24. A computer program comprising computer executable instructions which when run on one or more processors perform the steps of the method of claim 22 or claim 23.

Description:
APPARATUS, METHOD AND COMPUTER PROGRAM FOR MANAGING A REQUEST FOR COGNITIVE NETWORK FUNCTIONS AND/OR MACHINE LEARNING MODELS

Field of the disclosure

The present disclosure relates to an apparatus, a method, and a computer program for managing a request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement.

Background

A communication system can be seen as a facility that enables communication sessions between two or more entities such as communication devices, base stations and/or other nodes by providing carriers between the various entities involved in the communications path.

The communication system may be a wireless communication system. Examples of wireless systems comprise public land mobile networks (PLMN) operating based on radio standards such as those provided by 3GPP, satellite based communication systems and different wireless local networks, for example wireless local area networks (WLAN). The wireless systems can typically be divided into cells, and are therefore often referred to as cellular systems.

The communication system and associated devices typically operate in accordance with a given standard or specification which sets out what the various entities associated with the system are permitted to do and how that should be achieved. Communication protocols and/or parameters which shall be used for the connection are also typically defined. Examples of standard are the so-called 5G standards.

Summary According to an aspect there is provided an apparatus comprising means for: storing information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and sending a response to the first request based on the processing.

Processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement may comprise: comparing the information relating to the plurality of cognitive network functions and/or machine learning models to the requirement to identify a cognitive network function and/or a machine learning model fulfilling the requirement.

The apparatus may comprise means for: identifying a cognitive network function and/or a machine learning model fulfilling the requirement; send a second request for the identified cognitive network function and/or machine learning model; receiving a response to the second request comprising the identified cognitive network function and/or machine learning model; and sending a response to the first request comprising the identified cognitive network function and/or machine learning model. The apparatus may comprise means for: identifying a cognitive network function and/or a machine learning model fulfilling the requirement; and sending a response to the first request comprising information relating to the identified cognitive network function and/or machine learning model. The apparatus may comprise means for: identifying no cognitive network function and/or machine learning model fulfilling the requirement; and sending a response to the first request comprising an indication that no cognitive network function and/or machine learning model has been identified. The first request may comprise an expected context or a detected context; and the requirement may comprise an availability of a cognitive network function and/or a machine learning model with a training context matching the expected context or detected context.

Each training context, expected context or detected context may comprise one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute.

The first request may originates from a network management function or a cognitive network function deployed on the network; and the response to the first request may be directed to the network management function or the cognitive network function deployed on the network.

The second request may be directed to a cognitive network function and model repository function deployed on the network; and the response to the second request may originate from the cognitive network function and model repository function deployed on the network.

The apparatus may be a cognitive network function and model selection function deployed on the network.

The cognitive network function and model selection function may be integrated within the cognitive network function and model repository function or may be separate from the cognitive network function and model repository function.

The apparatus may comprise means for: receiving a first request for all cognitive network functions and/or all machine learning models fulfilling a requirement or for information relation to all cognitive network functions and/or all machine learning models fulfilling a requirement; processing the first request to identify all cognitive network functions and/or all machine learning models fulfilling the requirement.

Each cognitive network function may be associated with a cognitive network function information object class in an information model; each machine learning model may be associated with a machine learning model information object class in the information model; and/or each training context, expected context or detected context may be associated with a training context, expected context or detected context information object class in the information model.

The cognitive network function information object class may comprise a cognitive network function properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives. The machine learning model information object class may comprise a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives. The training context, expected context or detected context information object class may comprise one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

Each cognitive network function may be identified by a cognitive network function identifier; and/or each machine learning model may be associated with a machine learning model identifier. According to an aspect there is provided an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured, with the at least one processor, to cause the apparatus at least to: store information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; process the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request based on the processing. Processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement may comprise: comparing the information relating to the plurality of cognitive network functions and/or machine learning models to the requirement to identify a cognitive network function and/or a machine learning model fulfilling the requirement.

The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; send a second request for the identified cognitive network function and/or machine learning model; receive a response to the second request comprising the identified cognitive network function and/or machine learning model; and send a response to the first request comprising the identified cognitive network function and/or machine learning model.

The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request comprising information relating to the identified cognitive network function and/or machine learning model. The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: identify no cognitive network function and/or machine learning model fulfilling the requirement; and send a response to the first request comprising an indication that no cognitive network function and/or machine learning model has been identified.

The first request may comprise an expected context or a detected context; and the requirement may comprise an availability of a cognitive network function and/or a machine learning model with a training context matching the expected context or detected context.

Each training context, expected context or detected context may comprise one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute. The first request may originates from a network management function or a cognitive network function deployed on the network; and the response to the first request may be directed to the network management function or the cognitive network function deployed on the network. The second request may be directed to a cognitive network function and model repository function deployed on the network; and the response to the second request may originate from the cognitive network function and model repository function deployed on the network. The apparatus may be a cognitive network function and model selection function deployed on the network. The cognitive network function and model selection function may be integrated within the cognitive network function and model repository function or may be separate from the cognitive network function and model repository function. The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: receive a first request for all cognitive network functions and/or all machine learning models fulfilling a requirement or for information relation to all cognitive network functions and/or all machine learning models fulfilling a requirement; process the first request to identify all cognitive network functions and/or all machine learning models fulfilling the requirement.

Each cognitive network function may be associated with a cognitive network function information object class in an information model; each machine learning model may be associated with a machine learning model information object class in the information model; and/or each training context, expected context or detected context may be associated with a training context, expected context or detected context information object class in the information model.

The cognitive network function information object class may comprise a cognitive network function properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

The machine learning model information object class may comprise a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

The training context, expected context or detected context information object class may comprise one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

Each cognitive network function may be identified by a cognitive network function identifier; and/or each machine learning model may be associated with a machine learning model identifier.

According to an aspect there is provided an apparatus comprising circuitry configured to: store information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; process the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request based on the processing.

Processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement may comprise: comparing the information relating to the plurality of cognitive network functions and/or machine learning models to the requirement to identify a cognitive network function and/or a machine learning model fulfilling the requirement. The circuitry may be configured to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; send a second request for the identified cognitive network function and/or machine learning model; receive a response to the second request comprising the identified cognitive network function and/or machine learning model; and send a response to the first request comprising the identified cognitive network function and/or machine learning model. The circuitry may be configured to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request comprising information relating to the identified cognitive network function and/or machine learning model.

The circuitry may be configured to: identify no cognitive network function and/or machine learning model fulfilling the requirement; and send a response to the first request comprising an indication that no cognitive network function and/or machine learning model has been identified.

The first request may comprise an expected context or a detected context; and the requirement may comprise an availability of a cognitive network function and/or a machine learning model with a training context matching the expected context or detected context.

Each training context, expected context or detected context may comprise one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute.

The first request may originates from a network management function or a cognitive network function deployed on the network; and the response to the first request may be directed to the network management function or the cognitive network function deployed on the network.

The second request may be directed to a cognitive network function and model repository function deployed on the network; and the response to the second request may originate from the cognitive network function and model repository function deployed on the network. The apparatus may be a cognitive network function and model selection function deployed on the network.

The cognitive network function and model selection function may be integrated within the cognitive network function and model repository function or may be separate from the cognitive network function and model repository function.

The circuitry may be configured to: receive a first request for all cognitive network functions and/or all machine learning models fulfilling a requirement or for information relation to all cognitive network functions and/or all machine learning models fulfilling a requirement; process the first request to identify all cognitive network functions and/or all machine learning models fulfilling the requirement.

Each cognitive network function may be associated with a cognitive network function information object class in an information model; each machine learning model may be associated with a machine learning model information object class in the information model; and/or each training context, expected context or detected context may be associated with a training context, expected context or detected context information object class in the information model.

The cognitive network function information object class may comprise a cognitive network function properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives. The machine learning model information object class may comprise a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives. The training context, expected context or detected context information object class may comprise one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

Each cognitive network function may be identified by a cognitive network function identifier; and/or each machine learning model may be associated with a machine learning model identifier.

According to an aspect there is provided a method comprising: storing information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and sending a response to the first request based on the processing.

Processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement may comprise: comparing the information relating to the plurality of cognitive network functions and/or machine learning models to the requirement to identify a cognitive network function and/or a machine learning model fulfilling the requirement.

The apparatus may comprise means for: identifying a cognitive network function and/or a machine learning model fulfilling the requirement; send a second request for the identified cognitive network function and/or machine learning model; receiving a response to the second request comprising the identified cognitive network function and/or machine learning model; and sending a response to the first request comprising the identified cognitive network function and/or machine learning model. The method may comprise: identifying a cognitive network function and/or a machine learning model fulfilling the requirement; and sending a response to the first request comprising information relating to the identified cognitive network function and/or machine learning model.

The method may comprise: identifying no cognitive network function and/or machine learning model fulfilling the requirement; and sending a response to the first request comprising an indication that no cognitive network function and/or machine learning model has been identified.

The first request may comprise an expected context or a detected context; and the requirement may comprise an availability of a cognitive network function and/or a machine learning model with a training context matching the expected context or detected context.

Each training context, expected context or detected context may comprise one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute.

The first request may originates from a network management function or a cognitive network function deployed on the network; and the response to the first request may be directed to the network management function or the cognitive network function deployed on the network.

The second request may be directed to a cognitive network function and model repository function deployed on the network; and the response to the second request may originate from the cognitive network function and model repository function deployed on the network. The method may be performed by a cognitive network function and model selection function deployed on the network.

The cognitive network function and model selection function may be integrated within the cognitive network function and model repository function or may be separate from the cognitive network function and model repository function.

The method may comprise: receiving a first request for all cognitive network functions and/or all machine learning models fulfilling a requirement or for information relation to all cognitive network functions and/or all machine learning models fulfilling a requirement; processing the first request to identify all cognitive network functions and/or all machine learning models fulfilling the requirement.

Each cognitive network function may be associated with a cognitive network function information object class in an information model; each machine learning model may be associated with a machine learning model information object class in the information model; and/or each training context, expected context or detected context may be associated with a training context, expected context or detected context information object class in the information model.

The cognitive network function information object class may comprise a cognitive network function properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives. The machine learning model information object class may comprise a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives. The training context, expected context or detected context information object class may comprise one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

Each cognitive network function may be identified by a cognitive network function identifier; and/or each machine learning model may be associated with a machine learning model identifier. According to an aspect there is provided a computer program comprising computer executable code which when run on at least one processor is configured to: < store information relating to a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a first request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement; process the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request based on the processing.

Processing the first request to identify a cognitive network function and/or a machine learning model fulfilling the requirement may comprise: comparing the information relating to the plurality of cognitive network functions and/or machine learning models to the requirement to identify a cognitive network function and/or a machine learning model fulfilling the requirement.

The computer program may comprise computer executable code which when run on at least one processor is configured to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; send a second request for the identified cognitive network function and/or machine learning model; receive a response to the second request comprising the identified cognitive network function and/or machine learning model; and send a response to the first request comprising the identified cognitive network function and/or machine learning model.

The computer program may comprise computer executable code which when run on at least one processor is configured to: identify a cognitive network function and/or a machine learning model fulfilling the requirement; and send a response to the first request comprising information relating to the identified cognitive network function and/or machine learning model.

The computer program may comprise computer executable code which when run on at least one processor is configured to: identify no cognitive network function and/or machine learning model fulfilling the requirement; and send a response to the first request comprising an indication that no cognitive network function and/or machine learning model has been identified.

The first request may comprise an expected context or a detected context; and the requirement may comprise an availability of a cognitive network function and/or a machine learning model with a training context matching the expected context or detected context.

Each training context, expected context or detected context may comprise one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute.

The first request may originates from a network management function or a cognitive network function deployed on the network; and the response to the first request may be directed to the network management function or the cognitive network function deployed on the network. The second request may be directed to a cognitive network function and model repository function deployed on the network; and the response to the second request may originate from the cognitive network function and model repository function deployed on the network.

The at least one processor may be part of a cognitive network function and model selection function deployed on the network.

The cognitive network function and model selection function may be integrated within the cognitive network function and model repository function or may be separate from the cognitive network function and model repository function.

The computer program may comprise computer executable code which when run on at least one processor is configured to: receive a first request for all cognitive network functions and/or all machine learning models fulfilling a requirement or for information relation to all cognitive network functions and/or all machine learning models fulfilling a requirement; process the first request to identify all cognitive network functions and/or all machine learning models fulfilling the requirement.

Each cognitive network function may be associated with a cognitive network function information object class in an information model; each machine learning model may be associated with a machine learning model information object class in the information model; and/or each training context, expected context or detected context may be associated with a training context, expected context or detected context information object class in the information model.

The cognitive network function information object class may comprise a cognitive network function properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

The machine learning model information object class may comprise a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

The training context, expected context or detected context information object class may comprise one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

Each cognitive network function may be identified by a cognitive network function identifier; and/or each machine learning model may be associated with a machine learning model identifier.

According to an aspect there is provided an apparatus comprising means for: storing a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and sending a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement.

The second request may originate from a cognitive network function and model selection function deployed on the network. The response to the second request may be directed to the cognitive network function deployed on the network and model selection function.

The apparatus may be a cognitive network function and model repository function deployed on the network. The cognitive network function and model repository function may integrate the cognitive network function and model selection function or may be separate from the cognitive network function and model selection function.

According to an aspect there is provided an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured, with the at least one processor, to cause the apparatus at least to: store a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and send a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement.

The second request may originate from a cognitive network function and model selection function deployed on the network. The response to the second request may be directed to the cognitive network function deployed on the network and model selection function.

The apparatus may be a cognitive network function and model repository function deployed on the network.

The cognitive network function and model repository function may integrate the cognitive network function and model selection function or may be separate from the cognitive network function and model selection function. According to an aspect there is provided an apparatus comprising circuitry configured to store a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and send a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement.

The second request may originate from a cognitive network function and model selection function deployed on the network. The response to the second request may be directed to the cognitive network function deployed on the network and model selection function.

The apparatus may be a cognitive network function and model repository function deployed on the network. The cognitive network function and model repository function may integrate the cognitive network function and model selection function or may be separate from the cognitive network function and model selection function.

According to an aspect there is provided a method comprising: storing a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receiving a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and sending a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement.

The second request may originate from a cognitive network function and model selection function deployed on the network. The response to the second request may be directed to the cognitive network function deployed on the network and model selection function. The method may be performed by a cognitive network function and model repository function deployed on the network.

The cognitive network function and model repository function may integrate the cognitive network function and model selection function or may be separate from the cognitive network function and model selection function.

According to an aspect there is provided a computer program comprising computer executable code which when run on at least one processor is configured to: store a plurality of cognitive network functions and/or machine learning models deployable on a network, each cognitive network function being associated with one or more machine learning models, each machine learning model being trained with one or more training contexts; receive a second request for an identified cognitive network function and/or machine learning model fulfilling a requirement; and send a response to the second request comprising the identified cognitive network function and/or machine learning model fulfilling the requirement.

The second request may originate from a cognitive network function and model selection function deployed on the network. The response to the second request may be directed to the cognitive network function deployed on the network and model selection function.

The at least one processor may be a cognitive network function and model repository function deployed on the network.

The cognitive network function and model repository function may integrate the cognitive network function and model selection function or may be separate from the cognitive network function and model selection function. According to an aspect, there is provided a computer readable medium comprising program instructions stored thereon for performing at least one of the above methods. According to an aspect, there is provided a non-transitory computer readable medium comprising program instructions stored thereon for performing at least one of the above methods. According to an aspect, there is provided a non-volatile tangible memory medium comprising program instructions stored thereon for performing at least one of the above methods.

In the above, many different aspects have been described. It should be appreciated that further aspects may be provided by the combination of any two or more of the aspects described above.

Various other aspects are also described in the following detailed description and in the attached claims.

List of abbreviations

AF: Application Function

Al: Artificial Intelligence AMF: Access and Mobility Management Function

API: Application Protocol Interface

BS: Base Station

CM: Conditional Mandatory

CNF: Cognitive Network Function CoMDI: Cognitive network function and model Deployment Interface CoMReF: Cognitive network function and Model Repository Function CoMSeF: Cognitive network function and Model Selection Function CoMSI: Cognitive network function and model Specification Interface CU: Centralized Unit DL: Downlink

DU: Distributed Unit

F: False gNB: gNodeB GSM: Global System for Mobile communication

HSS: Home Subscriber Server

IOC: Information Object Class loT: Internet of Things LTE: Long Term Evolution

M: Mandatory

MAC: Medium Access Control MDAS : Management Data Analytics Service ML : Machine Learning MS: Mobile Station MTC: Machine Type Communication NEF: Network Exposure Function NF: Network Function

NR: New radio NRF: Network function Repository Function

0: Optional

PDU: Packet Data Unit RAM: Random Access Memory (R)AN: (Radio) Access Network ROM: Read Only Memory SINR: Signal to Interference Plus Noise Ratio SMF: Session Management Function NSSAI: Network Slice Selection Assistance Information T: True TR: Technical Report

TS: Technical Specification

UE: User Equipment UMTS: Universal Mobile Telecommunication System 3GPP: 3 rd Generation Partnership Project 5G: 5 th Generation

5GC: 5G Core network 5GS: 5G System Brief Description of the Figures

Embodiments will now be described, by way of example only, with reference to the accompanying Figures in which: Figure 1 shows a schematic representation of a 5G system;

Figure 2 shows a schematic representation of a control apparatus;

Figure 3 shows a schematic representation of a terminal;

Figure 4 shows a schematic representation of a 5GC sub-system comprising a cognitive network function and model repository function and cognitive network function and model selection function with interfaces for requesting a cognitive network function and/or a machine learning model, where the cognitive network function and model repository function integrates the cognitive network function and model selection function;

Figures 5a and 5b shows a signaling diagram of a process for managing a request for a cognitive network function and/or a machine learning model fulfilling a requirement;

Figures 6a and 6b shows a block diagram illustrating the relationships among a cognitive network function, machine learning models and other network functions;

Figure 7 shows a schematic representation of a 5GC sub-system comprising a cognitive network function and model repository function and cognitive network function and model selection function with interfaces for requesting a cognitive network function and/or a machine learning model, where the cognitive network function and model repository function is separate from the cognitive network function and model selection function;

Figure 8 shows a block diagram of a process for managing a request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement;

Figure 9 shows a block diagram of a method for managing a request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement, performed for example by a cognitive network function and model selection function; Figure 10 shows a block diagram of a method for managing a request for a cognitive network function and/or a machine learning model fulfilling a requirement or for information relating to a cognitive network function and/or a machine learning model fulfilling a requirement, performed for example by a cognitive network function and model repository function; and

Figure 11 shows a schematic representation of a non-volatile memory medium storing instructions which when executed by a processor allow a processor to perform one or more of the steps of the methods of Figures 9 and 10. Detailed Description of the Figures

In the following certain embodiments are explained with reference to mobile communication devices capable of communication via a wireless cellular system and mobile communication systems serving such mobile communication devices. Before explaining in detail the exemplifying embodiments, certain general principles of a wireless communication system, access systems thereof, and mobile communication devices are briefly explained with reference to Figures 1, 2 and 3 to assist in understanding the technology underlying the described examples.

Figure 1 shows a schematic representation of a 5G system (5GS). The 5GS may comprises a terminal, a (radio) access network ((R)AN), a 5G core network (5GC), one or more application functions (AF) and one or more data networks (DN). The 5G (R)AN may comprise one or more gNodeB (gNB) distributed unit functions connected to one or more gNodeB (gNB) centralized unit functions.

The 5GC may comprise an access and mobility management function (AMF), a session management function (SMF), an authentication server function (AUSF), a user data management (UDM), a user plane function (UPF) and/or a network exposure function (NEF). Although not illustrated the 5GC may comprise other network functions (NF), such as a network management function (NMF), a cognitive network function (CNF) cognitive network function and model repository function (CoMReF) and a cognitive network function and model selection function (CoMSeF).

Figure 2 illustrates an example of a control apparatus 200 for controlling a function of the (R)AN or the 5GC as illustrated on Figure 1 . The control apparatus may comprise at least one random access memory (RAM) 211a, at least on read only memory (ROM) 211 b, at least one processor 212, 213 and an input/output interface 214. The at least one processor 212, 213 may be coupled to the RAM 211a and the ROM 211 b. The at least one processor 212, 213 may be configured to execute an appropriate software code 215. The software code 215 may for example allow to perform one or more steps to perform one or more of the present aspects. The software code 215 may be stored in the ROM 211 b. The control apparatus 200 may be interconnected with another control apparatus 200 controlling another function of the 5G (R)AN or the 5GC. In some embodiments, each function of the (R)AN or the 5GC comprises a control apparatus 200. In alternative embodiments, two or more functions of the (R)AN or the 5GC may share a control apparatus.

Figure 3 illustrates an example of a terminal 300, such as the terminal illustrated on Figure 1. The terminal 300 may be provided by any device capable of sending and receiving radio signals. Non-limiting examples comprise a user equipment, a mobile station (MS) or mobile device such as a mobile phone or what is known as a ’smart phone’, a computer provided with a wireless interface card or other wireless interface facility (e.g., USB dongle), a personal data assistant (PDA) or a tablet provided with wireless communication capabilities, a machine-type communications (MTC) device, a Cellular Internet of things (CloT) device or any combinations of these or the like. The terminal 300 may provide, for example, communication of data for carrying communications. The communications may be one or more of voice, electronic mail (email), text message, multimedia, data, machine data and so on. The terminal 300 may receive signals over an air or radio interface 307 via appropriate apparatus for receiving and may transmit signals via appropriate apparatus for transmitting radio signals. In Figure 3 transceiver apparatus is designated schematically by block 306. The transceiver apparatus 306 may be provided for example by means of a radio part and associated antenna arrangement. The antenna arrangement may be arranged internally or externally to the mobile device.

The terminal 300 may be provided with at least one processor 301, at least one memory ROM 302a, at least one RAM 302b and other possible components 303 for use in software and hardware aided execution of tasks it is designed to perform, including control of access to and communications with access systems and other communication devices. The at least one processor 301 is coupled to the RAM 302b and the ROM 302a. The at least one processor 301 may be configured to execute an appropriate software code 308. The software code 308 may for example allow to perform one or more of the present aspects. The software code 308 may be stored in the ROM 302a.

The processor, storage and other relevant control apparatus can be provided on an appropriate circuit board and/or in chipsets. This feature is denoted by reference 304. The device may optionally have a user interface such as keypad 305, touch sensitive screen or pad, combinations thereof or the like. Optionally one or more of a display, a speaker and a microphone may be provided depending on the type of the device.

One or more aspects of the present disclosure relate to NFs using artificial intelligence (Al) models and more specifically to machine learning (ML) models.

NFs using ML models are steadily gaining ground in communication networks. Such NFs may for example be used for automating network management or for fundamental network procedures like resource scheduling. Typically, a ML model is developed and trained by a vendor. The ML model may be trained with one or more ML model version contexts (i.e. network training contexts) during a training or development phase. Then, when it is confirmed that the ML model achieves a desired behavior, the vendor may parameterize and bind the ML model with operational meta information to implement a cognitive network function (CNF). On deployment into an operator’s production network, the CNF may be tested and validated by the operator (e.g., using a sandbox environment). The CNF may be deployed into operation after successfully passing the validation.

Alternatively, multiple ML models may be developed and trained by a vendor. Each ML model may be trained with one or more ML model version contexts during a training or development phase. The vendor may parameterize and bind the multiple ML models with operational meta information to implement a CNF. For optimal performance, a ML model among the multiple ML models may be selected so that the ML model version context for which the ML model has been trained matches a network expected context or a network detected context.

One or more aspects of the present disclosure provide techniques for supporting the request, identification and/or selection of a ML model so that the ML model version context for which a ML model has been trained matches a network expected context or a network detected context.

In a typical network operation, it may be possible to have multiple CNFs and/or multiple ML models available for supporting a same NF / addressing a same problem (e.g. optimization of handover thresholds). The multiple ML models may be trained with the same or different ML model version contexts (e.g. city centre vs. rural area).

In one example, multiple CNFs may have been purchased from different vendors for supporting a same NF/ addressing a same problem. Each CNF may be associated with a single ML model trained with a ML model version training context. In another example, a single CNF may have been purchased from a vendor for supporting a NF / addressing a problem. The single CNF may be associated with multiple ML models trained within different ML model version contexts. There can be significant variations in the network expected context or network detected context. It may then become important to choose among multiple CNFs and/or multiple ML models so that the ML model version context for which a ML model has been trained matches a network expected context or a network detected context. There may be a need to provide means for allowing an operator, a network automation platform or function on such a platform to request for a CNF and/or a ML model that fits a certain context or for information about a CNF and/or a ML model that fits a certain context. There may be a need to provide means for parsing such a request and for matching the request to one of multiple CNF s and/or one of multiple ML models.

There may be a need to provide means for sending one of multiple CNFs and/or one of multiple ML models or information relating to one of multiple CNFs and/or one of multiple ML models to the operator, the network automation platform or function on such a platform.

Moreover, to support such selection among the multiple CNFs and/or multiple ML models, there may be a need to provide means for describing the multiple CNFs and/or multiple ML models in an information model. Such information model may then allow standard information objects to be exchanged on interfaces despite multiple vendors, NFs, network contexts and/or a combination thereof. The information objects may use fields of well-defined semantics. One or more aspects of this disclosure provide a solution allowing a selection of a fitting context-specific CNF and/or ML model. Multiple architectures have been proposed for use of Al / ML in mobile networks (e.g. ITU-T FG-ML5G, FG-ML5G-ARC5G Unified architecture for machine learning in 5G and future networks and ETSI GS ZSM 002 VO.12.0 (2019-05) Reference Architecture). However, these architectures typically assume that there is only one ML model available to address all possible network contexts. They do not consider scenarios where multiple CNFs or ML models have been trained and where multiple CNFs or ML models need to be matched to a network expected context or a network detected context. The management data analytics service (MDAS) concept in 3GPP SA5, proposes to standardize the interfaces that allow consumers to request and receive analytics for a number of analytics use cases (e.g. S5-204246 pCR Add overview of MDA functionality). Although MDAS assumes that several analytics functions may be available, it does not provide any means through which one among the many available analytics functions may be selected according to a network training context, a network expected context or a network detected context. Instead, it is assumed either that the requestor knows exactly the kind of analytics they need and makes a request for the specific analytics or that the analytics function is able to select the correct model, because the analytics function knows the context of the consumer. As such MDAS would also benefit from a solution that provides means for such context-based selection of CNFs (in that case supporting analytics functions) and/or ML models.

Outside the networking area, there exist some approaches to add metadata to AI/ML models. It has for example been provided the possibility to add and retrieve ML metadata (e.g. Google Cloud, Data Analytics Products). The metadata is focused on the ML model and its training, like when it was created or which ML model type and various training details. The proposed models cannot be used for selecting among multiple network-focussed ML models or CNFs and no interfaces are provided for how such ML models may be requested for a network expected context or a network detected context.

One or more aspects of this disclosure propose a CNF model repository function (CoMReF) and mechanisms through which entities (e.g. an operator, a network automated platform or a network automated function on a network automated platform) may select CNFs and/or ML models and/or information (i.e. metadata) relating to the CNFs and/or ML models. One or more aspects of this disclosure propose a mechanism allowing for a structured way of defining CNFs and/or ML models .

One or more aspects of this disclosure propose a mechanism allowing for a structured way in which an external entity can describe desired CNFs, ML models and/or ML model version contexts so that a CoMReF can understand and avail such CNF s and/or ML models.

A CoMReF may be provided so that entities (e.g. an operator, a network automated platform or a network automated function on a network automated platform) may select CNFs and/or ML models and/or information (i.e. metadata) relating to the CNFs and/or ML models.

An information model may be provided. The information model may comprise information object class (IOC) for CNFs and/or ML models. The information model may support the interactions between entities (e.g. an operator, a network automated platform or a network automated function on a network automated platform) and the CoMReF. The information model may enable a standardized characterization of the CNFs and/or ML models. Standard specifications may be provided for such lOCs. The standard specifications may define a standard format for specifying a network training context, a network expected context or a network detected context. Vendors may provide trained ML models for a CNF with a standardized “scope” specification. A cognitive network function and model selection function (CoMSeF) may be provided. The CoMSeF may be an implementation of logic that enables the CoMReF to match the available CNFs and/or ML models to an incoming request and to identify a best fitting CNF and/or ML model. A structured interface, the Cognitive Network Function and Model Specification Interface (CoMSI), may be provided. The CoMSI may allow entities (e.g. an operator, a network automated platform or a network automated function on a network automated platform) to specify (automation) requirements (e.g. expected context or detected context) that need to be fulfilled and to request for a CNF and/or ML model that can fulfill these requirements. Through the CoMSI the CoMReF may return the available CNFs and/or ML models that fulfil the requirements or respond to inform the entities (e.g. an operator, a network automated platform or a network automated function on a network automated platform) when such requests cannot be met.

A mechanism for selecting CNFs and/or ML models may be provided using the CoMSI, CoMReF, CoMSeF and/or the information model. Figure 4 shows 5GC sub-system comprising a CoMReF and CoMSeF, where the CoMReF integrates the cognitive network function and model selection function CoMSeF.

The CoMReF may implement a database for storing the CNFs and ML models. The CoMReF may cooperate (optionally host) with the CoMSeF. The CoMSeF may be used to select the CNFs and/or ML models depending on requirements (e.g. network expected context or network detected context) stated on the CoMSI. A request for a CNF and/or a ML model may be initiated toward the CoMSeF by an operator, a network automated platform or a network automated function on a network automated platform. The request may state requirements (e.g. network expected context or network detected context) for a CNF and/or a ML model. The requirements may be stated in a well-defined format with well-defined fields. A priori an operator, a network automated platform or a network automated function on a network automated platform may not know which CNFs and ML models are available in the CoMReF. Also, an operator, a network automated platform or a network automated function on a network automated platform may not know the ML model version contexts with which the ML models have been trained. An operator, a network automated platform or a network automated function on a network automated platform may only know the requirements and thus may send a request that only states the requirements (e.g. network expected context or network detected context) to be matched to the ML model version contexts with which the ML models have been trained.

Following the request, the CoMSeF may evaluate the request by comparing the stated requirements (e.g. network expected context or network detected context) with the ML model version contexts with which the ML models have been trained. If a matching CNF and/or ML model is found, the CoMSeF may fetch the matched CNF and/or ML model from the CoMReF and may return the matched CNF and/or ML model to the operator, network automated platform or network automated function on a network automated platform for deployment on the network. In an implementation, the matched CNF and/or ML model may be deployed on the network by the CoMReF.

Figures 5a and 5b shows a signaling diagram of a process for managing a request for a CNF and/or a ML model fulfilling a requirement (e.g. network expected context or network detected context).

In step 1 a vendor may develop a CNF and trains multiple ML models with multiple ML model version contexts. The vendor may offer the CNF and/or ML models for sale via a sales catalogue (e.g. a special kind of CoMReF). The sales catalogue may expose CNF packages. A CNF package may comprise CNF. and/or ML models. For each CNF and/or ML model information (metadata) of the ML model version context may be provided in a well-defined machine-readable format. For each CNF and/or ML model information (metadata) of the network expected context and/or network detected context for which the vendor guarantees valid results may be provided. The information (metadata) may be provided in a well-defined machine-readable format.

In step 2, an operator may (manually) select the CNF and/or one of the multiple ML models that fits a network expected context based on the information (metadata) stored by the CoMReF.

In step 3 the operator may purchase the selected CNF and/or one of the multiple ML models. In step 3a the operator may check a CoMReF for the portfolio of already available CNF s and/or ML models by retrieving and inspecting the information (metadata) stored by the CoMReF.

In step 4 the operator may onboard the selected CNF and/or one of the multiple ML models to the CoMReF. The operator may onboard information (metadata) relating to the selected CNF and/or one of the multiple ML models to the CoMReF.

In step 5 during design time of a network management function (e.g. a closed loop) the operator may query the information (metadata) of the CoMReF to select a CNF and/or a ML model that matches a network expected context.

The network management function may require a CNF fit for the respective use case handled by that NMF. But there may be multiple CNFs or even for one CNF multiple ML models that can support the NMF. The NMF may need to identify the appropriate CNF and ML model which it then may deploy for its use case.

In step 6 during run time of the network management function, the network management function may determine a network detected context. The network detected context may be associated with a specific scope (e.g. area and/or time) of the network.

In step 7 based on the detected context and/or a network expected context the network management function may send a request to a CoMSeF for a CNF and/or a ML model with a ML model version context that matches the network detected context and/or the network expected context.

In step 7a the CoMSeF may process the request. The CoMSeF may store metadata relating to the CNFs and/or ML models stored on the CoMReF. The metadata may comprise the ML model version contexts with which the ML models have been trained. The CoMSeF may compare the network detected context and/or the network expected context with the ML model version contexts with which the ML models have been trained. The ML model version context may comprise training conditions, a training state, operating or inference conditions, reference performance, CNF Properties / parameters, data characteristics or other (as discussed below). The ML model version context may also comprise expected conditions.

In step 7b the CoMSeF may send a request to the CoMReF for a CNF and/or a ML model with a ML model version context that matches the network detected context and/or the network expected context. The CoMReF may identify a CNF and/or a ML model with a ML model version context that matches the network detected context and/or the network expected context.

In step 7c the CoMReF may send a response to the request to the CoMSeF comprising the CNF and/or the ML model with a ML model version context that matches the network detected context and/or the network expected context.

In step 7d, the CoMSeF may send a response to the request to the network management function comprising the CNF and/or the ML model with a ML model version context that matches the network detected context and/or the network expected context.

It will be understood that although in the above the requirement to be matched to a ML model version context is the network detected context and/or the network expected context other characteristics may be stated in the requests and may be matched to the ML model version context (e.g. training state, CNF properties or parameters, data characteristics or other).

In step 8 the network management function may run the CNF, which further on acts autonomously. In step 9 during run time the CNF may determine a network detected context. The network detected context may be associated with a specific scope (e.g. area or time) of the network. In step 10 based on the network detected context and/or a network expected context the CNF may send a request to the CoMSeF for a CNF and/or a ML model with a ML model version context that matches the network detected context and/or the network expected context.

In step 10a the CoMSeF may process the request. The CoMSeF may store metada relating to the CNFs and/or ML models stored on the CoMReF. The metadata may comprise the ML model version contexts with which the ML models have been trained. The CoMSeF may compare the network detected context and/or the network expected context with the ML model version contexts with which the ML models have been trained. The ML model version context may comprise training conditions, a training state, operating or inference conditions, reference performance, CNF Properties / parameters, data characteristics or other (as discussed below). The ML model version context may also comprise expected conditions.

In step 10b the CoMSeF may send a request to the CoMReF for a ML model version with a ML model version context that matches the network detected context and/or the network expected context. The CoMReF may identify a ML model with a ML model version context that matches the network detected context and/or the network expected context.

In step 10c, the CoMReF may send a response to the request to the CoMSeF comprising the ML model with a ML model version context that matches the network detected context and/or the network expected context.

In step 10d, the CoMSeF may send a response to the request to the CNF comprising the ML model with a ML model version context that matches the network detected context and/or the network expected context. Again, it will be understood that although in the above the requirement to be matched to a ML model version context is the network detected context and/or the network expected context other characteristics may be stated in the requests and may be matched to the ML model version context (e.g. training state, CNF properties or parameters, data characteristics or other).

The formal description of a CNF and/or ML model may include the ML model version context that applies to the CNF and/or ML model. Multiple ML models may be maintained for a CNF, each differing in the ML model version context. A ML model version context may be identified by ML model version context attributes.

Table 1 illustrate an example of a formal description of a CNF. The CNF may be identified by a CNF identifier (e.g. Nokia_CMO_ 2021a) and a CNF version identifier (e.g. NK01_Rural). The CNF version identifier may be associated with a specific scope (e.g. designed for a rural area). The CNF may be associated with one or more ML models. Each ML model may be identified by a ML model identifier (e.g. NK01_Rural) and a ML model version identifier (NK01_Rural_v2102). Each ML model version identifier may be associated with a specific scope (e.g. designed for a rural area). Each ML model version identifier may be associated with a specific sub-scope (e.g. auto generated based training data from February 2021). Each ML model may be associated with one or more ML model version context. The ML model version context may be identified by ML model version context attributes.

Table 1

The ML model version context may comprise a training conditions attribute. The training conditions may comprise the conditions under which the ML model was trained. Such conditions may include statistic of UE velocity during training, statistic of channel conditions during training, statistics of signal to interference plus noise ratio (SINR) during training, cell traffic during training or other.

The ML model version context may comprise a training state attribute. The training state may indicate an amount of training data and/or duration of training runs. A training state may for example distinguish a partially trained version from a mature version.

The ML model version context may comprise operating or inference conditions. The operating or inference conditions may be the conditions under which the ML model may perform best. The ML model may have been trained on a wide data set but the capability or achievable performance may not be uniform across the data set. The ML model may not have been trained or may have been less trained on a sub-space of a data set. In such cases it may be useful to identify a sub-space of a data set in which the ML model performs best. The operating or inference conditions may for example state desired traffic characteristics.

The ML model version context may comprise a reference performance attribute. The reference performance may indicate an expected performance of the ML model. The reference performance may track an exact performance achieved by the ML model on a set of specified test cases. The test cases may indicate differences in usage scenarios.

The ML model version context may comprise a CNF properties or parameters attribute. The CNF properties or parameters may document the static properties of functions themselves or the algorithms used to create the functions. Example information may include neural network parameters like activation functions, number and size of layers, etc. The ML model version context may comprise a data characteristics attribute. The data characteristics may comprise the characteristics of the assumed input data (e.g. whether the assumed input data is a matrix or a one-hot vector and for any such data structure the relations among the entries). The data characteristics may also comprise a length of training data as well as the pre-processing or quality requirements for that training data.

Based on information (metadata) relating to CNFs and/or ML models a tree may be created. The tree may comprise the CNFs and/or ML models. The tree may identify the critical differences between the ML models.

The CNF identifier may be autogenerated or may be generated by the operator. The CNF version identifier may be autogenerated or may be generated by the operator.

The ML model identifier may be autogenerated or may be generated by the operator. The ML model version identifier may be autogenerated or may be generated by the operator.

The ML model version context may be a characteristic of a ML model but may also be attributed to a CNF. Table 1 only illustrates the case where the ML model version context is attributed to the ML model version.

For each ML model version context attribute an information object may be provided. The information object may comprise a fixed set of dimensions. Example dimensions for three ML model version context attributes are shown in Table 2. For each dimension there may be a fixed set of alternatives among which a value may be selected.

To allow for usage in multi-vendor environment, a CNF and/or a ML model may be described by a standardized information model. To this end, the CNF and/or ML models may be introduced as new lOCs to the 3GPP Network Resource Mode.

Such information model may serve various purposes. The information model may enable different network management tools to exchange management data to manage a CNF and/or a ML model during runtime, (i.e. to create, modify, and delete corresponding managed entities). Since a CNF is able to control other network functions, the information model may mirror these relationships. To characterize a CNF and/or a ML model during the training phase, the same standardized structure of the information model may be used. The information model may be used to document the network training context of the CNF and/or ML models during training phase and the preconditions required at runtime to ensure correct results of the CNF and/or ML models.

The applicable relationship between a CNF, ML models and other NFs is described by Figures 6a and 6b.

A CNF IOC may represent a CNF in the information model. A CNF may be associated with multiple ML models. A CNF may be identified by a CNF identifier and a CNF version identifier.

In an example the CNF IOC may have attributes as listed in Table 3: a CNFIdentifier attribute, a CNFVersion attribute and a CNFProperties attribute. These attributes may be a dataType or a class by its own. These attributes may comprise fields. The fields may comprise a support qualifier field, readable field, writable field, invariant sub-field and/or notifiable field.

Table 3

Versions may not be part of the CNF but part of the ML model. In such case the selection may be a two-step approach, first to select the CNF and then select the ML model that matches an expected context or a detected context.

An ML model IOC may represent a ML model in the information model. The ML model may be trained with a ML model version context. The ML model may be associated with a CNF.

An ML model IOC may reflect the contexts at different phases. An ML model IOC may allow to manage a CNF and/or a ML model as part of a network (network detected context/runtime context). An ML model IOC may allow to manage the relationship of a CNF and/or a ML model as a management system with respect to other network functions (network detected context/runtime context). An ML model IOC may allow to document the context during training (network training context) of a CNF and/or a ML model and to document the context that is expected by the CNF for correct inference (network expected context/ expected runtime context).

Since each ML model may be related to a set of contexts (e.g. network training context, network expected context/expected runtime context and network detected context/runtime context), a ML Model IOC may contain lists of contexts.

A context may be associated to managed entities (e.g. network slice, network slice subnets or network functions) to model the configuration of the network and to characterize the data provided by the managed entities. Further, a context may refer to data providers that are not part of the 3GPP system and are thus not modelled as 3GPP managed entities.

A set of attributes for the ML Model IOC is stated by Table 4. The ML Model IOC may comprise a modelVersionContextList or trainingContextList attribute, an expectedRuntimeContextList attribute, a rutimeContextList attribute, a model properties attributes, a modelVersionldentifier attribute and an operatingConditions attribute. These attributes may be a dataType or a class by its own. These attributes may comprise fields. The fields may comprise a support qualifier field, readable field, writable field, invariant sub-field and/or notifiable field.

Table 4

Model properties may comprise a number of neural network layers, a size of neural network layers or other properties. Three contexts may be documented: the detected context/runtimeContext, the trainingContext, and the expected context/ expected runtime context. Each of these contexts may be a data type of its own. These data types may document the context of a CNF and/or ML model during the different phases.

For each datatype the attributes in Table 5 may be applicable. Table 5

DataProviderRef may refer to the data used for training of the model in a specific context or might refer to the data used for inference in a specific context. Troubleshooting of irregularities during the runtime of the CNF using a specific model in a specific context might rely on a comparison of the corresponding data sets. Parameter values in TrainingContext and ExpectedRuntimeContext may be for documentation purpose and thus read-only.

StartTime and endTime may document the time interval of a TrainingContext or ExpectedRuntimeContext to characterize the amount of data that has been used for training or the amount of data that is expected during runtime, respectively.

The CoMSeF may be implemented as a submodule within the CoMReF or as a standalone function that interacts with the CoMReF. Figure 4 illustrated the case where the CoMSeF may be implemented as a submodule within the within CoMReF while Figure 7 illustrated the case where the CoMSeF is a standalone function.

As a standalone function, the CoMSeF may implement both the CoMSI as well as the optional Cognitive Network Function and Model deployment Interface (CoMDI). The CoMDI would otherwise be implemented by the CoMReF when the CoMSeF is integrated within the CoMReF. Moreover, the CoMSeF may implement the query interface to the CoMReF.

The logic for the operation of the CoMSeF may be broken down as illustrated by Figure 8.

The CoMSeF may parse the request to identify the nature of the request, either a request for a CMF and/or a ML model or a request for information relating to a CNF and/or a model for a context (e.g. expected context or detected context).

The CoMSeF may parse the request to identify the context attributes, the dimensions for each attribute and the values for each dimension.

The CoMSeF may query the CoMReF to identify a CNF and/or a ML model with a training context that match the identified context.

If CNF and/or a ML model is found, the CoMSeF notifies the requester as such. If CNF and/or a ML model is found and a CNF and/or a ML model was requested, the CoMSeF may send the CNF and/or a ML model to the requester.

If CNF and/or a ML model is found and information relating to a CNF and/or a ML model was requested, the CoMSeF may build an information object comprising information relating to a CNF and/or a ML model and may send the information object to requester.

The CoMSI may allow an operator, a network automation platform or a function on such a platform to request for a CNF and/or a ML model by stating a required context (e.g. expected context or detected context). The request may be stated as a “getCNF”, “getModel” or an equivalent “read” operation as illustrated below. getModel with context {

Training conditions {

Load, Low Location, CBD

}

Training state { training_time, lOhours ^Trainingsample, 76548

}

}

The CoMSI may allow an operator, a network automation platform or a function on such a platform to request for information on some or all CNF and/or ML models by stating required context (e.g. expected context or detected context). The request may be stated as a “getAIICNFs”, “getAIIModels” or an equivalent “read” operation as illustrated below. getAllModels with context {

Training conditions { Location , Rural

}

}

The CoMSI may allow the CoMSeF to be it as a standalone module or to be a module integrated into the CoMReF.

The CoMSI may allow the CoMSeF to notify an operator, a network automation platform or a function thereof when no CNF and/or ML model is found.

The CoMSI may allow the CoMSeF to send a CNF and/or ML model to an operator, a network automation platform or a function thereof. The CoMSeF may send an identifier of a CNF and/or ML model and/or a file containing the learned parameters of a CNF and/or ML model.

The CoMSI may allow the CoMSeF to send to the operator, to a network automation platform or to a function thereof or information relating to a CNF and/or ML model. The CoMSeF may send an identifier of a CNF and/or a ML model and/or a training context.

In an example an operator may have a radio system supplied by two vendors A and B each of which supplies equipment for an urban network and a rural network.

Consider that both vendors A and B supply CNFs and/or ML models for coverage and capacity optimization (CCO), trained at different time with different data (rural or urban), the CoMReF for this scenario may contain the information model represented by Table 6.

Table 6

An operator, a network automation platform or a function thereof wishing to find an appropriate CNF and/or ML model for a rural area may send the request below. getAlIModels with context {

Training conditions {

Location , Rural

}

}

The CoMSeF may process the request and identifies that there are two models for rural locations and may send the response below to the operator, the network automation platform or the function thereof. Models with context {

Training conditions {

Location, Rural

}

Are: {

CNF, Vendor_A_CCO_2020_12{

CNF version, Vendor_A_CCO_2020_12_Rural{

Model, Vendor_A_CCO_2021_02_Rural { model identifier, Vendor_A_CCO_2021_02_Rural_v2101; Training conditions {

Date, 01022021_03:46

}

Training state {

Converged, No

}

Operating/ inference conditions { Subnetwork, Vendor_A

}

}

}

}

CNF, Vendor_A_CCO_2020_12 {

CNF version, Vendor_A_CCO_2020_12_Rural{Model, Vendor_A_CCO_2021_02_Rural { model version identifier, Vendor_A_CCO_2021_02_Rural_v21 02;

Training conditions {

Date, 01032021_04:51

}

Training state {

Converged, Yes

}

Operating/ inference conditions { Subnetwork , Vendor_A

}

}

} Accordingly, the operator, the network automation platform or the function thereof may realize that only ML models fitting to the network area for vendor A are available and that only one of the two is matured/converged. The operator, the network automation platform or the function thereof may then deploy a converged ML model. However, a month later when a new ML model is likely to have been trained the operator, the network automation platform or the function thereof may want to see if there is a ML model with better performance and may as such request for all ML models for Vendor A area that have converged. In that case the request is more specific, including more context fields as follows. getAlIModels with context {

Training conditions {

Location, Rural

}

Training state {

Converged, Yes

}

Operating/ inference conditions {

Subnetwork, Vendor_A

}

}

One or more aspects of this disclosure enable CNFs and/or ML models to be characterized with a description of the conditions assumed when training the ML model so that usage is matched to those conditions.

CNF and/or ML model information may allow a tree of different ML models to be created in a way that identifies the critical differences among them.

One or more aspects of this disclosure supports performance guarantees by providing means for selecting among different ML models that serve the same objective. One or more aspects of this disclosure allow a CNF and/or a ML model to be matched with the right operating conditions, which avoids sub-optimal CNF and/or ML model performance.

Standardizing metadata allows the identification and selection of CNF and/or ML model to be undertaken by autonomic functions in the management plane.

Figure 9 shows a block diagram of a method for managing a request for a CNF and/or a ML model fulfilling a requirement or for information relating to a CNF and/or a ML model fulfilling a requirement, performed for example by a CoMSeF.

In step 900, a CoMSeF may store information relating to a plurality of CNFs and/or ML models deployable on a network. Each CNF may be associated with one or more ML models. Each ML model may be trained with one or more training contexts.

In step 902, the CoMSeF may receive a first request for a CNF and/or a ML model fulfilling a requirement or for information relating to a CNF and/or a ML model fulfilling a requirement.

In step 904, the CoMSeF may process the first request to identify a CNF and/or a ML model fulfilling the requirement.

In step 906, the CoMSeF may send a response to the first request based on the processing.

Processing the first request to identify a CNF and/or a ML model fulfilling the requirement may comprise: comparing the information relating to the plurality of CNFs and/or ML models to the requirement to identify a CNF and/or a ML model fulfilling the requirement.

The CoMSeF may identify a CNF and/or a ML model fulfilling the requirement. The CoMSeF may send a second request for the identified CNF and/or ML model. The CoMSeF may receive a response to the second request comprising the identified CNF and/or ML mode. The CoMSeF may send a response to the first request comprising the identified CNF and/or ML model.

The CoMSeF may identify a CNF and/or a ML model fulfilling the requirement. The CoMSeF may send a response to the first request comprising information relating to the identified CNF and/or ML model.

The CoMSeF may identify no CNF and/or ML model fulfilling the requirement. The CoMSeF may send a response to the first request comprising an indication that no CNF and/or ML model has been identified.

The first request may comprise an expected context or a detected context. The requirement may comprise an availability of a CNF and/or a ML model with a training context matching the expected context or detected context.

Each training context, expected context or detected context may comprise one or more of the following context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute.

The first request may originate from a NMF or a CNF deployed on the network. The response to the first request may be directed to the NMF or the CNF deployed on the network.

The second request may be directed to a CoMReF deployed on the network. The response to the second request may originate from the CoMReF deployed on the network.

The CoMSeF may be integrated within the CoMReF or is separate from the CoMReF. The CoMSeF may receive a first request for all CNFs and/or all ML models fulfilling a requirement or for information relation to all CNFs and/or all ML models fulfilling a requirement. The CoMSeF may process the first request to identify all CNFs and/or all ML models fulfilling the requirement.

Each CNF may be associated with a CNF IOC in an information model.

Each ML model may be associated with a ML model IOC in the information model.

Each training context, expected context or detected context may be associated with a training context, expected context or detected context IOC in the information model.

The CNF IOC may comprise a CNF properties attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

The ML machine learning model IOC may comprise a training context attribute, an expected context attribute and/or a detected context attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

The training context, expected context or detected context IOC may comprise one or more of the following training context, expected context or detected context attributes: a managed entity reference attribute, a data provider reference attribute, a start time attribute, an end time attribute, a training conditions attribute, a training state attribute, an operating conditions attribute, a reference performance attribute, a cognitive network function properties attribute and/or a data characteristics attribute with a plurality of fields, each field having a single value selected among a fixed set of alternatives.

Each CNF may be identified by a CNF identifier.

Each ML model may be associated with a ML model identifier. Figure 10 shows a block diagram of method for managing a request for a CNF and/or a ML model fulfilling a requirement or for information relating to a CNF and/or a ML model fulfilling a requirement, performed for example by a CoMReF. The CoMReF may store a plurality of CNFs and/or ML models deployable on a network. Each CNF may be associated with one or more ML models. Each ML model may be trained with one or more training contexts.

The CoMReF may receive a second request for an identified CNF and/or a ML model fulfilling a requirement.

The CoMReF may send a response to the second request comprising the identified CNF and/or ML model fulfilling the requirement. The second request may originate from a CoMSeF. The response to the second request may be directed to the CoMSeF.

The CoMReF may integrate the CoMSeF or may be separate from the CoMSeF. Figure 11 shows a schematic representation of non-volatile memory media 1100a (e.g. computer disc (CD) or digital versatile disc (DVD)) and 1100b (e.g. universal serial bus (USB) memory stick) storing instructions and/or parameters 1102 which when executed by a processor allow the processor to perform one or more of the steps of the methods of Figures 9 and 10.

It is noted that while the above describes example embodiments, there are several variations and modifications which may be made to the disclosed solution without departing from the scope of the present invention. It will be understood that although the above concepts have been discussed in the context of a 5GS, one or more of these concepts may be applied to other cellular systems. The embodiments may thus vary within the scope of the attached claims. In general, some embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although embodiments are not limited thereto. While various embodiments may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.

The embodiments may be implemented by computer software stored in a memory and executable by at least one data processor of the involved entities or by hardware, or by a combination of software and hardware. Further in this regard it should be noted that any procedures, e.g., as in Figures 9 and 10, may represent program steps, or interconnected logic circuits, blocks and functions, or a combination of program steps and logic circuits, blocks and functions. The software may be stored on such physical media as memory chips, or memory blocks implemented within the processor, magnetic media such as hard disk or floppy disks, and optical media such as for example DVD and the data variants thereof, CD.

The memory may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory. The data processors may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASIC), gate level circuits and processors based on multi-core processor architecture, as non-limiting examples. Alternatively or additionally some embodiments may be implemented using circuitry. The circuitry may be configured to perform one or more of the functions and/or method steps previously described. That circuitry may be provided in the base station and/or in the communications device.

As used in this application, the term “circuitry” may refer to one or more or all of the following:

(a) hardware-only circuit implementations (such as implementations in only analogue and/or digital circuitry); (b) combinations of hardware circuits and software, such as:

(i) a combination of analogue and/or digital hardware circuit(s) with software/firmware and

(ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as the communications device or base station to perform the various functions previously described; and

(c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.

This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example integrated device.

The foregoing description has provided by way of exemplary and non-limiting examples a full and informative description of some embodiments However, various modifications and adaptations may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings and the appended claims. However, all such and similar modifications of the teachings will still fall within the scope as defined in the appended claims.