Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS, METHOD, AND COMPUTER PROGRAM
Document Type and Number:
WIPO Patent Application WO/2022/057998
Kind Code:
A1
Abstract:
The disclosure relates to an apparatus comprising means for: receiving a request to deploy a data analytics function instance from an operator; causing the data analytics function instance to be deployed; determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and causing the other data analytics function instance to be deployed or to be reconfigured.

Inventors:
BAJZIK LAJOS (HU)
SZILÁGYI PÉTER (HU)
Application Number:
PCT/EP2020/075751
Publication Date:
March 24, 2022
Filing Date:
September 15, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA SOLUTIONS & NETWORKS OY (FI)
International Classes:
H04L12/24
Domestic Patent References:
WO2020169174A12020-08-27
Foreign References:
CN110677299A2020-01-10
US20200228420A12020-07-16
Other References:
ERICSSON: "Automatic provisioning of Expected UE Behaviour in UDR", vol. SA WG2, no. Xi'an, China; 20190408 - 20190412, 2 April 2019 (2019-04-02), XP051719281, Retrieved from the Internet [retrieved on 20190402]
NOKIA ET AL: "KI #2, New Solution: Distributed NWDAFs deployment and Aggregation Function", vol. SA WG2, no. Elbonia; 20200819 - 20200902, 2 September 2020 (2020-09-02), XP051928783, Retrieved from the Internet [retrieved on 20200902]
Attorney, Agent or Firm:
NOKIA EPO REPRESENTATIVES (FI)
Download PDF:
Claims:
53

CLAIMS

1 . An apparatus comprising means for: receiving a request to deploy a data analytics function instance from an operator; causing the data analytics function instance to be deployed; determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and causing the other data analytics function instance to be deployed or to be reconfigured.

2. The apparatus of claim 1 , wherein the request to deploy the data analytics function instance comprises an input data template defining one or more input data type for the data analytics function instance and/or an output data template defining one or more output data type for the data analytics function instance.

3. The apparatus of claim 2, wherein the input data template and/or the output data template comprise one or more measurement or analytics result.

4. The apparatus of claim 1 or claim 2, wherein the request to deploy the data analytics function instance comprises an input data scope defining one or more input data source for the data analytics function instance and/or an output data scope defining one or more output data destination for the data analytics function instance.

5. The apparatus of claim 4, wherein the input data scope and/or the output data scope comprise one or more measurement source or data analytics function instance. 54

6. The apparatus of any of claims 2 to 5, wherein determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured is based on the input data template and the input data scope of the data analytics function instance.

7. The apparatus of any of claims 1 to 6, wherein causing the data analytics function instance to be deployed comprises providing at least part of a data analytics function deployment package to a virtual network function and/or cloud native network function orchestrator; and/or wherein causing the other data analytics function instance to be deployed comprises providing at least part of another data analytics function deployment package to the virtual network function and/or cloud-native network function orchestrator.

8. The apparatus of any of claims 1 to 7, wherein causing the other data analytics function instance to be reconfigured comprises causing the other data analytics function instance to modify the input data scope and/or output data scope of the other data analytics function instance.

9. The apparatus of any of claims 1 to 8, wherein the request to deploy the data analytics function instance comprises a data analytics function deployment package identifier associated with a data analytics function deployment package.

10. The apparatus of any of claims 1 to 9, comprising means for: receiving a request to register a data analytics function deployment package, the request to register the data analytics function deployment package comprising the data analytics function deployment package; associating a data analytics function deployment package identifier to the data analytics function deployment package; and storing the data analytics function deployment package and the data analytics function deployment package identifier. 55

11. The apparatus of claim 9 or claim 10, wherein the data analytics function deployment package comprises one or more profile, each profile comprises an input data template and an output data template.

12. The apparatus of any of claims 9 to 11 , wherein the data analytics function deployment package comprises a data analytics function input and output descriptor file, wherein the data analytics function input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

13. The apparatus of any of claims 9 to 12, wherein the apparatus comprises means for: storing the one or more profiles with the one or more profile identifier.

14. The apparatus of claim 13, wherein the request to deploy the data analytics function instance comprises a profile identifier associated with one of the one or more profiles.

15. The apparatus of any of claims 9 to 14, comprising means for: providing the data analytics function deployment package identifier to the operator.

16. The apparatus of claim 15, wherein the request to deploy the data analytics function instance comprises the data analytics function deployment package identifier.

17. The apparatus of any of claims 1 to 16, comprising means for: providing a uniform resource link of the data analytics function instance to the operator.

18. The apparatus of any of claims 1 to 17, wherein the apparatus is data analytics management platform. 56

19. The apparatus of claim 18, wherein the apparatus is a data collection coordination function.

20. A method comprising: receiving a request to deploy a data analytics function instance from an operator; causing the data analytics function instance to be deployed; determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and causing the other data analytics function instance to be deployed or to be reconfigured.

21 . A computer program comprising computer executable instructions which when run on one or more processors perform the steps of the method of 20.

Description:
APPARATUS, METHOD, AND COMPUTER PROGRAM

Field of the disclosure

The present disclosure relates to an apparatus, a method, and a computer program for managing data analytics function instances in a mobile network.

Background

A communication system can be seen as a facility that enables communication sessions between two or more entities such as user terminals, base stations/access points and/or other nodes by providing carriers between the various entities involved in the communications path. A communication system can be provided for example by means of a communication network and one or more compatible communication devices. The communication sessions may comprise, for example, communication of data for carrying communications such as voice, electronic mail (email), text message, multimedia and/or content data and so on. Non-limiting examples of services provided comprise two-way or multi-way calls, data communication or multimedia services and access to a data network system, such as the Internet. In a wireless communication system at least a part of a communication session between at least two stations occurs over a wireless link.

A user can access the communication system by means of an appropriate communication device or terminal. A communication device of a user is often referred to as user equipment (UE) or user device. A communication device is provided with an appropriate signal receiving and transmitting apparatus for enabling communications, for example enabling access to a communication network or communications directly with other users. The communication device may access a carrier provided by a station or access point and transmit and/or receive communications on the carrier. The communication system and associated devices typically operate in accordance with a required standard or specification which sets out what the various entities associated with the system are permitted to do and how that should be achieved. Communication protocols and/or parameters which shall be used for the connection are also typically defined. One example of a communications system is UTRAN (3G radio). Another example of an architecture that is known as the long-term evolution (LTE) or the Universal Mobile Telecommunications System (UMTS) radio-access technology. Another example communication system is so called 5G radio or new radio (NR) access technology.

Summary

According to an aspect there is provided an apparatus comprising means for: receiving a request to deploy a data analytics function instance from an operator; causing the data analytics function instance to be deployed; determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and causing the other data analytics function instance to be deployed or to be reconfigured.

The request to deploy the data analytics function instance may comprise an input data template defining one or more input data type for the data analytics function instance and/or an output data template defining one or more output data type for the data analytics function instance.

The input data template and/or the output data template may comprise one or more measurement or analytics result.

The request to deploy the data analytics function instance may comprise an input data scope defining one or more input data source for the data analytics function instance and/or an output data scope defining one or more output data destination for the data analytics function instance. The input data scope and/or the output data scope may comprise one or more measurement source or data analytics function instance.

Determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured may be based on the input data template and the input data scope of the data analytics function instance.

Causing the data analytics function instance to be deployed may comprise providing at least part of a data analytics function deployment package to a virtual network function and/or cloud native network function orchestrator; and/or causing the other data analytics function instance to be deployed may comprise providing at least part of another data analytics function deployment package to the virtual network function and/or cloud-native network function orchestrator.

Causing the other data analytics function instance to be reconfigured may comprise causing the other data analytics function instance to modify the input data scope and/or output data scope of the other data analytics function instance.

The request to deploy the data analytics function instance may comprise a data analytics function deployment package identifier associated with a data analytics function deployment package.

The apparatus may comprise means for: receiving a request to register a data analytics function deployment package, the request to register the data analytics function deployment package comprising the data analytics function deployment package; associating a data analytics function deployment package identifier to the data analytics function deployment package; storing the data analytics function deployment package and the data analytics function deployment package identifier.

The data analytics function deployment package may comprise one or more profile, each profile comprises an input data template and an output data template. The data analytics function deployment package may comprise a data analytics function input and output descriptor file, wherein the data analytics function input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

The apparatus may comprise means for: storing the one or more profiles with the one or more profile identifier.

The request to deploy the data analytics function instance may comprise a profile identifier associated with one of the one or more profiles.

The apparatus may comprise means for: providing the data analytics function deployment package identifier to the operator.

The request to deploy the data analytics function instance may comprise the data analytics function deployment package identifier.

The apparatus may comprise means for: providing a uniform resource link of the data analytics function instance to the operator.

The apparatus may be data analytics management platform.

The apparatus may be a data collection coordination function.

According to an aspect there is provided an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured, with the at least one processor, to cause the apparatus at least to: receive a request to deploy a data analytics function instance from an operator; cause the data analytics function instance to be deployed; determine that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and cause the other data analytics function instance to be deployed or to be reconfigured.

The request to deploy the data analytics function instance may comprise an input data template defining one or more input data type for the data analytics function instance and/or an output data template defining one or more output data type for the data analytics function instance.

The input data template and/or the output data template may comprise one or more measurement or analytics result.

The request to deploy the data analytics function instance may comprise an input data scope defining one or more input data source for the data analytics function instance and/or an output data scope defining one or more output data destination for the data analytics function instance.

The input data scope and/or the output data scope may comprise one or more measurement source or data analytics function instance.

Determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured may be based on the input data template and the input data scope of the data analytics function instance.

Causing the data analytics function instance to be deployed may comprise providing at least part of a data analytics function deployment package to a virtual network function and/or cloud native network function orchestrator; and/or causing the other data analytics function instance to be deployed may comprise providing at least part of another data analytics function deployment package to the virtual network function and/or cloud-native network function orchestrator. Causing the other data analytics function instance to be reconfigured may comprise causing the other data analytics function instance to modify the input data scope and/or output data scope of the other data analytics function instance.

The request to deploy the data analytics function instance may comprise a data analytics function deployment package identifier associated with a data analytics function deployment package.

The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: receive a request to register a data analytics function deployment package, the request to register the data analytics function deployment package comprising the data analytics function deployment package; associate a data analytics function deployment package identifier to the data analytics function deployment package; store the data analytics function deployment package and the data analytics function deployment package identifier.

The data analytics function deployment package may comprise one or more profile, each profile comprises an input data template and an output data template.

The data analytics function deployment package may comprise a data analytics function input and output descriptor file, wherein the data analytics function input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: store the one or more profiles with the one or more profile identifier.

The request to deploy the data analytics function instance may comprise a profile identifier associated with one of the one or more profiles. The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: provide the data analytics function deployment package identifier to the operator.

The request to deploy the data analytics function instance may comprise the data analytics function deployment package identifier.

The at least one memory and the computer code may be configured, with the at least one processor, to cause the apparatus at least to: provide a uniform resource link of the data analytics function instance to the operator.

The apparatus may be data analytics management platform.

The apparatus may be a data collection coordination function.

According to an aspect there is provided an apparatus comprising circuitry configured to: receive a request to deploy a data analytics function instance from an operator; cause the data analytics function instance to be deployed; determine that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and cause the other data analytics function instance to be deployed or to be reconfigured.

The request to deploy the data analytics function instance may comprise an input data template defining one or more input data type for the data analytics function instance and/or an output data template defining one or more output data type for the data analytics function instance.

The input data template and/or the output data template may comprise one or more measurement or analytics result. The request to deploy the data analytics function instance may comprise an input data scope defining one or more input data source for the data analytics function instance and/or an output data scope defining one or more output data destination for the data analytics function instance.

The input data scope and/or the output data scope may comprise one or more measurement source or data analytics function instance.

Determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured may be based on the input data template and the input data scope of the data analytics function instance.

Causing the data analytics function instance to be deployed may comprise providing at least part of a data analytics function deployment package to a virtual network function and/or cloud native network function orchestrator; and/or causing the other data analytics function instance to be deployed may comprise providing at least part of another data analytics function deployment package to the virtual network function and/or cloud-native network function orchestrator.

Causing the other data analytics function instance to be reconfigured may comprise causing the other data analytics function instance to modify the input data scope and/or output data scope of the other data analytics function instance.

The request to deploy the data analytics function instance may comprise a data analytics function deployment package identifier associated with a data analytics function deployment package.

The apparatus may comprise circuitry configured to: receive a request to register a data analytics function deployment package, the request to register the data analytics function deployment package comprising the data analytics function deployment package; associate a data analytics function deployment package identifier to the data analytics function deployment package; store the data analytics function deployment package and the data analytics function deployment package identifier.

The data analytics function deployment package may comprise one or more profile, each profile comprises an input data template and an output data template.

The data analytics function deployment package may comprise a data analytics function input and output descriptor file, wherein the data analytics function input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

The apparatus may comprise circuitry configured to: store the one or more profiles with the one or more profile identifier.

The request to deploy the data analytics function instance may comprise a profile identifier associated with one of the one or more profiles.

The apparatus may comprise circuitry configured to: provide the data analytics function deployment package identifier to the operator.

The request to deploy the data analytics function instance may comprise the data analytics function deployment package identifier.

The apparatus may comprise circuitry configured to: provide a uniform resource link of the data analytics function instance to the operator.

The apparatus may be data analytics management platform.

The apparatus may be a data collection coordination function.

According to an aspect there is provided a method comprising: receiving a request to deploy a data analytics function instance from an operator; causing the data analytics function instance to be deployed; determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and causing the other data analytics function instance to be deployed or to be reconfigured.

The request to deploy the data analytics function instance may comprise an input data template defining one or more input data type for the data analytics function instance and/or an output data template defining one or more output data type for the data analytics function instance.

The input data template and/or the output data template may comprise one or more measurement or analytics result.

The request to deploy the data analytics function instance may comprise an input data scope defining one or more input data source for the data analytics function instance and/or an output data scope defining one or more output data destination for the data analytics function instance.

The input data scope and/or the output data scope may comprise one or more measurement source or data analytics function instance.

Determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured may be based on the input data template and the input data scope of the data analytics function instance.

Causing the data analytics function instance to be deployed may comprise providing at least part of a data analytics function deployment package to a virtual network function and/or cloud native network function orchestrator; and/or causing the other data analytics function instance to be deployed may comprise providing at least part of another data analytics function deployment package to the virtual network function and/or cloud-native network function orchestrator. Causing the other data analytics function instance to be reconfigured may comprise causing the other data analytics function instance to modify the input data scope and/or output data scope of the other data analytics function instance.

The request to deploy the data analytics function instance may comprise a data analytics function deployment package identifier associated with a data analytics function deployment package.

The method may comprise: receiving a request to register a data analytics function deployment package, the request to register the data analytics function deployment package comprising the data analytics function deployment package; associating a data analytics function deployment package identifier to the data analytics function deployment package; storing the data analytics function deployment package and the data analytics function deployment package identifier.

The data analytics function deployment package may comprise one or more profile, each profile comprises an input data template and an output data template.

The data analytics function deployment package may comprise a data analytics function input and output descriptor file, wherein the data analytics function input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

The method may comprise: storing the one or more profiles with the one or more profile identifier.

The request to deploy the data analytics function instance may comprise a profile identifier associated with one of the one or more profiles.

The method may comprise: providing the data analytics function deployment package identifier to the operator. The request to deploy the data analytics function instance may comprise the data analytics function deployment package identifier.

The method may comprise: providing a uniform resource link of the data analytics function instance to the operator.

The method may be performed by a data analytics management platform.

The method may be performed by a data collection coordination function.

According to an aspect there is provided a computer program comprising computer executable code which when run on at least one processor is configured to: receive a request to deploy a data analytics function instance from an operator; cause the data analytics function instance to be deployed; determine that another data analytics function instance is to be deployed or is deployed and is to be reconfigured to provide input data to the data analytics function instance; and cause the other data analytics function instance to be deployed or to be reconfigured.

The request to deploy the data analytics function instance may comprise an input data template defining one or more input data type for the data analytics function instance and/or an output data template defining one or more output data type for the data analytics function instance.

The input data template and/or the output data template may comprise one or more measurement or analytics result.

The request to deploy the data analytics function instance may comprise an input data scope defining one or more input data source for the data analytics function instance and/or an output data scope defining one or more output data destination for the data analytics function instance. The input data scope and/or the output data scope may comprise one or more measurement source or data analytics function instance.

Determining that another data analytics function instance is to be deployed or is deployed and is to be reconfigured may be based on the input data template and the input data scope of the data analytics function instance.

Causing the data analytics function instance to be deployed may comprise providing at least part of a data analytics function deployment package to a virtual network function and/or cloud native network function orchestrator; and/or causing the other data analytics function instance to be deployed may comprise providing at least part of another data analytics function deployment package to the virtual network function and/or cloud-native network function orchestrator.

Causing the other data analytics function instance to be reconfigured may comprise causing the other data analytics function instance to modify the input data scope and/or output data scope of the other data analytics function instance.

The request to deploy the data analytics function instance may comprise a data analytics function deployment package identifier associated with a data analytics function deployment package.

The computer program may comprise computer executable code which when run on at least one processor is configured to: receive a request to register a data analytics function deployment package, the request to register the data analytics function deployment package comprising the data analytics function deployment package; associate a data analytics function deployment package identifier to the data analytics function deployment package; store the data analytics function deployment package and the data analytics function deployment package identifier. The data analytics function deployment package may comprise one or more profile, each profile comprises an input data template and an output data template.

The data analytics function deployment package may comprise a data analytics function input and output descriptor file, wherein the data analytics function input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

The computer program may comprise computer executable code which when run on at least one processor is configured to: store the one or more profiles with the one or more profile identifier.

The request to deploy the data analytics function instance may comprise a profile identifier associated with one of the one or more profiles.

The computer program may comprise computer executable code which when run on at least one processor is configured to: provide the data analytics function deployment package identifier to the operator.

The request to deploy the data analytics function instance may comprise the data analytics function deployment package identifier.

The computer program may comprise computer executable code which when run on at least one processor is configured to: provide a uniform resource link of the data analytics function instance to the operator.

The at least one processor may be part of a data analytics management platform.

The at least one processor may be part of a data collection coordination function. According to an aspect, there is provided a computer readable medium comprising program instructions stored thereon for performing at least one of the above methods.

According to an aspect, there is provided a non-transitory computer readable medium comprising program instructions stored thereon for performing at least one of the above methods.

According to an aspect, there is provided a non-volatile tangible memory medium comprising program instructions stored thereon for performing at least one of the above methods.

In the above, many different aspects have been described. It should be appreciated that further aspects may be provided by the combination of any two or more of the aspects described above.

Various other aspects are also described in the following detailed description and in the attached claims.

List of abbreviations

AF: Application Function

AMF: Access Management Function

API: Application Programming Interface

BS: Base Station

CNF: Cloud-native/Containarized Network Function

C-Plane: Control Plane

CU: Centralized Unit

DCCF: Data Collection Coordination Function

DA: DCCF adaptor

DAF: Data Analytics Function

DAMP: Data Analytics Management Platform

DL: Downlink Dll: Distributed Unit

E2E: End to End eNB: eNodeB

ETSI: European Telecommunication Standard Institute gNB: gNodeB

GSM: Global System for Mobile communication

HSS: Home Subscriber Server

IMS: IP multimedia subsystem

IOC: Information Object class loT : Internet of Things

KPI: Key Performance Indicator

LTE: Long Term Evolution

MAC: Medium Access Control

MDA: Management Data Analytics

MDAS: Management Data Analytics Service

MnS: Management Service

MOI: Managed Object Instance

M-Plane: Management Plane

MS: Mobile Station

MTC: Machine Type Communication

NEF: Network Exposure Function

NF: Network Function

NMS: Network Management System

NR: New radio

NRF: Network function Repository Function

NSI: Network Slice Instance

NSSI: Network Slice Subnet Instance

NWDAF: Network Data Analytics Function

0AM: Operation administration and Maintenance

OSS: Operation Support System

PDU: Packet Data Unit

PM: Performance Measurement RAM: Random Access Memory

(R)AN: (Radio) Access Network

ROM: Read Only Memory

SDO: Standards Developing Organization

SMF: Session Management Function

SON: Self-Organised Network

TR: Technical Report

TS: Technical Specification

UE: User Equipment

UMTS: Universal Mobile Telecommunication System

U-Plane: User Plane

URL: Uniform Resource Link

USB: Universal Serial Bus

VNF: Virtualized Network Function

3CA: Third-Party consumer adaptor

3PA: Third-Party producer adaptor

3GPP: 3 rd Generation Partnership Project

5G: 5 th Generation

5GC: 5G Core network

5GS: 5G System

Brief Description of the Figures

Embodiments will now be described, by way of example only, with reference to the accompanying Figures in which:

Figure 1 shows a schematic representation of a 5G system;

Figure 2 shows a schematic representation of a control apparatus;

Figure 3 shows a schematic representation of a terminal; Figure 4 shows a block diagram of an analytics pipeline comprising data analytics function instances and measurement sources;

Figure 5 shows a block diagram of a data collection framework discussed in [TR 23.700-91 ];

Figure 6 shows a block diagram of a data analytics management platform determining input data is to be input to a data analytics function instance and output data is to be output by the data analytics function instance;

Figure 7 shows a flow chart of a method perform by a data analytics management platform for deploying a data analytics function instance;

Figure 8 shows a signalling diagram of a process for registering a data analytics function instance deployment package;

Figure 9 shows a signalling diagram of a process for deploying a data analytics function instance;

Figure 10 shows a signalling diagram of a process for configuring a data analytics function instance;

Figure 11 shows a flow chart of a method performed by a data analytics management platform for deploying a data analytics function instance;

Figure 12 shows a flow chart of a method performed by a data collection and coordination function for deploying a data analytics function instance;

Figure 13 shows a block diagram of a method performed by a data analytics management platform for deploying a data analytics function instance; and Figure 14 shows a schematic representation of a non-volatile memory medium storing instructions which when executed by a processor allow a processor to perform one or more of the steps of the method of Figure 13.

Detailed Description of the Figures

In the following certain embodiments are explained with reference to mobile communication devices capable of communication via a wireless cellular system and mobile communication systems serving such mobile communication devices. Before explaining in detail the exemplifying embodiments, certain general principles of a wireless communication system, access systems thereof, and mobile communication devices are briefly explained with reference to Figures 1 , 2 and 3 to assist in understanding the technology underlying the described examples.

Figure 1 shows a schematic representation of a 5G system (5GS). The 5GS may comprises a terminal, a (radio) access network ((R)AN), a 5G core network (5GC), one or more application functions (AF) and one or more data networks (DN).

The 5G (R)AN may comprise one or more gNodeB (gNB) distributed unit functions connected to one or more gNodeB (gNB) centralized unit functions.

The 5GC may comprise an access management function (AMF), a session management function (SMF), an authentication server function (ALISF), a user data management (UDM), a user plane function (UPF) and/or a network exposure function (NEF). Although not illustrated the 5GC may comprise other network functions (NF), such as a network address and/or port translation function (NAPTF), a network function repository function (NRF), a binding support function (BSF) or a data collection coordination function (DCCF).

Figure 2 illustrates an example of a control apparatus 200 for controlling a function of the (R)AN or the 5GC as illustrated on Figure 1 . The control apparatus may comprise at least one random access memory (RAM) 211 a, at least on read only memory (ROM) 211 b, at least one processor 212, 213 and an input/output interface 214. The at least one processor 212, 213 may be coupled to the RAM 211 a and the ROM 211 b. The at least one processor 212, 213 may be configured to execute an appropriate software code 215. The software code 215 may for example allow to perform one or more steps to perform one or more of the present aspects. The software code 215 may be stored in the ROM 211 b. The control apparatus 200 may be interconnected with another control apparatus 200 controlling another function of the 5G (R)AN or the 5GC. In some embodiments, each function of the (R)AN or the 5GC comprises a control apparatus 200. In alternative embodiments, two or more functions of the (R)AN or the 5GC may share a control apparatus.

Figure 3 illustrates an example of a terminal 300, such as the terminal illustrated on Figure 1. The terminal 300 may be provided by any device capable of sending and receiving radio signals. Non-limiting examples comprise a user equipment, a mobile station (MS) or mobile device such as a mobile phone or what is known as a ’smart phone’, a computer provided with a wireless interface card or other wireless interface facility (e.g., USB dongle), a personal data assistant (PDA) or a tablet provided with wireless communication capabilities, a machine-type communications (MTC) device, a Cellular Internet of things (CloT) device or any combinations of these or the like. The terminal 300 may provide, for example, communication of data for carrying communications. The communications may be one or more of voice, electronic mail (email), text message, multimedia, data, machine data and so on.

The terminal 300 may receive signals over an air or radio interface 307 via appropriate apparatus for receiving and may transmit signals via appropriate apparatus for transmitting radio signals. In Figure 3 transceiver apparatus is designated schematically by block 306. The transceiver apparatus 306 may be provided for example by means of a radio part and associated antenna arrangement. The antenna arrangement may be arranged internally or externally to the mobile device.

The terminal 300 may be provided with at least one processor 301 , at least one memory ROM 302a, at least one RAM 302b and other possible components 303 for use in software and hardware aided execution of tasks it is designed to perform, including control of access to and communications with access systems and other communication devices. The at least one processor 301 is coupled to the RAM 302a and the ROM 211 b. The at least one processor 301 may be configured to execute an appropriate software code 308. The software code 308 may for example allow to perform one or more of the present aspects. The software code 308 may be stored in the ROM 302b.

The processor, storage and other relevant control apparatus can be provided on an appropriate circuit board and/or in chipsets. This feature is denoted by reference 304. The device may optionally have a user interface such as keypad 305, touch sensitive screen or pad, combinations thereof or the like. Optionally one or more of a display, a speaker and a microphone may be provided depending on the type of the device.

One or more aspect of this disclosure relates to managing data analytics function (DAF) instances in mobile networks. A DAF may refer to a software function performing data analytics on input data and producing an analytics result as output data.

One or more aspect of this disclosure relates to managing DAF instances in 5G mobile networks but it will be understood that one or more aspect may be adapted to mobile networks with former radio access technologies and to other domains or cross-domain environments (e.g. industrial networks with correlated loT application data and network data analytics).

In 5G mobile networks automation may be more important than ever due the increased complexity and the wide range of services provided. Advanced data collection and analytics may be useful enablers for automation and therefore this may be a key area for both standardization and research.

Analytics for automation (e.g. the analytics required for either fully automated or supervised end to end (E2E) services) may not be computed by a single DAF instance but by complex analytics pipelines built from several DAF instances (i.e. building blocks). In the future, these analytics pipelines may be deployed dynamically and from any DAF instance. That is, the deployment and integration of these analytics pipelines may be seamless regardless of the actual functionality of the DAF instances.

Figure 4 shows a block diagram of an analytics pipeline comprising DAF instances DAF#1 , DAF#2, DAF#3 and DAF#4 and measurement sources.

An analytics pipeline may be described as a directed graph where the nodes with only outgoing arrows represent measurement sources (light grey boxes on Figure 4) and nodes having incoming arrows represent DAF instances (dark grey boxes on Figure 4). The arrows represent data provided by a source node to a destination node. The data provided by a source node to a destination node is referred to as output data for the source node and input data for the destination node.

The DAF instances DAF#2, DAF#3 and DAF#4 may perform data analytics on input data measured for specific types of network functions (NF). The input data may comprise a time series of performance measurements (PM).

For example, the DAF instance DAF#2 may perform data analytics on PM for gNBs. The DAF instance DAF#2 may build a gNB state model. For each gNB of a set of gNBs the DAF instance DAF#2 may receive a same set of PM per timestamp. The DAF instance DAF#2 may use the set of PM per timestamp to determine an abstract gNB state per time stamp. The DAF instance DAF#2 may output the abstract gNB state per timestamp.

The DAF instances DAF#3 and DAF#4 may operate similarly but instead of performing data analytics on PM for gNBs, they may perform data analytics on PM for N3 interfaces (i.e. user plane interfaces between gNBs and UPFs) and for UPFs respectively. The DAF instance DAF#1 on the other hand may be one level higher in the DAF hierarchy than the DAF instances DAF#2, DAF#3 and DAF#4. The DAF instances DAF#2, DAF#3 and DAF#4 may be referred to as lower-level DAFs. The DAF instances DAF#1 may be referred to as a higher-level DAF.

The DAF instance DAF#1 may receive the abstract states from the DAF instances DAF#2, DAF#3 and DAF#4 along with direct E2E measurements. The E2E measurements may for example comprise delay measurements between UEs and UPFs. The E2E measurements may comprise a given E2E scope (e.g. E2E connections handled by a given set of UPFs, E2E connections for a given group or UEs, etc.)

The DAF instance DAF#1 may find anomalies in the E2E measurements. The DAF instance DAF#1 may correlate the anomalies in the E2E measurements to the the gNB abstract states, the N3 interface abstract states and the UPF abstract states provided by the DAF instances DAF#2, DAF3# and DAF#4. In this way, root cause analysis for the E2E anomalies may be facilitated.

Input data provided to a DAF instance may be referred to as data dependencies of the DAF instance. For example, the data dependencies for the DAF instances DAF#2, DAF3# and DAF#4 are respectively the per-gNB PM measurements, the per-N3 interface PM measurements and the per-UPF PM measurements.

The data dependency of the DAF instance DAF#1 may comprise E2E measurements, the gNB abstract states, the N3 interface abstract states and the UPF abstract states provided by the DAF instances DAF#2, DAF3# and DAF#4.

Figure 4 illustrates an example but any analytics pipeline may be modelled as a similar directed graph. Different deployment scenarios may be possible to deploy (i.e. install and configure) such data analytics pipeline. One option may be that the operator is aware of the structure of the analytics pipeline and deploys the measurement sources and DAF instances one-by-one in the right order so that data dependencies are available at the time when a given DAF instance is deployed. This has two main problems.

On the one hand, the complexity of expected analytics pipelines in the future may make this process very hard and error prone. On the other hand, the operator’s needs for different levels within the analytics pipelines may evolve organically rather than pre-planned. For example, the operator may decide for some reason to deploy a higher-level DAF instance like the one provided by DAF#1 in the example of Figure 4 without being aware what part of the whole analytics pipeline is already deployed and what part is not. In addition, in order to enable an open ecosystem of software market, an operator should be able to replace easily one vendor’s implementation of a specific data analytics functionality with another vendor’s implementation of the same specific data analytics functionality. Thus, deploying DAF instances shouldn’t depend by any means on the specific implementation of other DAF instances deployed.

One or more aspect of this disclosure relates to deploying DAF instances.

One or more aspect of this disclosure relates to automating the deploying of DAF instances .

One or more aspect of this disclosure relates to starting (i.e. running) DAF instances.

One or more aspect of this disclosure relates to deploying or reconfiguring measurement sources and/or other DAF instances required to provide input data to a DAF instance. The input data may comprise performance measurements and/or analytics results.

One or more aspect of this disclosure relates to connecting the DAF instance to the measurement sources and/or other DAF instances. The deploying of the DAF instance may be agnostic to the actual functionality that the DAF instance is performing. The deploying of the DAF instance may support the organic evolution of the operator’s analytics need. The deploying of the DAF instance may support swapping different implementations of the same functionality. The deploying of the DAF instance may be intent-based meaning that the operator may be relieved from any integration effort that can be done automatically. In this way, the role of the operator may be reduced to a minimum.

There is no existing DAF instance deploying solution addressing the above problems.

On the products side, data analytics may be part of large network management system (NMS) or centralized self-organised network (SON) software products. Connecting these software products to data sources in a network may be a manual integration process. The data sources may themselves be large NMS or operations support system (OSS).

On the standardization side, there may be related standards in the SA5 and SA2 working groups of 3GPP. These are briefly described below.

The NRM standard [3GPP TS 28.541 ] defines the object-oriented model of 5G networks for management purposes in the form of information object class (IOC) definitions and their relations. It is built on the generic NRM model classes defined in [3GPP TS 28.622], An instance of an IOC is called managed object instance (MOI) and represents an actual managed element of the network of the type described by the IOC. For example, the IOC for UPF functionality is named UPFFunction, and the UPF functionality of all the UPF network functions in the network is modelled as a separate MOI of this IOC. According to the hierarchical relationships of the lOCs as defined by the NRM, the MOI objects form a tree. The distinguished name (DN) syntax for identifying a MOI in this tree is defined in standard [3GPP TS 32.300],

SA5 defines standard measurements on per-IOC basis. These performance measurements are standardized in [TS 28.552], [TS 28.552] specifies the different measurements per-IOC. [TS 28.552] also specifies a standardized measurement name for them. [TS 28.554] is similar for key performance indicators (KPI). [TS 28.554] defines E2E KPIs for specific lOCs, such as higher-level lOCs (e.g. NetworkSliceSubnet). The KPIs also have a standard name.

SA5 have also standardized application programming interfaces (API) (services in the terminology of SA5) for starting and stopping measurements. The consumer of the service specifies the list of standard measurement names ([TS 28.552, TS 28.554]) and the list of DNs identifying the measured objects (MOIs) for which the measurements need to be started. The job control APIs have two variants, the dedicated API is defined in [TS 28.550] and the control NRM fragment for performance job control is defined in [TS28.622], There is also standard for trace job control, with control NRM fragment ([TS 28.622]).

SA5 defines the interfaces between the producers and consumers of the measurements and traces for the actual transfer of the data in standards [TS 28.550] and [TS 28.532], There are file-based and streaming reporting options. SA5 standards specify all aspects of the interaction between producers and consumers including the operations and the file formats and stream serialization formats.

In summary, the above SA5 performance measurement standards, together with the NRM give the basic enablers and building blocks for standardized performance measurement collection. However, they are not solving the problem of automatic deployment and integration of DAF instances.

While the basic SA5 enablers collected above are well-defined, the standardization of analytics based on the collected measurements is still discussed in SA5.

The standard [TS 28.550] contains two general use cases for management data analytics on how the provider of a management data analytics service (MDAS) can provide analytics result to the consumer of that service. However, the use cases already specify that the analytics are for network slice instance (NSI) or network slice subnet instance (NSSI (5.1.5.1 ) level or network level (5.1.5.2), limiting the scope of analytics. On the other hand, the step 2 of these two use cases specifies that it is the responsibility of the provider of the MDAS (i.e. the DAF instance) to resolve data dependencies itself. The DAF, using the SA5 APIs for measurement control, need to determine what measurements are already collected, start the missing measurement jobs and connect itself to the producers of these measurements.

In other words, current SA5 approach for dependency resolution is to delegate it to the running DAF itself. There might be issues with this approach.

There might be no visibility on analytics pipelines. As one DAF instance in isolation is not aware of which analytics pipelines it is part of the automated coordination of multilevel pipeline DAF instances may not be achieved this way.

The operator may not want to give automated control for the DAF product of some vendor over what data is collected in the network. It is not an issue nowadays because there is no automated control and all data collection is started as part of manual integration process inherently supervised by the operator.

The software development efficiency may be affected. Delegating dependency resolution to the DAF instance may complicate and slows down the software development.

There is a Rel-17 work item in 3GPP on the MDAS topic FS_eMDAS (Study on enhancement of Management Data Analytics Service). There is also a TR report [TR 28.809], Currently it is collecting concrete analytics service use cases, defining for each use case the input data and what the analytics service is producing. Thus, it is not targeting general DAF instances. Furthermore, it is not covering the deployment and dependency resolution of DAF instances.

A network data analytics function (NWDAF) is part of the 5G architecture specified in [TS 23.501 ], It can receive data from 5G NFs via interfaces defined by SA2 and allowed to consume SA5 measurements. Based on these inputs, it provides analytics results to consumers via service-based interfaces standardized by SA2. The analytics services provided by NWDAF are standardized in [TS 23.288], These NWDAF standards are also not agnostic to the actual analytics. All the possible analytics are standardized in [TS 23.288] specifying the input data and the output data. In addition, the approach for dependency resolution is the same as for the SA5 MDAS provider, it is the responsibility of the running DAF instance (in this case the NWDAF instance), so the same issues apply as described above for SA5 MDAS provider.

A concept of a programmable data collection framework has been submitted as a contribution to SA2 #139E for FS_eNA_Ph2. The contribution has been accepted and already included in the technical report [TR 23.700-91], The functional architecture of the proposed data collection framework is shown in Figure 5.

The scope of the data collection framework is to enable the collection and distribution of any data (operational, trace, event, control plane (C-plane), management plane (M- plane), etc., except for the user plane (Il-plane) packet data unit (PDU) session data) from data sources to data consumers. Data sources are dynamically discovered by the data collection framework based on requests of data consumers. A messaging framework is used to efficiently distribute data from data sources to data consumers so that a data source does not need to replicate data towards multiple data consumers. In fact, data sources are not even aware of the location and identity of the data consumers. This is handled by the data collection framework. A data collection and coordination function (DCCF) controls the data sources and data Consumers. The DCCF exposes available data to potential data consumers, receives request for data from data consumers, triggers the production of data at data sources and dynamically configures the messaging framework to route and replicate data from data sources to data consumers. The DCCF itself does not handle data but is aware of the bindings between data sources and data consumers.

The data collection framework includes adaptors on the data source and data consumer side. The adaptors comprise a 3rd party producer adaptor (3PA) interfaces with its associated data source(s) to collect data using protocols and mechanisms suiting the data source(s). The adaptors comprise the 3rd party consumer adaptor (3CA) interfaces with its associated data consumer(s) to provide data using protocols and mechanisms suiting the data consumer(s). Such protocols and mechanisms may be standardized. In general, the 3PA and 3CA have the role to adapt the interfaces of the data source(s) and data consumer(s) to the interface of the messaging framework.

The solution proposal #9 for data management framework in [TR 23.700-91 ] moves the responsibility of resolving the data dependencies from the NWDAF to the DCCF, once the NWDAF sends a detailed request for the input data to the DCCF. However, it still assumes that the DAF instance (NWDAF instance) is already deployed/running and knows its detailed input data requirement (possibly from operator configuration). It doesn’t give solution for the automatic deployment of the DAF instance.

One or more aspect of this disclosure relates to a mechanism for automatically resolving data dependency resolution and deploying a DAF instance. The mechanism may be implemented by an operation administration and management (0AM) system of the operator rather than by a DAF instance.

A functional part of the 0AM system responsible for the dependency resolution and deployment of DAF instances may be referred to as data analytics management platform (DAMP).

One or more aspect of this disclosure relates to the provision of an input data template and output data template for a DAF instance to a DAMP. The input data template may define an input data type for the DAF instance. The input data type may comprise one or more measurement or analytics result. The output data template may define an output data type for the DAF instance. The output data type may comprise one or more analytics result. The input data template and the output data template may be provided by a DAF vendor as part of a DAF deployment package. The input data template and the output data template may be selected by an operator. One or more aspect of this disclosure relates to the provision of an input data scope and an output data scope for a DAF instance. The input data scope may define an input data source for the DAF instance. The input data scope may comprise one or more measurement source or DAF instance. The output data scope may define an output data destination for the DAF instance. The output data scope may comprise one or more measurement source or DAF instance. The input data scope and the output data scope may comprise one or more NF instance (e.g. DAF instance), RAN instance (e.g. cell instance) or other. The input data scope and the output data scope may be provided by an operator to a DAMP.

The DAMP may combine the input data template and the input data scope for the DAF instance to determine what input data is to be input to a DAF instance.

The DAMP may combine the output data template and the output data scope for the DAF instance to determine what output data is to be output by the DAF.

The DAMP may perform automated deployment of the DAF instance. The DAMP may make sure that all the required input data is available by recursively deploying or reconfiguring other DAF instances from which the DAF instance directly or indirectly requires input data.

Figure 6 shows a schematic representation of a block diagram of a mechanism to allow a DAMP to determine what input data is to be input to a DAF instance and what output data is to be output by the DAF instance before deploying the DAF instance;

In steps 1 .a and 1 .b a DAF deployment package may contain one or more input data template and one or more output data template for a DAF instance. These one or more input data template and one or more output data template may define the required and the produced data not on the level of concrete measured/analysed objects but on the level of lOCs. For example, each input data template may define what type of input data is needed rather than which entities are providing them. These one or more input data template and one or more output data template may be created by the DAF vendor at development time. These one or more input data template and one or more output data template may be a mandatory part of the DAF deployment package provided by the DAF vendor.

For the sake of flexibility several input data templates and output data templates may be allowed. The DAF deployment package may comprise a DAF input and output descriptor file provided by the vendor. The DAF input and output descriptor file may contain the input data templates and output data templates and a list of profiles. Each profile may comprise a pair with an input data template and an output data template. A DAF instance may be deployed according to a selected profile, that is according to an input data template and an output data template. Using the terminology of SA5, the input data template and an output data template may comprise one or more lOCs.

In step 2, the operator may define the input data scope and the output data scope for the DAF instance. Using the terminology of SA5, the input data scope and the output data scope may comprise one or more MOIs. The input data scope and the output data scope may define the actual set of measured/analysed objects. The input data scope and the output data scope may define the scope of a DAF instance.

The DAMP may receive the input data template and the output data template from the DAF deployment package. The DAMP may receive the input data scope and the output data scope from the operator.

In step 3a, the DAMP may combine the input data template and the input data scope and may determine what input data is required by the DAF instance.

In step 3b, the DAMP may combine the output data template and the output data scope and may determine what output data is to be produced by the DAF instance.

The input data template may contain lOCs. The input data template may contain an input data identifier for each IOC. The input data identifier may be a measurement identifier set out in standardized measurement definitions and/or an intermediate analytics result identifier set out in publicly available data schema documents.

The input data scope may contain MOIs from which the input data is to be provided.

Likewise, the output data template may contain lOCs. The output data template may contain an output data identifier for each IOC. The output data identifier may be an intermediate analytics result identifier set out in publicly available data schema documents.

The output data scope may contain MOIs to which the output data is to be provided.

The role of the operator may be as follows. The operator may select a DAF deployment package. The operator may select a profile comprising an input data template and an output data template for a DAF instance to be deployed. The operator may provide the profile comprising the input data template and the output data template to the DAMP. The operator may provide the input data scope and the output data scope.

The DAMP may then automatically deploy the DAF instance and all data dependencies accordingly. The DAMP may execute the steps of the flow chart shown on Figure 7, recursively resolving data dependencies if necessary.

Via this recursive process, the DAMP may deploy a whole data analytics pipeline comprising a DAF instance, its direct or indirect DAF dependencies, and the measurements required either by the given DAF or its direct or indirect DAF dependencies. For the actual deployment, the DAMP may hand over the DAF deployment package to a virtualized network function (VNF) and/or a cloud- native/containerized network function orchestrator of the operator’s system.

Figure 7 shows a schematic representation of a flow chart of a method perform by a DAMP for deploying a DAF instance. The DAMP may receive the input data template and the output data template from the DAF deployment package.

The DAMP may receive the input data scope and the output data scope from the operator for an operator-initiated DAF instance deployment. In the event of a recursive-initiated DAF instance deployment the DAMP may have previously received the input data scope and the output data scope from the operator.

The DAMP may combine the input data template, the output data template, the input data scope and the output data scope to specifically determine what input data is to be input to a DAF instance and what output data is to be output by the DAF instance. The input data to be input to the DAF instance may comprise measurements and analytics results.

The DAMP may hand over the DAF deployment package to the VNF and/or CNF to deploy and start (i.e. activate) the DAF instance.

For each measurement, the DAMP may find a measurement source that is already deployed and started to provide the measurement to the DAF instance.. The DAMP may reconfigure the measurement source. The DAMP may connect the DAF instance to the measurement source.

Alternatively, the DAMP may deploy and start a measurement source to provide the measurement to the DAF instance. The DAMP may connect the DAF instance to the measurement source.

For each analytics result, the DAMP may find another DAF instance providing the analytics result that is already deployed and started to provide the analytics result to the DAF instance. The DAMP may reconfigure the other DAF instance. The DAMP may connect the DAF instance to the other DAF instance. Alternatively, the DAMP may deploy and start another DAF instance from a registered DAF deployment package by recursively performing the flow chart of Figure 7.. The DAMP may connect the DAF instance to the other DAF instance.

The DAF deployment package may be standardized. One option may be to use the VNF packaging standardized by European Telecommunications Standards Institute (ETSI) Network function virtualization (NFV) in the specification [ETSI GS NFV-SOL 004], It may be a zip file with a standardized internal structure and standardized location and format for files describing the package.

A DAF input and output descriptor file format may be standardized. The DAF input and output descriptor file format may contain the input templates and output templates and the profiles as allowed combinations thereof.

The way how the DAMP may locate the DAF input and output descriptor file in the package may also be standardized. For DAF deployment package may include the DAF input and output descriptor file as a Non-MANO artefact (section 4.3.7 in [ETSI GS NFV-SOL 004]). For the DAMP, only the DAF input and output descriptor file may be of interest from the DAF deployment package. Remaining files of the DAF deployment package may be transparent. Remaining files of the DAF deployment package may be interpreted by the VNF and/or CNV orchestrator to which the DAMP passes the DAF deployment package for DAF instance deployment.

There may be two types of entries, referred to as type I and type II entries, that may be used in input data templates. The input data template may contain both types I and II entries, while the output data templates may contain only type II entries.

The type I entry serves for the definition of data dependency on direct measurements. A type I entry may associate a specific standard IOC name (SA5 NMR standards) to a list of standard measurement names for that IOC (SA5 measurement standards). The type II entry serves for defining analytics results, either as an input data dependency if it is included in an input data template or as an output data dependency if it is used in an output template. A type II entry may associate one standard IOC name, or in general a set of IOC names to a reference to a publicly available data model scheme document. The data model scheme document may define what data constitutes the analytics result per object instance of the specific IOC or object instances of the specific set of lOCs. This way, a specific analytics result may not need to be standardized, but its formal syntactical definition may be made available publicly by the DAF vendor along with the documentation of the semantics of the data. Alternatively, standards developing organizations (SDOs) can also publish documents for commonly used analytics. Possible standardized options for data scheme format are: ASN.1 , YANG, IDL, XML schema, etc.

Both types of entries may contain other parameters for measurement control and/or data transfer.

These parameters may comprise a granularity period. The granularity period may be a time periodicity of the input data or output data described by the entry. The granularity period may be different for different input data entries of the same input data template. The granularity period of input data entries may be different from the granularity period of output data entries (e.g. the latter can be an integer multiple of the former).

These parameters may comprise a reporting option (SA5). The reporting option may comprise file-based or streaming, file format, stream serialization format

A simple way to specify the input data scope and the output data scope, that is the MOIs for which the DAF instance is to be applied, is to give a list of DNs. As the SA5 MOIs are forming an object tree according to the IOC hierarchy and rules defined in the NRM, and the lOCs has standardized attributes, query language expressions applied to this tree may also be used to specify objects as an alternative, or in combination of simple DN listing. One example of query language for selecting objects from a tree hierarchy is XPath, defined by the W3C.

One or more aspect of this disclosure relates to three procedures: DAF deployment package registering procedure (also referred to as DAF onboarding procedure), DAF instance deployment procedure (also referred to as DAF instance activation procedure) and DAF instance configuration procedure (also referred to as DAF instance scope update procedure).

Figure 8 shows a schematic representation of a signalling diagram of a process to register a DAF deployment package.

The purpose of the DAF deployment package registering procedure (also referred to as DAF onboarding procedure) is the registration of the DAF deployment package and its available profiles in advance of the deployment of the DAF instance. The deployment of the DAF instance is performed subsequently via the DAF instance deployment procedure (also referred to as DAF instance activation procedure). Separating the DAF deployment package registration from the DAF instance deployment enables scenarios where the DAMP deploys a DAF instance automatically when another DAF instance requires the output data from the DAF instance as input data.

The operator may acquire the DAF deployment package from the DAF vendor. For example, the DAF deployment package may be in the format of VNF packaging standardized by [ETSI GS NFV-SOL 004], The DAF deployment package may include the DAF input and output descriptor file describing the input and output templates and profiles, in a standardized location of the DAF deployment package internal directory hierarchy.

The operator may send a request for registering the DAF deployment package. The operator may send the request via a standardized DAMP interface for registering the DAF deployment package. The operator may pass the DAF deployment package as parameter.

The DAMP may find the DAF input and output descriptor file in the DAF deployment package. The DAMP may ignore other files. The DAMP may store the DAF deployment package in a register along with the profiles included in the DAF input and output descriptor file and a DAF deployment package identifier

The DAMP may allocate the DAF deployment package identifier to the DAF deployment package. The DAF deployment package identifier may be provided to the operator and may be used by the operator in the DAF instance deployment procedure (also referred to as DAF instance activation procedure) to identify the DAF deployment package.

Figure 9 shows a schematic representation of a signalling diagram of a process to deploy a DAF instance.

Using the process to register a DAF deployment package of Figure 8 the operator may have already registered the DAF deployment package.

The operator may select the DAF deployment package and may decide to deploy a DAF instance with a selected profile comprising a selected input data template and a selected output data template.

The operator may select an input data scope and an output data scope. The input data scope and the output data scope may be in a standardized format. The input data scope and the output data scope may comprise a list of DNs, query expressions or the combination of these.

The operator may send a request to deploy and start the DAF instance to the DAMP. The request may be sent via a standardized DAMP interface for deploying a DAF instance. The operator may pass the DAF deployment package identifier, the profile and the input data scope and the output data scope as parameters.

The DAMP may perform automatic dependency resolution. The DAMP may determine which other DAF instance need to be deployed and started and with what profile. The DAMP may delegate the DAF instances deployment and start to the VNF and/or CNF orchestrator by passing the DAF deployment package to the VNF and/or CNF orchestrator. The VNF and/or CNF orchestrator may support the DAF deployment package format.

The DAMP may provide a response to the operator indicating that the DAF instance deployment, including the recursive deployment or reconfiguration of other DAF instances is successful or unsuccessful. If the DAF instance deployment and start is successful, the response may contain a northbound interface uniform resource link (URL) of the deployed DAF instance. The operator may use the URL to directly interact with the DAF instance (e.g. monitor the analytics results).

Figure 10 shows a schematic representation of a signalling diagram of a process to reconfigure a DAF instance.

Reconfiguring a deployed lower-level DAF instance may be required when during the deployment of a higher-level DAF instance, one of its data dependencies is an intermediate analytics result from a lower-level DAF instance and the input data scope and/or the output data scope of the lower-level DAF is different from an input data scope and/or an output data scope of the lower-level DAF that would be needed by the higher-level DAF to be deployed.

In this case, it may be part of the role of the DAMP to determine a union input data scope and/or union output data scope of the lower-level DAF instance. The union input data scope may be formed by the union of the current input data scope of the lower- level DAF instance and an input data scope of the lower-level DAF instance required by the higher-level DAF. The union output data scope may be formed by the union of the current output data scope of the lower-level DAF instance and an output data scope of the lower-level DAF instance required by the higher-level DAF.

For example, a lower-level DAF may be already analysing measurement data collected from one geo-area and produce analytics result relevant for that geo-area. Next, a higher-level DAF may be deployed that requires the analytics output generated by the lower-level DAF but for a greater geo-area than what is currently configured as the input for the lower-level DAF. The DAMP may then reconfigure the lower-level DAF to take input from this greater geo-area (or, if the greater geo-area is not a superset of the current geo-area, then from the union of the two geo-areas). The DAMP may reconfigure the lower-level DAF instead of deploying an additional instance of the lower-level DAF configured on the greater geo-area, because this may be more efficient if the two geo-areas have overlap (otherwise the intersection would be analysed twice, both by the current and newly deployed lower-level DAF instances).

Other dimensions of union/extension may also possible, for example, time-wise (lower-level DAF may currently be performing analytics during a given period and may be required to be performing analytics for additional period) or other. The lower-level DAF may currently be performing analytics on measurements X and Y, and the new higher-level DAF would require the lower-level DAF to perform analytics on measurements X, Y and Z (this example may be valid only if the lower-level DAF has declared its capability to flexibly operate on measurement set X, Y and also on X, Y, Z).

The DAMP may reconfigure the lower-level DAF instance to produce data according to the union input data scope and/or union output data scope. The DAMP may notify the lower-level DAF instance to consume input data and produce output data according to the union input data scope and/or union output data scope. The DAMP may receive a response from the lower-level DAF instance indicating that the lower-level DAF instance consumes input data and produce output data according to the union input data scope and/or union output data scope.

The DAMP may send a request to reconfigure a DAF instance. The request may be sent to a standardized interface for reconfiguring a DAF instance. The request may pass the union input data scope and/or union output data scope as parameter. The union input data scope and/or union output data scope may be in the same standardized format as the standardized format used for the initial input data scope and/or the output data scope in the process of Figure 9.

The DAF instance may make the necessary internal reconfiguration to receive and process the input data and produce the output data according to the union input data scope and/or union output data scope. This may involve auto-scaling, either autonomously or in cooperation with a DAF-specific or generic VNF manager.

The DAF instance may acknowledge the request. After the acknowledgement the DAF instance may be expected to process the input data and produce the output data according to the union input data scope and/or union output data scope.

It will be understood that a special case of this procedure may be used to inform the DAF instance of its initial output data scope, right after it was deployed via the DAF deployment procedure.

Figure 11 shows a schematic representation of a flow chart of a method performed by a DAMP for deploying a DAF instance.

The DAMP may receive an input data template and an output data template from the DAF deployment package.

The DAMP may receive an input data scope and an output data scope from the operator for an operator-initiated DAF instance deployment. In the event of a recursive-initiated DAF instance deployment the DAMP may have previously received the input data scope and the output data scope from the operator.

The DAMP may combine the input data template, the output data template, the input data scope and the output data scope to determine what input data is to be input to a DAF instance and what output data is to be output by the DAF instance before deploying the DAF instance. The input data to be input to the DAF instance may comprise measurements and analytics results.

The DAMP may hand over the DAF deployment package to the VNF and/or CNF to deploy and start the DAF instance.

For each input data item to be input to the DAF instance the DAMP may determine whether the input data item is a measurement (type I) or an analytics result (type II).

If the input data item is a measurement (type I) the DAMP may deploy and start or find and reconfigure a measurement source to make sure that the input data to be input to the DAF is available. The DAMP may connect the DAF instance to the measurement source.

The DAMP may determine if other input data item to be input to the DAF instance are available. If other input data item to be input to the DAF instance are available the DAMP repeats the above steps. Otherwise the DAMP may provide a notification to the DAF instance indicating that the DAF deployment has been successful.

If the input data item is an analytics result (type II) the DAMP may deploy and start or find and reconfigure another DAF to make sure that the input data to be input to the DAF is available.

If the DAMP determines that there is another DAF instance already deployed and started to provide the input data item to the DAF instance, the DAMP may reconfigure the other DAF instance to provide the input data item to the DAF instance. The DAMP may determine a union input data scope and/or union output data scope for the other DAF instance. The DAMP may reconfigure recursively DAF instances dependent on the other DAF instance to allow the other DAF instance to process input data and produce output data according to the union input data scope and/or union output data scope. The DAMP may connect the DAF instance to the other DAF instance.

The DAMP may determine if another input data item to be input to the DAF instance is available. If another input data item to be input to the DAF instance is available the DAMP repeats the above steps. Otherwise the DAMP may provide a notification to the DAF instance indicating that the DAF deployment has been successful.

If the DAMP determines that there is no other DAF instance already deployed and started to provide the input data item to the DAF instance, the DAMP may determine if there is a registered DAF deployment package with a suitable profile so that another DAF instance could be deployed and started to provide the input data item to the DAF instance.

If there is no registered DAF deployment package with a suitable profile the DAMP may provide a notification to the DAF instance indicating that the DAF deployment has been unsuccessful.

If there is a registered DAF deployment package with a suitable profile the DAMP may perform the whole process of Figure 11 recursively and may deploy the other DAF instance to provide the input data item to the DAF instance. The DAMP may connect the DAF instance to the measurement source.

The DAMP may determine if another input data item to be input to the DAF instance are available. If another input data item to be input to the DAF instance is available the DAMP repeats the above steps. Otherwise the DAMP may provide a notification to the DAF instance indicating that the DAF deployment has been successful.

One or more aspects of the disclosure may be standardized as follows. The interface between the operator and the DAMP to support the registration of the DAF deployment package may be standardized. It may include the specification of the content, format of the DAF input and output descriptor file or specification of how this can be included and located in supported VNF packaging format.

The interface between the operator and the DAMP to support the deployment of the DAF instance may be standardized. It may include the format of the input data scope and the output data scope.

The interface between the DAMP and the DAF instance to support the configuration of the DAF instance. It may include the format of the union input data scope and the output data scope.

There may be two possible standardization targets in 3GPP. The standardization may be an extension to the SA5 standard or it may be a further contribution to SA2 NWDAF enhancement work item.

SA5 specifies only management APIs, called management service (MnS). SA5 does not specify the management functions that provide these APIs.

One or more aspects of the disclosure may be standardized in SA5 as follows.

An MnS for registering the DAF deployment package may be standardized in SA5. The DAMP may be the provider of the MnS and the operator (or the network management tool used by the operator) may be the consumer.

An MnS for deploying the DAF instance may be standardized in SA5. The DAMP may be the provider of the MnS and the operator (or the network management tool used by the operator) may be the consumer. An MnS for configuring a DAF instance may be standardized in SA5. The DAF instance may be the provider of the MnS and the DAMP may be the consumer.

The data management framework has been contributed to the Rel-17 work item FS_eNA_Ph2 and added to technical report [TR 23.700-91 ] as solution #9.

One or more aspect of this disclosure may give solution for automatic NWDAF deployment and dependency resolution in a way that is agnostic to analytics use cases. In this case the role of DAMP is played by the Data Management Framework.

Figure 12 shows a schematic representation of a flow chart of a method performed by a DCCF for deploying a DAF instance. The DCCF may operate as a DAMP.

In steps 1 to 3 the DCCF may perform a procedure for registering a DAF deployment package. The procedure for registering a DAF deployment package may be as per Figure 8 or may include some variations.

In step 1 the operator may send a request for registering the DAF deployment package to the DCCF. The operator may pass the DAF deployment package as parameter.

In step 2 the DCCF may find the DAF input and output descriptor file in the DAF deployment package. The DCCF may store the DAF deployment package in a register along with the profiles included in the DAF input and output descriptor file and a DAF deployment package identifier. The DCCF may allocate the DAF deployment package identifier to the DAF deployment package.

In step 3 the DCCF may provide the DAF deployment package identifier to the operator.

In steps 4 to 11 the DCCF may perform a procedure for deploying a DAF instance. The procedure for deploying a DAF instance may be as per Figure 9 or may include some variations. In step 4 the operator may select the DAF deployment package and may decide to deploy a DAF instance with a selected profile comprising a selected input data template and a selected output data template. The operator may select an input data scope and an output data scope.

In step 5 the operator may send a request to deploy and start the DAF instance to the DCCF. The operator may pass the DAF deployment package identifier, the profile and the input data scope and the output data scope as parameters.

In step 6 the DCCF may identify the DAF deployment package based on the DAF deployment package identifier and may pass the DAF deployment package to a VNF and/or CNF orchestrator to deploy the DAF instance. The VNF and/or CNF orchestrator may deploy and start the DAF instance.

Starting the DAF may be performed before one or more measurement source or one or more other DAF instance providing the input data required to the DAF instance may be connected to the DAF instance.

In step 7 the DCCF may combine the input data template and the input data scope to specifically determine what input data is to be input to a DAF instance.

In step 8 the DCCF may perform automatic dependency resolution. The DCCF may determine which other DAF instances need to be deployed and started to provide the input data to the DAF instance. The DCCF may delegate the other DAF instances deployment and start to the VNF and/or CNF orchestrator by passing other registered DAF deployment package to the VNF and/or CNF orchestrator. The VNF and/or CNF orchestrator may deploy and start the other DAF instances.

The DCCF may determine which other DAF instances already deployed and started need to be reconfigured to provide the input data to the DAF instance. The DCCF may reconfigure the other DAF instances. The DCCF may determine which measurement sources need to be deployed and started to provide the input data to the DAF instance. The VNF and/or CNF orchestrator may deploy and start the measurement sources.

The DCCF may determine which measurement sources already deployed and started need to be reconfigured to provide the input data to the DAF instance. The DCCF may reconfigure the measurement sources.

The DCCF may integrate the input data provided to the DAF instance into the messaging framework.

In step 9 the DCCF may connect the DAF instance to the other DAF instances and/or the measurement sources.

By executing steps 8 and 9, the DCCF may ensure that all input data to be provided to the DAF instance is available and the DAF instance is connected to the measurement sources and/or other DAF instances providing the input data via the messaging framework.

While the data management framework description already included in [TR 23.700-91 ] may already give the responsibility of deploying and starting or finding and reconfiguring measurement sources on behalf of the NWDAF, it may be a new responsibility that the DCCF also deploys and starts or finds and reconfigure other DAFs on behalf of the NWDAF.

The DCCF may receive input data from the measurements sources and/or other DAF instances (producers) via the messaging framework.

In step 10 the DCCF may combine the output data template and the output data scope to specifically determine what output data is to be output by the DAF instance. The DCCF may stores what output data is to be output by the DAF instance in a register. In this way, when subsequently deploying and starting another DAF, it can detect whether the DAF instance may be used to provide input data to the other DAF instance.

In step 11 the DCCF may provide a response to the operator indicating that the DAF instance deployment, including the recursive deployment of other DAF instances is successful. If the DAF instance deployment is successful, the response may contain a northbound interface uniform resource link (URL) of the deployed DAF instance. The operator may use the URL to directly interact with the DAF instance (e.g. monitor the analytics results).

In step 12 the DCCF may integrate the output data of the DAF instance into the messaging framework. Thus, after deploying and starting the DAF instance the DCCF may connect the DAF instance to other DAF instances (consumers).

A new service interface may be implemented into SA2 to support the procedure for configuring a DAF instance. This service interface may be provided by all running DAF instances and may be consumed by the DCCF to reconfigure already running DAFs.

Figure 13 shows a block diagram of a method performed by a DAMP for deploying a DAF instance. The DAMP may be a DCCF.

In step 1300 the DAMP may receive a request to deploy a DAF instance from an operator.

The request to deploy the DAF instance may comprise an input data template defining one or more input data type for the DAF instance and/or an output data template defining one or more output data type for the DAF instance.

The input data template and/or the output data template may comprise one or more measurement or analytics result. The request to deploy the DAF instance may comprise an input data scope defining one or more input data source for the DAF instance and/or an output data scope defining one or more output data destination for the DAF instance.

The input data scope and/or the output data scope may comprise one or more measurement source or data analytics function instance.

The request to deploy the DAF instance may comprise a DAF deployment package ID associated with a DAF deployment package.

In step 1302 the DAMP may cause the DAF instance to be deployed.

Causing the DAF instance to be deployed may comprise providing at least part of a DAF deployment package to a VNF and/or CNF orchestrator.

In step 1304 the DAMP may determine that another DAF instance is to be deployed or is deployed and is to be reconfigured to provide input data to the DAF instance.

Determining that another DAF instance is to be deployed or is deployed and is to be reconfigured is based on the input data template and the input data scope of the DAF instance.

In step 1306 the DAMP may cause the other data analytics function instance to be deployed or to be reconfigured.

Causing the other DAF instance to be deployed may comprise providing at least part of another DAF deployment package to the VNF and/or CNF orchestrator.

Causing the other DAF instance to be reconfigured may comprise causing the other DAF instance to modify the input data scope and/or output data scope of the other DAF instance. Prior to step 1302 the DAMP may receiving a request to register a DAF deployment package, the request to register the DAF deployment package comprising the DAF deployment package. The DAMP may associate a DAF package identifier to the DAF deployment package. The DAMP may store the DAF deployment package and the DAF deployment package ID.

The DAF deployment package may comprise one or more profile, each profile comprises an input data template and an output data template.

The DAF deployment package may comprise a DAF input and output descriptor file, wherein the DAF input and output descriptor file comprises one or more profile associated with one or more profile identifier, each profile comprises an input data template and an output data template.

The DAMP may store the one or more profiles with the one or more profile identifier.

The request to deploy the DAF instance may comprise a profile identifier associated with one of the one or more profiles.

The DAMP may provide the DAF deployment package ID to the operator.

The request to deploy the DAF instance may comprise the DAF deployment package ID.

The DAMP may provide a URI of the DAF instance to the operator.

Figure 14 shows a schematic representation of non-volatile memory media 1400a (e.g. computer disc (CD) or digital versatile disc (DVD)) and 1400b (e.g. universal serial bus (USB) memory stick) storing instructions and/or parameters 1402 which when executed by a processor allow the processor to perform one or more of the steps of the methods of Figure 13. It is noted that while the above describes example embodiments, there are several variations and modifications which may be made to the disclosed solution without departing from the scope of the present invention.

It will be understood that although the above concepts have been discussed in the context of a 5GS, one or more of these concepts may be applied to other cellular systems.

It will also be understood that although the above mechanism has been described as being implemented by a messaging framework, the mechanism may be implemented by another apparatus, entity or function.

The embodiments may thus vary within the scope of the attached claims. In general, some embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although embodiments are not limited thereto. While various embodiments may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.

The embodiments may be implemented by computer software stored in a memory and executable by at least one data processor of the involved entities or by hardware, or by a combination of software and hardware. Further in this regard it should be noted that any procedures, e.g., as in Figure 13, may represent program steps, or interconnected logic circuits, blocks and functions, or a combination of program steps and logic circuits, blocks and functions. The software may be stored on such physical media as memory chips, or memory blocks implemented within the processor, magnetic media such as hard disk or floppy disks, and optical media such as for example DVD and the data variants thereof, CD.

The memory may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory. The data processors may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASIC), gate level circuits and processors based on multi-core processor architecture, as non-limiting examples.

Alternatively or additionally some embodiments may be implemented using circuitry. The circuitry may be configured to perform one or more of the functions and/or method steps previously described. That circuitry may be provided in the base station and/or in the communications device.

As used in this application, the term “circuitry” may refer to one or more or all of the following:

(a) hardware-only circuit implementations (such as implementations in only analogue and/or digital circuitry);

(b) combinations of hardware circuits and software, such as:

(i) a combination of analogue and/or digital hardware circuit(s) with software/firmware and

(ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as the communications device or base station to perform the various functions previously described; and

(c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation. This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example integrated device.

The foregoing description has provided by way of exemplary and non-limiting examples a full and informative description of some embodiments However, various modifications and adaptations may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings and the appended claims. However, all such and similar modifications of the teachings will still fall within the scope as defined in the appended claims.