Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS, COMPUTER PROGRAM AND METHOD FOR CONTROLLING MEDIA SYSTEM OF MEETING SPACE
Document Type and Number:
WIPO Patent Application WO/2018/109270
Kind Code:
A1
Abstract:
Apparatus, computer program and method for controlling media system of meeting space. The apparatus (100) operates as follows: detect (120) an arrival of a user (160, 160A, 160B) to a meeting space (130); determine (122) user meeting rights of the user (160, 160A, 160B) associated with a mobile apparatus (140, 140A, 140B); receive (124) a meeting service request from the mobile apparatus (140, 140A, 140B); and process (126) the received meeting service request according to the determined user meeting rights to control a media system (132) of the meeting space (130).

Inventors:
KIVELÄ JORMA (FI)
Application Number:
PCT/FI2017/050878
Publication Date:
June 21, 2018
Filing Date:
December 12, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
JUTEL OY (FI)
International Classes:
H04L12/18
Foreign References:
US20140267559A12014-09-18
EP1085774A22001-03-21
Other References:
None
Attorney, Agent or Firm:
KOLSTER OY AB (FI)
Download PDF:
Claims:
Claims

1. An apparatus (100) comprising:

one or more processors (102); and

one or more memories (104) including computer program code (106), the one or more memories (104) and the computer program code

(106) configured to, with the one or more processors (102), cause the apparatus (100) at least to:

detect (120) an arrival of a user (160, 160A, 160B) to a meeting space

(130);

determine (122) user meeting rights of the user (160, 160A, 160B) associated with a mobile apparatus (140, 140A, 140B);

receive (124) a meeting service request from the mobile apparatus (140, 140A, 140B);

process (126) the received meeting service request according to the determined user meeting rights to control a media system (132) of the meeting space (130);

receive (400) a comment request as the service request (124);

transmit (402) the comment request to a moderator (220) of the meeting;

receive (404) a decision regarding the comment request from the moderator (220);

transmit (406) the decision regarding the comment request to the mobile apparatus (140A); and

if the decision regarding the comment request is affirmative (408 YES), transmit (410) a configuration request to the mobile apparatus (140A) to configure the mobile apparatus (140A) to catch speech of the user (160A) with a microphone (156) coupled with the mobile apparatus (140A) and transmit (412) the speech wirelessly to the media system (132) of the meeting space (130), and configure (414) the media system (132) of the meeting space (130) to receive (416) the speech and output it through at least one audio output apparatus (212) located in the meeting space (130).

2. The apparatus of claim 1, further caused to:

detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by detecting a presence (302) of the user (160, 160A, 160B) in the meeting space (130).

3. The apparatus of any preceding claim, further caused to: detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by detecting a presence (304) of the mobile apparatus (140, 140A, 140B) in the meeting space (130).

4. The apparatus of any preceding claim, further caused to: detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by analyzing a code (306) received from the mobile apparatus (140, 140A, 140B).

5. The apparatus of any preceding claim, further caused to: detect (120) the arrival of the user (160, 160A, 160B) to the meeting space (130) by detecting an interaction of the mobile apparatus (140, 140A, 140B) with a discovery protocol (308) broadcasted in the meeting space (130).

6. The apparatus of any preceding claim, further caused to: detect (314) an exit of the user (160, 160A, 160B) from the meeting space (130); and

transmit (316) a meeting service termination order to the mobile apparatus (140, 140A, 140B).

7. The apparatus of any preceding claim, further caused to: configure (414) the media system (132) of the meeting space (130) to broadcast (418) the speech wirelessly to a plurality of mobile apparatuses (140B) located in the meeting space (130).

8. The apparatus of any preceding claim, further caused to: transmit (422) a configuration request to the mobile apparatus (140A) to configure the mobile apparatus (140A) to catch video of the user (160A) with a camera (158) coupled with the mobile apparatus (140) and transmit (424) the video wirelessly to the media system (132) of the meeting space (130), and configure (414) the media system (132) of the meeting space (130) to receive (426) the video and output it through at least one video output apparatus (214) located in the meeting space (132).

9. The apparatus of claim 8, further caused to:

configure (414) the media system (132) of the meeting space (130) to broadcast (420) the video wirelessly to a plurality of mobile apparatuses (140B) located in the meeting space (130).

10. The apparatus of any preceding claim, further caused to: determine (122) the user meeting rights of the user (160, 160A, 160B) associated with the mobile apparatus (140, 140A, 140B) in a form of software configuration information (310);

transmit (312) the software configuration information to the mobile apparatus (140, 140A, 140B); and

receive (124) the service request from an application (148) configured with the software configuration information (310) in the mobile apparatus (140, 140A, 140B).

11. A computer-readable storage medium (112) comprising the computer program code (106) of any preceding claim 1 to 10, which, when loaded into the apparatus (100) causes the apparatus (100) to perform the described processing.

12. A method comprising:

detecting (120) an arrival of a user to a meeting space;

determining (122) user meeting rights of the user associated with a mobile apparatus;

receiving (124) a meeting service request from the mobile apparatus; processing (126) the received meeting service request according to the determined user meeting rights to control a media system of the meeting space;

receiving (400) a comment request as the service request; transmitting (402) the comment request to a moderator of the meeting; receiving (404) a decision regarding the comment request from the moderator;

transmitting (406) the decision regarding the comment request to the mobile apparatus; and

if the decision regarding the comment request is affirmative (408

YES), transmitting (410) a configuration request to the mobile apparatus to configure the mobile apparatus to catch speech of the user with a microphone coupled with the mobile apparatus, and transmitting (412) the speech wirelessly to the media system of the meeting space, and configuring (414) the media system of the meeting space to receive (416) the speech and output it through at least one audio output apparatus located in the meeting space.

Description:
Apparatus, computer program and method for controlling media system of meeting space

Field

The invention relates to an apparatus, a computer program, and a method. Background

Although video conferences have become commonplace, traditional meetings still also have their place in training and business, for example. As participants have accustomed themselves with tools implementing video conferences, further sophistication is desired to enhance a traditional meeting.

Brief description

The present invention seeks to provide an improved apparatus, computer program and method for controlling a media system of a meeting space.

According to an aspect of the present invention, there is provided an apparatus as specified in claim 1.

According to another aspect of the present invention, there is provided a computer program as specified in claim 11.

According to another aspect of the present invention, there is provided a method as specified in claim 12. List of drawings

Example embodiments of the present invention are described below, by way of example only, with reference to the accompanying drawings, in which

Figure 1 illustrates example embodiments of an apparatus and a mobile apparatus;

Figure 2 illustrates further example embodiments of the apparatus and its operating environment; and

Figures 3 and 4 are flow-charts illustrating example embodiments of a method. Description of embodiments

The following embodiments are only examples. Although the specification may refer to "an" embodiment in several locations, this does not necessarily mean that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments. Furthermore, words "comprising" and "including" should be understood as not limiting the described embodiments to consist of only those features that have been mentioned and such embodiments may contain also features/structures that have not been specifically mentioned.

Figure 1 illustrates example embodiments of an apparatus 100 and a mobile apparatus 140, and Figure 2 illustrates further example embodiments of the apparatus 100 and its operating environment.

In an example embodiment, the apparatus 100 is a computing device. The computing device 100 may be portable, mobile or stationary. It may be an independent apparatus or it may be more or less integrated with another object, such as a media system 132. A non-limiting list of example embodiments of the computing device 100 comprises: a computer, a portable computer, a laptop, a mobile phone, a smartphone, a tablet computer, or any other portable/mobile/stationary computing device capable of controlling a media system 132 of a meeting space 130.

In an example embodiment, the apparatus 100 is a personal computing device operated by a chair/operator 220 of a meeting, or a technician 216 responsible for the meeting, and/or the media system 132, and/or the meeting space 130.

In an example embodiment, the apparatus 100 is a computing server. The computing server 100 may be implemented with any applicable technology. It may include one or more centralized computing apparatuses 100, or it may include more than one distributed computing apparatuses 100. It may be implemented with client-server technology, or in a cloud computing environment, or with another technology applicable to the computing server 100 capable of controlling the media system 132 of the meeting space 130.

In an example embodiment, the mobile apparatus 140, 140A, 140B is a personal computing device of a user 160, 160A, 160B. It may be portable. A non- limiting list of example embodiments of the mobile apparatus 140, 140A, 140B comprises: a computer, a portable computer, a laptop, a mobile phone, a smartphone, a tablet computer, a smartwatch, smartglasses, or any other portable/mobile computing device, which may be manipulated by the user 160, 160A, 160B.

In an example embodiment, the apparatus 100 and/or the mobile apparatus 140, 140A, 140B is a general-purpose off-the-shelf computing device, as opposed to a purpose-build proprietary equipment, whereby research & development costs will be lower as only the special-purpose software (and not the hardware) needs to be designed, implemented and tested.

In an example embodiment, the apparatus 100 and/or the mobile apparatus 140, 140A, 140B employs a suitable operating system such as Windows, iOS or Android, for example.

The apparatus 100 comprises one or more processors 102, and one or more memories 104 including computer program code 106.

The mobile apparatus 140 also comprises one or more processors 142, and one or more memories 144 including computer program code 146.

The term 'processor' 102, 142 refers to a device that is capable of processing data. Depending on the processing power needed, the apparatus 100 or the mobile apparatus 140, 140A, 140B may comprise several processors 102, 142 such as parallel processors or a multicore processor. When designing the implementation of the processor 102, 142 a person skilled in the art will consider the requirements set for the size and power consumption of the apparatus 100 or the mobile apparatus 140, 140A, 140B, the necessary processing capacity, production costs, and production volumes, for example.

The term 'memory' 104, 144 refers to a device that is capable of storing data run-time (= working memory) or permanently (= non-volatile memory). The working memory and the non-volatile memory may be implemented by a random-access memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), a flash memory, a solid state disk (SSD), PROM (programmable read-only memory), a suitable semiconductor, or any other means of implementing an electrical computer memory.

The processor 102, 142 and the memory 104, 144 may be implemented by an electronic circuitry. A non-exhaustive list of implementation techniques for the processor 102, 142 and the memory 104, 144 includes, but is not limited to: logic components, standard integrated circuits, application-specific integrated circuits (ASIC), system-on-a-chip (SoC), application-specific standard products (ASSP), microprocessors, microcontrollers, digital signal processors, special-purpose computer chips, field-programmable gate arrays (FPGA), and other suitable electronics structures.

The computer program code 106, 146 may be implemented by software and/or hardware. In an example embodiment, the software may be written by a suitable programming language (a high-level programming language, such as C, C++, or Java, or a low-level programming language, such as a machine language, or an assembler, for example), and the resulting executable code 106, 146 may be stored on the memory 104, 144 and run by the processor 102, 142. In an alternative example embodiment, the functionality of the hardware may be designed by a suitable hardware description language (such as Verilog or VHDL), and transformed into a gate-level netlist (describing standard cells and the electrical connections between them), and after further phases the chip implementing the processor 102, 142, memory 104, 144 and the code 106, 146 of the apparatus 100 or the mobile apparatus 140, 140A, 140B may be fabricated with photo masks describing the circuitry.

An example embodiment provides a computer-readable medium 112 comprising the computer program code 106, which, when loaded into the apparatus 100 and executed by the apparatus 100 causes the apparatus 100 to perform processing of the example embodiments. An example embodiment provides a computer-readable medium 154 comprising the computer program code 146, which, when loaded into the mobile apparatus 140, 140A, 140B and executed by the mobile apparatus 140, 140A, 140B causes the mobile apparatus 140, 140A, 140B to perform processing of the example embodiments.

In an example embodiment, the operations of the computer program code 106, 146 may be divided into functional modules, sub-routines, methods, classes, objects, applets, macros, etc., depending on the software design methodology and the programming language used. In modern programming environments, there are software libraries, i.e. compilations of ready-made functions, which may be utilized by the computer program code 106, 146 for performing a wide variety of standard operations. In an example embodiment, the computer program code 106, 146 may be in source code form, object code form, executable file, or in some intermediate form. The computer-readable medium 112, 154 may comprise at least the following: any entity or device capable of carrying the computer program code 106, 146 to the apparatus 100 or to the mobile apparatus 140, 140A, 140B, a record medium, a computer memory, a read-only memory, an electrical carrier signal, a telecommunications signal, and a software distribution medium. In some jurisdictions, depending on the legislation and the patent practice, the computer-readable medium 112, 154 may not be the telecommunications signal. In an example embodiment, the computer-readable medium 112, 154 may be a non-transitory computer-readable storage medium.

In an example embodiment, the mobile apparatus 140, 140A, 140B comprises a user interface 150 implementing the exchange 164 of graphical, textual and/or auditory information with the user 160, 160A, 160B. The user interface 150 may be used to perform required user actions in relation to controlling the media system 132 of the meeting space 130. The user interface 150 may be realized with various techniques, such as a (multi-touch) display, means for producing sound, a keyboard, and/or a keypad, for example. The means for producing sound may be a loudspeaker or a simpler means for producing beeps or other sound signals. The keyboard/keypad may comprise a complete (QWERTY) keyboard, a mere numeric keypad or only a few push buttons and/or rotary buttons. In addition, or alternatively, the user interface may comprise other user interface components, for example various means for focusing a cursor (mouse, track ball, arrow keys, touch sensitive area etc.) or elements enabling audio control.

In an example embodiment, the mobile apparatus 140, 140A, 140B comprises a radio transceiver 152 implementing the communication with the apparatus 100. In an example embodiment, the radio transceiver 152 comprises a cellular radio transceiver (communicating with technologies such as GSM, GPRS, EGPRS, WCDMA, UMTS, 3GPP, IMT, LTE, LTE-A, etc.) and/or a non-cellular radio transceiver (communicating with technologies such as Bluetooth, Bluetooth Low Energy, Wi-Fi, WLAN, etc.).

Now that the basic equipment, the apparatus 100 and the mobile apparatus 140, 140A, 140B, has been described, let us proceed to describe an example embodiment of a typical use case.

The person 160, 160A, 160B enters the meeting space 130 in order to participate in a meeting. The user 160, 160A, 160B is carrying his/her personal mobile apparatus 140, 140A, 140B, or, alternatively, the user 160, 160A, 160B is given a mobile apparatus 140, 140A, 140B for use during the meeting.

The one or more memories 104 and the computer program code 106 of the apparatus 100 are configured to, with the one or more processors 102 of the apparatus 100, cause the apparatus 100 at least to perform the following four- step sequence of operations:

120) Detect an arrival 170/172 of the user 160, 160A, 160B to the meeting space 130. The user detector 108 may be implemented with any technology suitable for detecting the user 160, 160A, 160B, and/or his/her mobile apparatus 140, 140A, 140B.

122) Determine user meeting rights of the user 160, 160A, 160B associated 162 with the mobile apparatus 140, 140A, 140B. The user meeting rights may determine the role (attendant, speaker, listener, etc.) of the user 160, 160A, 160B in the meeting, and, consequently, allowed actions (speak, vote, etc.). 124) Receive a meeting service request 174 from the mobile apparatus 140, 140A, 140B. The meeting service request may have been generated by an interaction of the user 160, 160A, 160B with a meeting application 148 in his/her mobile apparatus 140, 140A, 140B.

126) Process the received meeting service request according to the determined user meeting rights to control 176 the media system 132 of the meeting space 130. The media system 132 of the meeting space 130 may comprise loudspeakers, displays and/or other apparatus presenting audio and/or visual information to the attendees of the meeting.

In an example embodiment, the apparatus 100 may interact or even be a part of an Internet of Sound® system developed by the applicant, Jutel Oy, for providing an IP network based media processing and delivery.

These operations 120-122-124-126 implement a method illustrated in Figure 3, which starts in 300. The method ends in 318 after the processing is finished, or, it may be looped back to the beginning, and the processing of the next user may be started from the operation 120, or looped back to the operation 124, and the processing of the next meeting service request may be started. Further example embodiments are illustrated in Figure 4. The operations are not strictly in chronological order, and some of the operations may be performed simultaneously or in an order differing from the given ones. Other functions may also be executed between the operations or within the operations and other data exchanged between the operations. Some of the operations or part of the operations may also be left out or replaced by a corresponding operation or a part of the operation. It should be noted that no special order of operations is required, except where necessary due to the logical requirements for the processing order.

In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting a presence 302 of the user 160, 160A, 160B in the meeting space 130. In an example embodiment, computer vision may be used to detect the user 160, 160A, 160B, and identify him/her with aid of a database 114 storing user data, such as facial images of users 160, 160A, 160B.

In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting a presence 304 of the mobile apparatus 140, 140A, 140B in the meeting space 130. The presence 304 of the mobile apparatus 140, 140A, 140B in the meeting space 130 may be detected with any suitable means for locating the mobile apparatus 140, 140A, 140B such as by location services 206, detecting that the location of the mobile apparatus 140A, 140A, 140B is in the meeting space 130 or in its immediate vicinity, for example. The mobile apparatus 140, 140A, 140B may also be detected by its address (such as WLAN address, or Bluetooth MAC address).

In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by analyzing a code 306 received from the mobile apparatus 140, 140A, 140B. In this example embodiment, the mobile apparatus 140, 140A, 140B transmits the code 306 to the apparatus 100 through a wireless network 202 or a media network 204, after the user 160, 160A, 160B has obtained the code, by receiving the code 306 in an e-mail message, in a text message, or after having perceived (seen or heard) the code 306. The code 306 may be displayed in the meeting space 130, for example, and the user may read it or photograph it (such as QR code).

In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by detecting an interaction of the mobile apparatus 140, 140A, 140B with a discovery protocol 308 broadcasted in the meeting space 130. The discovery protocol 308 may be a part of a zero-configuration networking, wherein the mobile apparatus 140, 140A, 140B is able to connect to a computer network 202, 204 without a manual operator intervention or a special configuration server. In an example embodiment, the discovery protocol 308 is Bonjour, which provides service discovery, address assignment, and hostname resolution. In an example embodiment, the apparatus 100 is further caused to detect 120 the arrival of the user 160, 160A, 160B to the meeting space 130 by receiving data from the mobile apparatus 140, wherein the data somehow identifies the meeting space 130. For example: a camera 158 of the mobile apparatus 140 captures live video of a display 214 in the meeting space 130, and by analyzing the live video, the meeting space 130 is identified. The identification may be based on recognizing a certain feature from the video, such as an image or a series of images or characters shown in the display 214, a certain pattern of colours shown in the display 214, a certain pattern of flickering shown in the display 214, etc. Another example embodiment provides a gesture performed with the mobile apparatus 140, which identifies the meeting space 130. For example: in the meeting space 130 is announced (by an image or by an announcement, for example) that the meeting space identification is number eight, and the user 160 moves his/her mobile apparatus 140 in air to form a figure eight, which is noted with an inertial measurement unit (including an acceleration sensor and possibly also a magnetometer and/or a gyroscope) of the mobile apparatus 140, and transmitted to the apparatus 100 as an identification from a meeting application 148 of the mobile apparatus 140.

In an example embodiment, the apparatus 100 is further caused to detect 314 an exit of the user 160, 160A, 160B from the meeting space 130, and transmit 316 a meeting service termination order to the mobile apparatus 140, 140A, 140B. The exit of the user 160, 160A, 160B may be detected with applying similar technologies as for detecting the arrival of the user 160, 160A, 160B

In an example embodiment, the apparatus 100 is further caused to receive 400 a comment request as the service request 124, transmit 402 the comment request to a moderator 220 of the meeting, receive 404 a decision regarding the comment request from the moderator 220, and transmit 406 the decision regarding the comment request to the mobile apparatus 140A. The comment request may be generated by the user 160A performing an appropriate user action (such as pressing a button labelled "REQUEST COMMENT") in the meeting application 148 present in the mobile apparatus 140A. This mechanism may also be utilized for other applicable meeting services such as written comments, voting, etc.

In an example embodiment, the apparatus 100 is further caused to, if the decision regarding the comment request is affirmative 408 YES, transmit 410 a configuration request to the mobile apparatus 140A to configure the mobile apparatus 140A to catch speech of the user 160A with a microphone 156 coupled with the mobile apparatus 140A and transmit 412 the speech wirelessly to the media system 132 of the meeting space 130, and configure 414 the media system 132 of the meeting space 130 to receive 416 the speech and output it through at least one audio output apparatus 212 located in the meeting space 130. In an example embodiment, the microphone 156 is a built-in microphone of the mobile apparatus 140A.

In an example embodiment, the apparatus 100 is further caused to configure 414 the media system 132 of the meeting space 130 to broadcast 418 the speech wirelessly to a plurality of mobile apparatuses 140B located in the meeting space 130. This example embodiment may augment or even replace the use of general loudspeakers 212 in the meeting space 130 as each user 160B may listen to the speech in his/her own mobile apparatus 140B, from a loudspeaker of the mobile apparatus 140B or from a (wired or wireless) earpiece coupled with the mobile apparatus 140B, for example.

In an example embodiment, the apparatus 100 is further caused to transmit 422 a configuration request to the mobile apparatus 140A to configure the mobile apparatus 140A to catch video of the user 160A with a camera 158 coupled with the mobile apparatus 140 and transmit 424 the video wirelessly to the media system 132 of the meeting space 130, and configure 414 the media system 132 of the meeting space 130 to receive 426 the video and output it through at least one video output apparatus 214 located in the meeting space 130. In an example embodiment, the camera 158 is a built-in camera of the mobile apparatus 140A.

In an example embodiment, the apparatus 100 is further caused to configure 414 the media system 132 of the meeting space 130 to broadcast 420 the video wirelessly to a plurality of mobile apparatuses 140B located in the meeting space 130. This example embodiment may augment or even replace the use of general displays 214 in the meeting space 130 as each user 160B may see the video in his/her own mobile apparatus 140B, in a display 150 of the mobile apparatus 140B, for example.

In an example embodiment, the apparatus 100 is further caused to determine 122 the user meeting rights of the user 160, 160A, 160B associated with the mobile apparatus 140, 140A, 140B in a form of software configuration information 310, transmit 312 the software configuration information to the mobile apparatus 140, 140A, 140B, and receive 124 the service request from an application 148 configured with the software configuration information 310 in the mobile apparatus 140, 140A, 140B. With this example embodiment, the user 160, 160A, 160B may obtain beforehand a meeting application 148, which is then configured on-the-go with the software configuration information 310 to support various meeting spaces 130 as required. The meeting application 148 may be a so-called stub, i.e., basic software with a communication interface and a configurator, which enables flexible configuration of the meeting application 148. The meeting application 148 may become various functionalities (and their help functions) with the software configuration information 310. These functionalities may already exist in the mobile apparatus 140 and/or they may be downloaded from an external source such as the apparatus 100.

Finally, let us study Figure 2 illustrating further example embodiments of the apparatus 100 and its operating environment.

The attendants 160A, 160B of the meeting each have an attendant user interface 230A, 230B for interacting with the apparatus 100 and the meeting services provided by the apparatus 100. The chair/operator 220 of the meeting has a control user interface 226 for interacting with the apparatus 100 and the meeting services provided by the apparatus 100, and also a microphone 222 coupled with a microphone amplifier 224. The technician 216 responsible for the meeting, and/or the media system 132, and/or the meeting space 130 has a technician's user interface 218 for interacting with the apparatus 100 and the meeting services provided by the apparatus 100. The roles of the technician 216 and the chair/operator 220 may also be combined.

A data network 200 may be a wired network, utilizing copper or optical fibre, for example. The wired network 200 may even supply power (Power Over Ethernet, POE).

A wireless network 202 may be a wireless local area network and/or a mobile network.

A media network 204 may exist instead of or in addition to the wireless network 202 and it may be based on Bluetooth or some other wireless technology.

The location services 206 may utilize various location technologies, such as satellite-based location, cellular network location, indoor radio beacon- based location, indoor magnetic positioning, etc. The location services 206 may also utilize other electronic or optical mechanisms. The location services 206 may also utilize an electronic seating arrangement management system.

The apparatus 100 may be coupled to external cloud services 210, which may further enhance the services provided for the meeting in the meeting space 130.

A firewall 208 may be utilized to protect the apparatus 100 from unauthorized traffic from and to the Internet.

It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways. The invention and its embodiments are not limited to the example embodiments described above but may vary within the scope of the claims.