Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS AND METHOD FOR PROVIDING A CONFERENCE BETWEEN A CONFERENCE VENUE AND A REMOTE ATTENDEE
Document Type and Number:
WIPO Patent Application WO/2022/136856
Kind Code:
A1
Abstract:
Apparatus for use at a conference venue comprises a conference control unit (CCU) interface configured to communicate with a conference control unit 140 at the conference venue. The CCU is connectable to a plurality of DUs each having a microphone. A cellular wireless apparatus 320 is configured to support a voice call with a remote attendee telephone RA(T1). A processor is configured to support an audio conference between the CCU at the conference venue and the remote attendee telephone via the cellular wireless apparatus. The processor is configured to: receive CCU audio from the CCU and send audio data representing the CCU audio to the cellular wireless apparatus; and receive remote attendee audio data from the cellular wireless apparatus and send audio representing the remote attendee audio data to the CCU.

Inventors:
CREMIN DANIEL (GB)
WALLMAN RICHARD (GB)
COX JORDAN (GB)
SCHUMAN HASSAN (GB)
Application Number:
PCT/GB2021/053387
Publication Date:
June 30, 2022
Filing Date:
December 21, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CIVICO LTD (GB)
International Classes:
H04L65/403; H04M3/56; H04N7/15
Foreign References:
US20180288368A12018-10-04
US20180288111A12018-10-04
EP2974294A22016-01-20
CN111246151A2020-06-05
CN112188144A2021-01-05
Other References:
SENNHEISER ELECTRONIC GMBH: "Wireless conference System WICOS", 30 April 2009 (2009-04-30), pages 1 - 90, XP002597331, Retrieved from the Internet [retrieved on 20100819]
Attorney, Agent or Firm:
YEADON, Mark (GB)
Download PDF:
Claims:
46

CLAIMS

1 . Apparatus for use at a first conference venue comprising: a conference control unit (CCU) interface configured to communicate with a conference control unit at the first conference venue, wherein the CCU is connectable to a plurality of DUs each having a microphone; a cellular wireless apparatus which is configured to support a voice call with a remote attendee telephone; a processor configured to support an audio conference between the CCU at the first conference venue and the remote attendee telephone via the cellular wireless apparatus, the processor configured to: receive CCU audio from the CCU and send audio data representing the CCU audio to the cellular wireless apparatus; receive remote attendee audio data from the cellular wireless apparatus and send audio representing the remote attendee audio data to the CCU.

2. Apparatus according to claim 1 wherein the processor is configured to receive control data from the cellular wireless apparatus.

3. Apparatus according to claim 2 wherein the cellular wireless apparatus is configured to map an operating state of the cellular wireless apparatus to control data which emulates CCU control data.

4. Apparatus according to claim 3, wherein the control data comprises at least one: an indication of an operating state (on/off) of a CCU, wherein an initialised state of the cellular wireless apparatus is mapped to a CCU operating state of “on”; an indication of a state of a delegate unit (DU) connected to a CCU, wherein a state where the cellular wireless apparatus is connected to a cellular wireless network is mapped to a CCU operating state of “connected”; an indication of a state of a microphone of the DU, wherein an operating state where a voice call is connected with audio to a remote attendee is mapped to a microphone state of “on”; an indication of a voting selection.

5. Apparatus according to any one of claims 2 to 4 wherein the processor is configured to: maintain a request-to-speak queue of DUs; add a DU to the request-to-speak queue in response to receiving CCU control data indicating that a request-to-speak has been made at a DU connected to the CCU; and, add the remote attendee telephone to the request-to-speak queue in response to receiving control data indicating that a request-to-speak has been made at the remote attendee telephone. 47

6. Apparatus according to any one of claims 2 to 4 wherein the cellular wireless apparatus is configured to: determine if a key has been pressed on the remote attendee telephone by monitoring audio from the telephone; and when it is determined that a key has been pressed, outputting control data to the processor.

7. A cellular wireless apparatus for use at a conference venue comprising: a connector port configured to connect to a host apparatus at the conference venue; a cellular wireless transceiver configured to support a voice call with a remote attendee telephone; a processor configured to: receive audio data from the host apparatus for sending in the voice call with the remote attendee telephone; send remote attendee audio data from the voice call with the remote attendee telephone to the host apparatus; receive control data from the host apparatus for controlling the voice call with the remote attendee telephone; send control data to the host apparatus indicative of a status of the cellular wireless apparatus.

8. Apparatus according to claim 7 wherein the processor is configured to send control data which emulates conference control unit (CCU) control data, and the processor is configured to map an operating state of the cellular wireless apparatus to the CCU control data.

9. Apparatus according to claim 8 wherein the processor is configured to send control data which comprises: an indication of an operating state (on/off) of a CCU, wherein an initialised state of the cellular wireless apparatus is mapped to a CCU operating state of “on”; an indication of a state of a delegate unit (DU) connected to a CCU, wherein a state where the cellular wireless apparatus is connected to a cellular wireless network is mapped to a CCU operating state of “connected”; an indication of a state of a microphone of the DU, wherein an operating state where a voice call is connected with audio to a remote attendee is mapped to a microphone state of “on”; an indication of a voting selection.

10. Apparatus according to any one of claims 7 to 9 which is configured to send control data indicating a request-to-speak in response to receiving an incoming cellular wireless call.

11. Apparatus according to any one of claims 7 to 10 wherein the apparatus is configured to: 48 determine if a key has been pressed on the remote attendee telephone by monitoring audio from the telephone; and when it is determined that a key has been pressed, outputting control data to the host apparatus.

12. Apparatus according to any one of claims 7 to 11 which is configured to receive control data from the host apparatus to instruct the cellular wireless transceiver to dial out to a telephone number of a remote attendee telephone, the control data indicating a number to be dialled.

13. Apparatus according to any one of claims 7 to 12 which is configured to send control data to the host apparatus when the cellular wireless transceiver receives a call from a remote attendee telephone.

14. Apparatus according to any one of claims 7 to 13 wherein the apparatus comprises a housing with an RF port and the RF antenna is connected to, or connectable to, the RF port by an RF cable.

15. Apparatus according to any one of the preceding claims wherein the cellular wireless apparatus comprises a cellular wireless transceiver and an input configured to receive a subscriber identity module.

16. A computer-implemented method for providing a conference between conference equipment at a first conference venue and a remote attendee telephone, the method comprising, at apparatus at the first conference venue: communicating with a conference control unit (CCU) and with a cellular wireless apparatus which is configured to support a voice call with a remote attendee telephone, wherein the first CCU is connectable to a plurality of first delegate units (DU) each having a microphone; the communicating comprising: receiving CCU audio from the CCU and sending audio data representing the CCU audio to the cellular wireless apparatus; receiving remote attendee audio data from the cellular wireless apparatus and sending audio representing the remote attendee audio data to the CCU.

17. A method according to claim 16 comprising receiving control data from the cellular wireless apparatus.

18. A method according to claim 17 comprising: mapping an operating state of the cellular wireless apparatus to control data which emulates CCU control data at the cellular wireless apparatus; and sending the control data from the cellular wireless apparatus to the apparatus at the first conference venue.

19. A method according to claim 17 or 18, wherein the control data comprises at least one of: an indication of an operating state (on/off) of a CCU, wherein an initialised state of the cellular wireless apparatus is mapped to a CCU operating state of “on”; an indication of a state of a delegate unit (DU) connected to a CCU, wherein a state where the cellular wireless apparatus is connected to a cellular wireless network is mapped to a CCU operating state of “connected”; an indication of a state of a microphone of the DU, wherein an operating state where a voice call is connected with audio to a remote attendee is mapped to a microphone state of “on”; an indication of a voting selection.

20. A method according to any one of claims 17 to 19 comprising: maintaining a request-to-speak queue of DUs; adding a DU to the request-to-speak queue in response to receiving CCU control data indicating that a request-to-speak has been made at a DU connected to the CCU; and, adding the remote attendee telephone to the request-to-speak queue in response to receiving control data indicating that a request-to-speak has been made at the remote attendee telephone.

21 . A method according to any one of claims 17 to 20 comprising: determining if a key has been pressed on the remote attendee telephone by monitoring audio from the telephone; and when it is determined that a key has been pressed, outputting control data to the processor.

22. A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any one of claims 16 to 21.

23. A computer-readable medium having stored thereon the computer program of claim 22.

Description:
APPARATUS AND METHOD FOR PROVIDING A CONFERENCE BETWEEN A CONFERENCE VENUE AND A REMOTE ATTENDEE

BACKGROUND

Meetings involving a group of participants can use conference equipment which provides a unit on each participant’s desk and allows participants to speak, to hear others and to participate in voting. The conference equipment may be installed in a conference venue on a permanent or temporary basis. Examples of venues where this equipment is installed include conference venues, debating chambers of government buildings (e.g. councils, parliaments), commercial offices (e.g. meeting rooms, boardrooms) and educational establishments.

FIGURE 1 shows an example of conference equipment at a venue. Audio conferencing equipment typically comprises a Conference Control Unit (CCU) 40 and a plurality of delegate units (DU) 10. A delegate unit DU is typically a small desktop unit with a high quality microphone 11 and a loudspeaker 12. The delegate units DU 10 are connected to the CCU 40 by a wired link or a wireless link. During a conference, a participant wishing to speak can activate the microphone of their DU by pressing a button, or a microphone at the DU of the participant may be activated by an organiser. This causes audio to be output from that DU 10 to the CCU 40. Other DUs may be controlled so as to mute their microphones. The CCU 40 may mix audio from a single, or multiple, DUs and output the mixed audio to loudspeakers at the DUs so that the speaking participant can be heard by other participants. DUs may support other functions, such as voting.

Video equipment may also be provided at the venue. The video equipment is typically separate to the audio conferencing equipment. One or more cameras 20 are connected to a camera and video control unit 35. An operator can select a video input from one of the cameras for output to a display 30. The operator can also manually control the cameras to provide a required view, such as of the participants currently speaking.

The audio conferencing equipment at a venue is typically supplied by a single manufacturer and is an expensive investment. Currently, the installed conference equipment at a conference venue is used in isolation. That is, a conference takes place between the participants within the venue, but does not allow participants at the venue to participate in a conference with other participants outside the venue, or participants at another conference venue. Manufacturers have, in general, developed proprietary technology which is intended to tie a customer to their conference equipment. The video conferencing equipment at a venue is typically supplied by a different manufacturer to the audio conferencing equipment.

There are known technologies for conferencing between multiple geographically distributed parties. For example, audio telephone conferencing is well known via telephone lines, with a switch acting as a conference bridge. Audio and video conferencing is also possible between remote users via a conferencing service hosted on an Internet-based server. FIGURE 2 shows an example of an internet-based conferencing service. Each terminal 60 participating in the conference may be a personal computer (PC), tablet, smartphone or other computing device running an application 62 of the conferencing provider. Each terminal 60 communicates, via a network 64, with a server 66 supporting the conferencing service. Video conferencing services can support audio, video and sharing of presentation materials, such as slides. Existing technologies for video conferencing between remote parties are typically restricted to terminals which are dedicated to a single user (e.g. a user’s computer or phone), or a terminal which is intended to be shared by a group of users, such as a microphone unit in a meeting room which is shared by the group of users. These systems are not compatible with venues with multi-user installed conferencing systems.

It is possible to hold a meeting between a plurality of participants within a conference venue, who are served by conventional on-site conferencing equipment, and remote attendees. Typically, this requires remote attendees to use a video conferencing application which is independent of the onsite equipment and for an operator at a conference venue to manually switch to a feed from the internet video conferencing application when a remote attendee has permission to speak, or to patch a single combined feed of remote attendees from an internet video conferencing application to the conference venue. This is cumbersome for the meeting operator to manage. For example, the operator may need to manually mute and unmute the microphone of a remote attendee when it is their turn to speak, or each remote attendee may need to manually unmute their microphone when it is their turn to speak. This can interrupt the flow of a meeting and can result in a frustrating user experience.

It is an aim of the present invention to address at least one disadvantage associated with the prior art.

SUMMARY OF THE INVENTION

There is provided apparatus for use at a first conference venue comprising: a conference control unit (CCU) interface configured to communicate with a conference control unit at the first conference venue, wherein the CCU is connectable to a plurality of DUs each having a microphone; a cellular wireless apparatus which is configured to support a voice call with a remote attendee telephone; a processor configured to support an audio conference between the CCU at the first conference venue and the remote attendee telephone via the cellular wireless apparatus, the processor configured to: receive CCU audio from the CCU and send audio data representing the CCU audio to the cellular wireless apparatus; receive remote attendee audio data from the cellular wireless apparatus and send audio representing the remote attendee audio data to the CCU. Optionally, the processor is configured to receive control data from the cellular wireless apparatus.

Optionally, the cellular wireless apparatus is configured to map an operating state of the cellular wireless apparatus to control data which emulates CCU control data.

Optionally, the control data comprises at least one: an indication of an operating state (on/off) of a CCU, wherein an initialised state of the cellular wireless apparatus is mapped to a CCU operating state of “on”; an indication of a state of a delegate unit (DU) connected to a CCU, wherein a state where the cellular wireless apparatus is connected to a cellular wireless network is mapped to a CCU operating state of “connected”; an indication of a state of a microphone of the DU, wherein an operating state where a voice call is connected with audio to a remote attendee is mapped to a microphone state of “on”; an indication of a voting selection.

Optionally, the processor is configured to: maintain a request-to-speak queue of DUs; add a DU to the request-to-speak queue in response to receiving CCU control data indicating that a request-to-speak has been made at a DU connected to the CCU; and, add the remote attendee telephone to the request-to-speak queue in response to receiving control data indicating that a request-to-speak has been made at the remote attendee telephone.

Optionally, the cellular wireless apparatus is configured to: determine if a key has been pressed on the remote attendee telephone by monitoring audio from the telephone; and when it is determined that a key has been pressed, outputting control data to the processor.

An aspect provides cellular wireless apparatus for use at a conference venue comprising: a connector port configured to connect to a host apparatus at the conference venue; a cellular wireless transceiver configured to support a voice call with a remote attendee telephone; a processor configured to: receive audio data from the host apparatus for sending in the voice call with the remote attendee telephone; send remote attendee audio data from the voice call with the remote attendee telephone to the host apparatus; receive control data from the host apparatus for controlling the voice call with the remote attendee telephone; send control data to the host apparatus indicative of a status of the cellular wireless apparatus. Optionally, the processor is configured to send control data which emulates conference control unit (CCU) control data, and the processor is configured to map an operating state of the cellular wireless apparatus to the CCU control data.

Optionally, the processor is configured to send control data which comprises: an indication of an operating state (on/off) of a CCU, wherein an initialised state of the cellular wireless apparatus is mapped to a CCU operating state of “on”; an indication of a state of a delegate unit (DU) connected to a CCU, wherein a state where the cellular wireless apparatus is connected to a cellular wireless network is mapped to a CCU operating state of “connected”; an indication of a state of a microphone of the DU, wherein an operating state where a voice call is connected with audio to a remote attendee is mapped to a microphone state of “on”; an indication of a voting selection.

Optionally, the apparatus is configured to send control data indicating a request-to-speak in response to receiving an incoming cellular wireless call.

Optionally, the apparatus is configured to: determine if a key has been pressed on the remote attendee telephone by monitoring audio from the telephone; and when it is determined that a key has been pressed, outputting control data to the host apparatus.

Optionally, the apparatus is configured to receive control data from the host apparatus to instruct the cellular wireless transceiver to dial out to a telephone number of a remote attendee telephone, the control data indicating a number to be dialled.

Optionally, the apparatus is configured to send control data to the host apparatus when the cellular wireless transceiver receives a call from a remote attendee telephone.

Optionally, the apparatus comprises a housing with an RF port and the RF antenna is connected to, or connectable to, the RF port by an RF cable.

Optionally, the apparatus comprises a cellular wireless transceiver and an input configured to receive a subscriber identity module.

An aspect provides a computer-implemented method for providing a conference between conference equipment at a first conference venue and a remote attendee telephone, the method comprising, at apparatus at the first conference venue: communicating with a conference control unit (CCU) and with a cellular wireless apparatus which is configured to support a voice call with a remote attendee telephone, wherein the first CCU is connectable to a plurality of first delegate units (DU) each having a microphone; the communicating comprising: receiving CCU audio from the CCU and sending audio data representing the CCU audio to the cellular wireless apparatus; receiving remote attendee audio data from the cellular wireless apparatus and sending audio representing the remote attendee audio data to the CCU.

Optionally, the method comprises receiving control data from the cellular wireless apparatus.

Optionally, the method comprises: mapping an operating state of the cellular wireless apparatus to control data which emulates CCU control data at the cellular wireless apparatus; and sending the control data from the cellular wireless apparatus to the apparatus at the first conference venue.

Optionally, the control data comprises at least one of: an indication of an operating state (on/off) of a CCU, wherein an initialised state of the cellular wireless apparatus is mapped to a CCU operating state of “on”; an indication of a state of a delegate unit (DU) connected to a CCU, wherein a state where the cellular wireless apparatus is connected to a cellular wireless network is mapped to a CCU operating state of “connected”; an indication of a state of a microphone of the DU, wherein an operating state where a voice call is connected with audio to a remote attendee is mapped to a microphone state of “on”; an indication of a voting selection.

Optionally, the method comprises: maintaining a request-to-speak queue of DUs; adding a DU to the request-to-speak queue in response to receiving CCU control data indicating that a request-to-speak has been made at a DU connected to the CCU; and, adding the remote attendee telephone to the request-to-speak queue in response to receiving control data indicating that a request-to-speak has been made at the remote attendee telephone.

Optionally, the method comprises: determining if a key has been pressed on the remote attendee telephone by monitoring audio from the telephone; and when it is determined that a key has been pressed, outputting control data to the processor. An aspect provides a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method as disclosed or claimed. Another aspect provides a computer-readable medium having stored thereon the computer program. The functionality described in this document can be implemented in hardware, software executed by a processing apparatus, or by a combination of hardware and software. The processing apparatus can comprise a computer, a processor, a state machine, a logic array or any other suitable processing apparatus. The processing apparatus can be a general-purpose processor which executes software to cause the general-purpose processor to perform the required tasks, or the processing apparatus can be dedicated to perform the required functions. Another aspect of the invention provides machine-readable instructions (software) which, when executed by a processor, perform any of the described methods. The machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium. The machine-readable medium can be a non-transitory machine-readable medium. The term “non-transitory machine-readable medium” comprises all machine-readable media except for a transitory, propagating signal. The machine-readable instructions can be downloaded to the storage medium via a network connection.

An advantage of at least one example or embodiment is that participants at the first conference venue and the remote attendee can participate in a more unified meeting environment. That is, participants at the first venue and the remote attendee can hear each other. A remote attendee can have their microphone turned on under the control of a chairperson, and may take part in voting.

An advantage of at least one example or embodiment is that the conference organiser can retain control over the equipment and phone numbers to join a meeting. A call via the cellular wireless network terminates at the cellular wireless apparatus. This contrasts with a situation where there is a need to connect to a private branch exchange (PBX) or similar equipment at a conference venue. This can be especially problematic when a meeting is held at a new conference venue. Using a PBX at a venue has disadvantages such as: the phone number for dialling in/out is associated with the PBX equipment at the venue; there is a need to correctly interface with the PBX; the PBX must have sufficient spare capacity to support the number of remote attendees; and distributing the phone number to remote attendees.

The term “conference control unit” (CCU) relates to a unit which is found in audio conferencing systems. The CCU is connectable to a plurality of delegate units (DU). The connection between a CCU and a DU may be a wired or a wireless link. The maximum number of DUs served by a CCU varies between vendors and may range, for example, from 8 or 12, to a larger number such as 150 or 245. The CCU is configured to send audio to the DUs and to receive audio from the DUs. The CCU is configured to send control data/signals to the DUs and to receive control data/signals from the DUs. The CCU may provide power to the DUs. DUs may alternatively be called microphone units (MU) or contribution units. Some vendors use a different term to describe a unit which performs the function of a conference control unit, such as “central control unit (CCU)” or “audio processor and powering switch”.

Embodiments of the invention may be understood with reference to the appended claims.

Within the scope of this application it is envisaged that the various aspects, embodiments, examples and alternatives, and in particular the individual features thereof, set out in the preceding paragraphs, in the claims and/or in the following description and drawings, may be taken independently or in any combination. For example features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.

For the avoidance of doubt, it is to be understood that features described with respect to one aspect of the invention may be included within any other aspect of the invention, alone or in appropriate combination with one or more other features.

BRIEF DESCRIPTION OF THE DRAWINGS

One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying figures in which:

FIGURE 1 shows conference equipment at a venue with separate audio equipment and video equipment;

FIGURE 2 shows conference equipment which supports an audio and video conference via a network-based server;

FIGURE 3 shows a system with conference equipment at two venues;

FIGURE 4 shows an alternative version of the system of FIGURE 3;

FIGURE 5A shows an example of a delegate unit for use in the system of FIGURES 3 and 4;

FIGURE 5B shows an example of a camera shows for use in the system of FIGURES 3 and 4;

FIGURES 6A and 6B show examples of interfaces between equipment in the system of FIGURES 3 and 4;

FIGURE 7 shows an example of sending audio between VENUE A and VENUE B;

FIGURE 8 shows an example of sending audio between VENUE B and VENUE A;

FIGURE 9 shows an example of sending video between VENUE A and VENUE B, and between VENUE B and VENUE A;

FIGURE 10 shows an example of sending control data between VENUE A and VENUE B when a CCU is operated in a first mode;

FIGURE 11 shows an example of sending control data between VENUE A and VENUE B when a CCU is operated in a second mode;

FIGURE 12 shows another example of sending control data between VENUE A and VENUE B when a CCU is operated in the second mode; FIGURE 13 shows an example data structure for participants/seats;

FIGURE 14 shows an example data structure for a queue of delegate units waiting to speak;

FIGURE 15 shows a system with conference equipment at three venues;

FIGURE 16 shows an example of configuring a conference between venues;

FIGURE 17 shows an example of a GUI display during the configuration;

FIGURE 18 shows another example of a GUI display during the configuration;

FIGURE 19 shows another example of a GUI display during the configuration;

FIGURE 20 shows a conferencing system which supports remote attendees;

FIGURE 21A shows a Director at VENUE A communicating with remote attendees via a server;

FIGURE 21 B shows a Director at VENUE A and a Director at VENUE B communicating with remote attendees via a server;

FIGURE 22 shows an example of sending audio between VENUE A and a remote attendee;

FIGURE 23 shows an example of sending control data between VENUE A and a remote attendee;

FIGURE 24 shows cellular wireless equipment for use in FIGURE 20;

FIGURES 25 to 27 show operation of the cellular wireless equipment;

FIGURE 28 shows a system with multiple CCUs;

FIGURE 29 shows an example of interfaces between equipment in the system of FIGURE 28;

FIGURE 30 shows an example of sending audio between CCUs;

FIGURE 31 shows an example of sending control data between CCUs;

FIGURE 32 shows functional modules of a Director apparatus;

FIGURE 33 shows software for CCU control at a Director apparatus;

FIGURE 34 shows a system with conference equipment at two venues and bridging functionality at a server;

FIGURE 35 shows processing apparatus for implementing a Director;

FIGURES 36 to 39 show methods of operation.

DETAILED DESCRIPTION

FIGURE 3 shows an example of a system with conference equipment at two venues: VENUE A, 100; and VENUE B, 200. The two venues may be different rooms in the same building or venues which are geographically spaced, such as council offices in two different towns or cities. A meeting may involve participants solely at VENUE A, using the conference equipment at VENUE A. Similarly, a meeting may involve participants solely at VENUE B, using the conference equipment at VENUE B. In some examples, it is possible to provide a meeting which involves participants at both VENUE A and VENUE B, using the conference equipment at VENUE A and VENUE B. This will be called a bridged meeting or an inter-venue meeting. A meeting may only involve participants who are physically present in the venue, or a meeting may involve participants who are physically present in the venue and at least one remote attendee (i.e. a participant who is not physically present in the venue.) In this disclosure, the term “meeting” includes a conference or an event.

VENUE A has audio conferencing equipment 105 comprising a conference control unit (CCU) 140 connected to a plurality of delegate units (DU) 110. Each DU has a microphone 111. Each DU may also have one or more of: a loudspeaker 112, an audio output socket for connecting to a headset, or some other form of audio output for providing audio to the participant. The connection between the DUs 110 and the CCU 140 may be wired or wireless. A delegate unit may be called a microphone unit (MU), a discussion unit or a contribution unit. Another name for a conference control unit is a central control unit or an audio processor and powering switch. The audio conferencing equipment 105 may be permanently installed in the venue, such as DUs which are mounted on desks in a fixed manner. Alternatively, the audio conferencing equipment may be temporarily installed in the venue, such as portable DUs placed on desks, and the audio conferencing equipment can be moved from one location to another. The audio conferencing equipment 105 may be equipment which is intended to be used in a standalone manner within VENUE A, i.e. equipment which is not intended to be used in conjunction with equipment external to the venue. The conferencing equipment 105 may be battery powered or powered via the CCU 140.

VENUE A has video equipment comprising a plurality of video sources, such as cameras 120. The video equipment comprises one or more displays 130. The cameras 120 may be controllable in terms of direction (pan and tilt) and field of view (zoom). This type of control is called pan, tilt and zoom (PTZ). During a conference, one of the cameras 120 can be controlled to steer, in terms of pan and tilt, to point at a speaking participant. Zoom can be controlled to suitably frame the speaking participant, or multiple participants. A video output of the speaking participant from the camera 120 can be output to display 130. The display 130, or other digital signage, can be used to display other information related to the conference, such as name of the speaking participant, presentation materials (e.g. slides), voting instructions, voting results etc. An audio output may also be provided to an in-venue audio system 135 and/or to other parties wishing to listen to the conference, such as a translator.

VENUE A has a control unit called a Director 150. The Director 150 is connected to the CCU 140. In this example, the cameras 120 and displays 130 are also connected to the Director 150. The connection between the Director 150 and the CCU 140 can carry audio signals (e.g. in a digital format or in analogue format) and control/data signals. Examples of control data/signals are: control data/signals for turning a DU microphone on/off; control data/signals for requesting to speak; control data/signals indicating a voting selection made at a DU; control data for display at a DU.

VENUE B has similar equipment to VENUE A, with audio conferencing equipment 205 comprising a plurality of DUs 210 connected to a CCU 240. VENUE B also has video equipment comprising a plurality of cameras 220 and one or more displays 230. The cameras 220 and displays 230 are connected to the Director 250. The Director 250 is connected to the CCU 240. Each Director 150, 250 has control software which is executed by a processor to control operation of the Director.

In some examples, a conference can be established between VENUE A and VENUE B, with participants at both VENUE A and VENUE B. Control software at each venue is used to generate meeting information in respect of that venue. The meeting information is stored and accessible to the Director associated with the venue. The Directors 150, 250 each have an interface which is configured to allow Director-to-Director communication via a network 300. The network 300 may be a local area network or a wide area network (e.g. Internet).

The Directors 150, 250 can access network-based servers 302, 304, 306, 310. In this example, the servers comprise: a webcast server 302; a remote attendee system (RAS) 304; an admin server 306; and a third party document management server 310. The webcast server 302 supports distribution of an audio and video stream of the meeting via the Internet. The webcast may be limited to parts of the meeting which are open to the public.

Each Director 150, 250 stores data about participants in a meeting, shown as the boxes VENUE A, VENUE B, REMOTE. Each Director 150, 250 has an overview of all participants (seats) in a meeting and can allow all of the participants to participate in a unified meeting environment. For example, participants at VENUE B and remote attendees can hear (and see) other participants in the same manner as participants sitting at seats in VENUE A. Also, participants at VENUE B and remote attendees can participate in voting and access the meeting agenda and related documents in the same manner as participants sitting at seats in VENUE A. The data stored at each Director 150, 250 allows the Director to determine where the participant is physically located (e.g. VENUE A, VENUE B or remote attendee) and which interface the Director should use to communicate with the participant. The data stored at each Director 150, 250 indicates the current status of all participants. Some possible states are: MIC ON; MIC OFF; request-to-speak (RTS).

FIGURE 4 shows a similar system to FIGURE 3. At VENUE A the Director 150 is connected to a switch 145 (e.g. an Ethernet switch). CCU 140 connects to the switch 145. Video equipment 120, 130 also connects to the switch 145. The CCU 140, video equipment 120, 130 and the Director 150 are all connected to a private local area network (LAN). Director 150 communicates with audio equipment 105 and video equipment 120, 130 using data frames/packets. This arrangement is suitable for a system where the audio equipment 105 and video equipment 120, 130 has an Ethernet or IP interface, or can be connected to a network by a network adapter. FIGURE 4 shows VENUE B with the same connection arrangement as FIGURE 3. However, VENUE B may also use a LAN and a switch to connect Director 250 to audio equipment 205 and video equipment 220, 230. Each Director 150, 250 in FIGURE 4 can also communicate with cloud servers 302-310, in the same manner as shown in FIGURE 3. Video equipment may connect directly to the Directors 150, 250 instead of, or in addition to, any video equipment connected to a LAN. A direct connection may use a High Definition Multimedia Interface (HDMI), or another kind of video or media interface.

The conference equipment 110, 120, 130, 140 at VENUE A may be the same as the conference equipment 210, 220, 230, 240 at VENUE B. Alternatively, the conference equipment 110, 120, 130, 140 at VENUE A may be different to the conference equipment 210, 220, 230, 240 at VENUE B. That is, the conference equipment at VENUE A and the conference equipment VENUE B may be heterogeneous. The conference equipment at VENUE A may use different technology compared to the conference equipment at VENUE B. For example, the CCU 140 may send control commands to turn a microphone on/off at a DU 110. The format of these control commands can be different between the CCU 140 and DU 110 of the conference equipment at VENUE A compared to control commands between the CCU 240 and DU 210 of the conference equipment at VENUE B. Similarly, the PTZ control commands to control pan, tilt and zoom of the camera 120 may differ from the PTZ control commands to control pan, tilt and zoom of the camera 220. Other possible differences in technology include: audio coding formats for carrying audio between DU 110 and CCU 140 compared to DU 220 and CCU 240; video coding formats for carrying video between camera 120 and Director 150 compared to camera 220 and Director 250; the external interfaces of CCU 140 and CCU 240. It will be understood that there can be other differences. The conference equipment at VENUE A may be manufactured by a different company compared to the conference equipment at VENUE B. For example, the audio conference equipment 110, 140 at VENUE A may be manufactured by company X and the audio conference equipment 210, 240 at VENUE B may be manufactured by company Y. Each company may have a proprietary interface for sending and receiving audio data signals, and/or a proprietary interface for sending and receiving control data/signals.

FIGURE 5A shows an example of a Delegate Unit (DU) 110. The DU comprises a housing 111 with a microphone 112. A control button or key 113 is provided on the housing 111. The control button 113 may function as a switch to allow the participant to turn on the microphone 112 before speaking. Alternatively, the control button 113 may function as a request-to-speak button. The participant may press the control button 113 to indicate that they wish to speak. The DU may also comprise an indicator 114 which can be illuminated when the microphone is turned on (active). The DU may comprise one or more loudspeakers 115 to provide the participant with audio. The audio may be the current speaking participant, a mix of the current speaking participants, or other audio, such as audio accompanying a video presentation. The DU may comprise a display 116. The display 116 can be used to display operating settings or information related to the conference. The DU may comprise one or more further buttons or keys 117 for other purposes during the conference, such as voting (e.g. “YES”, “NO”). The display 116 may be a display which only displays information to a participant, or it may be a touch screen display which allows a participant to make selections. One or more of the buttons, such as buttons 117 may be implemented as virtual buttons on the touch screen. For example, at a time of a vote, the display can display two virtual buttons for “YES” and “NO”. The DU may receive power via a cable which carries audio/data and power. Alternatively, the DU may have a connection to a power supply, or include a battery.

FIGURE 5B shows an example of a camera 120 supporting pan, tilt and zoom. The image sensing portion of the camera (lens 126 and an image sensor) is carried by part 124. The camera comprises a base 121 and a cradle 123 which is rotatable relative to the base 121. This provides movement in the pan direction 128. Part 124 is movable relative to the cradle 123 about an axis 124A. This provides movement in the tilt direction 125. Movement is performed by electrical motors. The camera 120 is highly manoeuvrable and has a small physical footprint. It will be understood that many other different cameras are possible.

FIGURE 6A shows a first example overview of interfaces and of signals and data sent between equipment at venue 100. The DU 110 typically communicates with a CCU 140 using a proprietary communications protocol. The communications link may be wired or wireless, such as Institute of Electrical and Electronics Engineers IEEE 802.11 (Wi-Fi™). Each DU 110 is able to transmit an audio signal (in analogue or digital format) via the communications link when the microphone associated with the DU is active. The DU 110 can also receive an audio signal from the CCU and output the audio through one or more loudspeakers 115 (FIGURE 5A). The DU 110 can transmit a control signal/data in the form of a microphone Request-to-Speak (RTS) indication each time a microphone RTS button is activated. The DU 110 can also transmit MIC ON and MIC OFF indications (e.g. in the form of digital control data) to inform the CCU 140 of the status of the microphone associated with the DU. Other possible control signals are control signals to support voting functions. During a meeting, a participant may have an opportunity to vote on a matter. A DU may present the participant with voting selections, such as “Yes”, “No” or “Abstain”. A DU may receive a control signal indicating when a vote occurs (and voting options) and the DU may output a voting selection, indicating the voting selection of the user, such as Yes/No/Abstain.

The CCU 140 is configured to output an audio signal/audio data and control data to the Director 150. The audio signal may be a mix of audio signals from one or more active microphones. The audio signal may comprise two or more individual channels. For example, systems employing Dante™ audio networking, up to eight individual audio channels may be output by the CCU 140 in a digital, frame or packet-based format.

The Director has an interface 152 configured to communicate with the CCU 140. Interface 152 will be called an audio interface or a CCU audio interface. In general, audio interface 152 can receive audio (in analogue or digital form) from the CCU 140 and can send audio (in analogue or digital form) to the CCU 140.

Various control data is sent from the CCU and received by the CCU 140. The control data output by the CCU 140 can indicate whether a DU is connected to the CCU 140. The control data output by the CCU 140 can provide information in respect of a microphone of a DU, such as one or more of: an indication of when an RTS button of a DU is pressed; confirmation of a microphone on/off at a DU. The Director 150 has an interface 151 configured to communicate with the CCU 140. Interface 151 will be called a CCU interface or a CCU control interface as it sends control data/signals to the CCU 140 and receives control data/signals from the CCU 140.

As described above, the audio and control signals between the CCU 140 and Director 150 may be carried in frame or packet-based form over a local area network, such as in payloads of Ethernet frames. The interfaces 151 , 152 between the CCU 140 and Director 150 can have an IP (TCP/IP) layer. The interfaces 151 , 152 can comprise at least vendor-specific layer, such as one or more of: an audio coding format; an audio data encryption type/format; a protocol for sending control data; a control data coding format; a control data encryption type/format. The CCU interface 151 can have at least one different layer to the audio interface 152, such as different coding, encryption or protocol. The term “layer” refers to a layer in the Open Systems Interconnection (OSI) model. The differences may occur, for example, in Layer 6 (presentation).

The Director 150 can receive video from a range of video sources, such as cameras. Camera 120 has an interface 122 to connect to a network. Video and control signals between the camera 120 and Director 150 may be carried in frame or packet-based form over a local area network, such as in payloads of Ethernet frames. The camera 120 can have an IP address and the interface can have an IP (TCP/IP) layer. The Director 150 has a video source (video) interface 153 for receiving video data from the camera 120 and a video source (control) interface 154 for communicating control data to the camera, such as commands to control pan, tilt and zoom (PTZ) operations of the camera. The interface with the camera may also carry power, or a separate power supply may be provided for the camera. An example of an interface which carries data and power is Power over Ethernet (PoE), with the video data and control data carried in the payloads of Ethernet frames. The video source (video) and video source (control) interfaces 153, 154 can have an IP (TCP/IP) layer. They can also comprise at least vendor-specific layer, such as one or more of: a video coding format; a protocol for sending control data; a control data coding format. Other video sources with a network interface (e.g. IP) can be connected in the same manner as shown. A video source may connect directly to the Directors 150, 250 instead of, or in addition to, any video equipment connected to a LAN. A direct connection may use a High Definition Multimedia Interface (HDMI), or another kind of video or media interface.

The Director 150 comprises a video output interface 155 for communicating with at least one display 130 or digital signage device. The Director 150 may comprise additional audio interfaces 156, 157. Audio interface 156 is configured to receive audio from another audio source, such as a microphone (other than a DU 110), a microphone associated with camera 120, audio accompanying video received from a video player or another media source (local or networked). Audio interface 157 is configured to output audio to an audio system 135 or other audio device requiring a feed from a meeting. For example, a meeting requiring a translation service requires an audio output to allow a translator to perform translation in real time. It is also advantageous to provide an accompanying video output to the translator so that the translator can see the lips of the person speaking.

FIGURE 6A shows logically separate flows of audio to CCU (control) interface 151 and control data to audio interface 152. In practice, these two flows can be carried over the same physical connection for at least part of their paths, such as a LAN segment. Similarly, in systems where video is also carried over the same LAN as audio and control data, the video, audio and control can be carried over the same physical connection for at least part of their paths.

The Director 150 has an interface 158 for communicating with one or more other Directors. This will be called an inter-venue interface 158. Communication with other Directors may be via a local area network (e.g. where Directors are located in the same building) or via a wide area network such as the Internet (e.g. where Directors are located in different buildings). This interface can be IP interface, with audio data, video data and control data carried in IP packets.

The interfaces at the Director 150 comprise a physical port/connection (e.g. an Ethernet socket), hardware associated with the port (e.g. an Ethernet card) and software, executed by the Director, which is configured to format data for transmission and extract data on reception. The software is also configured to communicate with another entity (CCU 140, camera 120, Director B) to coordinate transmission of data. The communication may use a protocol which is proprietary (closed) or open.

FIGURE 6B shows another example overview of interfaces and of signals sent between equipment at venue 100. The main difference in FIGURE 6B is that the CCU sends/receives analogue audio and the camera outputs analogue video. A conversion unit 160 is provided. In some examples, the conversion unit 160 is a separate physical device to the Director. This can allow the Director 150 to be implemented on a conventional PC, while the conversion unit 160 can be tailored to the particular connectivity requirements. The conversion unit 160 has an interface 161 for communicating control data/signals with the CCU 140 and an interface 162 for communicating audio signals with the CCU 140. In an upstream (CCU to Director) direction, analogue audio signals are converted to digital audio data by an analogue-to-digital converter (ADC) at the conversion unit 160. In a downstream (Director to CCU) direction, digital audio data is converted to audio signals by a digital-to-analogue converter (DAC) at the conversion unit 160. The conversion unit 160 has an interface 163 for receiving analogue video signals from the camera 120. A video capture unit converts the analogue video signals to digital video data. The conversion unit 160 has a digital interface 164 for communicating with the Director 150. For example, the audio data, video data and control data can be carried over Universal Serial Bus (USB), in the payloads of Ethernet frames, or some other kind of digital interface. The Director 150 has a complementary digital interface 166 for communicating with the conversion unit 160. The conversion unit has a module 165 which is configured to package digital data (audio, control, video) into data units for transmission (e.g. USB data packets or Ethernet frames) to the Director 150 and to unpackage digital data (audio, control, video) received from the Director 150. In FIGURE 6B the Director has a CCU (control) interface 151 and an audio interface 152. Control data can require similar processing at interface 151 as described for FIGURE 6A, such as encoding/decoding, encryption/decryption, formatting. Audio data may be simpler to process compared to FIGURE 6A, as analogue-to-digital conversion has been performed locally by the ADC at the conversion unit 160. The video source (video) interface 153 can operate in a similar manner as described for FIGURE 6A. The video source (video) interface 153 may comprise at least vendor-specific layer, such as one or more of: a protocol for sending control data; a control data coding format. The video source (video) interface 153 may received video data in raw or compressed format from the video capture unit of the conversion unit 160.

FIGURES 7-12 show examples of a meeting/conference between participants at VENUE A and VENUE B. The process of configuring a meeting between venues, and of pairing Directors at VENUE A and VENUE B is described later on. The upper part of FIGURE 7 shows a simplified schematic of equipment at VENUE A and VENUE B. The equipment at each venue includes: delegate units DU; a conference control unit CCU; a camera CAM and a display (DISP.) VENUE A has a plurality (N) of DUs: DU(A1)-DU(AN). Similarly, VENUE B has a plurality (N) of DUs: DU(B1)- DU(BN). The number of DUs at each venue can be the same, or different. The lower part of each of FIGURES 7 to 12 shows flows of audio, data and control signals between equipment within a venue, and between venues.

FIGURES 7 and 8 show audio flow paths. FIGURE 7 shows audio flow between VENUE A and VENUE B and FIGURE 8 shows audio flow between VENUE B and VENUE A. The main source of audio signals are microphones at DUs. In this example, DU(A1) is active (i.e. microphone turned on) at VENUE A and DU(B1) and DU(B2) are both active at VENUE B. Audio of the speaking participant is captured by a microphone at DU(A1) and output to the CCU 140. CCU 140 outputs a mix of the audio of active DUs to each of the DUs DU(A1-AN). So far this is the normal operation of the audio conferencing system. CCU 140 also outputs the audio (or an audio mix, if there are multiple active DUs) to Director 150. Director 150 may decode, or transcode, audio received from the CCU 140. An example format for forwarding audio between Directors is Advanced Audio Coding (AAC). Various other audio coding formats can be used. A selection of an audio coding format may be based on available bandwidth. For example, when two Directors are located within a single building and are connected by a high bandwidth network connection, audio can be sent in a high bit rate raw or lossless audio coding format whereas when two Directors are connected by a lower bandwidth network a lower bit rate compressed audio coding format can be used. Director 150 encodes the audio and forwards the encoded audio to Director 250 at VENUE B as inter-venue audio data. This inter-venue audio data represents the audio received from the CCU. Director 250 decodes (or transcodes) the audio and outputs the audio to CCU 240. The Director 250 may transcode the audio to a particular format of the CCU 240. CCU 240 outputs audio to DUs DU(B1-BN). This allows each participant at VENUE B to hear the speaking participants) at VENUE A.

FIGURE 8 shows audio flow paths between VENUE B and VENUE A. Audio of the speaking participants is captured by a microphone at each of DU(B1) and DU(B2) and output to the CCU 240. CCU 240 outputs a mix of the audio of active DUs to each of the DUs DU(B1-BN). CCU 240 also outputs the audio mix to Director 250. Director 250 may decode, or transcode, audio received from the CCU 240. Director 250 encodes the audio and forwards the encoded audio to Director 150 at VENUE A as inter-venue audio data. Director 150 decodes (or transcodes) the audio and outputs the audio to CCU 140. This audio represents the inter-venue audio data received from Director B. The Director 150 may transcode the audio to a particular format of the CCU 140. CCU 240 outputs audio to DUs DU(A1-AN). This allows each participant at VENUE A to hear the speaking participants at VENUE B.

The audio processing for the direction VENUE A to VENUE B shown in FIGURE 7 and the audio processing for the direction VENUE B to VENUE A shown in FIGURE 8 occurs simultaneously. CCU 140 forms a mix of the active local DUs and any audio received from the Director 150. In this example, CCU 140 will form an audio mix of the audio from DU(A1) and remote DU(B1), DU(B2) received from Director 150. Similarly, CCU 240 forms a mix of the active local DUs and any audio received from the Director 250. In this example, CCU 240 will form an audio mix of the audio from DU(B1), DU(B2) and the remote DU(A1) received from Director 250.

FIGURES 7 and 8 show audio from speaking participants at DUs. Each Director 150, 250 can also connect to one or more audio sources. One example of an audio source is audio accompanying a presentation, e.g. played by a video player apparatus, a software player playing a video file, or a network media source. If the source of the audio is local to VENUE A, the audio is output to Director 150, decoded/transcoded (if required) and forwarded to Director 250. Director 250 can decode/transcode the audio (if required) before outputting it to CCU 240 and on to DU(B1-BN) and/or other loudspeakers at VENUE B. CCU 240 forms an audio mix which includes the audio and outputs audio to DU(A1-AN). Similarly, if the source of the audio is local to VENUE B, the audio is output to Director 250, decoded/transcoded (if required) and forwarded to Director 150. Director 150 can decode/transcode the audio (if required) before outputting it to CCU 140 and on to DU(A1- AN) and/or or other loudspeakers at VENUE A.

The upper part of FIGURE 9 shows video flow paths between VENUE A and VENUE B. One form of video sent from VENUE A is video output by cameras at the venues. Typically, this is video of the currently speaking participant(s), chairperson etc. This video allows other participants to see the currently speaking participant(s). In this example, the participant at DU(A1) is the currently speaking participant. CAM(A) outputs a video signal to Director 150. CAM(A) is controlled by the Director to point at DU(A1). Director 150 encodes (or transcodes) the video, if required, and forwards the encoded video to Director 250 at VENUE B. Director 250 decodes (or transcodes) the video, and outputs the video to one or more displays 230. This allows each participant at VENUE B to see the speaking participant at VENUE A. The Director 250 may transcode the video to a particular format for the display 230. Formats include: High Definition (HD, 1920 x 1080 pixels) and Ultra HD/4K (3840 x 2160 pixels). Frame rates are typically 25 or 30 frames per second (fps). An example format for forwarding video between Directors is Advanced Video Coding (AVC), H.264. Various other video coding formats can be used.

The lower part of FIGURE 9 shows video flow paths between VENUE B and VENUE A. This can include video of currently speaking participant, or currently speaking participants at VENUE B. In this example, the participant at DU(B1) and DU(B2) are the currently speaking participants. CAM(B) outputs a video signal to Director 250. CAM(B) is controlled by the Director to point at DU(B1) and DU(B2). Alternatively, two cameras may be used: one pointing at DU(B1) and another camera pointing at DU(B2). Director 250 encodes (or transcodes) the video, if required, and forwards the encoded video to Director 150 at VENUE A. Director 150 decodes (or transcodes) the video, and outputs the video to one or more displays 130. This allows each participant at VENUE A to see the speaking participants at VENUE B. The Director 150 may transcode the video to a particular format for the display 130. The Director can use stored configuration data to determine a camera associated with a DU and camera settings for the DU (e.g. PTZ settings). This is described in more detail later on.

Each Director 150, 250 can also connect to one or more video sources. One example of a video source is a video player (an apparatus, or a software video player playing a file) or some other source. If the source of the video is local to VENUE A, the video is output to Director 150, decoded/transcoded (if required) and forwarded to Director 250. Director 250 can decode/transcode the video (if required) before outputting it to display 230. Similarly, if the source of the video is local to VENUE B, the video is output to Director 250, decoded/transcoded (if required) and forwarded to Director 150.

The Director 150 synchronise audio and video before forwarding the synchronised audio and video to the Director 250 at VENUE B. Similarly, the Director 250 synchronises audio and video before forwarding the synchronised audio and video to the Director 150 at VENUE A This helps to minimise an offset, in time, between audio and corresponding video. This helps to reduce lip sync issues when the audio and video are output at VENUE A. Director 250 may introduce an audiovideo offset to compensate for equipment at VENUE B. For example, if the audio processing path to DUs 210 is known to be longer than the video processing path to display 230 then a delay can be introduced to the video processing path. If the audio processing path to DUs 210 is known to be shorter than the video processing path to display 230 then a delay can be introduced to the audio processing path.

CONTROL DATA

Control data is forwarded between Directors 150, 250 at VENUE A and VENUE B as inter-venue control data. One type of inter-venue control data is indicative of a status or action at a DU connected to the CCU 140. For example, Director A can receive control data from a CCU which indicates that a microphone has been turned on at a DU connected to that CCU. Director A can then send inter-venue control data to Director B indicating that a microphone has been turned on at the DU. The inter-venue control data can comprise an identifier of the DU. Another type of inter- venue control data is control data to control operation of a DU. For example, Director A can send control data to Director B to instruct a microphone to be turned on at a DU connected to a CCU at VENUE B, e.g. turn on microphone at DU(B1). Director B can then send control data to the CCU at VENUE B to turn on the microphone at DU(B1).

Some types of control data associated with DU operation include: control data indicating a participant requesting to speak; turning a DU microphone on/off; confirmation that a microphone has been turned on/off; an indication that a DU is connected to a CCU. Forwarding this control data informs a venue when a microphone at a DU is on or off. This allows a venue to take actions such as displaying video of the speaker.

By way of background, in conventional audio conferencing equipment there is an operating mode in which:

(i) a user presses a button on the DU to request-to-speak, and the DU sends a request-to- speak indication to the CCU;

(ii) the DU receives from the CCU an instruction to turn on the microphone at the DU;

(iii) the DU confirms that the microphone is now turned on.

There are other ways of operating a microphone at a DU. These include: (i) microphone is always on; (ii) push to talk; (iii) push to activate; (iv) voice activation.

There are two ways in which a CCU can operate:

• standalone mode;

• request-to-speak (RTS) mode. This may also be called PC control mode.

In standalone mode, a CCU at a venue is in control of activating a DU (i.e. turning a microphone on at a DU) and deactivating a DU (i.e. turning a microphone off at a DU). The CCU receives request- to-speak indications from DUs and sends “turn microphone on” and “turn microphone off’ instructions to DUs. A CCU outputs, to a Director, control signals indicating when DUs are activated and deactivated. This allows a Director to monitor when DUs are activated/deactivated, and share this information with other venues.

In request-to-speak (RTS) mode, the Director connected to the CCU is in control of activating a DU (i.e. turning a microphone on at a DU) and deactivating a DU (i.e. turning a microphone off at a DU). When a DU requests to speak, the CCU sends a request to the Director. The Director informs the CCU when a DU should be activated/deactivated and the CCU sends control signals to the DU.

These modes are now described in more detail.

STANDALONE MODE

FIGURE 10 shows an example of flow paths for control signals/data associated with activating/deactivating DUs when a CCU operates in standalone mode. The system is the same as the one shown in FIGURES 7 to 10. The DUs support request-to-speak operation, i.e. a DU sends a request-to-speak (RTS) request to a CCU and receives a TURN MIC ON indication when the DU can be activated. Running through FIGURE 10:

• DU(A1) sends a request to speak (RTS) indication to the CCU.

• at some later time, the CCU sends a TURN MIC ON indication to DU(A1). DU(A1) turns its microphone on.

• DU(A1) sends a MIC ON confirmation to the CCU. The CCU sends the MIC ON confirmation to Director A.

• Director A can perform one or more actions when the MIC ON indication is received. For example, Director A can update a local table of DUs to indicate the status of DU(A1) as active. Director A can use the indication of active DU(s) to control cameras 120. When a new DU becomes active, Director A can instruct one of the cameras 120 to point at the newly activated DU. Director A can also select video from the camera showing the newly activated DU, and cause that video to be output to a display at VENUE A.

• Director A sends an indication of DU(A1) being activated to Director B at VENUE B. Director B decodes the control data. Director A sends control data which indicates that DU(A1) has been activated. For example, Director A can send data which indicates: an identifier of DU(A1); microphone state (ON, OFF).

• Director B receives the control data from Director A. Director B can perform one or more actions when it receives control data indicating that DU(A1) is active. For example, it can update a local table of DUs to indicate the status of DU(A1) as active. Optionally, Director B can use the indication that DU(A1) is active to select video data received from VENUE A (showing the new speaker) and cause that video to be output to a display at VENUE B. This is useful where Director B receives multiple video streams (e.g. from every camera at VENUE A). The control data received at Director B can indicate an identifier of a video stream to select. Optionally, Director B can use the indication that DU(A1) is active to select data to be overlaid on a video signal sent to a display at VENUE B, such as a name and/or job title of the participant.

In FIGURE 10 Director A is only shown receiving control data from the CCU. In some conferencing systems, it is possible to send control data to a CCU operating in standalone mode. For example, it may be possible to send control data to a CCU to perform one or more of: change volume level; change a mode of a CCU (between standalone mode and RTS mode). Accordingly, the Director may still send some control data to a CCU even when the CCU is operating in standalone mode.

CCU REQUEST-TO-SPEAK MODE (CHAIRPERSON AT VENUE A)

FIGURE 11 shows an example of flow paths for control signals associated with activating/deactivating DUs when a CCU operates in req uest-to-s peak mode and the conference Chairperson is at VENUE A. The DUs support request-to-speak operation, i.e. a DU sends a request-to-speak (RTS) request to a CCU and receives a TURN MIC ON indication when the DU can be activated. The upper part of FIGURE 11 shows steps when a DU wishes to participate in a conference:

• DU(A1) sends a request to speak (RTS) indication to the CCU.

• CCU sends the RTS indication to Director A.

• Director A adds an identifier of DU(A1) to a list of waiting DUs. The list may be ordered according to one of more criterion, such as time of receipt, priority etc.

• At some later time, Director A determines that DU(A1) can be activated.

• Director A sends a TURN MIC ON indication to CCU.

• CCU sends the TURN MIC ON indication to DU(A1). DU(A1) turns its microphone on.

• DU(A1) sends a MIC ON confirmation to the CCU. The CCU sends the MIC ON confirmation to Director A.

• Director A can perform one or more new speaker actions. For example, Director A can update a local table of DUs to indicate the status of DU(A1) as active. Director A can use the indication of active DU(s) to control cameras 120. When a new DU becomes active, Director A can instruct one of the cameras 120 to point at the newly activated DU.

• Director A sends an indication of DU(A1) being activated to Director B at VENUE B. Director B decodes the control data. Director A sends control data which indicates that DU(A1) has been activated. For example, Director A can send data which indicates: an identifier of DU(A1); microphone state (ON, OFF, RTS).

• Director B can perform one or more actions when the MIC ON control data is received. Director B can perform one or more actions when it receives control data indicating that DU(A1) is active. For example, it can update a local table of DUs to indicate the status of DU(A1) as active. Optionally, Director B can use the indication that DU(A1) is active to select video data received from VENUE A (showing the new speaker) and cause that video to be output to a display at VENUE B. Optionally, Director B can use the indication that DU(A1) is active to select data to be overlaid on a video signal sent to a display at VENUE B, such as a name and/or job title of the participant. The lower part of FIGURE 11 shows steps when a DU is deactivated.

• Conference Chairperson decides DU(A1) should be deactivated.

• Director A sends a TURN MIC OFF indication to the CCU.

• CCU sends the TURN MIC OFF indication to DU(A1).

• DU(A1) turns its microphone off and sends a MIC OFF confirmation to the CCU.

• CCU sends the MIC OFF confirmation to Director A.

• Director A sends an indication of DU(A1) being deactivated to Director B. For example, Director A can send data which indicates: an identifier of DU(A1); microphone state (OFF).

• Director A can perform one or more remove speaker actions. For example, Director A can update a local table of DUs, e.g. to indicate the status of DU(A1) as inactive. Director A can use the indication of an inactive DU to switch away from the video data of DU(A1).

• Director B can perform one or more actions when the MIC OFF indication is received. For example, it can update a local table of DUs to indicate the status of DU(A1) as inactive. Director B can switch away from video data of DU(A1).

CCU REQUEST-TO-SPEAK MODE (CHAIRPERSON AT VENUE B)

FIGURE 12 shows an example of flow paths for control signals associated with activating/deactivating DUs when a CCU operates in req uest-to-s peak mode and the conference Chairperson is at VENUE B. Operation is similar to FIGURE 12, but the conference Chairperson at Director B is in control of activating and deactivating DUs. The upper part of FIGURE 12 shows steps when a DU wishes to participate in a conference:

• DU(A1) sends a request to speak (RTS) indication to the CCU.

• CCU sends the RTS indication to Director A.

• Director A sends the RTS indication to Director B. For example, Director A can send data which indicates: an identifier of DU(A1); microphone state (RTS).

• Director B adds an identifier of DU(A1) to a list of waiting DUs. The list may be ordered according to one of more criterion, such as time of receipt, priority etc.

• at some later time, Director B determines that DU(A1) can be activated.

• Director B sends a TURN MIC ON indication to Director A.

• Director A sends a TURN MIC ON indication to CCU.

• CCU sends the TURN MIC ON indication to DU(A1). DU(A1) turns its microphone on.

• DU(A1) sends a MIC ON confirmation to the CCU. The CCU sends the MIC ON confirmation to Director A.

• Director A can perform one or more actions when the TURN MIC ON control data is received. For example, it can update a local table of DUs to indicate the status of DU(A1) as active. Director A can use the MIC ON control data to select video data from CAM(A) showing the new speaker. Stored data at Director A indicates the camera associated with DU(A1). • Director A sends the MIC ON confirmation to Director B. For example, Director A can send data which indicates: an identifier of DU(A1); microphone state (ON).

• Director B can perform one or more new speaker actions. For example, Director B can update a local table of DUs, e.g. to indicate the status of DU(A1) as active. Director B can use the indication of active DU(s) to control cameras 120 at VENUE A.

The lower part of FIGURE 12 shows steps when a DU is deactivated:

• Conference Chairperson decides DU(A1) should be deactivated.

• Director B sends a TURN MIC OFF indication to Director A.

• Director A sends the TURN MIC OFF indication to the CCU.

• CCU sends the TURN MIC OFF indication to DU(A1).

• DU(A1) turns its microphone off and sends a MIC OFF confirmation to the CCU.

• CCU sends the MIC OFF confirmation to Director A.

• Director A sends data indicative of DU(A1) being deactivated to Director B. For example, Director A can send data which indicates: an identifier of DU(A1); microphone state (OFF).

• Director A and Director B can each perform one or more remove speaker actions. For example, Director A can update a local table of DUs to indicate the status of DU(A1) as inactive. Director A can switch away from video data of DU(A1). Director B can update a local table of DUs to indicate the status of DU(A1) as inactive. Director B can switch away from video data of DU(A1).

In this example, the inter-venue control data that Director A receives from Director B is to control a DU at VENUE A. This allows the Chairperson to be located at VENUE B while retaining control over DUs at VENUE A, in the same manner as if the Chairperson were located at VENUE A.

FIGURE 13 shows an example of a data structure 350 which can be used by one of the Directors to store data about participants/seats in a meeting. In this simple example there are participants at VENUE A, VENUE B and a remote attendee. Each Director 150, 250 has an overview of all participants/seats in a meeting and can allow all of the participants to participate in a unified meeting environment. The data structure 350 contains an identifier 352 of a seat number in the overall meeting across all participants. The data structure 350 contains an identifier 354 of where the participant is physically located. Identifier 354 maps the seat number to a physical location of the seat. In this example, seat 1 corresponds to DU1 of a hardware CCU at VENUE A, seat 3 corresponds to DU1 of a hardware CCU at VENUE B, and seat 5 corresponds to a remote attendee, hosted by the RAS. The mapping 354 indicates which interface the Director should use to communicate with the participant (seat). The data structure 350 contains an identifier 356 of a status of the seat. In this example, the current statuses are: MIC ON; MIC OFF; RTS. Other statuses are possible. The data structure 350 contains an identifier 358 of a video source which corresponds to the seat. The video source can be a camera, an image file or some other source. The data structure 350 can also store video presets 360. A video preset can be associated with a set of instructions for controlling a camera, such as PZT values. This causes a camera to frame a shot for the seat. The data structure 350 can also store one or more other types of data for each seat. A Director 150, 250 can update the data structure 350 when: control data is received from a local CCU; inter-venue control data is received from another Director about the status of DUs at that venue. Directors may periodically exchange status data with each other to ensure the locally held data in data structure 350 is up-to-date.

FIGURE 14 shows an example of a data structure 400 which can be used by one of the Directors to maintain a queue of DUs that have sent a RTS indication. In this example, the data structure 400 holds, for each requesting DU: an identifier 402 which identifies the DU making the request. The data structure 400 can store the identifiers 402 in an order in which the requests were received from DUs. This order implicitly represents time of the requests. Optionally, the data structure 400 may include one or more additional fields for each requesting DU. One possible field is a field 404 which indicates a time of the request. The time can be a timestamp recorded by a Director local to the DU. Another possible field is a field indicating a priority 406 of the request. DUs may have different priorities. For example, a council leader may have priority over councillors. One or more DUs may be designated as a Very Important Person (VIP). The data structure may store one or more other criterion which is used to determine an order in selecting a DU. In the example of FIGURE 14, the RTS queue includes a participant at VENUE A (seat 1), a participant at VENUE B (seat 3) and a remote attendee (seat 4). Participants to a meeting are handled in a unified manner even though they are located at different places (VENUE A, VENUE B and remotely).

A Director may automatically update the data structure 400 when a new request-to-speak is received from a DU. An identifier of the DU sending the request is added to the end of the ordered list. Requests can be received from DUs at multiple locations (e.g. VENUE A, VENUE B, RAS). When a DU has completed their turn to speak, the DU is removed from the data structure 400.

A Director may select the next DU to activate based on the queue of waiting DUs, or the Director may display information to allow a Chairperson or other user to select a DU to activate. The Director may automatically switch off the microphone of a DU (e.g. at the end of an item, or upon expiry of a time limit) or the Director may wait for a participant to press a button to turn their microphone off.

A CCU may be configured to permit a Chairperson of a meeting to speak when they turn on their microphone, without a need to join the queue 400. For a CCU which does not support this functionality, a Director 150, 250 may be configured to allow a selected DU, associated with a Chairperson of the meeting, to override all other DUs. Thus, if a RTS signal is received from the DU having an ID code that is associated with the Chairperson of the meeting, the director is configured to instruct each DU to switch off the microphone associated with that DU and to instruct the DU associated with the Chairperson to switch on the microphone associated with that DU. When the chairperson has finished speaking and their RTS button is no long activated, the director instructs the DU associated with the Chairperson to switch off the microphone and receives a confirmation from that DU that the microphone has been switched off. The Director may then instruct the DUs associated with microphones that were previously active to switch on their respective microphones.

Similarly, if a Director receives a signal from a DU that is associated with a VIP seat, the director may turn off all other microphones and activate only the microphone associated with the DU associated with the VIP seat. The director may allow a DU associated with a Chairperson of the meeting to override a DU associated with a VIP seat.

The director may implement the VIP override functionality even if the CCU is unable to implement such functionality. Thus, the Director is able to provide functionality to a CCU that it would not otherwise be able to provide.

ACTIONS BY CONFERENCE CHAIRPERSON

When two or more venues are connected in a bridging mode, one Director is primary and the other Directors are secondary. The primary director is operated by the conference Chairperson or another operator via a GUI. The GUI supports the following functions:

• open the meeting, allowing users to log on to the meeting.

• start the meeting; pause the meeting; restart the meeting (e.g. where a meeting has exceeded a meeting duration).

• change a meeting from a private session (e.g. not webcast and optionally not recorded) to a public session (e.g. webcast to a public audience and recorded) and vice versa, e.g. when an agenda item is private.

• RTS mode: set queue length for RTS mode; manually advance to next waiting speaker (in manual mode).

• delete a person from the queue if they do not want to speak, and clear the queue e.g. if an allocated time runs out.

• trigger a vote.

• mute a speaker.

• elevate or demote a person within an RTS list.

• set a time limit for each agenda item, or sub part of an item. Once the time has elapsed, the director may automatically move on to the next sub part or agenda item.

• set an advance warning period. For example, a participant may be warned 30 seconds before an allotted time expires, such as by causing a light on the DU to illuminate or flash.

DIRECTOR TO CCU COMMUNICATION

Each Director 150, 250 is configured to use a library of control data to format control data sent to conference equipment and to interpret control data received from conference equipment. For example, Director 150 can use the library to interpret the particular form of MIC ON control data sent from CCU 140. Different manufacturers, and different equipment by a particular manufacturer, may encode a MIC ON command in different ways. The library allows the Director 150 to interpret the control data. Director 150 can use the library to format control data sent to cameras 120. Each of the CCU interface 151 , CCU audio interface 152 and video source interface 153 can use a library of control data. Cameras 120 from different manufacturers, or different cameras from the same manufacturer, may use different control data to perform commands such as pan, tilt and zoom. The library allows the Director 150 to output camera control data in a format used by the cameras 120.

DIRECTOR-TO-DIRECTOR COMMUNICATION

Communication between Directors 150, 250 is in a format which is common to the Directors. For example, audio data is sent and received in one format, independently of what format is used on the CCU interface in VENUE A or the CCU interface in VENUE B. Similarly, video data is sent and received in one format, independently of what format is used by VENUE A and VENUE B.

THREE OR MORE DIRECTORS

FIGURE 15 shows a meeting with three venues: VENUE A, VENUE B, VENUE C. Each venue has a Director (DIRECTOR A, DIRECTOR B, DIRECTOR C). The Directors operate as described above. Communication between Directors can be configured with:

(i) a point-to-point communication link between each pair of Directors, i.e. A-B, A-C, B-C; or

(ii) point-to-point communication links between a limited number of Directors, with one of the Directors acting as a relay. This is shown in FIGURE 15. Director A is connected to Director B and Director C. There is no direct connection between DIRECTOR B and DIRECTOR C. Instead, Director B communicates with Director C via Director A, i.e. B-A and A-C. The reverse path is the same, i.e. C-A and A-B.

FIGURE 15 shows how a Director combines audio for the different Directors. Each participant at a DU should hear every other participant at VENUES A, B and C. This requires each DU to receive an audio mix A+B+C. Director A receives audio data (AUDIO: B) from Director B and (AUDIO: C) from Director C. Audio A is already available to the CCU at VENUE A. The CCU at VENUE A forms an audio mix A+B+C by combining the audio received from its own local DUs with the audio mix B+C received from Director A. Director B receives an audio mix A+C from Director A. The CCU at VENUE B forms an audio mix A+B+C by combining the audio received from its own local DUs with the audio mix A+C received from Director A. Similarly, Director C receives an audio mix A+B from Director A. The CCU at VENUE B forms an audio mix A+B+C by combining the audio received from its own local DUs with the audio mix A+B received from Director A. This scheme avoids looping audio, which can help to reduce echo and improve clarity. It also minimises the amount of data sent between Directors. It can also be advantageous where a particular pair of venues lack a direct connection, such as VENUE B and VENUE C.

CONFIGURING A CONFERENCE PAIRING

As described above, there is a Director associated with each venue (VENUE A, VENUE B). In some examples, a conference can be established between two venues, with participants at VENUE A and VENUE B.

A software application (called a “Conductor”) associated with each venue is used to generate meeting information in respect of that venue. The meeting information is stored and accessible to the Director associated with the venue. The Conductor software application may provide a graphical user interface (GUI) to allow configuration of the meeting.

FIGURE 16 shows an example of configuring a conference between two venues (VENUE A, VENUE B). Initially, at blocks 601 , 602, a user configures the conference equipment/participants at each venue. This can be performed separately by a user at each venue. Blocks 603-605 show some of the functions to configure a conference at VENUE A. At block 603 a user configures Director A with the seats/seating positions to be used in VENUE A of the conference. At block 604 a user configures Director A with the audio conferencing equipment to be used in VENUE A of the conference. The main audio equipment is DUs (microphones) used at seats. At block 605 a user configures Director A with the video conferencing equipment to be used in VENUE A of the conference. The video equipment includes one or more cameras and one or more displays or digital signage. A user configures the conference at VENUE B in the same manner as for the user at VENUE A.

At block 606, a user configures a Chairperson of the conference. This indicates the seat where the chairperson of the conference will be sitting. The chairperson can have particular functions which are described elsewhere.

At block 608, an association or “pairing” is formed between Director A at VENUE A and Director B at VENUE B. The pairing uses a pairing communication between Director A and Director B. In this example, Director A at VENUE A sends a pairing request 610 to Director B at VENUE B. Director B sends a pairing reply 611 to Director A. If the pairing reply accepts the pairing request, then a pairing is formed between Director A and Director B. A Chairperson of a meeting can be located at either venue.

Director A and Director B exchange configuration data. Director A sends configuration data 612 to Director B. Director B sends configuration data 613 to Director A. The configuration data 612, 613 can include: number of participants; names of participants. The Directors can send abstracted configuration data. As an example, consider VENUE A has 5 DUs from manufacturer X and a CCU with a digital interface to the Director 150. While Director A needs to know this information in order to locally communicate with the audio equipment, Director B at VENUE B does not need to know all of this information. Director A can determine an abstracted set of configuration data. This conveys the important information that Director B needs to know. The abstracted configuration data is reduced in size and carries less detail. In this example, the abstracted configuration data can define that VENUE A has: five seats/DUs and identifiers of the seats/DUs: 1 , 2...5; metadata for each seat/DU, such as name, title etc.; an identifier of a camera/video source associated with each seat/DU.

After the pairing process, Director A has abstracted configuration data for VENUE B and Director B has abstracted configuration data for VENUE A. Each Director already has more detailed configuration data about the local venue, which is sufficient to communicate locally with the audio and video equipment. Each Director can merge the abstracted configuration data with its own local configuration information to form a combined set of configuration data that includes the participant information in respect of the two venues.

It is possible to form a conference between more than two venues. Where a conference is to be established between more than two venues, pairing signals may be sent between each venue. For example, where a conference is required between VENUE A, VENUE B and VENUE C, there is a pairing (as described above) between: VENUE A and B; VENUE A and C; and VENUE B and C. This allows each Director to acquire information about the overall conference. A group of two or more Directors associated with respective venues that are configured to cooperate to enable a single meeting to be held may be referred to as a cluster of Directors, or a cluster of venues.

A Director can provide an interactive configuration application to allow a user to configure a conference/meeting. The configuration application uses a GUI. FIGURES 17-19 show an example of displays of the application to configure a venue to participate in a conference. The application provides an interactive visual representation of a venue. The application may initially display a blank box which the user can fill, or the application may display a template representing a plan of a meeting environment, such as a conference room or a council chamber which is regularly used for meetings. The configuration application displays a pane 622 with symbols representing “assets” in the form of audio conferencing equipment 623 and video equipment (cameras 624, displays 625). Each of the assets has been previously registered with the Director at a time of installing the equipment. For example, the Director knows that a DU with an identifier code XYZ is a DU from manufacturer X, and the capabilities of the DU (e.g. audio format (analogue/digital; coding/decoding format(s); bit rate(s); control data protocol; control data format). The GUI has a pane 626 with symbols representing seats (seating positions). The configuration application displays a pane 627 of participants or participants. The panes 622, 626, 627 could be implemented as drop down menus or in some other way.

The configuration application supports drag-and-drop functionality to allow seats to be dragged from pane 626 to required positions within the meeting environment template. In FIGURE 17, four seats have already been added to the plan. The Director is configured to assign to each seat a unique identifier in the form of a code. Audio equipment is associated with seating positions. In FIGURE 17, three of the seats already have DUs allocated to them. A DU (microphone) may be dragged from the DU store portion 623 of the GUI and dropped over a seat 624 that is to be associated with a DU. In some venues, audio conferencing equipment is installed in a fixed manner. For example, a DU may be installed (in a fixed manner) at a desk at a particular seating position. Therefore, a DU/microphone is already associated with a seating position.

Participants of a conference can be added to the plan in a similar manner. A participant name may be dragged from the participant store portion 627 of the application and dropped over a seat 621 that is to be associated with the participant. A participant can be associated with participant information such as: personal name; title (e.g. council leader, councillor), name of the local ward or district they represent.

The configuration application allows a camera to be associated with a seat. A venue may have a single camera, or multiple cameras. In each case, a camera may be shared between a plurality of seats. This means that, at different points in time during a conference, the camera can be pointed at different participants. The application allows a user to configure, for each seat, an identity of a camera for that seat and, where applicable, camera settings information in respect of that seat, such as camera pan, tilt and zoom (PTZ) information. FIGURE 18 shows an example of a display screen 640 of the application for configuring camera settings. A pane 641 displays information about the seat, such as: seat number; participant; microphone identifier and camera identifier. A pane 642 displays a live view of the selected camera. A tool bar 644 allows the user to vary the view of the camera to frame the user at the seat. The tool bar 644 may provide control of: pan (left/right), tilt (up/down) and zoom (in/out). A save button 645 allows a user to save the current camera settings. These camera settings are stored by the Director and can be recalled, at a later time, to cause the camera to provide the same view.

Another possible configuration setting is a lamp illumination colour of a request to speak (RTS) button at a DU. At least one lamp is also typically associated with each DU. For example, the lamp may be illuminated in a red colour or a green colour under the control of the director application associated with the venue.

FIGURE 19 shows an example of a configuration application display 650 at one of the Directors, after the pairing of VENUE A and VENUE B. The display 650 schematically shows: a region 651 showing VENUE A with seats, audio equipment and video equipment; and VENUE B with audio equipment and video equipment. Participant information is stored for each seat on the plan. As described above, after a pairing process both Directors have access to this combined configuration data for the meeting. The configuration application can be provided as a web application which can be accessed directly from a GUI of the computer hosting the Director, or from any computing device (e.g. a desktop computer, mobile computing device) which communicates with the web application.

DOCUMENT MANAGEMENT SYSTEM (DMS)

Referring again to FIGURE 3, Directors 150, 250 can each access an admin server 306. The admin server hosts a document management system (DMS) 307. The DMS can interface to third party DMS systems 310 and import documents to the DMS 307. The DMS provides document management services for meetings/conferences, such as documents which are discussed during meetings/conferences.

A meeting or conference typically has an agenda of items to be discussed. The DMS can store agenda data (e.g. in an agenda folder) comprising agenda meeting items. The Agenda application communicates with the Director. Each agenda meeting item may include information in respect of:

- the name of the person that is leading the agenda item;

- a list of documents in the DMS associated with that agenda item;

- an amount of time allotted to the agenda item;

- information regarding a vote in respect of that agenda item e.g. a statement defining the vote such as a motion.

The Directors 150, 250 may be configured to access the agenda data. The Directors 150, 250 can use the agenda data to control the meeting. For example, the conference Chairperson can use the agenda data to ensure the agenda topics are discussed and the meeting runs to time. The Directors 150, 250 can be configured to retrieve data from the agenda data and display, on the one or more video displays 130, 230, information indicating the current agenda item and an amount of time allotted to the item. The Directors 150, 250 may be configured to display a timer that indicates the amount of time elapsed since the agenda item was begun and/or an amount of time remaining forthat agenda item. The information displayed is configurable by the local Director 150, 250 based on a preset meeting display style. The Directors 150, 250 at respective venues may employ their own preset meeting display style and access the information required to be displayed from the agenda folder. Thus, the Directors 150, 250 have access to the same information but choose which information is to be displayed, and the manner in which it is to be displayed (such as text font, colour and/or size).

A user may use an agenda application to set up the agenda data. The agenda application can be a separate application to the Director. That is, it can be used by administrator personnel independently of the Director application. In some embodiments the Director application itself may implement the agenda application.

REMOTE ATTENDEES FIGURE 20 shows a system which supports remote attendees to a meeting. Remote attendees are participants who are not physically present in VENUE A or VENUE B. A remote attendee may join the meeting by using a communication device in their home or office, or a mobile device while travelling. FIGURE 20 shows two ways of supporting remote attendee devices:

- remote attendee devices RA(A1-AN) connecting via a Remote Attendee System (RAS) and a wide area network (e.g. Internet);

- remote attendee telephone devices RA(T1-TN) connecting using a voice call over a cellular wireless network.

Each of the remote attendee devices RA(A1-AN) may be a device such as a personal computer (PC), tablet, smartphone or other computing device running an application which can individually communicate with the remote attendee system. Each of the remote attendee devices RA(A1-AN) is typically used by a user who is working at home, at an office outside of the first conference venue, or travelling. Each of the remote attendee devices RA(A1-AN) may be used by an individual remote attendee, or shared by a group of remote attendees who can gather around the device to share the microphone and loudspeaker/audio output of the device. The remote attendee device RA(A1-AN) is not intended to be an installed conference system, such as the equipment at the first conference venue with a CCU and multiple DUs.

Remote attendee devices RA(A1-AN), RA(B1-BN). The Remote Attendee System (RAS) 304 is a network-based system which supports communication links with remote attendee devices RA(A1- AN), RA(B1-BN). RA(A1-AN) are associated with VENUE A. RA(B1-BN) are associated with VENUE B. The number of remote attendee devices can be any number between 1 and N. The remote attendee devices may be a personal computer (PC), tablet, smartphone or other computing device running an application 305 which can communicate with the RAS. The application 305 is capable of communicating audio data with the RAS via a data network, i.e. audio is carried in packetised form over a data network. A suitable protocol is Voice over Internet Protocol (VoIP). Other protocols can be used. Audio is captured by a microphone on the remote attendee device (or an external microphone connected to the remote attendee device) and audio is output by a loudspeaker on the remote attendee device (or an external loudspeaker, headset or earpiece connected to the remote attendee device.) The communication path between the Director, RAS 304 and remote attendee device may also support video and control data. The application 305 can display video of the meeting. The application 305 can also display virtual control buttons which provide the same, or similar, functionality as participants in the venues 100, 200. For example, the virtual control buttons can support request-to-speak, voting etc. This allows a remote attendee who has an Internet connection to participate in the conference in the same manner as a participant who is physically present in one of the venues. Remote attendee devices RA(A1-AN) can participate in a meeting with VENUE A, or they can participate in an inter-venue meeting, such as a meeting between VENUE A and VENUE B. Each Director 150, 250 communicates with the RAS 304.

In one method of operation, RA(A1-AN) are associated with VENUE A and communicate with Director A. Similarly, RA(B1-BN) are associated with VENUE B and communicate with Director A. When an inter-venue meeting takes place between VENUE A and VENUE B, Director A shares information about RA(A1-AN) as part of the configuration data of participants at VENUE A and Director B shares information about RA(B1-BN) as part of the configuration data of participants at VENUE B.

FIGURE 21 A shows the system for supporting remote attendee devices in more detail. The Director 150 at VENUE A communicates with RAS 304. RAS 304 communicates with remote attendee devices RA(A1), RA(A2). RAS 304 comprises Remote Attendee CCU 312 and a Media Server 313. The Remote Attendee CCU 312 manages connectivity and status using control data. The Media Server 313 supports the forwarding of media content, such as audio, video and data (e.g. documents associated with a meeting). The Remote Attendee CCU 312 can also be called a Virtual CCU.

The Director hosts a software module 311 which provides an interface to the RAS and the remote attendee devices RA(A1), RA(A2) connected to the RAS. The software module 311 can be a library stub. The software module 311 appears to the Director as an interface to another CCU and connected DUs, which allows a unified functionality at the core of the Director. The RAS and remote attendee devices RA(A1-AN) can be viewed as a remote attendee CCU with a plurality of connected DUs. The CCU (RAS) 311 provides an interface between the Director and the RAS. It offers the same functionality to the Director as a hardware CCU 140, but calls upon the RAS 304 to provide status and audio/video functionality. The Director provides the seamless management of local and remote attendees.

The RAS 304 runs as an Internet service which accepts connections from both the CCU (RAS) 311 and Remote Attendee Client applications 305. It maintains the activation state of each attendee, handles authorisation of attendees to a meeting, provides the audio and video streams for remote attendees, and ensures that all attendees are provided the same content, regardless of location.

The Remote Attendee Client application 305 runs on the remote attendee’s device 305, and connects to the RAS 304. The client application 305 presents audio and video data from the meeting, sends voice and video data into the meeting, and provides convenient access to the same documents and facilities that regular event attendees enjoy. An example user interface 314 of the application is shown. The user interface 314 comprises controls, such as a virtual request to speak (RTS) button 315. An indicator 316 indicates status of the microphone (on or off). A pane 317 displays video. Typically, this will be the same video as is currently being displayed in the VENUE A. Metadata 318 associated with the speaker is displayed, such as speaker name. A pane 319 provides access to documents associated with the meeting, such as the agenda and any documents discussed on the agenda.

When an event is marked as having remote attendees, a Remote Attendee CCU 312 is prepared in the RAS 304. Remote attendees can then be added to the Remote Attendee CCU 312, allowing the proper control of who may and may not be present (remotely) in an event. Each meeting will have a unique Remote Attendee CCU identifier associated with it, and once the Client has successfully authorised themselves to the RAS, a list of appropriate identifiers will be provided by the server.

The Director 150 connects to the Remote Attendee CCU 312 and integrates the remote attendees into the meeting, to form a single list. When the meeting is running, it is the job of the CCU (RAS) 311 to communicate with the RAS 304 to ensure that updates to the state of the meeting (e.g. change of active speaker, or the start of a vote) are made available to Remote Attendee Client applications 305, and any actions or notifications sent by a Remote Attendee Client application 305 are handled appropriately.

The Remote Attendee Client application 305 will negotiate a connection to an existing remote attendee CCU 312 and, if successful, will appear as a connected DU on that device. The RAS 304 will send meeting audio and video to the Remote Attendee Client application 305, and will notify the Remote Attendee Client application 305 about any changes to the state of the meeting. The Director 150 and the RAS 304 can manage the status of remote attendees to a meeting, such as by an asynchronous communication channel. The Director 150 and the RAS 304 can query, respond and update the status of a Remote Attendee.

FIGURE 21 B shows a system with two venues (VENUE A, VENUE B) and remote attendee devices RA(A1-AN), RA (B1-BN). The remote attendee devices can participate in an inter-venue meeting between VENUE A and VENUE B. The remote attendee devices RA(A1-AN), RA (B1-BN) communicate with the RAS 304. Each of the remote attendee devices is associated with one of the venues. Remote attendee devices RA(A1-AN) are associated with VENUE A and remote attendee devices RA(B1-BN) are associated with VENUE B. The RAS 304 has a Remote Attendee CCU (Virtual CCU) per venue. These are shown as virtual CCUs V-CCU_A 312A and V-CCU_B 312B. Director A 150 receives audio (and video) data for the remote attendee devices RA(A1-AN) from RAS 304. Similarly, Director B 250 receives audio (and video) data for the remote attendee devices RA(B1-BN) from RAS 304. The virtual CCUs 312A, 312B inform the Directors 150, 250 how to obtain the audio and video data from the RAS. The media server 314 handles the forwarding of audio and video data. For inter-venue meetings, there are two possible ways of forwarding audio (and video) data. In a first option, audio and video data for the remote attendee devices RA(A1-AN) is sent from the RAS to Director A. Director A then forwards the audio and video data for the remote attendee devices RA(A1-AN) to Director B along with other VENUE A audio and video data. In the reverse direction, Director B sends audio and video data from VENUE B to Director A and then Director A forwards the audio and video data to the RAS. The RAS forwards the audio and video data to the remote attendee devices RA(A1-AN). Similarly, audio and video data for the remote attendee devices RA(B1-BN) is sent from the RAS to Director B. Director B then forwards the audio (and video) data for the remote attendee devices RA(B1-BN) to Director A. In the reverse direction, Director A sends audio and video data from VENUE A to Director B and then Director B forwards the audio and video data to the RAS. The RAS forwards the audio and video data to the remote attendee devices RA(B1-BN).

In a second option, Director B 250 obtains audio and video data for the remote attendee devices RA(A1-AN) directly from the RAS 304. Director A tells Director B how to obtain the audio and video data for the remote attendee devices RA(A1-AN). For example, Director A can send to Director B an address of a source of the audio (and video) data at the media server 313. In the reverse direction, Director B sends audio and video data from VENUE B to the RAS. The RAS forwards the audio and video data to the remote attendee devices RA(A1-AN). Similarly, Director A obtains audio and video data for the remote attendee devices RA(B1-BN) directly from the RAS 304. Director B tells Director A how to obtain the audio and video data for the remote attendee devices RA(B1-BN). In the reverse direction, Director A sends audio and video data from VENUE A to the RAS. The RAS forwards the audio and video data to the remote attendee devices RA(B1-BN). This second option has an advantage of reducing delay for the audio and video data. It also has an advantage of reducing processing load on the Directors 150, 250.

In FIGURE 21 A and FIGURE 21 B each of the remote attendee devices is associated with one of the venues. In another example, each of the remote attendee devices is not associated with a particular one of the venues. Each of the Directors 150, 250 can obtain information about the available remote attendee devices from the RAS, receive audio (and video) data for the remote attendee devices from the RAS and send audio (and video) data for the remote attendee devices to the RAS. One or both of the Directors 150, 250 can send control data to the RAS and receive control data from the RAS for the remote attendee devices.

FIGURES 22 and 23 illustrate how a remote attendee device RA(A1) can participate in a meeting. FIGURE 22 shows an example of audio flow paths. These are similar to the inter-venue audio flow paths shown in FIGURE 7. Audio is forwarded and received (i.e. bridged) between VENUE A and remote attendee device RA(A1). In this example, DU(A1) is active (i.e. microphone turned on) at VENUE A and RA(A1) is participating in the meeting. Audio of the speaking participant is captured by a microphone at DU(A1) and output to the CCU 140. CCU 140 outputs a mix of the audio of active DUs to each of the DUs DU(A1-AN). CCU 140 also outputs the audio (or an audio mix if there are multiple active DUs) to Director 150. Director 150 may decode, or transcode, audio received from the CCU 140. Director 150 encodes the audio and forwards the encoded audio to RAS 340. RAS 340 forwards the audio to the Remote Attendee Client Application of RA(A1). This allows the participant using RA(A1) hear the speaking participant at VENUE A. The reverse audio path between RA(A1) and VENUE A is also shown.

FIGURE 23 shows an example of flow paths for control signals associated with activating/deactivating DUs. This is similar to the inter-venue example shown in FIGURE 12. In this example, the participant at RA(A1) wishes to speak. They can request to speak by pressing a virtual button on the display of their Remote Attendee Client Application 305. Control data indicating RTS is sent from RA(A1) to the RAS, and forwarded to Director A. RA(A1) is added to the RTS list. At a later time, DU(A1) at VENUE A wishes to speak by pressing an RTS button on their DU. DU(A1) is added to the RTS list. At a later time, the Chairperson allows RA(A1) to speak. A “Turn MIC ON” indication is sent to the RAS and forwarded to the Remote Attendee Client Application 305 of RA(A1). This turns on the microphone at RA(A1). The Remote Attendee Client Application 305 sends a MIC ON confirmation to the RAS, which is forwarded to Director A. From these examples, it can be seen that the remote attendee has the same kind of meeting experience as a participant at VENUE A.

FIGURES 23 and 24 only show VENUE A. Remote attendees can also participate in an inter-venue meeting, such as a meeting between VENUE A and VENUE B. As described above, Director A can forward to Director B audio, video and control data for a remote attendee associated with VENUE A. Alternatively, Director A can forward to Director B control data which allows Director B to communicate with the RAS and directly receive audio and video data for a remote attendee associated with VENUE A. Director A can receive from Director B audio, video and control data for a remote attendee associated with VENUE B. Alternatively, Director A can forward to Director B control data which allows Director B to communicate with the RAS and directly send audio and video data for a remote attendee associated with VENUE A.

Remote attendee devices RA(T1),...RA(TN).

Returning to FIGURE 20, the Director 150 at VENUE A is connected to wireless equipment 320 which supports a wireless connection with remote attendee telephone devices RA(T1),...RA(TN) via a cellular wireless network. The number of remote attendee telephone devices can be any number between 1 and N. There are several possible ways of forming a communication path to the remote attendee telephone device. RA(TN) is connected end-to-end via a cellular network. For example, RA(TN) is a wireless cellular phone (e.g. 2G/3G/4G/5G cellular phone). RA(T1) is a telephone connected to the public switched telephone network (PSTN). The communication path between the wireless equipment 320 and the remote attendee device RA(T1) comprises a portion via a cellular wireless network and a portion via the PSTN. The wireless equipment 320 has one or more phone numbers associated with it. A remote attendee can dial in to join a meeting by dialling the phone number associated with the wireless equipment 320. It is also possible for an operator, such as a chairperson of a meeting, to dial out to the remote attendee. An advantage of providing wireless equipment 320 connected to the Director is that the conference organiser can retain control over the phone numbers to join a meeting. A call via the cellular wireless network terminates at the wireless equipment 320. This contrasts with a situation where the Director connects to a private branch exchange (PBX) or similar equipment at a venue, where the phone number is associated with the PBX equipment and there is a need to interface with the PBX, ensure the PBX has sufficient capacity to support the number of remote attendees who will be dialling in, and distribute the phone number to attendees.

FIGURE 24 shows an example of the wireless apparatus 320 for use at a venue. The wireless apparatus 320 can be provided in the form of a dongle or other plug-in device. The wireless apparatus 320 supports a connection with a single remote attendee telephone device. A plurality of similar dongles 320 can be provided. The wireless equipment 320 has an interface 322 for interfacing with the Director 150, such as a Universal Serial Bus (USB) interface with data lines and power lines. The interface 322 can be in the form of a USB plug. The interface 322 has data lines 322A and power lines 322B. The data lines 322A carry audio data and control data, such as control data relating to call control and control data which indicates status of the apparatus 320 and/or a call. The wireless apparatus 320 comprises an audio processing module 323 which is configured to convert audio data between a format used on the interface 322 and a format used by the cellular wireless transceiver 324. The cellular wireless transceiver 324 is configured to receive/send audio data from/to the audio processing module 323. The cellular wireless transceiver 324 has an RF interface which connects to an antenna 321. RF interface is configured to send/receive a modulated RF signal. The cellular wireless transceiver 324 is configured to support a voice call via a cellular wireless network, such as a cellular wireless network operating according to 2G/3G/4G/5G or other network technology. The cellular wireless transceiver 324 is connected to, or includes, a Subscription Identity Module (SIM) reader 326. The SIM reader accepts a SIM card 325 which provides a cellular wireless network identity, such as a Mobile Station International Subscriber Directory Number (MSISDN).

A processor 328 is connected to the data lines 322A, and to the cellular wireless transceiver 324. The processor 328 controls operation of the cellular wireless transceiver 324. The processor 328 is configured to handle call control. The processor 328 notifies the Director when there is an incoming call to the cellular wireless transceiver 324 and causes the cellular wireless transceiver 324 to answer the incoming call in response to an instruction received from the Director. The processor 328 can also cause the cellular wireless transceiver 324 to dial out to a telephone number provided by control data received from the Director. The processor 328 is also configured to output control data onto the data lines 322A. The control data can emulate the control signals normally found on a CCU interface. In this way, the wireless apparatus 320 can appear, to the Director, as another CCU. The processor 328 is configured to map an operating state of the wireless apparatus to CCU control data. The state diagrams of FIGURES 25-27 illustrate example values of the mapped control data. The control data is indicative of: an operating state of a CCU (on/off); a connection state of a delegate unit (DU) connected to a CCU (connected/disconnected); a state of a microphone of the DU (on/off/RTS).

The antenna 321 may be integrated with the wireless apparatus 320 or may be physically separate from the wireless apparatus 320 and connected to it by a cable. Physically separating the from the wireless apparatus 320 can be advantageous where the Director is located in a room with poor cellular wireless reception.

Operation of the wireless apparatus/dongle 320 will now be described. FIGURE 25 shows a state diagram for the wireless apparatus 320. In state 331 the dongle 320 is disconnected from the Director 150. When the dongle is plugged into the Director 150, it moves to a Device Initialised state. To emulate a CCU, the dongle indicates that the CCU is on, but the DU/MU is disconnected. When the dongle connects to a wireless cellular network, it moves to a Connected to mobile network state. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected and off. The dongle shown in FIGURE 24 has a single SIM card and is capable of connecting to a single caller at a time. The dongle reports that it is connected to a single DU.

FIGURE 26 shows operation of the wireless apparatus/dongle 320 for an incoming call. When a person wishes to participate in a meeting, they can dial the mobile network number of the dongle 320. The initial state is state 334. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected and off. An incoming call (ringing, but not answered) causes the dongle to move to state 335. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected and RTS. When an operator at VENUE A (e.g. the Chairperson) accepts the call, the dongle moves to state 336. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected, on and muted. The caller can hear audio of the meeting. When an operator at VENUE A unmutes the caller, the dongle moves to state 337. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected and on. The caller can hear audio of the meeting and be heard in the meeting. The various paths back to state 334 are illustrated, along with the conditions required for the return.

FIGURE 27 shows operation of the wireless apparatus/dongle 320 for an outgoing call. In some situations, a person can be invited to join a meeting by phoning out to the person. For example, a council meeting to discuss planning applications may only require input from an applicant during a short period of the meeting. The person can be contacted by phone shortly before their item is due to be discussed. The initial state is state 341. An operator calls a phone number of a participant and moves to ringing state 342. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected and RTS. When the called party accepts (answers) the call, the dongle moves to state 343. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected, on and muted. The called party can hear audio of the meeting. When an operator at VENUE A unmutes the caller, the dongle moves to state 344. To emulate a CCU, the dongle indicates that the CCU is on, and the DU/MU is connected and on. The caller can hear audio of the meeting and be heard in the meeting. The various paths back to state 341 are illustrated, along with the conditions required for the return.

FIGURES 26 and 27 show examples where the microphone of a remote attendee is always on, and the operator (Chairperson) can control when to mute and unmute the microphone. In another method of operation, the remote attendee can request to speak in a similar manner as a participant of a DU in VENUE A. The remote attendee can press a key (e.g. #) to request to speak and the dongle can determine when the RTS key has been pressed by recognising Dual Tone Multi Frequency (DTMF) tones in the audio. The dongle can forward a RTS indication to the Director when the RTS indication is detected from the remote attendee. In FIGURES 26 and 27, a participant is added to an RTS list when the participant answers their phone (outgoing call) or the incoming call is ringing. A remote attendee may use a keypad of their telephone device to vote by selecting a predetermined key, such as ‘1’ to vote YES, ‘2’ to vote NO and ‘3’ to ABSTAIN from the vote.

The wireless apparatus 320 supports a connection to a single remote attendee device. A plurality of the wireless apparatus (dongle) 320 can be provided, depending on a number of remote attendees to be supported. Each wireless apparatus 320 can appear, to the Director, as a separate CCU serving a single DU (e.g. RA(T1)). In another example, a wireless apparatus can be provided which supports a plurality of simultaneous wireless cellular connections. The wireless apparatus 320 can comprise a plurality of individual dongles, same as shown in FIGURE 22. Alternatively, the wireless apparatus 320 can comprise apparatus with capacity to handle a plurality of simultaneous connections, where each connection is associated with a mobile phone number to allow a user to dial in, if required.

In an alternative form of the wireless apparatus 320, the connector 322 can be an Ethernet plug and the processor 328 can be configured to communicate via an Ethernet interface. This can allow the wireless apparatus 320 to be located further from the Director 150 than is possible with a USB interface. For example, the wireless apparatus 320 can be connected to an Ethernet LAN at a position which offers good cellular wireless coverage.

DIRECTOR CONNECTED TO MULTIPLE CCUS In the example shown in FIGURE 3, a Director 150 connects to one CCU 140 or a chain of CCUs which are homogeneous, such as CCUs from the same vendor. FIGURE 28 shows an example of apparatus at VENUE A which connects to multiple CCUs. Two CCUs (CCU1 , CCU2) are shown, but this can be extended to a larger number of CCUs. Audio conferencing equipment 105(1) comprises CCU1 140(1) and DUs 110(1). Audio conferencing equipment 105(2) comprises CCU2 140(2) and DUs 110(2).

In one example, CCU1 is different to CCU2. That is, an interface required to communicate with CCU1 is different to an interface required to communicate with CCU2. The interfaces are heterogeneous. CCU1 may use different technology compared to CCU2. CCU1 may be from a different vendor to CCU2, or CCU1 and CCU2 may be from the same vendor, but different (incompatible) model ranges. For example, CCU1 may send control commands to turn a microphone on/off at a DU 110. The format of these control commands can be different between CCU1 and CCU2. Another example of different technologies is audio coding, audio formats, encryption. The interface with a CCU comprises a CCU control interface for carrying control data and a CCU audio interface for carrying audio data.

The Director 150 can connect to one or more other Directors, as previously described, to form inter- venue meetings. The Director 150 can connect to a RAS to connect to remote attendee devices, as previously described.

Director 150 stores data about participants in a meeting, shown as the boxes: VENUE A (CCU1); VENUE A (CCU2); REMOTE (A). The Director 150 has an overview of all participants in a meeting and can allow all of the participants to participate in a unified meeting environment. Participants at DUs connected to CCU1 , participants at DUs connected to CCU2 and remote attendees can hear (and optionally see) each other. Also, all participants can participate in voting and access the meeting agenda and related documents.

FIGURE 29 shows an example overview of interfaces and of signals sent between equipment at venue 100. This is similar to FIGURE 6A, and therefore only significant differences will be described. Each CCU (CCU1 , CCU2) may communicate with connected DUs using a (different) proprietary communications protocol. CCU1 140(1) is configured to output audio and control data/signals to the Director 150. The audio may be a mix of audio signals from one or more active microphones. The Director 150 has a CCU1 (control) interface 151 (1) configured to communicate with CCU1 , and a CCU2 (control) interface 151 (2) configured to communicate with CCU2. The Director 150 has an audio interface (CCU1) or CCU1 audio interface 152(1) configured to communicate audio with CCU1 , and an audio interface (CCU2) or CCU2 audio interface 152(2) configured to communicate with CCU2. As described above, the audio and control signals between the CCU 140 and Director 150 may be carried in frame or packet-based form over a local area network, such as in payloads of Ethernet frames. The CCU1 interface 151 (1), 152(1) between the CCU1 140(1) and Director 150 can have an IP (TCP/IP) layer. It can also comprise at least vendorspecific layer, such as one or more of: an audio coding format; an audio data encryption type/format; a protocol for sending control data; a control data coding format; a control data encryption type/format. The at least vendor-specific layer of the CCU1 interface 151 (1), 152(1) may be different to the at least vendor-specific layer of the CCU2 interface 151 (2), 152(2). At least vendor-specific layer of CCU1 audio interface 152(1) may be different to the at least vendor-specific layer of CCU2 audio interface 152(2). At least vendor-specific layer of CCU1 control interface 151 (1) may be different to the at least vendor-specific layer of CCU2 control interface 151 (2).

The Director 150 can be provided with interface data to allow the Director to communicate with a range of different CCUs. The Director 150 can be provided with interface data to allow the Director to communicate with a range of different video equipment.

FIGURE 29 shows a system which is similar to FIGURE 6A, with a LAN connection between equipment and the Director. It is also possible to use a conversion unit 160, as shown in FIGURE 6B, to convert between analogue signals (to/from a CCU) and digital signals (to/from the Director).

One advantage of the arrangement shown in FIGURES 28 and 29 is that an owner can add new conferencing equipment to an existing system, or replace some of their existing equipment, while still providing a uniform experience to participants. Existing equipment can comprise CCU1 140(1) and DUs 110(1), and new equipment can comprise CCU2 140(2) and DUs 110(2). The Director interfaces with each set of equipment and allows incompatible equipment, or equipment which is not designed to be used together, to interoperate via the Director. In some examples, CCU1 and CCU2 are the same or similar (i.e. they are homogeneous), but CCU1 and CCU2 are of a type which are not intended to connect or interoperate with one another. By individually connecting CCU1 and CCU2 to Director 150, it is possible to support a unified meeting between DUs 110(1) connected to CCU1 and DUs 110(2) connected to CCU2. CCU1 and CCU2 can be unaware of the existence of each other. This can be useful when an operator wants to expand the number of DUs in their system. For example, CCU1 may already support the maximum number of DUs and CCU1 may not be designed to interoperate with another CCU. By connecting CCU1 and CCU2 to the Director 150, the operator can add CCU2 and DUs 110(2) to the system without having to replace CCU1 and DUs 110(1).

FIGURE 30 shows an example of audio flow paths between DUs connected to CCU1 and DUs connected to CCU2. These are similar to the inter-venue audio flow paths shown in FIGURE 7. In this example, DU(CCU1_1) and DU(CCU2_1) are active (i.e. microphone turned on). Audio is forwarded and received (i.e. bridged) between CCU1 and CCU2 at VENUE A. Audio of the speaking participant is captured by a microphone at DU(CCU1_A1) and output to CCU1. CCU1 outputs a mix of the audio of active DUs to each of the DUs DU(CCU_A1-AN). CCU1 also outputs the audio (or an audio mix if there are multiple active DUs) to Director 150. Director 150 may decode, or transcode, audio received from CCU1. This will depend on the requirements of CCU2. Director 150 forwards audio to CCU2 and DU(CCU2_A1-AN). This allows the participants at DU(CCU_A1-AN) to hear the speaking participant. The reverse audio path between a participant at a DU connected to CCU2 is similar.

If Director A is connected to a Director at another venue (VENUE B), then Director A will also receive audio data from VENUE B. Director A forwards audio (or an audio mix) of CCU1_A1 plus VENUE B audio. If Director A is connected to a RAS and/or a cellular wireless apparatus, then Director A will also receive audio data from remote attendees. Director A forwards audio (or an audio mix) of CCU1_A1 plus remote attendee audio. If Director A is connected to VENUE B and a RAS and/or a cellular wireless apparatus, then Director A forwards audio (or an audio mix) of CCU1_A1 plus VENUE B audio plus remote attendee audio.

FIGURE 31 shows an example of flow paths for control signals associated with activating/deactivating DUs. This is similar to the inter-venue example shown in FIGURE 12. In this example, participant DU(CCU1_A1) wishes to speak. Control data indicating RTS is sent from CCU1 to Director A. DU(CCU1_A1) is added to the RTS list. At a later time, DU(CCU2_A1) wishes to speak. DU(CCU2_A1) is added to the RTS list. At a later time, the Chairperson allows DU(CCU1_A1) to speak. A “Turn MIC ON” indication is sent to CCU1 and forwarded to DU(CCU1_A1). This turns on the microphone at DU(CCU1_A1). DU(CCU1_A1) sends a MIC ON confirmation to CCU1 , which is forwarded to Director A. From these examples, it can be seen that the participants at DUs of both CCUs participate in a unified meeting experience.

In some examples, the two CCUs (CCU1 , CCU2) will be located in the same room. It is also possible to connect two CCUs in different rooms. For example, CCU1 can be located in room 1 and CCU2 can be located in room 2. In some examples, the two CCUs can be the same type of equipment (e.g. same vendor and model) and the Director can communicate with CCU1 and CCU2 using the same interface, or instances of the same interface. In other examples, the two CCUs can be different (e.g. different vendors, or same vendor but different models) and the Director can communicate with CCU1 and CCU2 using two different (heterogeneous) interfaces. It is possible to configure a meeting which combines CCU2 and CCU1 for part of a meeting, and allows CCU2 and CCU1 to operate independently for another part of the meeting.

While this part of the disclosure describes two CCUs, it will be understood that it can be applied to any number N of CCUs.

SOFTWARE ARCHITECTURE

FIGURE 32 schematically shows some of the main functional modules of the Director 150, 250. Media module 702 is configured to perform various functions relating to media (audio, video). These include one or more of: encoding, decoding and transcoding audio and video data; synchronising audio and video; selecting video sources; processing video content for displays (e.g. picture-in-picture); local recording of media. Camera control module 704 is configured to control cameras at the local venue. Video switch module 706 is configured to switch between video sources. CCU control module 708 is configured to control CCUs in the manner described above. CCU control may require one or more vendor-specific interface layers per CCU. An inter-venue interface module 710 is configured to perform bridging between participants of a meeting. Participants may be associated with a real CCU or a virtual CCU (participants at another venue, remote attendees).

FIGURE 33 schematically shows operation of the control module 708 of the Director. Core functionality 720 interacts with a set of abstract CCU interfaces 722. Each abstract CCU interface 722 comprises a number of abstract DUs 723 served by the abstract CCU. An interface 724 lies between the abstract CCU 722 and a communication link to hardware. Interface 724 allows the abstract CCU 722 to interact with the actual hardware that the abstract CCU represents. This approach allows the core functionality to interact with a range of different equipment which connect to participants. Each abstract CCU can represent an actual hardware CCU (e.g. CCU 140(1) in FIGURE 28) or some other hardware, such as the RAS or cellular wireless dongle. The RAS can be represented as a CCU which hosts a number of DUs, where the DUs are actually remote attendees connected to the RAS. The wireless dongle can be represented as a CCU which hosts a single DU, or a plurality of DUs. Another venue (e.g. VENUE B) can be represented by an abstract CCU [bridging]. The control part of the inter-venue interface shown as 158, FIGURE 6A communicates with the abstract CCU [bridging].

Some examples of operation will now be described. The table shown in FIGURE 13 will be used to illustrate this example, with seats 1-4 representing DUs at VENUE A and VENUE B, and seat 5 representing a remote attendee. Consider the Director wishes to turn on a microphone at seat 1 . The director determines, using stored data, that seat 1 is actually DU1 of CCU1 . A command “Turn MIC ON” is a command used by the core functionality 720. The core functionality 720 instructs the abstract CCU representing CCU1 to turn on a microphone at DU1. The vendor specific interface 724 converts this command to an actual command that CCU1 can understand. For example, CCU1 may require a particular protocol (e.g. a specific exchange of messages) to perform this action, and/or messages in a particular coded format. The interface 724 may be proprietary to vendor X.

Now consider the Director wishes to turn on a microphone at seat 5. The director determines, using stored data, that seat 5 is actually remote attendee DU1 connected to the RAS. The same command “Turn MIC ON” is sent to the abstract CCU representing the RAS to turn on a microphone at DU1 . The vendor specific interface 724 converts this command to an actual command that CCU [RAS] can understand. For example, CCU [RAS] may require a particular protocol (e.g. a specific exchange of messages) to perform this action, and/or messages in a particular coded format. The core functionality can use the same set of commands (calls) to perform a function at a DU, or to determine status of a DU.

Now consider the Director wishes to turn on a microphone at seat 3. The director determines, using stored data, that seat 3 is actually connected to a CCU at VENUE B. An instruction to turn on the microphone is sent, as inter-venue control data, to Director B. An instruction is sent to the Abstract CCU [bridging] module and on to the inter-venue interface 158.

SERVER-BASED DIRECTOR

In the system described above, a Director 150 at VENUE A communicates with a Director 250 at VENUE B. The Directors 150, 250 can perform all of the functionality to support a bridged meeting between DUs at VENUE A and VENUE B. One advantage of providing functionality at each venue is that the local Director can continue to support a meeting between participants at the local venue in the event of a network failure, such as a failure of a communication link between the Directors 150, 250.

FIGURE 34 shows another system. At least part of the functionality of the Directors 150, 250 shown in FIGURE 3 is now implemented by a Virtual Director V-DIR 350 hosted by a server. VENUE A has an apparatus (Director) 150’. Apparatus (Director) 150’ can have reduced functionality compared to the Director 150 of FIGURE 3. Similarly, VENUE B has an apparatus (Director) 250’ which can have reduced functionality compared to the Director 250 of FIGURE 3. Apparatus 150’ and apparatus 250’ communicate with the V-DIR 350. For example, V-DIR 350 may perform one or more of: audio coding and/or transcoding; video coding and/or transcoding; maintaining a RTS list for participants at the venues; Chairperson functionality for the venues; configuration of a conference. Each of the Directors 150’, 250’ hosts a client application which communicates with V-DIR 350. Directors 150’, 250’ have an inter-venue interface. The inter-venue interface communicates with V-DIR 350.

This system has an advantage that functionality provided at the server V-DIR 350 can be shared between multiple venues. One example of shared functionality is that V-DIR 350 can perform coding, decoding and/or transcoding of audio and video data. Director 150’ can send audio data to V-DIR in a format used at VENUE A and V-DIR 350 can perform transcoding to a format used at VENUE B before sending to Director 250’. Similarly, if there are other venues, V-DIR 350 can perform transcoding to the format(s) required by each of the other venues. Another example of shared functionality is that V-DIR 350 can host a RTS list for the venues.

Director 150’ can operate as follows: receive CCU audio from the CCU 140 and send inter-venue audio data representing the CCU audio (to the V-DIR) via the inter-venue interface; receive inter-venue audio data (from the V-DIR) via the inter-venue interface and send audio representing the inter-venue audio data to the CCU 140; receive CCU control data from the CCU 140 and, optionally, send CCU control data to the CCU 140; send inter-venue control data (to the V-DIR) via the inter-venue interface which is indicative of a status or action of a DU connected to the CCU 140; and receive inter-venue control data (from the V-DIR) via the inter-venue interface which is indicative of a status or action of a DU at the second conference venue.

Each local Director 150’, 250’ may perform functionality and/or store data to permit stand-alone operation of a conference for at least the DUs connected to the local CCU (or multiple CCUs) in the event of a failure to communicate with V-DIR 350.

In FIGURE 34, the RAS 304 can communicate with the V-DIR 350 (as shown) and/or with the Directors 150’, 250’.

FIGURE 35 shows an example of a processing apparatus 800 which may implement at least part of the processing of the invention, such as one of the Directors 150, 250. The processing apparatus 800 may implement the method of FIGURE 16. Processing apparatus 800 comprises one or more processor 801 which may be any type of processor for executing instructions to control the operation of the device. The processor 801 is connected to other components of the apparatus via one or more buses 802. Processor-executable instructions 804 may be provided using any data storage device or computer-readable media, such as memory 803. The processor-executable instructions 804 comprise instructions for implementing the functionality of the described methods. The memory 802 is of any suitable type such as non-volatile memory, a magnetic or optical storage device. Memory 805, or memory 803, stores data used by the processor. This can include one or more of: data about active DUs (DUs with a microphone turned on); data about DUs who have requested to speak (RTS list); configuration data for a meeting.

The processing apparatus 800 comprises an I/O interface 807 which allows the processing apparatus 800 to connect to a display and an input device (e.g. keyboard, mouse or pointer device). The I/O interface 807 can comprise a USB interface to communicate with a conversion unit 160 (FIGURE 6B). A user interface 812 such as the GUI shown in FIGURES 17-19, can be provided on a display connected to the I/O interface 807, or on another computing device which communicates with the processing apparatus 800 via the network interface 808 or the wireless network interface 809.

The processing apparatus 800 comprises various network interfaces 808-810. Network interface 808 connects to a wired network, such as a local area network and/or a gateway/router. Wireless network interface 809 connects to a wireless network, such as audio and/or video equipment within the venue, or other wireless equipment. The network interfaces 808, 809 can communicate audio, control and video data to equipment, and connect to another venue. Cellular wireless network interface 810 can be one or more of the devices 320 shown in FIGURE 24.

The processing operations can be grouped into two: (i) in-meeting processing operations, such as sending/receiving audio, video and control data; coding/decoding of audio, video and control data; bridging data between participants etc.; and (ii) configuration of a meeting.

FIGURE 35 shows one processing apparatus which can perform one, or both, of these groups of processing operations. Another processing apparatus may be used to execute part of the software. For example, configuration of a meeting (FIGURES 16-19) may be performed by another processing apparatus, with configuration data sent to the processing apparatus which performs the in-meeting processing operations.

FIGURE 36 shows a computer-implemented method for providing a bridged conference between conference equipment at a first conference venue and conference equipment at a second conference venue. At block 902 the method comprises receiving CCU audio from the CCU and sending inter-venue audio data representing the CCU audio to the apparatus at the second conference venue via the inter-venue interface. At block 904 the method comprises receiving inter- venue audio data via the inter-venue interface and sending audio representing the inter-venue audio data to the CCU. At block 906 the method comprises receiving CCU control data from the CCU and sending CCU control data to the CCU. At block 908 the method comprises sending inter- venue control data via the inter-venue interface which is indicative of a status or action of a DU connected to the CCU. At block 910 the method comprises receiving inter-venue control data via the inter-venue interface which is indicative of a status or action of a DU at the second conference venue.

FIGURE 37 shows a computer-implemented method for providing a conference between conference equipment. At block 912 the method comprises communicating with a first CCU and with a second CCU. At block 914 the communicating comprises: receiving first CCU audio and first CCU control data from the first CCU; and receiving second CCU audio and second CCU control data from the second CCU. At block 916 the communicating comprises: forwarding the first CCU audio to the second CCU; and forwarding the second CCU audio to the first CCU.

FIGURE 38 shows a computer-implemented method for supporting a bridged conference between conference equipment at a first conference venue and at least one remote attendee device which is external to the first conference venue. At block 922 the method comprises receiving CCU audio from the CCU and sending audio data representing the CCU audio to the remote attendee system via the remote attendee interface. At block 924 the method comprises receiving remote attendee audio data from the remote attendee system via the remote attendee interface and sending audio data representing the remote attendee audio data to the CCU. At block 926 the method comprises receiving CCU control data from the CCU and, optionally, sending CCU control data to the CCU. At block 928 the method comprises receiving remote attendee control data via the remote attendee interface which is indicative of a status or action at the remote attendee device. At block 930 the method comprises sending remote attendee control data via the remote attendee interface to control the remote attendee device.

FIGURE 39 shows a computer-implemented method for providing a conference between conference equipment at a first conference venue and a remote attendee telephone. At block 932 the method comprises communicating with a conference control unit (CCU) and with a cellular wireless apparatus which is configured to support a voice call with a remote attendee telephone, wherein the first CCU is connectable to a plurality of first delegate units (DU) each having a microphone. At block 934 the communicating comprises receiving CCU audio from the CCU and sending audio data representing the CCU audio to the cellular wireless apparatus. The communicating also comprises receiving remote attendee audio data from the cellular wireless apparatus and sending audio representing the remote attendee audio data to the CCU. At block 936 the communicating comprises receiving control data from the cellular wireless apparatus.

Throughout the description and claims of this specification, the words “comprise” and “contain” and variations of the words, for example “comprising” and “comprises”, means “including but not limited to”, and is not intended to (and does not) exclude other moieties, additives, components, integers or steps.

Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.