Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHODS, APPARATUSES, AND SYSTEMS FOR FACILITATING ELECTRONIC COMMUNICATIONS THROUGH A HAPTIC WEARABLE INTERFACE
Document Type and Number:
WIPO Patent Application WO/2016/128896
Kind Code:
A1
Abstract:
The Methods Apparatuses And Systems For Facilitating Electronic Communications Through A Haptic Wearable Interface provide a user interface for a user to send motion control commands to a computing device which in turns executes user desired functions. Some embodiments described herein include an apparatus that has a line of sensors installed on the side of a wearable device (e.g., in the shape similar to a wrist band, a wrist watch, etc.), such that the user can touch and glide on the line of sensors to make a selection of displayed items on the display screen of the wearable device. In this way, the user can more accurately position his/her fingers to select an item when the touch screen of the wearable device has a limited touch area.

Inventors:
TIAN SIMON (CA)
Application Number:
PCT/IB2016/050681
Publication Date:
August 18, 2016
Filing Date:
February 09, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NEPTUNE COMPUTER INC (CA)
International Classes:
G06F3/0488; G04G21/08; G06F3/041; G06F3/0484
Foreign References:
US20140181750A12014-06-26
US20090084610A12009-04-02
US20040049743A12004-03-11
Attorney, Agent or Firm:
PERRY, Stephen (1300 Yonge StreetSuite 50, Toronto Ontario M4T 1X3, CA)
Download PDF:
Claims:
CLAIMS

1. A user control apparatus, comprising:

a display member configured to display a plurality of selectable items;

a sensor strip including a plurality of touch sensors, disposed on a side of the display member, wherein:

each of the touch sensors is configured to sense a first contact pressure and a second contact pressure;

a processor, operatively coupled to the display member and the sensor strip, the processor configured to:

obtain a first indication of the first contact pressure at one of said sensors and in response displaying one of said selectable items aligned with said one of said sensors; obtain a second indication of the second contact pressure at said one of said sensors;

compare the second contact pressure with the first contact pressure; and if the second contact pressure is greater than the first contact pressure instantiate said one of said selectable items.

2. The apparatus of claim 1, wherein the user control apparatus includes a wearable device.

3. The apparatus of claim 1, wherein the processor is further configured to send an instruction to the display member to display contents associated with said one of said selectable items.

4. The apparatus of claim 1 , wherein said one of said selectable items is a communication contact.

5. The apparatus of claim 1, wherein the display member includes a touch screen area that is configured to receive user hand-writing gestures.

6. The apparatus of claim 1 , wherein the processor is further configured to

generate a communication message based on user input; determine a destination of the communication message based on said one of said selectable items; and

send the communication message to the destination.

7. The apparatus of claim 1, wherein the plurality of selectable items are selectable contacts for the user to interact with,

8. The apparatus of claim I, wherein the plurality of selectable items are selectable notes.

9. The apparatus of claim 1, wherein the plurality of selectable items are selectable calendar items.

10. The apparatus of claim 1, wherein the plurality of selectable items are selectable applications.

11. The apparatus of claim 1 , wherein the processor displays a first plurality of selectable items in response to a user touch indication in one direction along said edge and a second plurality of selectable items in response to a user touch indication in an opposite direction along said edge.

Description:
METHODS, APPARATUSES, AND SYSTEMS FOR FACILITATING ELECTRONIC COMMUNICATIONS THROUGH A HAPTIC WEARABLE

INTERFACE

BACKGROUND

[0001] Computing devices provide one or more human computer interfaces available to the users to perform a plurality of activities including the execution of device operations and the reception of information. Examples of these human computer interfaces include a monitor display, a mouse, a touch screen or a keyboard. Human machine communication can be performed via a touch screen user interface (UI), when a user taps, swipes or scrolls on the touch screen to indicate different commands, such as selecting an item displayed on the touch screen, expanding a view displayed on the touch screen, or requesting for more information to be displayed on the touch screen.

SUMMARY

[0002] Some embodiments described herein comprise a user control apparatus. The user control apparatus can include a mobile device, or a wearable device. The user control apparatus includes a display member that is configured to display a list of selectable options, such as a first selectable item and a second selectable item. The user control apparatus has a sensor strip installed on a side of the display member such that a first touch sensor in the strip is aligned with the first selectable item displayed via the display member, and a second touch sensor in the strip is aligned with the second selectable item displayed via the display member. A user can use a fingertip to slide along the sensor strip such that the first touch sensor can sense a first contact pressure, and the second touch sensor can sense a second contact pressure. When the user intends to select the second selectable item, the user can press harder on the second touch sensor such that the second contact pressure is greater than the first contact pressure. The user control apparatus further includes a processor that is operatively coupled to the display member and the sensor strip. The processor is configured to determine that the second contact pressure is greater than the first contact pressure, and in turn instantiate the second selectable item.

[0003] It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of subject matter appearing in this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.

BRIEF DESCRIPTION OF THE DRAWINGS

[0004] The skilled artisan will understand that the drawings primarily are for illustrative purposes and are not intended to limit the scope of the inventive subject matter described herein. The drawings are not necessarily to scale; in some instances, various aspects of the inventive subject matter disclosed herein may be shown exaggerated or enlarged in the drawings to facilitate an understanding of different features. In the drawings, like reference characters generally refer to like features (e.g., functionally similar and/or structurally similar elements).

[0005] FIG. 1 shows examples of user interface (UI) screens illustrating using side sensor strip control to send a message on a wearable device, according to one embodiment of the present invention.

[0006] FIG. 2 shows schematic UI screens illustrating using side sensor strip control gestures on a wearable device, according to one embodiment of the present invention.

[0007] FIG. 3 shows a schematic UI screen illustrating a UI for a user to compose a message on a wearable device, according to one embodiment of the present invention.

[0008] FIG. 4 shows schematic UI screens illustrating sending a text message on a wearable device, according to one embodiment of the present invention.

[0009] FIG. 5 shows schematic UI screens illustrating selecting a message service, according to one embodiment of the present invention.

[0010] FIG. 6 shows schematic UI screens illustrating sending a quick response message on a wearable device, according to one embodiment of the present invention.

[0011] FIG. 7 shows schematic UI screens illustrating sending a voice input on a wearable device, according to one embodiment of the present invention. [0012] FIG. 8 shows schematic UI screens illustrating receiving a text message on a wearable device, according to one embodiment of the present invention.

[0013] FIG. 9 shows schematic UI screens illustrating receiving a voice input on a wearable device, according to one embodiment of the present invention.

DETAILED DESCRIPTION

[0014] Inventive embodiments of methods, apparatuses, and systems for facilitating electronic communications through a haptic wearable interface generally relate to the use of motion (e.g., hand movements and/or gestures) to control various activities relating to electronic communications (e.g., via short message service or "SMS," text, email, voice messages, various communication "apps" or Internet-based communication services, etc.) via a wearable computing and/or electronic communication device. It should be appreciated that various concepts introduced above and discussed in greater detail below may be implemented in any of numerous ways, as the disclosed concepts are not limited to any particular manner of implementation. Examples of specific implementations and applications are provided primarily for illustrative purposes.

[0015] The inventors have recognized several drawbacks of conventional computing devices and their respective human-computer interfaces. For example, a user can operate a mobile device for various purposes, such as emails, instant messages, telephone calls, gaming, information analytics, and/or the like. The mobile device can have a display screen, which can include a touch screen user interface for the user to input a command via the screen, such as making a selection of presented items on the screen, and/or the like. In some instances, the user interface display can be of limited size and thus makes it difficult for user contact, e.g., the user may not be able to tap, swipe or scroll accurately on a small touch screen such as the one on a wearable device (e.g., a wrist-band device, a Smart watch device, etc.). In another example, a user who would like to send a communication message through a conventional mobile device usually needs to perform multiple steps to compose and send a message, such as tapping on a desired messaging service icon (e.g., short message service (SMS), WhatsApp, Facebook Messenger, WeChat, LINE, Viper, Skype, e-mail, etc.) to initiate the message application, tapping on a "new message" option, tapping to type a new message via a virtual keyboard, tapping to select a contact to send the message, and tapping a "send" button, and/or the like. The multiple steps of UI interaction on a touch screen can be of increased difficulty and

inconvenience when the touch screen area is limited, e.g., when a user needs to tap accurately on a wrist-band or wrist- watch sized wearable device.

[0016] In view of the foregoing, some embodiments described herein are directed to a mobile (e.g., portable) computing and/or electronic communication apparatus that includes a touch screen display and a line of sensors installed on some portion (e.g., a side) of the apparatus, such that the user can touch and glide on the line of sensors to make a selection of displayed items on the touch screen display of the apparatus. In this way, the user can more accurately position his/her fingers on the line of sensors, as opposed to the limited touch area of the touch screen display, to select one or more displayed items. For example, a user can engage the side sensor strip to select a contact from a list of contacts, to select an action from an action menu, and/or the like. In some embodiments discussed in detail further below, the mobile computing and/or electronic communication apparatus is implemented as a wearable device (e.g., a wrist band, substantially in the shape of a wrist watch) that includes one or more processors, the touch screen display, and at least one line of sensors installed on some portion (e.g., a side) of the wearable device. It is noted that the term "mobile device," "portable device," and "wearable device" may be used interchangeably throughout this application.

[0017] Some embodiments described herein include an instant messaging "hub" component that can be instantiated on the user mobile device that has a line of sensors (e.g., in a shape similar to a strip) on the side of a touch screen display of the mobile device, so that a user can compose a message by sliding on the line of side sensors to select a messaging platform to send the composed message. The side sensors can be haptic/touch sensors that can detect a pressure when the sensor has been contacted and/or pressed against and the haptic/touch sensor can generate a signal indicative of a level of pressure sensed by the respective sensor. In some instances, a user can input a message to the mobile device by hand-writing on the touch screen display of the mobile device, or articulating a message via a voice input UI of the mobile device, and/or the like. Once the content of the message has been composed, the mobile device can provide a list of messaging options to the user, e.g., a list of messaging platforms such as short message service (SMS), Internet-based messenger (e.g., WhatsApp, WeChat, LINE, Facebook Messenger, Viper, etc.), electronic mail, and/or the like. The user can slide along the line of side sensors that is aligned along the list displayed on the UI screen to make a selection of the messenger service to send the composed message.

[0018] FIG. 1 shows examples of user interface (UI) screens illustrating using side sensor strip control to send a message on a wearable device, according to one embodiment of the present invention. In some embodiments, the wearable device can have a line of haptic/touch sensors (e.g., in a shape of a strip) 109, comprising a plurality of haptic/touch sensors which are responsive to pressure our touch inputs performed by a user. The line of sensors can be a dedicated "contact dash" button, allowing the user to scroll and select a contact to interact with. For example, when the line of sensors is located on one edge of the wearable device (e.g., see 109), a user can touch the edge surface of the wearable device to bring up the contact dash screen 101. The wearable device can transition from any current app displaying on the screen to bring up the contact dash app in response to a pressure event sensed by the pressure strip. A user can scroll through contacts, magnify contacts, and select a contact to interact with by pressing harder on the pressure strip. Upon selection the interface displays a new screen 103 showing the selected contact name on top of the screen and/or one or more alternatives to initiate

communication with. A person of ordinary skill in the art will understand that the contact dash feature may be extended to other items than contacts, such as applications , notes, calendar events, etc., that can be selected and engaged with via the pressure area or strip. In one embodiment, a first plurality of selectable items, such as most recent contacts, are displayed in response to user touch sliding in one direction along the pressure area or strip (e.g. downward) and a second plurality of selectable items, such as applications (e.g. communications services 507 discussed below, or other applications commonly executable on a computing device), are displayed in response to user touch sliding in an opposite direction (e.g. upward). The

communication options can be configured to fit multiple usage models, for example a walkie- talkie voice option, a draw/write option , conventional text messages, email, telephone conversation and other available communication methods. For example, in a draw/write option a user can write with a finger or with another device a message on a text input area as shown on the display screen 105, such a message will appear in the chosen contact screen as it is displayed on the sender's screen i.e., conveying the sender's handwriting style. Additionally, in some embodiments the user interface can display a communication status, for example feedback can be provided upon a successful message transmission and/or failure as shown on the display screen 107. A person of ordinary skill in the art will also understand that the contact dash feature, whether implemented to select contacts or other items such as applications, notes, calendar events, etc., can be applied to devices other than wearable device, such as tablet computers, laptops, smartphones, etc.

[0019] FIG. 2 shows schematic UI screens illustrating using side sensor strip control gestures on a wearable device, according to one embodiment of the present invention. In one

embodiment, the wearable device can keep track of the most recent contacts with whom a user has established communication. The most recent contacts can be available on the wearable device's touch screen display so they can be reached by the user in a rapid and convenient way 201. Alternatively or additionally the user can edit the list displayed on the screen according to other criteria including but not limited to most frequent contacts, emergency contacts and the like.

[0020] In some embodiments, the wearable device can interpret wrist movements through gyroscopes, accelerometers and similar sensors embedded in the body of the device. For example, a user can flick his wrist to change a screen from contact list view to time view 203. In some embodiments, the user can configure how each movement should be interpreted by the wearable device for example; a user can record a movement as an event and associate the event with a specific command.

[0021] In some embodiments, a user can select a subset of contacts for example by specifying a letter. If the user draws a letter on the device's screen 205 the contact names starting with the specified letter will be display on the screen 207.

[0022] FIG. 3 shows a schematic UI screen illustrating a UI 300 for a user to compose a message on a wearable device, according to one embodiment of the present invention. In some embodiments, the user interface can show a receiver name on top of the screen, one or more names can be listed as receivers of the communication 303. In some embodiments, an input area can be located below the receiver's name 301. Users can manually enter information by drawing one or more letters with their fingers and/or another device; alternatively a popup on-screen keyboard can be displayed. In a further embodiment, the user is presented with the options to send a voice message 302 to the contact and/or send emoji/emoticon symbols 305. [0023] FIG. 4 shows schematic UI screens illustrating sending a text message on a wearable device, according to one embodiment of the present invention. In some embodiments, the wearable device can comprise an autocorrect feature and/or autocomplete feature. For example, users can start writing a word 403 and simultaneously a list of related words can be provided 405, a user can select any of the suggested words instead of finishing the writing of the text. A similar process may be available to suggest correct words when an orthographical and/or grammatical mistake is detected. In such a case a list of correct alternatives can be prompted from which a user can correct the mistake by selecting one of the provided alternatives. In some embodiments, the wearable device keeps track of a message 410 as it is written by the user such that it can be reviewed by the user before sending 407.

[0024] FIG. 5 shows schematic UI screens illustrating selecting a message service, according to one embodiment of the present invention. In some embodiments, the wearable device can be configuring to utilize the most recent communication service as a default for future

communications. In this screen 501, the user does not have to specify for example if the communication should be performed by email, instant message and/or through the use of other platform instead the message is automatically sent 503 speeding the process by eliminating at least one intermediate step.

[0025] In other embodiments, a user can select from a list of communication services 507, however the input is performed in the same way for all the services, via the wearable's device user interface 505, there is no need to open a specific app and/or service interface to successfully send the message 509.

[0026] FIG. 6 shows schematic UI screens illustrating sending a quick response message on a wearable device, according to one embodiment of the present invention. In some embodiments, the wearable device provides with a selection of quick responses 603 that can be viewed from the user interface by pressing an icon 601. In some embodiments, a quick response can be transmitted automatically via the communication service that was used most recently.

[0027] FIG. 7 shows schematic UI screens illustrating sending a voice input on a wearable device, according to one embodiment of the present invention. As previously shown in Figure 3, the message input UI screen can include a voice input option 302. A user can tap and hold the voice input icon 302 and articulate to the wearable device to compose a message. As shown at 703, the time length of recording the audio message can be dynamically displayed; and the strength of the user' s audio speech is displayed on the screen as well. Once the user has finished the audio message, the user can release the voice input icon 302 to finish recording. At screen 300, the wearable device can send the composed message, e.g., either in a text format via a voice-to-text conversion component, or as a multimedia message.

[0028] FIG. 8 shows schematic UI screens illustrating receiving a text message on a wearable device, according to one embodiment of the present invention. In some instances, when a user wearing a wearable device, or operating a mobile device, receives a message (e.g., a SMS, an e-mail message, any Internet-based instant message, etc.) on the device, a vibration alert can be generated at the device to provide notice to the user. For example, as shown at UI screen 801 , the content or a part of the content of the received message can be displayed at the UI screen. The user can exercise motion control commands to view or skip the message. For example, for a wrist-wearable device, the user can flick wrist away to indicate a command to ignore the message, and the displayed content on screen 801 can stop to display. Or

alternatively, the user can tap on screen 801 to view and respond to the message, which leads to a message input screen 300.

[0029] FIG. 9 shows schematic UI screens illustrating receiving a voice input on a wearable device, according to one embodiment of the present invention. At screen 901, similar to receiving a text message at 801, the user feels a vibration alert, and sees a notification of the voice message displayed on screen 901, e.g., a "play" button indicating that an audio message is received and can be played. The user can flick his or her wrist to ignore the audio message, as discussed in connection with FIG. 8. Or alternatively, the user can tap on the UI screen 901 to listen to the audio message. For example, the UI screen 903 shows that an audio message is being displayed, with a waveform on screen to simulate strength of the audio message, and the time that has elapsed with the audio play. Upon completion of the audio play at screen 903, the mobile or wearable device can provide a screen 905 that allows a user to replay or respond to the audio message. Similarly, the user can flick wrist away to ignore screen 905, or to tap on screen 905 to replay or respond (which can lead to the message composing screen 300).

[0030] Embodiments described throughout this disclosure can be implemented via a mobile device or wearable device, such as a Smartphone, a wrist-worn computer, a Smart watch device, and/or the like, which may be enabled with wide area network (WAN) (e.g., 2G, 3G and 4G LTE), built-in global positioning system (GPS), Bluetooth and WiFi connectivity mechanisms. Additionally, the wearable device can implement a unique universal communication protocol and interface (U2CPI) to use as default to communicate with other computing devices. The wearable device may effectively be able to replace smartphones, tablets, laptops, desktops and smart TVs, by being paired with different screens and input/output devices. The wearable device may be charged wirelessly and may utilize a high-bandwidth wireless protocol (e.g. WiGig) along with Bluetooth, WiFi Direct, to stream video, audio, data, and various other contents to a variety of screen sizes. Alternatively or additionally, the wearable device can utilize the U2CPI to stream video, audio, data, and various other contents to a variety of screen sizes. It may also comprise multiple data sensors, such as an accelerometer, gyroscope, digital compass and/or the like as well as sizable internal storage. The wearable device may be controlled from other input devices for example the wearable device may receive commands from a paired device. In one embodiment the wearable device, may have a screen suitable to display indications like time/date, notifications, connectivity toggles, and the like, as well as tactile capabilities. Moreover, the wearable device may use a secure and passive authentication method (e.g., heart signature). Further exemplary features of the wearable device can be found in U.S. provisional application serial no. 61/985,393, entitled "Intelligent Wearable Data Processing and Control Platform Apparatuses, Methods and Systems," filed April 28, 2014; and U.S. provisional application serial no. 62/103,548, entitled "Wearable Data Processing And Control Platform Methods And Systems," filed January 14, 2015. Both aforementioned applications are herein expressly incorporated by reference.

[0031] While various inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.

[0032] The above-described embodiments of the invention can be implemented in any of numerous ways. For example, some embodiments may be implemented using hardware, software or a combination thereof. When any aspect of an embodiment is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.

[0033] In this respect, various aspects of the invention may be embodied at least in part as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium or non-transitory medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the technology discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present technology as discussed above.

[0034] The terms "program" or "software" are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of the present technology as discussed above. Additionally, it should be appreciated that according to one aspect of this embodiment, one or more computer programs that when executed perform methods of the present technology need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present technology.

[0035] Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.

[0036] Also, the technology described herein may be embodied as a method, of which at least one example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.

[0037] All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.

[0038] The indefinite articles "a" and "an," as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean "at least one."

[0039] The phrase "and/or," as used herein in the specification and in the claims, should be understood to mean "either or both" of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with "and/or" should be construed in the same fashion, i.e., "one or more" of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the "and/or" clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc. [0040] As used herein in the specification and in the claims, "or" should be understood to have the same meaning as "and/or" as defined above. For example, when separating items in a list, "or" or "and/or" shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as "only one of or "exactly one of," or, when used in the claims, "consisting of," will refer to the inclusion of exactly one element of a number or list of elements. In general, the term "or" as used herein shall only be interpreted as indicating exclusive alternatives (i.e. "one or the other but not both") when preceded by terms of exclusivity, such as "either," "one of," "only one of," or "exactly one of." "Consisting essentially of," when used in the claims, shall have its ordinary meaning as used in the field of patent law.

[0041] As used herein in the specification and in the claims, the phrase "at least one," in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, "at least one of A and B" (or, equivalently, "at least one of A or B," or, equivalently "at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

[0042] In the claims, as well as in the specification above, all transitional phrases such as "comprising," "including," "carrying," "having," "containing," "involving," "holding,"

"composed of," and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases "consisting of and "consisting essentially of shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.