Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SETTING FOR MOBILE ROBOT CONTROL
Document Type and Number:
WIPO Patent Application WO/2023/211644
Kind Code:
A1
Abstract:
A method of operating a mobile cleaning robot system including a mobile cleaning robot and a display device can include displaying a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room. When the room cleaning settings indication is selected, a cleaning mode indication can be displayed where the cleaning mode indication is selectable to set a cleaning mode setting of the specified room.

Inventors:
MESSENGER STEPHEN (US)
HAMEED SHEREEN (US)
ATHAVALE ADITYA (US)
BUTTERWORTH CRAIG MICHAEL (US)
JUDGE FRANK (US)
PEREIRA JORDAN (US)
BARON STEVEN J (US)
Application Number:
PCT/US2023/017449
Publication Date:
November 02, 2023
Filing Date:
April 04, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
IROBOT CORP (US)
International Classes:
A47L11/30; A47L9/04; A47L9/28; A47L11/40
Foreign References:
EP3184014A12017-06-28
US20210272471A12021-09-02
US20220000326A12022-01-06
US202217732744A2022-04-29
Attorney, Agent or Firm:
ARORA, Suneel et al. (US)
Download PDF:
Claims:
CLAIMS: . A method of operating a mobile cleaning robot system including a mobile cleaning robot and a display device, the method comprising: displaying a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room; and displaying, when the room cleaning settings indication is selected, a cleaning mode indication selectable to set a cleaning mode setting of the specified room. . The method of claim 1, further comprising: performing a cleaning mission using the cleaning mode setting of the specified room when the mobile cleaning robot is located within the specified room. . The method of claim 1, wherein the room cleaning settings indication includes a vacuuming only or vacuuming and mopping indication. . The method of claim 3, further comprising: displaying, when either vacuuming only or vacuuming and mopping is selected, a cleaning passes indication selectable to set a number of vacuuming cleaning passes of a mobile cleaning robot. . The method of claim 4, wherein the cleaning passes indication is selectable to set the number of passes of the mobile cleaning robot to between one and three passes. . The method of claim 3, further comprising: displaying, when the vacuuming and mopping indication is selected, a fluid rate indication selectable to set a rate at which a fluid is discharged from a mobile cleaning robot. The method of claim 6, wherein the fluid rate indication is selectable to set the rate at which the fluid is discharged between a low amount, a medium amount, and a high amount. The method of claim 1, wherein the room cleaning settings indication includes a vacuum speed indication selectable to set a speed of a vacuum system of a mobile cleaning robot. The method of claim 1, wherein the room cleaning settings indication includes an overlap indication selectable to set an overlap percentage of a mopping pad. The method of claim 9, wherein the room cleaning settings indication includes a fluid rate indication selectable to set a rate at which a fluid is discharged from a mobile cleaning robot. The method of claim 1 , further comprising: present a recommended cleaning setting based on a cleaning history. The method of claim 11, further comprising: receive a dirt detection signal from a mobile cleaning robot; and determine the cleaning history based on the dirt detection signal. A machine- readable medium including instructions for presenting a user interface, which when executed by a processor, cause the processor to: display a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room; and display, when the room cleaning settings indication is selected, a cleaning mode indication selectable to set a cleaning mode setting of the specified room. The machine-readable medium of claim 13, wherein the room cleaning settings indication includes a vacuuming only or vacuuming and mopping indication. The machine-readable medium of claim 14, the processor further configured to: display, when either vacuuming only or vacuuming and mopping is selected, a cleaning passes indication selectable to set a number of vacuuming cleaning passes of a mobile cleaning robot. The machine-readable medium of claim 15, wherein the cleaning passes indication is selectable to set the number of passes of the mobile cleaning robot to between one and three passes. A mobile cleaning robot system including a mobile cleaning robot, the system comprising: a display configured to present a user interface; and processing circuitry in communication with the mobile cleaning robot and the display, the processing circuitry configured to: display a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room; and display, when the room cleaning settings indication is selected, a cleaning mode indication selectable to set a cleaning mode setting of the specified room. The mobile cleaning robot system of claim 17, wherein the room cleaning settings indication includes a vacuuming only or vacuuming and mopping indication. The mobile cleaning robot system of claim 18, the processing circuitry further configured to: display, when either vacuuming only or vacuuming and mopping is selected, a cleaning passes indication selectable to set a number of vacuuming cleaning passes of a mobile cleaning robot.

The mobile cleaning robot system of claim 19, the processing circuitry further configured to: display, when the vacuuming and mopping indication is selected, a fluid rate indication selectable to set a rate at which a fluid is discharged from the mobile cleaning robot.

Description:
SETTINGS FOR MOBILE ROBOT CONTROL

PRIORITY APPLICATIONS

[0001] This application is a continuation of and claims priority to U.S. Patent Application Serial No. 17/732,744, filed April 29, 2022, the content of which is incorporated herein by reference in its entirety.

BACKGROUND

[0002] Autonomous mobile robots can move about an environment and can perform functions and operations in a variety of categories, including but not limited to security operations, infrastructure or maintenance operations, navigation or mapping operations, inventory management operations, and robot/human interaction operations. Some mobile robots, known as cleaning robots, can perform cleaning tasks autonomously within an environment, e.g., a home. Many kinds of cleaning robots are autonomous to some degree and in different ways. For example, a cleaning robot can conduct cleaning missions, where the robot traverses and simultaneously ingests (e.g., vacuums) debris from the floor surface of their environment.

SUMMARY

[0003] An autonomous mobile robot can be controlled locally (e.g. via controls on the robot) or remotely (e.g. via a remote handheld device) to move about an environment. A mobile application, such as an application implemented on a handheld computing device (e.g., a mobile phone), can be configured to display various information about the mobile robot organized in user interface views. A user can use the mobile application to select settings for the operation of the mobile robot. In some instances, a map of the environment in which the mobile robot operates can be produced by a network or the mobile device and can be displayed on the device. The map can include multiple rooms of the environment, which can have different characteristics such as sizes, shapes, or floor types. In such a situation, it may be desirable to select different cleaning settings for different rooms.

[0004] The devices, systems, or methods of this application can help to address this issue by presenting a room cleaning settings indication selectable by a user to set the cleaning settings of the mobile robot for one or more mapped rooms. For example, the user can select a kitchen room cleaning settings indication to change the room cleaning settings for operation of the mobile robot within the kitchen room, as indicated or defined by the map.

[0005] Once the room cleaning settings indication is selected, a room cleaning settings screen can be presented where a user can select one or more settings for the room. For example, a user can select a vacuuming only mode or a vacuuming and mopping mode (such as where the mobile robot is a two-in-one or vacuuming and mopping type robot). In either mode, the user can select a number of vacuuming passes. In the vacuuming and mopping mode, the user can select a rate at which fluid is dispensed. Also, when the mobile robot is a mopping robot, the room cleaning settings can include mopping rank overlap percentage. The various settings can be user-selected for each mapped room of the environment, allowing the user to customize cleaning settings for one or more cleaning missions to be performed by the robot in the mapped environment.

[0006] In one example, a method of operating a mobile cleaning robot system including a mobile cleaning robot and a display device can include displaying a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room. When the room cleaning settings indication is selected, a cleaning mode indication can be displayed where the cleaning mode indication is selectable to set a cleaning mode setting of the specified room.

BRIEF DESCRIPTION OF THE DRAWINGS

[0007] Various embodiments are illustrated by way of example in the figures of the accompanying drawings. Such embodiments are demonstrative and not intended to be exhaustive or exclusive embodiments of the present subject matter. [0008] FIG. 1 illustrates a plan view of a mobile cleaning robot in an environment.

[0009] FIG. 2A illustrates an isometric view of a mobile cleaning robot in a first condition.

[0010] FIG. 2B illustrates an isometric view of a mobile cleaning robot in a second condition.

[0011] FIG. 2C illustrates an isometric view of a mobile cleaning robot in a third condition.

[0012] FIG. 2D illustrates a bottom view of a mobile cleaning robot in a third condition.

[0013] FIG. 2E illustrates a top isometric view of a mobile cleaning robot in a third condition.

[0014] FIG. 3 illustrates a diagram illustrating an example of a communication network in which a mobile cleaning robot operates and data transmission in the network. [0015] FIG. 4 illustrates a diagram illustrating an exemplary process of exchanging information between the mobile robot and other devices in a communication network.

[0016] FIG. 5A illustrates a user interface of a handheld device for controlling a mobile cleaning robot.

[0017] FIG. 5B illustrates a user interface of a handheld device for controlling a mobile cleaning robot.

[0018] FIG. 5C illustrates a user interface of a handheld device for controlling a mobile cleaning robot.

[0019] FIG. 6A illustrates a user interface of a handheld device for controlling a mobile cleaning robot.

[0020] FIG. 6B illustrates a user interface of a handheld device for controlling a mobile cleaning robot.

[0021] FIG. 7 illustrates a user interface of a handheld device for controlling a mobile cleaning robot.

[0022] FIG. 8 illustrates a block diagram illustrating an example of a machine upon which one or more embodiments may be implemented. DETAILED DESCRIPTION

Robot Operation Summary

[0023] FIG. 1 illustrates a plan view of a mobile cleaning robot 100 in an environment 40, in accordance with at least one example of this disclosure. The environment 40 can be a dwelling, such as a home or an apartment, and can include rooms 42a-42e. Obstacles, such as a bed 44, a table 46, and an island 48 can be located in the rooms 42 of the environment. Each of the rooms 42a-42e can have a floor surface 50a-50e, respectively. Some rooms, such as the room 42d, can include a rug, such as a rug 52. The floor surfaces 50 can be of one or more types such as hardwood, ceramic, low-pile carpet, medium-pile carpet, long (or high)-pile carpet, stone, or the like.

[0024] The mobile cleaning robot 100 can be operated, such as by a user 60, to autonomously clean the environment 40 in a room-by-room fashion. In some examples, the robot 100 can clean the floor surface 50a of one room, such as the room 42a, before moving to the next room, such as the room 42d, to clean the surface of the room 42d. Different rooms can have different types of floor surfaces. For example, the room 42e (which can be a kitchen) can have a hard floor surface, such as wood or ceramic tile, and the room 42a (which can be a bedroom) can have a carpet surface, such as a medium pile carpet. Other rooms, such as the room 42d (which can be a dining room) can include multiple surfaces where the rug 52 is located within the room 42d.

[0025] During cleaning or traveling operations, the robot 100 can use data collected from various sensors (such as optical sensors) and calculations (such as odometry and obstacle detection) to develop a map of the environment 40. Once the map is created, the user 60 can define rooms or zones (such as the rooms 42) within the map. The map can be presentable to the user 60 on a user interface, such as a mobile device, where the user 60 can direct or change cleaning preferences, for example.

[0026] Also, during operation, the robot 100 can detect surface types within each of the rooms 42, which can be stored in the robot or another device. The robot 100 can update the map (or data related thereto) such as to include or account for surface types of the floor surfaces 50a-50e of each of the respective rooms 42 of the environment. In some examples, the map can be updated to show the different surface types such as within each of the rooms 42.

[0027] In some examples, the user 60 can define a behavior control zone 54 using, for example, the methods and systems described herein. In response to the user 60 defining the behavior control zone 54, the robot 100 can move toward the behavior control zone 54 to confirm the selection. After confirmation, autonomous operation of the robot 100 can be initiated. In autonomous operation, the robot 100 can initiate a behavior in response to being in or near the behavior control zone 54. For example, the user 60 can define an area of the environment 40 that is prone to becoming dirty to be the behavior control zone 54. In response, the robot 100 can initiate a focused cleaning behavior in which the robot 100 performs a focused cleaning of a portion of the floor surface 50d in the behavior control zone 54.

Robot Example

[0028] FIG. 2A illustrates an isometric view of a mobile cleaning robot 100 with a pad assembly in a stored position. FIG. 2B illustrates an isometric view of the mobile cleaning robot 100 with the pad assembly in an extended position. FIG. 2C illustrates an isometric view of the mobile cleaning robot 100 with the pad assembly in a mopping position. FIGS. 2A-2C also show orientation indicators Front and Rear. FIGS. 2A-2C are discussed together below.

[0029] The mobile cleaning robot 100 can include a body 102 and a mopping system 104. The mopping system 104 can include arms 106a and 106b (referred to together as arms 106) and a pad assembly 108. The robot 100 can also include a bumper 109 and other features such as an extractor (including rollers), one or more side brushes, a vacuum system, a controller, a drive system (e.g., motor, geartrain, and wheels), a caster, and sensors, as discussed in further detail below. A distal portion of the arms 106 can be connected to the pad assembly 108 and a proximal portion of the arms 106a and 106b can be connected to an internal drive system to drive the arms 106 to move the pad assembly 108. [0030] FIGS. 2A-2C show how the robot 100 can be operated to move the pad assembly 108 from a stored position in FIG. 2A to a transition or partially deployed position in FIG. 2B, to a mopping or a deployed position in FIG. 2C. In the stored position of FIG. 2A, the robot 100 can perform only vacuuming operations. In the deployed position of FIG. 2C, the robot 100 can perform vacuuming operations or mopping operations. FIGS. 2D-2E discuss additional components of the robot 100.

Components of the Robot

[0031] FIG. 2D illustrates a bottom view of the mobile cleaning robot 100 and FIG. 2E illustrates a top isometric view of the robot 100. FIGS. 2D and 2E are discussed together below. The robot 100 of FIGS. 2D and 2E can be consistent with FIGS. 2A-2C; FIGS. 2D-2E show additional details of the robot 100 For example, FIGS. 2D-2E show that the robot 100 can include a body 102, a bumper 109, an extractor 113 (including rollers 114a and 114b), motors 116a and 116b, drive wheels 118a and 118b, a caster 120, a side brush assembly 122, a vacuum assembly 124, memory 126, sensors 128, and a debris bin 130. The mopping system 104 can also include a tank 132 and a pump 134. [0032] The cleaning robot 100 can be an autonomous cleaning robot that autonomously traverses the floor surface 50 (of FIG. 1) while ingesting the debris from different parts of the floor surface 50. As shown in FIG. 2D, the robot 100 can include the body 102 that can be movable across the floor surface 50. The body 102 can include multiple connected structures to which movable or fixed components of the cleaning robot 100 are mounted. The connected structures can include, for example, an outer housing to cover internal components of the cleaning robot 100, a chassis to which the drive wheels 118a and 118b and the cleaning rollers 114a and 114b (of the cleaning assembly 113) are mounted, the bumper 109 mounted to the outer housing, etc. The caster wheel 120 can support the front portion of the body 102 above the floor surface 50, and the drive wheels 118a and 118b can support the middle and rear portions of the body 102 (but can also support a majority of the weight of the robot 100) above the floor surface 50.

[0033] As shown in FIG. 2D, the body 102 includes a front portion that has a substantially semicircular shape that can be connected to the bumper 109, and a rear portion that has a substantially semicircular shape. In other examples, the body 102 can have other shapes such as a square front or straight front. The robot 100 can also include a drive system including the actuators 116a and 116b, e.g., motors. The actuators 116a and 116b can be connected to the body 102 and can be operably connected to the drive wheels 118a and 118b, which can be rotatably mounted to the body 102. The drive wheels 118a and 118b can support the body 102 above the floor surface 50. The actuators 116a and 116b, when driven, can rotate the drive wheels 118a and 118b to enable the robot 100 to autonomously move across the floor surface 50.

[0034] The vacuum assembly 124 can be carried within the body 102 of the robot 100, e.g., in a rear portion of the body 102, and can be located in other locations in other examples. The vacuum assembly 124 can include a motor to drive an impeller that generates the airflow when rotated. The airflow and the cleaning rollers 114, when rotated, can cooperate to ingest the debris into the robot 100. The cleaning bin 130 can be mounted in the body 102 and can contain the debris ingested by the robot 100. A filter in the body 102 can separate the debris from the airflow before the airflow enters the vacuum assembly 124 and is exhausted out of the body 102. In this regard, the debris can be captured in both the cleaning bin 130 and the filter before the airflow is exhausted from the body 102. In some examples, the vacuum assembly 124 and extractor 113 can be optionally included or can be of a different type.

[0035] The cleaning rollers 114a and 114b can be operably connected to an actuator 115, e.g., a motor, through a gearbox. The cleaning head 113 and the cleaning rollers 114a and 114b can be positioned forward of the cleaning bin 130. The cleaning rollers 114 can be mounted to an underside of the body 102 so that the cleaning rollers 114a and 114b engage debris on the floor surface 50 during the cleaning operation when the underside of the body 102 faces the floor surface 50. [0036] The controller 111 can be located within the housing and can be a programable controller, such as a single or multi-board computer, a direct digital controller (DDC), a programable logic controller (PLC), or the like. In other examples, the controller 111 can be any computing device, such as a handheld computer, for example, a smart phone, a tablet, a laptop, a desktop computer, or any other computing device including a processor, memory, and communication capabilities. The memory 126 can be one or more types of memory, such as volatile or non-volatile memory, read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media. The memory 126 can be located within the housing 102, connected to the controller 111 and accessible by the controller 111.

[0037] The controller 111 can operate the actuators 116a and 116b to autonomously navigate the robot 100 about the floor surface 50 during a cleaning operation. The actuators 116a and 116b can be operable to drive the robot 100 in a forward drive direction, in a backwards direction, and to turn the robot 100. The controller 111 can operate the vacuum assembly 124 to generate an airflow that flows through an air gap near the cleaning rollers 114, through the body 102, and out of the body 102.

[0038] The control system can further include a sensor system with one or more electrical sensors. The sensor system, as described herein, can generate a signal indicative of a current location of the robot 100, and can generate signals indicative of locations of the robot 100 as the robot 100 travels along the floor surface 50. The sensors 128 (shown in FIG. 2A) can be located along a bottom portion of the housing 102. Each of the sensors 128 can be an optical sensor that can be configured to detect a presence or absence of an object below the optical sensor, such as the floor surface 50. The sensors 128 (optionally cliff sensors) can be connected to the controller 111 and can be used by the controller 111 to navigate the robot 100 within the environment 40. In some examples, the cliff sensors can be used to detect a floor surface type which the controller 111 can use to selectively operate the mopping system 104.

[0039] The cleaning pad assembly 108 can be a cleaning pad connected to the bottom portion of the body 102 (or connected to a moving mechanism configured to move the assembly 108 between a stored position and a cleaning position), such as to the cleaning bin 130 in a location to the rear of the extractor 113. The tank 132 can be a water tank configured to store water or fluid, such as cleaning fluid, for delivery to a mopping pad 142. The pump 134 can be connected to the controller 111 and can be in fluid communication with the tank 132. The controller 111 can be configured to operate the pump 134 to deliver fluid to the mopping pad 142 during mopping operations. In some examples, the pad 142 can be a dry pad such as for dusting or dry debris removal. The pad 142 can also be any cloth, fabric, or the like configured for cleaning (either wet or dry) of a floor surface.

Operation of the Robot

[0040] In operation of some examples, the controller 111 can be used to instruct the robot 100 to perform a mission. In such a case, the controller 111 can operate the motors 116 to drive the drive wheels 118 and propel the robot 100 along the floor surface 50. The robot 100 can be propelled in a forward drive direction or a rearward drive direction. The robot 100 can also be propelled such that the robot 100 turns in place or turns while moving in the forward drive direction or the rearward drive direction. In addition, the controller 111 can operate the motors 115 to cause the rollers 113a and 113b to rotate, can operate the side brush assembly 122, and can operate the motor of the vacuum system 118 to generate airflow. The controller 111 can execute software stored on the memory 126 to cause the robot 100 to perform various navigational and cleaning behaviors by operating the various motors of the robot 100.

[0041] The various sensors of the robot 100 can be used to help the robot navigate and clean within the environment 40. For example, the cliff sensors can detect obstacles such as drop-offs and cliffs below portions of the robot 100 where the cliff sensors are disposed. The cliff sensors can transmit signals to the controller 111 so that the controller 111 can redirect the robot 100 based on signals from the sensors.

[0042] Proximity sensors can produce a signal based on a presence or the absence of an object in front of the optical sensor. For example, detectable objects include obstacles such as furniture, walls, persons, and other objects in the environment 40 of the robot 100. The proximity sensors can transmit signals to the controller 111 so that the controller 111 can redirect the robot 100 based on signals from the proximity sensors. In some examples, a bump sensor can be used to detect movement of the bumper 109 along a fore-aft axis of the robot 100. A bump sensor 139 can also be used to detect movement of the bumper 109 along one or more sides of the robot 100 and can optionally detect vertical bumper movement. The bump sensors 139 can transmit signals to the controller 111 so that the controller 111 can redirect the robot 100 based on signals from the bump sensors 139.

[0043] The robot 100 can also optionally include one or more dirt sensors 144 connected to the body 102 and in communication with the controller 111. The dirt sensors 144 can be a microphone, piezoelectric sensor, optical sensor, or the like located in or near a flowpath of debris, such as near an opening of the cleaning rollers 114 or in one or more ducts within the body 102. This can allow the dirt sensor(s) 144 to detect how much dirt is being ingested by the vacuum assembly 124 (e.g., via the extractor 113) at any time during a cleaning mission. Because the robot 100 can be aware of its location, the robot 100 can keep a log or record of which areas or rooms of the map are dirtier or where more dirt is collected. This information can be used in several ways, as discussed further below.

[0044] The image capture device 140 can be configured to generate a signal based on imagery of the environment 40 of the robot 100 as the robot 100 moves about the floor surface 50. The image capture device 140 can transmit such a signal to the controller 111. The controller 111 can use the signal or signals from the image capture device 140 for various tasks, algorithms, or the like, as discussed in further detail below.

[0045] In some examples, the obstacle following sensors can detect detectable objects, including obstacles such as furniture, walls, persons, and other objects in the environment of the robot 100. In some implementations, the sensor system can include an obstacle following sensor along the side surface, and the obstacle following sensor can detect the presence or the absence an object adjacent to the side surface. The one or more obstacle following sensors can also serve as obstacle detection sensors, similar to the proximity sensors described herein.

[0046] The robot 100 can also include sensors for tracking a distance travelled by the robot 100. For example, the sensor system can include encoders associated with the motors 116 for the drive wheels 118, and the encoders can track a distance that the robot 100 has travelled. In some implementations, the sensor can include an optical sensor facing downward toward a floor surface. The optical sensor can be positioned to direct light through a bottom surface of the robot 100 toward the floor surface 50. The optical sensor can detect reflections of the light and can detect a distance travelled by the robot 100 based on changes in floor features as the robot 100 travels along the floor surface 50. [0047] The controller 111 can use data collected by the sensors of the sensor system to control navigational behaviors of the robot 100 during the mission. For example, the controller 111 can use the sensor data collected by obstacle detection sensors of the robot 100, (the cliff sensors, the proximity sensors, and the bump sensors) to enable the robot 100 to avoid obstacles within the environment of the robot 100 during the mission.

[0048] The sensor data can also be used by the controller 111 for simultaneous localization and mapping (SLAM) techniques in which the controller 111 extracts features of the environment represented by the sensor data and constructs a map of the floor surface 50 of the environment. The sensor data collected by the image capture device 140 can be used for techniques such as vision-based SLAM (VSLAM) in which the controller 111 extracts visual features corresponding to objects in the environment 40 and constructs the map using these visual features. As the controller 111 directs the robot 100 about the floor surface 50 during the mission, the controller 111 can use SLAM techniques to determine a location of the robot 100 within the map by detecting features represented in collected sensor data and comparing the features to previously stored features. The map formed from the sensor data can indicate locations of traversable and nontraversable space within the environment. For example, locations of obstacles can be indicated on the map as nontraversable space, and locations of open floor space can be indicated on the map as traversable space. [0049] The sensor data collected by any of the sensors can be stored in the memory 126. In addition, other data generated for the SLAM techniques, including mapping data forming the map, can be stored in the memory 126. These data produced during the mission can include persistent data that are produced during the mission and that are usable during further missions. In addition to storing the software for causing the robot 100 to perform its behaviors, the memory 126 can store data resulting from processing of the sensor data for access by the controller 111. For example, the map can be a map that is usable and updateable by the controller 111 of the robot 100 from one mission to another mission to navigate the robot 100 about the floor surface 50.

[0050] The persistent data, including the persistent map, can help to enable the robot 100 to efficiently clean the floor surface 50. For example, the map can enable the controller 111 to direct the robot 100 toward open floor space and to avoid nontraversable space. In addition, for subsequent missions, the controller 111 can use the map to optimize paths taken during the missions to help plan navigation of the robot 100 through the environment 40.

[0051] The controller 111 can also send commands to a motor (internal to the body 102) to drive the arms 106 to move the pad assembly 108 between the stored position (shown in FIGS. 2A and 2D) and the deployed position (shown in FIGS. 2C and 2E). In the deployed position, the pad assembly 108 (the mopping pad 142) can be used to mop a floor surface of any room of the environment 40.

[0052] The mopping pad 142 can be a dry pad or a wet pad. Optionally, when the mopping pad 142 is a wet pad, the pump 134 can be operated by the controller 111 to spray or drop fluid (e.g., water or a cleaning solution) onto the floor surface 50 or the mopping pad 142. The wetted mopping pad 142 can then be used by the robot 100 to perform wet mopping operations on the floor surface 50 of the environment 40. As discussed in further detail below, a user can select in which rooms of the environment which cleaning functions should be performed and at can select particular settings for each cleaning function of each room.

Network Examples [0053] FIG. 3 is a diagram illustrating by way of example and not limitation a communication network 400 that enables networking between the mobile robot 100 and one or more other devices, such as a mobile device 404 (including a controller 444), a cloud computing system 406 (including a controller 442), or another autonomous robot 408 separate from the mobile robot 100. Using the communication network 400, the robot 100, the mobile device 100, the robot 408, and the cloud computing system 406 can communicate with one another to transmit and receive data from one another. In some examples, the robot 100, the robot 408, or both the robot 100 and the robot 408 communicate with the mobile device 404 through the cloud computing system 406. Alternatively, or additionally, the robot 100, the robot 408, or both the robot 100 and the robot 408 communicate directly with the mobile device 404. Various types and combinations of wireless networks (e.g., Bluetooth, radio frequency, optical based, etc.) and network architectures (e.g., wi-fi or mesh networks) can be employed by the communication network 400.

[0054] In some examples, the mobile device 404 can be a remote device that can be linked to the cloud computing system 406 and can enable a user to provide inputs. The mobile device 404 can include user input elements such as, for example, one or more of a touchscreen display, buttons, a microphone, a mouse, a keyboard, or other devices that respond to inputs provided by the user. The mobile device 404 can also include immersive media (e.g., virtual reality) with which the user can interact to provide input. The mobile device 404, in these examples, can be a virtual reality headset or a headmounted display.

[0055] The user can provide inputs corresponding to commands for the mobile robot 404. In such cases, the mobile device 404 can transmit a signal to the cloud computing system 406 to cause the cloud computing system 406 to transmit a command signal to the mobile robot 100. In some implementations, the mobile device 404 can present augmented reality images. In some implementations, the mobile device 404 can be a smart phone, a laptop computer, a tablet computing device, or other mobile device.

[0056] According to some examples discussed herein, the mobile device 404 can include a user interface configured to display a map of the robot environment. A robot path, such as that identified by a coverage planner, can also be displayed on the map. The interface can receive a user instruction to modify the environment map, such as by adding, removing, or otherwise modifying a keep-out zone in the environment; adding, removing, or otherwise modifying a focused cleaning zone in the environment (such as an area that requires repeated cleaning); restricting a robot traversal direction or traversal pattern in a portion of the environment; or adding or changing a cleaning rank, among others.

[0057] In some examples, the communication network 400 can include additional nodes. For example, nodes of the communication network 400 can include additional robots. Also, nodes of the communication network 400 can include network-connected devices that can generate information about the environment 20. Such a network- connected device can include one or more sensors, such as an acoustic sensor, an image capture system, or other sensor generating signals, to detect characteristics of the environment 40 from which features can be extracted. Network-connected devices can also include home cameras, smart sensors, or the like.

[0058] In the communication network 400, the wireless links can utilize various communication schemes, protocols, etc., such as, for example, Bluetooth classes, Wi-Fi, Bluetooth-low-energy, also known as BLE, 802.15.4, Worldwide Interoperability for Microwave Access (WiMAX), an infrared channel, satellite band, or the like. In some examples, wireless links can include any cellular network standards used to communicate among mobile devices, including, but not limited to, standards that qualify as 1G, 2G, 3G, 4G, 5G, or the like. The network standards, if utilized, qualify as, for example, one or more generations of mobile telecommunication standards by fulfilling a specification or standards such as the specifications maintained by International Telecommunication Union. For example, the 4G standards can correspond to the International Mobile Telecommunications Advanced (IMT- Advanced) specification. Examples of cellular network standards include AMPS, GSM, GPRS, UMTS, LTE, LIE Advanced, Mobile WiMAX, and WiMAX- Advanced. Cellular network standards can use various channel access methods, e g., FDMA, TDMA, CDMA, or SDMA. [0059] FIG. 4 is a diagram illustrating an exemplary process 401 of exchanging information among devices in the communication network 400, including the mobile robot 100, the cloud computing system 406, and the mobile device 404 and of operating the mobile robot 100. In operation of some examples, a cleaning mission can be initiated by pressing a button on the mobile robot 100 (or the mobile device 404) or can be scheduled for a future time or day. The user can select a set of rooms to be cleaned during the cleaning mission or can instruct the robot to clean all rooms. The user can also select a set of cleaning parameters to be used in each room during the cleaning mission.

[0060] During a cleaning mission, the mobile robot 100 can track 410 its status, including its location, any operational events occurring during cleaning, and time spent cleaning. The operational events can include tracking of debris ingested by the robot during vacuuming mode. For example, using the debris sensor 144, an amount of debris, where it was ingested, at what time it was ingested, or the like, can be can be logged or stored or transmitted 12 to the cloud computing system 406. Either the robot 100 or the cloud computing system 406 can use this and other data to develop a cleaning history 414 of the robot 100 within the environment 40. The robot 100 or the cloud computing system 406 can also produce recommended settings 414 for the robot in each room or in the entire environment based on the cleaning history or other variables.

[0061] The cloud computing system 406 can transmit 416 the cleaning history or the recommended settings to the mobile device 404. The mobile device 404 can present 418, such as by a processor 444, the room settings, such as cleaning settings for one or more rooms of the environment as defined by the map. The recommended settings or settings of each room can be presented on the display of the mobile device 404 as any of a number of graphical representations editable to select cleaning settings for the robot 100 within the environment 40.

[0062] A user 402 can view 420 the settings or recommended settings on the display and can input 422 new settings or can accept settings that are recommended. Various settings can be selected by the user 402 to present additional setting options, as discussed below in FIGS. 5A-8. Accordingly, any of the settings discussed in FIGS. 5A-8 below can be implemented or selected in the steps 414-422. The user 402 can also add rooms or delete rooms which can each be assigned individual settings.

[0063] The display of the mobile device 404 can be updated 424 as the user 402 changes the settings or accepts the proposed settings. For example, if the user 402 selects the kitchen cleaning settings indication, various settings can be displayed on the mobile device 404 where the user can then select the cleaning settings for that room, such as a number of vacuuming passes, a mopping overlap, a vacuum suction setting, or a mopping fluid dispense rate. The room settings indication can then be updated on the mobile device 404 based on the one or more selections by the user 402.

[0064] Based on the inputs from the user 402, the settings can be transmitted to the robot 100 or to the cloud computing system 406. The robot 100 can then execute behaviors 426 based on any of the received settings discussed above or below. For example, if a number of vacuuming passes is set to two passes in the kitchen, the robot 100 can perform two vacuuming passes each time it is instructed to complete a cleaning mission in the kitchen.

[0065] Upon executing a behavior, the controller 430 can check to see if the cleaning behaviors are executed and can track 410 its status, including its location, any operational events occurring during cleaning, and time spent cleaning using the new settings and can update cleaning history and optionally produce new recommended settings as the loop continues.

[0066] In some examples, communications can occur between the mobile robot 100 and the mobile device 404 directly. For example, the mobile device 404 can be used to transmit one or more instructions through a wireless method of communication, such as Bluetooth or Wi-fi, to instruct the mobile robot 100 to perform a cleaning operation (mission). Operations for the process 401 and other processes described herein, such one or more steps discussed with respect to FIGS. 5A-8 can be executed in a distributed manner. For example, the cloud computing system 406, the mobile robot 100, and the mobile device 404 can execute one or more of the operations in concert with one another. Operations described as executed by one of the cloud computing system 406, the mobile robot 100, and the mobile device 404 are, in some implementations, executed at least in part by two or all of the cloud computing system 406, the mobile robot 100, and the mobile device 404.

User Interface Examples

[0067] FIGS. 5A-8 show examples of a user interface for creating or performing a cleaning mission routine and controlling a mobile robot to execute the cleaning mission in an environment. The user interface can be a part of a handheld computing device, such as a smart phone, a cellular phone, a. personal digital assistant, a laptop computer, a tablet, a smart watch, or other portable computing device capable of transmitting and receiving signals related to a robot cleaning mission. In an example, the handheld computing device is the mobile device 404.

[0068] The user interface can be configured to present, on a display, information about one or more robots in a user’s home and their respective operating status, one or more editable mission routines (e.g., a cleaning mission), a progress of a mission being executed. In some examples, cleaning settings for each room can be presented for editing or selecting by the user. The user interface can also receive user instructions or settings for controlling the robot navigation and mission execution.

[0069] FIG. 5A illustrates an example of a device 500A configured to display a view of creation or editing of a new job or mission. The device 500A can include a display 502 configured to display a user interface 504A. Through a single user interface 504 (as shown in FIGS. 5A-8), a user can coordinate behaviors of a robot. The user interface 504A can include one or more user interface controls (e.g., buttons, selectable indications, checkboxes, dropdown lists, list boxes, sliders, links, tabstrips, text boxes, charts, windows, or the like) that can allow a user to select various functions or settings of mission scheduling and robot control.

[0070] The user interface 504A can be a user interface for creating a new job or new cleaning routine or can be an interface for editing a job or cleaning routine. For example, the user interface 504A can include a save as favorite indication 506 that can be user- selectable to save the settings of the new job as a new favorite job or cleaning routine. The user interface 504A can also include a start now indication 508 that can be user- selectable to start the new job now, where selection can transmit an instruction to the robot 100 to start the cleaning mission based on the selected parameters of the user interface 504A (optionally among others).

[0071] The user interface 504A can also include one or more shelves 510, which can be collections or groupings of indications, buttons, lists, text boxes, or the like. The user interface 504A can be configured to display the shelves 510A-510C representing rooms to be cleaned in a new job screen. The rooms can optionally be ordered for cleaning sequencing, e.g., 1, 2, 3, or the like, indicating an order that the rooms of the map will be cleaned. Each shelf 510 can include a name indication 512, a time estimate 514, and a room cleaning settings indication 516.

[0072] The name indication 512 can display or indicate a name of the room of the map. Each room can be previously named by a user following map generation by the robot 100 or the cloud computing system 406, which can be developed, at least in part, during cleaning or mapping routines, as discussed above.

[0073] The time estimate indication 514 can display a time or estimated time to complete a portion of a cleaning mission to be performed within the room of the shelf 510. For example, the shelf 510A for the Kitchen can have a time estimate indication 514A of about 12 minutes, which can be an estimated time to complete a portion of the cleaning mission or job that occurs within the Kitchen, based on the selected settings. Optionally, the time estimate indication 514 can be updated based on user changes to the room cleaning settings, such as by selecting the room cleaning settings indication 516 and changing or entering one or more settings, as discussed in further detail below. The user can thereby be provided with a time estimate for cleaning each room to be cleaned based on their selected cleaning settings.

[0074] Each room cleaning settings indication 516 can be user-selectable to produce or display on the device 500, a screen for user selection of one or more cleaning settings for that previously-mapped specified room. For example, selection (e.g., by a user) of kitchen room cleaning settings indication 516A can produce the user interface shown in FIG. 5B where a user can select, enter, or accept one or more cleaning mode settings, as discussed in further detail below with respect to FIG. 5B. [0075] Each of the room cleaning settings indications 516 can include icons or images representing selected, default, or recommended settings. For example, the kitchen room cleaning settings indication 516A shows a vacuuming passes indication 518A, a vacuuming indication 520A, and a mopping indication 522A. The vacuuming passes indication 518A can represent a number of vacuuming passes that is selected as planned to be performed during the planned cleaning mission or job. The vacuuming indication 520A can represent that vacuuming will be performed in the planned cleaning mission or can represent a vacuuming speed that is selected to be performed on the planned cleaning mission or job. The mopping indication 522A can represent a rate at which fluid will be discharged from the robot during a mopping mode. Optionally, the drop icon can vary to show a small amount (or low amount), a medium amount, or a high amount. For example, an empty drop can indicate a low amount and a filled drop can represent a high amount. Further, as shown in the room cleaning settings indication 516C for the Dining Room, the room cleaning settings indication 516C can show a mopping indication 522C that is grayed-out or dashed, which can indicate that it has been selected, or the current setting is set to, not perform any mopping in the indicated room. Similarly, any of the indications 516 can be grayed out to indicate that the setting the indication represents is not selected or is turned off.

[0076] FIG. 5B illustrates a user interface 504B of the handheld device 500B that displays a cleaning mode indication, among others. The handheld device 500 and the display 502 can be similar to those discussed above where the user interface 504B can be changed with respect to the user interface 504A shown in FIG. 5A.

[0077] FIG. 5B shows that, when the room cleaning settings indication 516A of the user interface 504A is selected, the user interface 504B can be displayed such that the settings for the indicated or selected room of the environment are displayed and selectable. For example, a cleaning mode indication 524 can be displayed, which can include a text indication 526 that can display text to indicate which setting is selectable using the indication, such as Cleaning Mode for the cleaning mode indication 524. The cleaning mode indication 524 can be selectable to set a cleaning mode setting of the specified room. For example, a vacuuming only indication 528 can be selectable so that only vacuuming operations are performed in the room (e.g., the Kitchen) of the environment 40. Alternatively, a mopping and vacuuming indication 530 can be selected so that both vacuuming and mopping operations are performed in the room (e.g., the Kitchen) of the environment 40. The vacuuming only indication 528 can be represented by a vacuuming image, such as that of a side brush or other vacuuming component. The mopping and vacuuming indication 530 can be both a vacuuming component and a liquid drop icon or any other mopping component icon (e.g., spray nozzle or pad).

[0078] The selected cleaning mode indication 524 can change various indications or image presentations on the user interface 504B. For example, the selected indication of the cleaning mode indication 524 can be of a different color than the non-selected indication. For example, the vacuuming only indication 528 is selected and can be displayed in white and the mopping and vacuuming indication 530 is not selected and can be displayed in gray. Other colors can be used. The selected cleaning mode indication 524 can also vary text 532, which can list in text which mode is displayed. For example, as the vacuuming only indication 528 is selected, the text 532 displays Vacuum.

[0079] Further, when the vacuuming only indication 528 of the cleaning mode indication 524 is selected, a cleaning passes indication 534 can be displayed. The user interface 504B can include a text indication 536 which can display text to indicate what is selectable using the indication, such as Cleaning Passes for the indication 534. The cleaning passes indication 534 can include indications selectable to determine or select a number of vacuuming cleaning passes of a mobile cleaning robot. For example, the cleaning passes indication 534 can include a single pass indication 538 and a two-pass indication 540. The single pass indication 538 can be user selectable to instruct the robot to perform only a single vacuuming pass in the selected room (e.g., the Kitchen). The two-pass indication 540 can user selectable to instruct the robot to perform two vacuuming passes in the selected room. The cleaning passes indication 534 can optionally include indications for more passes, such as 3, 4, 5, 10, or the like.

[0080] The selected cleaning mode indication 524 can also vary text 542, which can list in text how many cleaning passes are displayed. For example, as vacuuming single pass indication 538 is selected, the text 532 displays “One”. The indications of the cleaning mode indication 524 can also change colors or can change appearance in other ways (similar to the cleaning mode indication 524). Other indications discussed below can also change appearance. Because vacuuming is performed when either the vacuuming only indication 528 or the mopping and vacuuming indication 530 indication is selected, the cleaning passes indication 534 can be displayed when either of the cleaning mode indication 524 options are selected.

[0081] FIG. 5C illustrates a user interface 504C of the handheld device 500C that displays a cleaning mode indication, among others. The handheld device 500 and the display 502 can be similar to those discussed above where the user interface 504C can be changed with respect to the user interfaces 504A shown in FIG. 5A.

[0082] FIG. 5C shows that when the mopping and vacuuming indication 530 of the cleaning mode indication 524 is selected, the indications and settings produced can be different. The cleaning passes indication 534 can still be produced and user selectable to select vacuuming passes; however, when the mopping and vacuuming indication 530 is selected, a fluid rate indication 544 can be produced or displayed by a processor on the user interface 504B. The fluid rate indication 544 can be selectable to set a rate at which a fluid is discharged from the robot 100 during mopping in the selected (or specific) room (e.g., the Kitchen).

[0083] The user interface 504C can include a text indication 546 which can display text to indicate what is selectable using the indication, such as “Liquid Amount” for the indication 534. The fluid rate indication 544 can include a low indication 548, a medium indication 550, and a high indication 552. The low indication 548 can be selected to set a low rate at which a fluid is discharged from the robot 100 in mopping mode. The medium indication 550 can be selected to set a medium rate at which a fluid is discharged from the robot 100 in mopping mode. The high indication 552 can be selected to set a high rate at which a fluid is discharged from the robot 100 in mopping mode. The indications 544- 548 can change in appearance based on which is selected, as discussed above with respect to other indications. The selected liquid amount indication 544 can also vary text 554, which can list in text the rate of fluid to be dispensed, such as “Medium.” In this way, the user interface 504B can be used by a user to set the settings for any room of the environment 40 for a vacuuming and mopping robot.

[0084] FIG. 6A illustrates an example of a device 600A configured to display a view of creation or editing of a new job or mission. The device 600 A can include a display 602 configured to display a user interface 604 A, which can be similar to the device 500A and user interface 504A, where like numerals can represent like components. The device user interface 604A can differ in that the user interface 604A can display an interface for selecting settings for a mopping only device, such as a mopping robot. That is, the room cleaning settings indications 616 can be selectable to display mopping settings within a particular room.

[0085] Each of the room cleaning settings indications 616 can include icons or images indicated selected, default, or recommended settings. For example, the room cleaning settings indication 616A can include an overlap percentage indication 656A and a fluid rate indication 620A. The overlap percentage indication 656A can represent a selected overlap percentage.

[0086] FIG. 6B illustrates a user interface 604B of the handheld device 600B that displays a cleaning mode indication, among others. The handheld device 600 and the display 602 can be similar to those discussed above where the user interface 604B can be changed with respect to the user interface 604A shown in FIG. 6A.

[0087] For example, the user interface 604B can be changed to display one or more cleaning settings when the indication 616A is user-selected to show a fluid rate indication 644, which can be similar to the fluid rate indication 544 discussed above. The user interface 604B can also show a mopping overlap indication 660 that is selectable to set an amount that a mopping pad rank for a given pass of the mopping pad will overlap the mopping pad rank of the previous pass. Stated another way, the overlap percentage is the amount the mopping pad overlaps in adjacent mopping passes.

[0088] The mopping overlap indication 660 can include a low indication 664, a medium indication 666, and a high indication 668. The low indication 664 can be selected to set a low percentage of mopping pad overlap in the room for the selected job or mission. The medium indication 666 can be selected to set a medium percentage of mopping pad overlap. The high indication 668 can be selected to set a high percentage of mopping pad overlap. For example, the low overlap percentage can be between 5 percent and 40 percent, the medium overlap percentage can be between 25 percent and 80 percent, and the high overlap percentage can be between 60 percent and 100 percent. In one example, the low overlap percentage can be 25 percent, the medium overlap can be 67 percent, and the high overlap can be 85 percent. More or fewer overlap percentage indications can be used, such as 2, 4, 5, 6, 10, or the like. The selected mopping overlap indication 644 can also vary text 670, which can list in text the overlap amount, such as “Medium.” In this way, the user interface 604B can be used by a user to set one or more mopping performance settings for any room of the environment 40 for a mopping robot. [0089] FIG. 7 illustrates an example of a device 700 configured to display a view of creation or editing of a new job or mission. The device 700 can include a display 702 configured to display a user interface 704, which can be similar to the user interface 504 or 604, where like numerals can represent like components. The device user interface 704 can differ in that the user interface 704 can display an interface for selecting settings for a vacuuming only device, such as a vacuuming robot. However, the indications and features of the device 700 can be used with any of the previously discussed devices or interfaces.

[0090] The user interface 704 can display a settings indication including a cleaning passes indication 734, which can be similar to the indication 534 discussed above. The user interface 704 can also display a suction power indication 772 (or vacuum speed indication), which can be selectable to set a speed of a vacuum system of a mobile cleaning robot. The user interface 704 can include a text indication 774 which can display text to indicate what is selectable using the indication, such as “Suction” for the text indication 774.

[0091] The suction power indication 772 can include a low suction indication 776, a medium suction indication 778, and a high suction indication 780. The low suction indication 776 can be selected to set a low suction rate of the vacuum system of the robot 100 in the selected room for the selected job. The medium suction indication 778 can be selected to set a medium suction rate of the vacuum system of the robot 100. The high suction indication 780 can be selected to set a high suction rate of the vacuum system of the robot 100. The indications 776-780 can change in appearance based on which is selected, as discussed above with respect to other indications. The selected suction power indication 772 can also vary text 782, which can list in text the suction setting of the robot 100, such as “Medium.” In this way, the user interface 704 can be used by a user to set one or more vacuum settings for any room of the environment 40.

[0092] In any of the user interfaces 504-704 discussed above, the presented settings can be presented as default settings before a user selects the settings. Alternatively, one or more of the presented cleaning settings can be presented based on a cleaning history of the robot 100. As discussed above with respect to FIGS. 3 and 4, cleaning history of the robot 100 can be used to develop recommended cleaning settings, which can then be displayed on any of the interfaces 504-704 for user acceptance or modification. In some examples, the settings can be presented based on previously selected settings or based on settings of other rooms. These settings can optionally be accepted using any of the interfaces discussed above. Alternatively, the recommended settings can be presented via a single indication selectable to select all recommended cleaning settings for the robot, either in a single room or for all rooms of the environment.

[0093] Any of the settings discussed above with respect to FIGS. 3-7 can be transmitted by the device (e.g., device 500) to the cloud computing system 406 or the robot 100 to control the robot in accordance with such settings during a cleaning operation of the robot 100 or during any future cleaning operation of the robot 100 using a saved job.

[0094] Though the settings are discussed as being for cleaning settings of the robot 100, the settings can be selectable for any number of mobile cleaning robots including a vacuuming robot, a mopping robot, or a two-in-one (i.e., vacuuming and mopping) robot. [0095] FIG. 8 illustrates a block diagram of an example machine 800 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform. Examples, as described herein, may include, or may operate by, logic or a number of components, or mechanisms in the machine 800. Circuitry (e.g., processing circuitry) is a collection of circuits implemented in tangible entities of the machine 800 that include hardware (e.g., simple circuits, gates, logic, etc.). Circuitry membership may be flexible over time. Circuitries include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuitry may be immutably designed to carry out a specific operation (e.g., hardwired). In an example, the hardware of the circuitry may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a machine readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation. In connecting the physical components, the underlying electrical properties of a hardware constituent are changed, for example, from an insulator to a conductor or vice versa. The instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuitry in hardware via the variable connections to carry out portions of the specific operation when in operation. Accordingly, in an example, the machine readable medium elements are part of the circuitry or are communicatively coupled to the other components of the circuitry when the device is operating. In an example, any of the physical components may be used in more than one member of more than one circuitry. For example, under operation, execution units may be used in a first circuit of a first circuitry at one point in time and reused by a second circuit in the first circuitry, or by a third circuit in a second circuitry at a different time. Additional examples of these components with respect to the machine 800 follow.

[0096] In alternative embodiments, the machine 800 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 800 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 800 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 800 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations. [0097] The machine (e.g., computer system) 800 may include a hardware processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 804, a static memory (e.g., memory or storage for firmware, microcode, a basic-input-output (BIOS), unified extensible firmware interface (UEFI), etc.) 806, and mass storage 808 (e.g., hard drive, tape drive, flash storage, or other block devices) some or all of which may communicate with each other via an interlink (e.g., bus) 830. The machine 800 may further include a display unit 810, an alphanumeric input device 812 (e.g., a keyboard), and a user interface (UI) navigation device 814 (e.g., a mouse). In an example, the display unit 810, input device 812 and UI navigation device 814 may be a touch screen display. The machine 800 may additionally include a storage device (e.g., drive unit) 808, a signal generation device 818 (e.g., a speaker), a network interface device 820, and one or more sensors 816, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 800 may include an output controller 828, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).

[0098] Registers of the processor 802, the main memory 804, the static memory 806, or the mass storage 808 may be, or include, a machine readable medium 822 on which is stored one or more sets of data structures or instructions 824 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 824 may also reside, completely or at least partially, within any of registers of the processor 802, the main memory 804, the static memory 806, or the mass storage 808 during execution thereof by the machine 800. In an example, one or any combination of the hardware processor 802, the main memory 804, the static memory 806, or the mass storage 808 may constitute the machine readable media 822. While the machine readable medium 822 is illustrated as a single medium, the term "machine readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 824.

[0099] The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 800 and that cause the machine 800 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, optical media, magnetic media, and signals (e.g., radio frequency signals, other photon based signals, sound signals, etc.). In an example, a non- transitory machine readable medium comprises a machine readable medium with a plurality of particles having invariant (e.g., rest) mass, and thus are compositions of matter. Accordingly, non-transitory machine-readable media are machine readable media that do not include transitory propagating signals. Specific examples of non-transitory machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magnetooptical disks; and CD-ROM and DVD-ROM disks.

[00100] The instructions 824 may be further transmitted or received over a communications network 826 using a transmission medium via the network interface device 820 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 820 may include one or more physical jacks (e.g., Ethernet, coaxial, or phonejacks) or one or more antennas to connect to the communications network 826. In an example, the network interface device 820 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple- input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 800, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. A transmission medium is a machine readable medium.

NOTES AND EXAMPLES

[00101] The following, non-limiting examples, detail certain aspects of the present subject matter to solve the challenges and provide the benefits discussed herein, among others.

[00102] Example 1 is a method of operating a mobile cleaning robot system including a mobile cleaning robot and a display device, the method comprising: displaying a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room; and displaying, when the room cleaning settings indication is selected, a cleaning mode indication selectable to set a cleaning mode setting of the specified room.

[00103] In Example 2, the subject matter of Example 1 optionally includes performing a cleaning mission using the cleaning mode setting of the specified room when the mobile cleaning robot is located within the specified room.

[00104] In Example 3, the subject matter of any one or more of Examples 1-2 optionally include wherein the room cleaning settings indication includes a vacuuming only or vacuuming and mopping indication.

[00105] In Example 4, the subject matter of Example 3 optionally includes displaying, when either vacuuming only or vacuuming and mopping is selected, a cleaning passes indication selectable to set a number of vacuuming cleaning passes of a mobile cleaning robot. [00106] In Example 5, the subject matter of Example 4 optionally includes wherein the cleaning passes indication is selectable to set the number of passes of the mobile cleaning robot to between one and three passes.

[00107] In Example 6, the subject matter of any one or more of Examples 3-5 optionally include displaying, when the vacuuming and mopping indication is selected, a fluid rate indication selectable to set a rate at which a fluid is discharged from a mobile cleaning robot.

[00108] In Example 7, the subject matter of Example 6 optionally includes wherein the fluid rate indication is selectable to set the rate at which the fluid is discharged between a low amount, a medium amount, and a high amount.

[00109] In Example 8, the subject matter of any one or more of Examples 1-7 optionally include wherein the room cleaning settings indication includes a vacuum speed indication selectable to set a speed of a vacuum system of a mobile cleaning robot.

[00110] In Example 9, the subject matter of any one or more of Examples 1-8 optionally include wherein the room cleaning settings indication includes an overlap indication selectable to set an overlap percentage of a mopping pad.

[00111] In Example 10, the subject matter of Example 9 optionally includes wherein the room cleaning settings indication includes a fluid rate indication selectable to set a rate at which a fluid is discharged from a mobile cleaning robot.

[00112] In Example 11, the subject matter of any one or more of Examples 1-10 optionally include present a recommended cleaning setting based on a cleaning history. [00113] In Example 12, the subject matter of Example 11 optionally includes receive a dirt detection signal from a mobile cleaning robot; and determine the cleaning history based on the dirt detection signal.

[00114] Example 13 is a machine-readable medium including instructions for presenting a user interface, which when executed by a processor, cause the processor to: display a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room; and display, when the room cleaning settings indication is selected, a cleaning mode indication selectable to set a cleaning mode setting of the specified room. [00115] In Example 14, the subject matter of Example 13 optionally includes wherein the room cleaning settings indication includes a vacuuming only or vacuuming and mopping indication.

[00116] In Example 15, the subject matter of Example 14 optionally includes the processor further configured to: display, when either vacuuming only or vacuuming and mopping is selected, a cleaning passes indication selectable to set a number of vacuuming cleaning passes of a mobile cleaning robot.

[00117] In Example 16, the subject matter of Example 15 optionally includes wherein the cleaning passes indication is selectable to set the number of passes of the mobile cleaning robot to between one and three passes.

[00118] In Example 17, the subject matter of any one or more of Examples 14-16 optionally include the processor further configured to: display, when the vacuuming and mopping indication is selected, a fluid rate indication selectable to set a rate at which a fluid is discharged from a mobile cleaning robot.

[00119] In Example 18, the subject matter of Example 17 optionally includes wherein the fluid rate indication is selectable to set the rate at which the fluid is discharged between a low amount, a medium amount, and a high amount.

[00120] In Example 19, the subject matter of any one or more of Examples 13-18 optionally include wherein the room cleaning settings indication includes a vacuum speed indication selectable to set a speed of a vacuum system of a mobile cleaning robot.

[00121] In Example 20, the subject matter of any one or more of Examples 13-19 optionally include wherein the room cleaning settings indication includes an overlap indication selectable to set an overlap percentage of a mopping pad.

[00122] In Example 21, the subject matter of Example 20 optionally includes wherein the room cleaning settings indication includes a fluid rate indication selectable to set a rate at which a fluid is discharged from a mobile cleaning robot.

[00123] In Example 22, the subject matter of any one or more of Examples 13-21 optionally include the processor further configured to: present a recommended cleaning setting based on a cleaning history. [00124] In Example 23, the subject matter of Example 22 optionally includes the processor further configured to: receive a dirt detection signal from a mobile cleaning robot; and determine the cleaning history based on the dirt detection signal.

[00125] In Example 24, the subject matter of any one or more of Examples 22-23 optionally include wherein the recommended cleaning setting is one or more of a number of vacuuming passes, a mopping overlap, a vacuum suction setting, or a mopping fluid dispense rate.

[00126] Example 25 is a mobile cleaning robot system including a mobile cleaning robot, the system comprising: a display configured to present a user interface; and processing circuitry in communication with the mobile cleaning robot and the display, the processing circuitry configured to: display a room cleaning settings indication selectable to set one or more cleaning settings for a previously-mapped specified room; and display, when the room cleaning settings indication is selected, a cleaning mode indication selectable to set a cleaning mode setting of the specified room.

[00127] In Example 26, the subject matter of Example 25 optionally includes wherein the room cleaning settings indication includes a vacuuming only or vacuuming and mopping indication.

[00128] In Example 27, the subject matter of Example 26 optionally includes the processing circuitry further configured to: display, when either vacuuming only or vacuuming and mopping is selected, a cleaning passes indication selectable to set a number of vacuuming cleaning passes of a mobile cleaning robot.

[00129] In Example 28, the subject matter of Example 27 optionally includes the processing circuitry further configured to: display, when the vacuuming and mopping indication is selected, a fluid rate indication selectable to set a rate at which a fluid is discharged from the mobile cleaning robot.

[00130] In Example 29, the subject matter of any one or more of Examples 25-28 optionally include the processing circuitry further configured to: present a recommended cleaning setting based on a cleaning history. [00131] In Example 30, the subject matter of Example 29 optionally includes the processing circuitry further configured to: receive a dirt detection signal from the mobile cleaning robot; and determine the cleaning history based on the dirt detection signal.

[00132] In Example 31, the apparatuses, systems, or methods of any one or any combination of Examples 1 - 30 can optionally be configured such that all elements or options recited are available to use or select from.

[00133] The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show, by way of illustration, specific embodiments in which the invention can be practiced. These embodiments are also referred to herein as “examples.” Such examples can include elements in addition to those shown or described. However, the present inventors also contemplate examples in which only those elements shown or described are provided. Moreover, the present inventors also contemplate examples using any combination or permutation of those elements shown or described (or one or more aspects thereof), either with respect to a particular example (or one or more aspects thereof), or with respect to other examples (or one or more aspects thereof) shown or described herein.

[00134] In the event of inconsistent usages between this document and any documents so incorporated by reference, the usage in this document controls. In this document, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, composition, formulation, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.

[00135] The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) can be used in combination with each other. Other embodiments can be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is provided to comply with 37 C.F.R. § 1.72(b), to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the above Detailed Description, various features can be grouped together to streamline the disclosure. This should not be interpreted as intending that an unclaimed disclosed feature is essential to any claim. Rather, inventive subject matter can lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that such embodiments can be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.