Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MOBILE CLEANING ROBOT WITH VARIABLE CLEANING FEATURES
Document Type and Number:
WIPO Patent Application WO/2024/010712
Kind Code:
A1
Abstract:
A mobile cleaning robot can include a body movable within an environment and can define a suction duct. The cleaning assembly can be connected to the body and can be configured to at least partially define, together with a floor surface of the environment, a debris port connected to the suction duct. The cleaning assembly can be engageable with a surface of the environment to direct debris through the debris port, and the cleaning assembly can include a guide adjustable to alter suction through the debris port.

Inventors:
BURBANK ERIC J (US)
MEHEGAN CHRISTOPHER (US)
LEMIEUX JASON (US)
Application Number:
PCT/US2023/026280
Publication Date:
January 11, 2024
Filing Date:
June 27, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
IROBOT CORP (US)
International Classes:
A47L9/00; A47L9/04; A47L9/28
Foreign References:
EP3427626A12019-01-16
EP3669737A12020-06-24
DE102016115977A12018-03-01
DE102017208960A12018-11-29
EP2096972B12012-02-15
EP3696641A12020-08-19
EP1967115B12010-07-28
JP3047555B22000-05-29
JP2015000210A2015-01-05
US20160000289A12016-01-07
US202217859230A2022-07-07
Attorney, Agent or Firm:
ARORA, Suneel et al. (US)
Download PDF:
Claims:
CLAIMS:

1. A mobile cleaning robot comprising: a body movable within an environment and defining a suction duct; and a cleaning assembly connected to the body and configured to at least partially define, together with a floor surface of the environment, a debris port connected to the suction duct, the cleaning assembly engageable with a surface of the environment to direct debris through the debris port, and the cleaning assembly including a guide adjustable to alter suction through the debris port.

2. The mobile cleaning robot of claim 1, further comprising: a plurality of debris guides connected to the body forward of the cleaning assembly and extending downward from the body to at least partially define, together with the floor surface and the cleaning assembly, the debris port.

3. The mobile cleaning robot of claim 2, wherein the guide includes a plurality of flaps, each flap extendable between adjacent debris guides of the plurality of debris guides to at least partially define, together with the floor surface and the debris guides, the debris port.

4. The mobile cleaning robot of claim 3, wherein the guide is movable between an extended position and a retracted position.

5. The mobile cleaning robot of claim 4, further comprising: a drive assembly operable to move the guide between the extended position and the retracted position.

6. The mobile cleaning robot of claim 3, wherein the guide is releasably connectable to the cleaning assembly to alter the debris port.

7. The mobile cleaning robot of any of claims 1-6, wherein the guide is a bristle guide connected to a rear portion of the cleaning assembly.

8. The mobile cleaning robot of claim 7, wherein the bristle guide extends along lateral sides of the cleaning assembly from the rear portion.

9. The mobile cleaning robot of claim 7, wherein the guide is movable between an extended position and a retracted position.

10. The mobile cleaning robot of claim 9, further comprising: a drive assembly operable to move the guide between the extended position and the retracted position.

11. The mobile cleaning robot of claim 10, wherein the guide increases suction through the suction port when the guide is in the extended position and decreases suction through the suction port when the guide is in the retracted position.

12. The mobile cleaning robot of claim 7, wherein the guide is releasably connectable to the cleaning assembly to alter the debris port.

13. A method of operating a mobile cleaning robot, the method comprising: determining a floor type of a floor surface of an environment; determining a location of the mobile cleaning robot within the environment; and adjusting a position of a guide of a cleaning assembly of the mobile cleaning robot to adjust a debris port defined, at least in part, by the guide and the floor surface.

14. The method of claim 13, wherein adjusting the position of the guide includes extending the guide toward the cleaning surface or retracting the guide away from the cleaning surface.

15. The method of any of claims claim 13-14, further comprising: detecting debris on the floor surface of the environment; determining a debris type of the detected debris; and adjusting the position of the guide based on the debris type.

16. The method of any of claims 13-16, further comprising: adjusting a rotational speed of one or more rollers of the cleaning assembly based on the position of the guide.

17. The method of any of claims 13-16, further comprising: determining a drive speed of the robot; and adjusting the drive speed of the mobile cleaning robot based on the determined drive speed and the position of the guide.

18. The method of any of claims 13-17, further comprising: determining a number of passes performed in a designated space of the environment; and adjusting a drive direction of the mobile cleaning robot based on the determined number of passes.

19. A non-transitory machine-readable medium including instructions, for pre- operatively operating a mobile cleaning robot, which when executed by a machine, cause the machine to: determine a floor type of a floor surface of an environment; determine a location of the mobile cleaning robot within the environment; and adjust a position of a guide of a cleaning assembly of the mobile cleaning robot to adjust a debris port defined, at least in part, by the guide and the floor surface.

20. The non-transitory machine- readable medium of claim 19, the instructions to further cause the machine to: detect debris on the floor surface of the environment; determine a debris type of the detected debris; and adjust the position of the guide based on the debris type.

21. The non-transitory machine-readable medium of claim 19, wherein adjusting the position of the guide includes extending the guide toward the cleaning surface or retracting the guide away from the cleaning surface.

Description:
MOBILE CLEANING ROBOT WITH VARIABLE CLEANING FEATURES

PRIORITY APPLICATIONS

[0001] This application is a continuation of and claims the benefit of priority to U.S. Patent Application Serial No. 17/859,230, filed July 7, 2022, the content of which is incorporated herein by reference in its entirety.

BACKGROUND

[0002] Autonomous mobile robots include autonomous cleaning robots that can autonomously perform cleaning tasks within an environment, such as a home. Many kinds of cleaning robots are autonomous to some degree and in different ways. The autonomy of mobile cleaning robots can be enabled by the use of a controller and multiple sensors mounted on the robot. In some examples, the robots can include devices for autonomously improving cleaning performance within an environment.

SUMMARY

[0003] As mobile cleaning robots (e.g., autonomous mobile cleaning robots) traverse an environment, the robots can perform cleaning operations such as vacuuming or mopping operations. During cleaning operations, the robot can operate a vacuum system, such as a blower (e.g., impeller and motor), and cleaning assembly (such as one or more rollers) to extract debris from the environment. However, because floor surfaces and debris types of the environment can vary, the vacuuming efficiency can vary between environments or between rooms of a given environment.

[0004] The devices, systems, and methods of this application can help to address these issues by providing a variable debris port that can be user-adjustable or automatically adjustable (e.g., via a controller of the robot) to improve vacuuming efficiency of the robot based on the flooring time. For example, the robot can include suction guides that are user adjustable based on a flooring type of the environment. Additionally, or alternatively, the robot can include suction guides that are adjusted by the robot based on user input or based on floor type (e.g., automatically) to improve cleaning efficiency between rooms.

[0005] For example, a mobile cleaning robot can include a body movable within an environment and can define a suction duct. The cleaning assembly can be connected to the body and can be configured to at least partially define, together with a floor surface of the environment, a debris port connected to the suction duct. The cleaning assembly can be engageable with a surface of the environment to direct debris through the debris port, and the cleaning assembly can include a guide adjustable to alter suction through the debris port.

[0006] The above discussion is intended to provide an overview of subject matter of the present patent application. It is not intended to provide an exclusive or exhaustive explanation of the invention. The description below is included to provide further information about the present patent application.

BRIEF DESCRIPTION OF THE DRAWINGS

[0007] In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.

[0008] FIG. 1 illustrates a plan view of a mobile cleaning robot in an environment.

[0009] FIG. 2A illustrates a bottom view of a mobile cleaning robot. [0010] FIG. 2B illustrates an isometric view of a mobile cleaning robot.

[0011] FIG. 3 illustrates a cross-section view across indicators 3-3 of FIG. 2A of a mobile cleaning robot. [0012] FIG. 4 illustrates a diagram illustrating an example of a communication network in which a mobile cleaning robot operates and data transmission in the network.

[0013] FIG. 5A illustrates a bottom perspective view of a mobile cleaning robot.

[0014] FIG. 5B illustrates a bottom perspective view of a mobile cleaning robot.

[0015] FIG. 6 illustrates a bottom perspective view of a mobile cleaning robot. [0016] FIG. 7A illustrates a bottom perspective view of a mobile cleaning robot.

[0017] FIG. 7B illustrates a bottom perspective view of a portion of a mobile cleaning robot.

[0018] FIG. 8 illustrates a partially exploded bottom perspective view of a portion of a mobile cleaning robot.

[0019] FIG. 9 illustrates a schematic view of a method.

[0020] FIG. 10 illustrates a block diagram of a machine upon which one or more embodiments may be implemented.

DETAILED DESCRIPTION

[0021] FIG. 1 illustrates a plan view of a mobile cleaning robot 100 in an environment 40, in accordance with at least one example of this disclosure. The environment 40 can be a dwelling, such as a home or an apartment, and can include rooms 42a-42e. Obstacles, such as a bed 44, a table 46, and an island 48 can be located in the rooms 42 of the environment. Each of the rooms 42a-42e can have a floor surface 50a-50e, respectively. Some rooms, such as the room 42d, can include a rug, such as a rug 52. The floor surfaces 50 can be of one or more types such as hardwood, ceramic, low-pile carpet, medium-pile carpet, long (or high)-pile carpet, stone, or the like.

[0022] The mobile cleaning robot 100 can be operated, such as by a user 60, to autonomously clean the environment 40 in a room-by-room fashion. In some examples, the robot 100 can clean the floor surface 50a of one room, such as the room 42a, before moving to the next room, such as the room 42d, to clean the surface of the room 42d. Different rooms can have different types of floor surfaces. For example, the room 42e (which can be a kitchen) can have a hard floor surface, such as wood or ceramic tile, and the room 42a (which can be a bedroom) can have a carpet surface, such as a medium pile carpet. Other rooms, such as the room 42d (which can be a dining room) can include multiple surfaces where the rug 52 is located within the room 42d.

[0023] During cleaning or traveling operations, the robot 100 can use data collected from various sensors (such as optical sensors) and calculations (such as odometry and obstacle detection) to develop a map of the environment 40. Once the map is created, the user 60 can define rooms or zones (such as the rooms 42) within the map. The map can be presentable to the user 60 on a user interface, such as a mobile device, where the user 60 can direct or change cleaning preferences, for example.

[0024] Also, during operation, the robot 100 can detect surface types within each of the rooms 42, which can be stored in the robot or another device. The robot 100 can update the map (or data related thereto) such as to include or account for surface types of the floor surfaces 50a-50e of each of the respective rooms 42 of the environment. In some examples, the map can be updated to show the different surface types such as within each of the rooms 42.

[0025] In some examples, the user 60 can define a behavior control zone 54 using, for example, the methods and systems described herein. In response to the user 60 defining the behavior control zone 54, the robot 100 can move toward the behavior control zone 54 to confirm the selection. After confirmation, autonomous operation of the robot 100 can be initiated. In autonomous operation, the robot 100 can initiate a behavior in response to being in or near the behavior control zone 54. For example, the user 60 can define an area of the environment 40 that is prone to becoming dirty to be the behavior control zone 54. In response, the robot 100 can initiate a focused cleaning behavior in which the robot 100 performs a focused cleaning of a portion of the floor surface 50d in the behavior control zone 54.

Components of the Robot [0026] FIG. 2A illustrates a bottom view of the mobile cleaning robot 100. FIG. 2B illustrates a bottom view of the mobile cleaning robot 100. FIG. 3 illustrates a cross-section view across indicators 3-3 of FIG. 2A of the mobile cleaning robot 100. FIG. 3 also shows orientation indicators Bottom, Top, Front, and Rear. FIGS. 2A-3 are discussed together below.

[0027] The cleaning robot 100 can be an autonomous cleaning robot that autonomously traverses the floor surface 50 while ingesting the debris 75 from different parts of the floor surface 50. As depicted in FIGS. 2A and 3, the robot 100 can include a body 102 movable across the floor surface 50. The body 102 can include multiple connected structures to which movable components of the cleaning robot 100 are mounted. The connected structures can include, for example, an outer housing to cover internal components of the cleaning robot 100, a chassis to which drive wheels 210a and 210b and the cleaning rollers 205a and 205b (of a cleaning assembly 204) are mounted, a bumper 138 mounted to the outer housing, etc.

[0028] As shown in FIG. 2A, the body 102 includes a front portion 202a that has a substantially semicircular shape and a rear portion 202b that has a substantially semicircular shape. As shown in FIG. 2A, the robot 100 can include a drive system including actuators 208a and 208b, e.g., motors, operable with drive wheels 210a and 210b. The actuators 208a and 208b can be mounted in the body 102 and can be operably connected to the drive wheels 210a and 210b, which are rotatably mounted to the body 102. The drive wheels 210a and 210b support the body 102 above the floor surface 50. The actuators 208a and 208b, when driven, can rotate the drive wheels 210a and 210b to enable the robot 100 to autonomously move across the floor surface 50.

[0029] The controller (or processor) 212 can be located within the housing and can be a programable controller, such as a single or multi-board computer, a direct digital controller (DDC), a programable logic controller (PLC), or the like. In other examples the controller 212 can be any computing device, such as a handheld computer, for example, a smart phone, a tablet, a laptop, a desktop computer, or any other computing device including a processor, memory, and communication capabilities. The memory 213 can be one or more types of memory, such as volatile or non-volatile memory, read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media. The memory 213 can be located within the housing 200, connected to the controller 212 and accessible by the controller 212. [0030] The controller 212 can operate the actuators 208a and 208b to autonomously navigate the robot 100 about the floor surface 50 during a cleaning operation. The actuators 208a and 208b are operable to drive the robot 100 in a forward drive direction, in a backwards direction, and to turn the robot 100. The robot 100 can include a caster wheel 211 that supports the body 102 above the floor surface 50. The caster wheel 211 can support the rear portion 202b of the body 102 above the floor surface 50, and the drive wheels 210a and 210b support the front portion 202a of the body 102 above the floor surface 50.

[0031] As shown in FIG. 3, a vacuum assembly 118 can be located at least partially within the body 102 of the robot 100, e.g., in the front portion 202a of the body 102. The controller 112 can operate the vacuum assembly 118 to generate an airflow that flows through the air gap near the cleaning rollers 205, through the body 102, and out of the body 102. The vacuum assembly 118 can include, for example, an impeller that generates the airflow when rotated. The airflow and the cleaning rollers 205, when rotated, can cooperate to ingest debris 75 into a suction duct 348 of the robot 100. The suction duct 348 can extend down to or near a bottom portion of the body 102 and can be at least partially defined by the cleaning assembly 204.

[0032] The suction duct 348 can be connected to the cleaning head 204 or cleaning assembly and can be connected to a cleaning bin 322. The cleaning bin 322 can be mounted in the body 102 contains the debris 75 ingested by the robot 100, and a filter in the body 102 separates the debris 75 from the airflow before the airflow 120 enters the vacuum assembly 118 and is exhausted out of the body 102. In this regard, the debris 75 is captured in both the cleaning bin 322 and the filter before the airflow 120 is exhausted from the body 102. [0033] The cleaning rollers 205a and 205b can operably connected to actuators 214a and 214b, e.g., motors, respectively. The cleaning head 204 and the cleaning rollers 205a and 205b can positioned forward of the cleaning bin 322. The cleaning rollers 205a and 205b can be mounted to a housing 124 of the cleaning head 204 and mounted, e.g., indirectly or directly, to the body 102 of the robot 100. In particular, the cleaning rollers 205a and 205b are mounted to an underside of the body 102 so that the cleaning rollers 205a and 205b engage debris 75 on the floor surface 50 during the cleaning operation when the underside faces the floor surface 50.

[0034] The housing 124 of the cleaning head 204 can be mounted to the body 102 of the robot 100. In this regard, the cleaning rollers 205a and 205b are also mounted to the body 102 of the robot 100, e.g., indirectly mounted to the body 102 through the housing 124. Alternatively, or additionally, the cleaning head 204 can be a removable assembly of the robot 100 in which the housing 124 with the cleaning rollers 205a and 205b mounted therein is removably mounted to the body 102 of the robot 100. The housing 124 and the cleaning rollers 205a and 205b can removable from the body 102 as a unit so that the cleaning head 205 is easily interchangeable with a replacement cleaning head.

[0035] The control system can further include a sensor system with one or more electrical sensors. The sensor system can generate a signal indicative of a current location of the robot 100, and can generate signals indicative of locations of the robot 100 as the robot 100 travels along the floor surface 50.

[0036] Cliff sensors 134 (shown in FIG. 2A) can be located along a bottom portion of the housing 200. Each of the cliff sensors 134 can be an optical sensor that can be configured to detect a presence or absence of an object below the optical sensor, such as the floor surface 50. The cliff sensors 134 can be connected to the controller 212. A bumper 138 can be removably secured to the body 102 and can be movable relative to body 102 while mounted thereto. In some examples, the bumper 138 form part of the body 102. The bump sensors 139a and 139b (the bump sensors 139) can be connected to the body 102 and engageable or configured to interact with the bumper 138. The bump sensors 139 can include break beam sensors, capacitive sensors, switches, or other sensors that can detect contact between the robot 100, i.e., the bumper 138, and objects in the environment 40. The bump sensors 139 can be in communication with the controller 212.

[0037] An image capture device 140 can be a camera connected to the body 102 and can extend through the bumper 138 of the robot 100, such as through an opening 143 of the bumper 138. The image capture device 140 can be a camera, such as a front-facing camera, configured to generate a signal based on imagery of the environment 40 of the robot 100 as the robot 100 moves about the floor surface 50. The image capture device 140 can transmit the signal to the controller 212 for use for navigation and cleaning routines.

[0038] Obstacle following sensors 141 (shown in FIG. 2B) can include an optical sensor facing outward from the bumper 138 and that can be configured to detect the presence or the absence of an object adjacent to a side of the body 102. The obstacle following sensor 141 can emit an optical beam horizontally in a direction perpendicular (or nearly perpendicular) to the forward drive direction of the robot 100. The optical emitter can emit an optical beam outward from the robot 100, e.g., outward in a horizontal direction, and the optical detector detects a reflection of the optical beam that reflects off an object near the robot 100. The robot 100, e.g., using the controller 212, can determine a time of flight of the optical beam and thereby determine a distance between the optical detector and the object, and hence a distance between the robot 100 and the object.

[0039] A side brush 142 can be connected to an underside of the robot 100 and can be connected to a motor 144 operable to rotate the side brush 142 with respect to the body 102 of the robot 100. The side brush 142 can be configured to engage debris to move the debris toward the cleaning assembly 205 or away from edges of the environment 40. The motor 144 configured to drive the side brush 142 can be in communication with the controller 112. The brush 142 can be a side brush laterally offset from a center of the robot 100 such that the brush 142 can extend beyond an outer perimeter of the body 102 of the robot 100. Similarly, the brush 142 can also be forwardly offset of a center of the robot 100 such that the brush 142 also extends beyond the bumper 138.

[0040] The robot 100 can also optionally include one or more dirt sensors 145 connected to the body 202 and in communication with the controller 112. The dirt sensors 145 can be a microphone, piezoelectric sensor, optical sensor, or the like located in or near a flow path of debris, such as near an opening of the cleaning rollers 205 or in one or more ducts within the body 202. This can allow the dirt sensor(s) 145 to detect how much dirt is being ingested by the vacuum assembly 118 (e.g., via the extractor 204) at any time during a cleaning mission. Because the robot 100 can be aware of its location, the robot 100 can keep a log or record of which areas or rooms of the map are dirtier or where more dirt is collected. This information can be used in several ways, as discussed further below.

Operation of the Robot

[0041] In operation of some examples, the robot 100 can be propelled in a forward drive direction or a rearward drive direction. The robot 100 can also be propelled such that the robot 100 turns in place or turns while moving in the forward drive direction or the rearward drive direction.

[0042] When the controller 212 causes the robot 100 to perform a mission, the controller 212 can operate the motors 208 to drive the drive wheels 210 and propel the robot 100 along the floor surface 50. In addition, the controller 212 can operate the motors 214 to cause the rollers 205a and 205b to rotate, can operate the motor 144 to cause the brush 142 to rotate, and can operate the motor of the vacuum system 118 to generate airflow. The controller 212 can also execute software stored on the memory 213 to cause the robot 100 to perform various navigational and cleaning behaviors by operating the various motors or components of the robot 100. [0043] The various sensors of the robot 100 can be used to help the robot navigate and clean within the environment 40. For example, the cliff sensors 134 can detect obstacles such as drop-offs and cliffs below portions of the robot 100 where the cliff sensors 134 are disposed. The cliff sensors 134 can transmit signals to the controller 212 so that the controller 212 can redirect the robot 100 based on signals from the cliff sensors 134.

[0044] In some examples, a bump sensor 139a can be used to detect movement of the bumper 138 along a fore-aft axis of the robot 100. A bump sensor 139b can also be used to detect movement of the bumper 138 along one or more sides of the robot 100. The bump sensors 139 can transmit signals to the controller 212 so that the controller 212 can redirect the robot 100 based on signals from the bump sensors 139.

[0045] The image capture device 140 can be configured to generate a signal based on imagery of the environment 40 of the robot 100 as the robot 100 moves about the floor surface 50. The image capture device 140 can transmit such a signal to the controller 212. The image capture device 140 can be angled in an upward direction, e.g., angled between 5 degrees and 45 degrees from the floor surface 50 about which the robot 100 navigates. The image capture device 140, when angled upward, can capture images of wall surfaces of the environment so that features corresponding to objects on the wall surfaces can be used for localization.

[0046] In some examples, the obstacle following sensors 141 can detect detectable objects, including obstacles such as furniture, walls, persons, and other objects in the environment of the robot 100. In some implementations, the sensor system can include an obstacle following sensor along a side surface, and the obstacle following sensor can detect the presence or the absence an object adjacent to the side surface. The one or more obstacle following sensors 141 can also serve as obstacle detection sensors, similar to the proximity sensors described herein.

[0047] The robot 100 can also include sensors for tracking a distance travelled by the robot 100. For example, the sensor system can include encoders associated with the motors 208 for the drive wheels 210, and the encoders can track a distance that the robot 100 has travelled. In some implementations, the sensor can include an optical sensor facing downward toward a floor surface. The optical sensor can be positioned to direct light through a bottom surface of the robot 100 toward the floor surface 50. The optical sensor can detect reflections of the light and can detect a distance travelled by the robot 100 based on changes in floor features as the robot 100 travels along the floor surface 50.

[0048] The controller 212 can use data collected by the sensors of the sensor system to control navigational behaviors of the robot 100 during the mission. For example, the controller 212 can use the sensor data collected by obstacle detection sensors of the robot 100, (the cliff sensors 134, the bump sensors 139, and the image capture device 140) to enable the robot 100 to avoid obstacles within the environment of the robot 100 during the mission.

[0049] The sensor data can also be used by the controller 212 for simultaneous localization and mapping (SLAM) techniques in which the controller 212 extracts or interprets features of the environment represented by the sensor data and constructs a map of the floor surface 50 of the environment. The sensor data collected by the image capture device 140 can be used for techniques such as vision-based SLAM (VSLAM) in which the controller 212 extracts visual features corresponding to objects in the environment 40 and constructs the map using these visual features. As the controller 212 directs the robot 100 about the floor surface 50 during the mission, the controller 212 can use SLAM techniques to determine a location of the robot 100 within the map by detecting features represented in collected sensor data and comparing the features to previously stored features. The map formed from the sensor data can indicate locations of traversable and non-traversable space within the environment. For example, locations of obstacles can be indicated on the map as non-traversable space, and locations of open floor space can be indicated on the map as traversable space.

[0050] The sensor data collected by any of the sensors can be stored in the memory 213. In addition, other data generated for the SLAM techniques, including mapping data forming the map, can be stored in the memory 213. These data produced during the mission can include persistent data that are produced during the mission and that are usable during further missions. In addition to storing the software for causing the robot 100 to perform its behaviors, the memory 213 can store data resulting from processing of the sensor data for access by the controller 212. For example, the map can be a map that is usable and updateable by the controller 212 of the robot 100 from one mission to another mission to navigate the robot 100 about the floor surface 50.

[0051] The persistent data, including the persistent map, helps to enable the robot 100 to efficiently clean the floor surface 50. For example, the map enables the controller 212 to direct the robot 100 toward open floor space and to avoid non- traversable space. In addition, for subsequent missions, the controller 212 can use the map to optimize paths taken during the missions to help plan navigation of the robot 100 through the environment 40.

Network Examples

[0052] FIG. 4 is a diagram illustrating by way of example and not limitation a communication network 400 that enables networking between the mobile robot 100 and one or more other devices, such as a mobile device 404, a cloud computing system 406, or another autonomous robot 408 separate from the mobile robot 100. Using the communication network 410, the robot 100, the mobile device 404, the robot 408, and the cloud computing system 406 can communicate with one another to transmit and receive data from one another. In some examples, the robot 100, the robot 408, or both the robot 100 and the robot 408 communicate with the mobile device 404 through the cloud computing system 406. Alternatively, or additionally, the robot 100, the robot 408, or both the robot 100 and the robot 408 communicate directly with the mobile device 404. Various types and combinations of wireless networks (e.g., Bluetooth, radio frequency, optical based, etc.) and network architectures (e.g., mesh networks) can be employed by the communication network 410.

[0053] In some examples, the mobile device 404 can be a remote device that can be linked to the cloud computing system 406 and can enable a user to provide inputs. The mobile device 404 can include user input elements such as, for example, one or more of a touchscreen display, buttons, a microphone, a mouse, a keyboard, or other devices that respond to inputs provided by the user. The mobile device 404 can also include immersive media (e.g., virtual reality) with which the user can interact to provide input. The mobile device 404, in these examples, can be a virtual reality headset or a head-mounted display.

[0054] The user can provide inputs corresponding to commands for the mobile robot 100. In such cases, the mobile device 404 can transmit a signal to the cloud computing system 406 to cause the cloud computing system 406 to transmit a command signal to the mobile robot 100. In some implementations, the mobile device 404 can present augmented reality images. In some implementations, the mobile device 404 can be a smart phone, a laptop computer, a tablet computing device, or other mobile device.

[0055] According to some examples discussed herein, the mobile device 404 can include a user interface configured to display a map of the robot environment. A robot path, such as that identified by a coverage planner, can also be displayed on the map. The interface can receive a user instruction to modify the environment map, such as by adding, removing, or otherwise modifying a keep-out zone in the environment; adding, removing, or otherwise modifying a focused cleaning zone in the environment (such as an area that requires repeated cleaning); restricting a robot traversal direction or traversal pattern in a portion of the environment; or adding or changing a cleaning rank, among others.

[0056] In some examples, the communication network 410 can include additional nodes. For example, nodes of the communication network 410 can include additional robots. Also, nodes of the communication network 410 can include network-connected devices that can generate information about the environment 40. Such a network-connected device can include one or more sensors, such as an acoustic sensor, an image capture system, or other sensor generating signals, to detect characteristics of the environment 40 from which features can be extracted. Network-connected devices can also include home cameras, smart sensors, or the like.

[0057] In the communication network 410, the wireless links can utilize various communication schemes, protocols, etc., such as, for example, Bluetooth classes, Wi-Fi, Bluetooth-low-energy, also known as BLE, 802.15.4, Worldwide Interoperability for Microwave Access (WiMAX), an infrared channel, satellite band, or the like. In some examples, wireless links can include any cellular network standards used to communicate among mobile devices, including, but not limited to, standards that qualify as 1G, 2G, 3G, 4G, 5G, 6G, or the like. The network standards, if utilized, qualify as, for example, one or more generations of mobile telecommunication standards by fulfilling a specification or standards such as the specifications maintained by International Telecommunication Union. For example, the 4G standards can correspond to the International Mobile Telecommunications Advanced (IMT-Advanced) specification. Examples of cellular network standards include AMPS, GSM, GPRS, UMTS, LTE, LTE Advanced, Mobile WiMAX, and WiMAX- Advanced. Cellular network standards can use various channel access methods, e g., FDMA, TDMA, CDMA, or SDMA.

Suction Guide Examples

[0058] FIG. 5A illustrates a bottom perspective view of a mobile cleaning robot 500. FIG. 5B illustrates a bottom perspective view of the mobile cleaning robot 500. FIGS. 5A and 5B are discussed together below. The mobile cleaning robot 500 can be similar to the robots discussed above. The mobile cleaning robot 500 can include guides for varying suction in the mobile cleaning robot 500, such as by directing or focusing suction. Any of the robots discussed above or below can include the features of the mobile cleaning robot 500.

[0059] The mobile cleaning robot 500 can include a body 502 that can be movable within an environment and can define or include a suction duct 348. The mobile cleaning robot 500 can also include a cleaning assembly 504 connected to the body 502. The cleaning assembly 504 can be configured to at least partially define, together with a floor surface of the environment, a debris port 546 connected to the suction duct (e.g., 348). The cleaning assembly 504 can also include a cleaning guide assembly 548 adjustable to alter suction through the debris port. [0060] More specifically, the guide assembly 548 can include debris guides 550a-550n and flaps 552a-552n located between the debris guides 550. The guide assembly 548 can include 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, or the like debris guides 550. The debris guides 550 can each, or collectively, be connected to the body, such as forward of the cleaning assembly 204. The debris guides 550 can extend downward from the body 502 to at least partially define, together with the floor surface and the cleaning assembly, the debris port 546.

[0061] The flaps 552a-552n can be located adjacent the debris guides 550a-550n and each flap 552 can be optionally between, to at least partially define, together with the floor surface and the debris guides, the debris port 546. The flaps 552 can each be rigid, semi-rigid or flexible and can be made of one or more of metals, plastics, foams, elastomers, ceramics, composites, combinations thereof, or the like. [0062] As shown in FIG. 5B, the flaps 552a-552n can be connected to and supported by a body 554. The body 554 can be connectable to the cleaning assembly 504 to secure the flaps to the cleaning assembly 504. The body 554 can be secured using a snap (e.g., interference) interface, one or more fasteners, adhesive, or the like. Optionally, the flaps 552 can be retractable, as discussed in further detail below.

[0063] When the flaps 552a-552n and the body 554 are removed (or retracted, as discussed in embodiments below) from the cleaning assembly 504, the debris guides 550a-550n can form gaps 556a-556n therebetween. The gaps 556a-556n can allow relatively large debris to pass into the debris port 546 such as to be extracted by the rollers (e.g., rollers 205). When the flaps 552a-552n and body 554 are inserted into the cleaning assembly 504, the gaps 556a-556n can be eliminated or can be reduced in size. This reduction or elimination of the gaps 556 can alter the geometry of the debris port 546 such as to increase suction pressure through the debris port 546, which can more effectively capture or ingest small or fine debris.

[0064] Because the guide assembly 548 can be modified by the user, the mobile cleaning robot 500 can be modified by the user (or by the factory) for improved cleaning efficiency or effectiveness depending on the environment in which the robot 500 operates. For example, users with mostly hard floor surfaces may install the body 554 and thee flaps 552 to focus on extraction of fine debris from hard surfaces. Conversely, users with a lot of large debris may remove the flaps 552 and body 554 to increase cleaning efficiency of large debris. Optionally, users with high-pile carpeting may install the body 554 and thee flaps 552 to increase cleaning performance in such environments.

[0065] FIG. 6 illustrates a bottom perspective view of a mobile cleaning robot 600. The mobile cleaning robot 600 can be similar to the robots discussed above. The mobile cleaning robot 600 can include rear and side guides for varying suction in the mobile cleaning robot 600, such as by directing or focusing suction. Any of the robots discussed above or below can include the features of the mobile cleaning robot 600.

[0066] The mobile cleaning robot 600 can include a guide assembly 648 that can be included in addition to the guide assembly 548 discussed above. The guide assembly 648 can include a body 654 that can be user-securable to a portion of a cleaning assembly 604, such as at or around side or rear perimeters of the cleaning assembly 604. The body 654 can include bristles 652 (or fibrous members) extending from the body and configured to extend downward relative to a body 602 of the mobile cleaning robot 600 when the guide assembly 648 is secured to the cleaning assembly 604. The bristles 652 (and the guide assembly 648) can increase suction through the suction port 646 when the guide 648 is in the extended position (or connected configuration) and can decrease suction through the suction port 646 when the guide is in the retracted position (or disconnected configuration). In this way, the 648 can also allow a user to optimize or improve cleaning efficiency for their environment. Optionally, as discussed below, the guide 648 can be movable between an extended position and a retracted position.

[0067] FIG. 7A illustrates a bottom perspective view of a mobile cleaning robot 700. FIG. 7B illustrates a bottom perspective view of a cleaning assembly 704 of a mobile cleaning robot. FIGS. 7A and 7B are discussed together below. The mobile cleaning robot 700 can include a retractable guide system for directing or focusing suction in the mobile cleaning robot 700. Any of the robots discussed above or below can include the features of the mobile cleaning robot 700. The robot 700 can be a round robot. Any of the robots discussed above or below can be round or can have other shapes and each or any can include any of the guide assemblies or systems discussed above or below.

[0068] The mobile cleaning robot 700 can include a body 702 configured to receive and retain a cleaning assembly 704 therein or thereon, including a frame 756. The cleaning assembly 704 can be similar to those discussed above, but can include a retractable guide system 748. The retractable guide system 748 and the cleaning assembly 704 (together with the floor surface) can define a suction port 746.

[0069] The retractable guide system 748 can also include a flap or seal 752 that can be affixed or mounted to the frame 756. The flap 752 can extend between debris guides 750a-750n of the 704 (similar to that of the flaps 552 discussed above). The seal 752 can be a single piece as shown in FIG. 7B or can be multiple pieces (similar to the flaps 552).

[0070] The retractable guide system 748 can also include a drive system 758 that can be operable to move the seal 752 between an extended position and a retracted position. The drive system 758 can include an actuator 760 and a motor 762 each of which can be secured or mounted to the frame 756. The motor 762 can be any AC or DC motor including rotary or linear motors. Optionally, multiple motors can be used. The actuator 760 can be any device configured to translate movement of the motor 762 into movement of the seal 752. Optionally, the actuator 760 and the motor 762 can be a single component, such as a linear actuator. One or more of the actuator 760 and the motor 762 can be connected to a controller (e.g., the controller 112).

[0071] The motor 762 can be connected to the actuator 760 such that operation of the motor 762 can drive or operate the actuator 760 to operate (e.g., move) the seal 752 relative to the frame 756 and the debris guides 750a-750n between an extended position (shown in FIG. 7B) and a retracted position (shown in FIG. 7A). The flap 752 can focus suction through the suction port 746 when in the extended position and can decrease suction through the suction port 746 when in the retracted position to allow for large debris flow into the suction port.

[0072] In operation of some examples, as discussed in further detail below, a controller (e.g., the controller 112) can operate the motor 762 to move the flap 752 between the extended position and the retracted position based on the environment of the robot 700 or other variables to increase cleaning effectiveness or efficiency. For example, the controller 112 can operate the motor 762 to move the seal 752 based on one or more of a detected flooring type, a change in flooring type, a determined location of the robot, detected obstacles, detected debris type, detected debris volume, speed of the robot, or other variables. Various control methods are discussed in further detail below.

[0073] Optionally, a height or extension of the flap 752 can be adjustable. For example, depending on operating conditions, the controller 112 can extend the seal 752 downward between 0% and 100% of possible extension of the flap or seal 752. For example, 100% extension can be a distance between the bottom of the body 702 of the robot and a floor surface and 10% can be a semi-retracted state. The controller can move the seal 752 to any position between 0% and 100% based on conditions of the environment (e.g., object detection, floor type, speed, selected modes, etc.).

[0074] FIG. 8 illustrates a partially exploded bottom perspective view of a cleaning assembly 804 of a mobile cleaning robot. The cleaning assembly can be similar to any of the cleaning assemblies discussed above or below. The cleaning assembly 804 can include a retractable guide system for varying suction in a mobile cleaning robot, such as directing or focusing suction. Any of the cleaning assemblies discussed above or below can include the features of the cleaning assembly 804. [0075] The cleaning assembly 804 can include a guide assembly 848 that can include a body 854 and bristles 852. The bristles can extend from the body, such as downward therefrom when the guide assembly 848 is secured to the cleaning assembly 804 and the robot (e.g., the robot 100). The guide assembly 848 can also include an actuator 860 and motor 862. The motor 862 can be similar to the motor 762 discussed above. The actuator 860 can be similar to the actuator 760 discussed above, but can be a different type of actuator. For example, the actuator 860 can include a rack 864 and pinion 866. The rack 864 can be connected to the body 854 and the pinion 866 can be connected to a shaft of the motor 862. The guide assembly 848 can also include a driver 868, which can be a cable or other member extending around at least a portion of a perimeter of the body 854. The driver 868 can also be connected to the rack 864.

[0076] The body 854 can include one or more slots 870a-870n extending at least partially into (or through) the body 854. A cam 872 can be located within each slot 870 and can be connected to the frame 856 of the cleaning assembly 804. The cams 872 can each be movable within or along their slot 870. The driver 868 can be connected to each of the cams 872. The entire guide assembly 848 can be connected to, or positioned within, the frame 856 such that operation of the guide assembly 848 causes adjustment of the bristles 852 relative to the suction duct 348 and the frame 856.

[0077] In operation of some examples, as the motor 862 is operated to turn (e.g., by the controller 112), the pinion 866 can be driven to rotate while engaged with the rack 864, causing the rack 864 to translate and therefore to move the driver 868. As the driver 868 is translated, the cams 870 can be moved along their respective slots 870. For example, translation of the cams 870 in a first direction (e.g., clockwise around the perimeter of the body 854) can lower the guide assembly 848 relative to the frame 856. Translation of the cams 870 in a second direction (e.g., counterclockwise around the perimeter of the body 854) can raise the guide assembly 848 relative to the frame 856. In this way, the guide assembly 848 can be operated to raise and lower the bristles 852 to modify the suction port (e.g., the suction port 646).

[0078] FIG. 9 illustrates a schematic view of a method 900, in accordance with at least one example of this disclosure. The method 900 can be a method of varying or controlling suction through a debris port of a cleaning assembly of a mobile cleaning robot. More specific examples of the method 900 are discussed below. The steps or operations of the method 900 are illustrated in a particular order for convenience and clarity; many of the discussed operations can be performed in a different sequence or in parallel without materially impacting other operations. The method 900 as discussed includes operations performed by multiple different actors, devices, and/or systems. It is understood that subsets of the operations discussed in the method 900 can be attributable to a single actor, device, or system could be considered a separate standalone process or method.

[0079] At step 902, a flooring type of a floor surface of an environment can be detected. For example, a flooring type of the floor surface 50 of the environment 40 can be detected using one or more sensors (e.g., the image capture device 140). The controller 112 can use one or more algorithms to make such a detection, such as visual scene understanding (VSU), obstacle detection or obstacle avoidance (ODO A), or the like. Optionally, the controller 112 can determine flooring type or pile direction based on current draw of the vacuum system 118. For example, when the robot 100 traverses the floor in a second direction where current draw is higher than a first direction, the controller 112 can determine the floor type is carpeted and can determine the direction of the pile.

[0080] At step 904, a location of the mobile cleaning robot within the environment can be determined. For example, a location of the robot 100 can be detected using one or more sensors (e.g., the image capture device 140). The controller 112 can use one or more algorithms to make a location determination of the robot 100, such as SLAM, VSLAM, or the like.

[0081] At step 906, a position of a guide of a cleaning assembly can be adjusted to adjust a debris port defined, at least in part, by the guide and the floor surface. For example, a position of the guide 548 (or 648) can be adjusted to adjust the debris port 546, such as by the user, based on the floor surface type. Optionally, at step 906, the adjustment can be made by a controller (e.g., 112) to adjust the retractable guide system 748 (or guide assembly 848). The adjustment by the controller 112 can be based on one or more of the determined or detected location of the robot and the determined or detected flooring type.

[0082] At step 908, debris on or from the floor surface of the environment can be detected. For example, debris 75 of the floor surface 50 of the environment 40 can be detected using one or more sensors (e.g., the image capture device 140). The controller 112 can use one or more algorithms using information or data from the sensors to make such a detection, such as visual scene understanding (VSU), obstacle detection or obstacle avoidance (ODOA), or the like. Optionally, the controller 112 can make the detection using a dirt sensor (e.g., the one or more dirt sensors 145).

[0083] At step 910, a debris type of the detected debris can be determined. For example, the controller 112 can determine a size or shape of the debris based on information collected from the one or more dirt sensors 145. The controller 112 can also determine in which rooms more or less debris is detected. Optionally, the position of the guide can be adjusted based on the detected debris type. For example, the controller 112 can adjust the retractable guide system 748 (or guide assembly 848) based on the amount or type of debris. Such adjustments can be made continuously through a cleaning mission. For example, when a large amount of hair (e.g., pet hair) within an environment is detected, the controller 112 can increase suction by lowering the guide (e.g., of retractable guide system 748) to help increase hair ingestion. Optionally, a user can select a mode (e.g., through a user interface), such as a pet hair mode, where selection of such a mode can increase suction through lowering the guide where possible.

[0084] At step 912, a rotational speed of one or more rollers of the cleaning assembly can be adjusted based on the position of the guide. For example, rotational speed of the rollers 205 can be adjusted based on a position of the retractable guide system 748 (or guide assembly 848), such as to optimize cleaning efficiency or effectiveness during a cleaning mission. For example, when a large amount of hair (e.g., pet hair) within an environment is detected, the controller 112 can decrease a speed of the rollers (optionally along with increasing suction by lowering the guide (e.g., retractable guide system 748)) to help increase hair ingestion. Optionally, a user can select a mode (e.g., through a user interface), such as a pet hair mode, where selection of such a mode can reduce roller speed where possible.

[0085] At step 914 a drive speed of the robot can be determined. For example, the controller 112 (such as using information from the image capture device 140 or other sensors) can perform odometry calculations (e.g., visual odometry (VO)) to determine a speed of the robot 100. At step 916, the drive speed of the mobile cleaning robot can be adjusted, such as based on the determined drive speed and the position of the guide. For example, the controller 112 can adjust a speed of the robot 100 or a position of the retractable guide system 748 (or guide assembly 848) based on the speed or the position of the guide, such as to ensure debris is captured. In another example, when the controller 112 detects that the robot 100 is not moving as fast as it is supposed to move based on drive wheel output, the controller 112 may determine that ingestion of carpeting may be reducing the speed of the robot more than desired. In such a case the controller 112 can raise (or otherwise adjust) the retractable guide system 748 to improve robot mobility.

[0086] In some examples, when a large amount of hair (e.g., pet hair) within an environment is detected, the controller 112 can decrease a speed of the robot (optionally along with increasing suction by lowering the guide (e.g., retractable guide system 748)) to help increase hair ingestion. Optionally, a user can select a mode (e.g., through a user interface), such as a pet hair mode, where selection of such a mode can reduce robot speed where possible.

[0087] At step 918, a number of passes performed in a designated space of the environment can be determined. For example, a number of passes of the robot 100 within any room or rooms 50 of the environment 40 can be calculated based on previous mission(s) or planned future mission(s). The controller 112 may receive instructions or determine that the robot 100 should pass through a room (or rooms) of the environment multiple times to perform a deep clean operation. In some instances, the retractable guide system 748 can be adjusted based on which pass is being performed. For example, in a first pass, the retractable guide system 748 can be operated to raise the guide for large debris extraction and during a second pass the retractable guide system 748 can lower the guide to extract smaller (or fine) debris.

[0088] At step 920, a drive direction of the mobile cleaning robot can be adjusted based on the determined number of passes. For example, during a second pass the robot 100 can be navigated to traverse the environment perpendicular to the direction of the ranks or passes of the first cleaning pass. Optionally, adjusting the position of the guide can include extending the guide (e.g., the retractable guide system 748) toward the cleaning surface or retracting the retractable guide system 748 away from the cleaning surface.

[0089] FIG. 10 illustrates a block diagram of an example machine 1000 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform. Examples, as described herein, may include, or may operate by, logic or a number of components, or mechanisms in the machine 1000. Circuitry (e.g., processing circuitry) is a collection of circuits implemented in tangible entities of the machine 1000 that include hardware (e.g., simple circuits, gates, logic, etc.). Circuitry membership may be flexible over time. Circuitries include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuitry may be immutably designed to carry out a specific operation (e.g., hardwired). In an example, the hardware of the circuitry may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a machine readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation. In connecting the physical components, the underlying electrical properties of a hardware constituent are changed, for example, from an insulator to a conductor or vice versa. The instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuitry in hardware via the variable connections to carry out portions of the specific operation when in operation. Accordingly, in an example, the machine readable medium elements are part of the circuitry or are communicatively coupled to the other components of the circuitry when the device is operating. In an example, any of the physical components may be used in more than one member of more than one circuitry. For example, under operation, execution units may be used in a first circuit of a first circuitry at one point in time and reused by a second circuit in the first circuitry, or by a third circuit in a second circuitry at a different time. Additional examples of these components with respect to the machine 1000 follow.

[0090] In alternative embodiments, the machine 1000 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1000 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 1000 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 1000 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.

[0091] The machine (e.g., computer system) 1000 may include a hardware processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 1004, a static memory (e.g., memory or storage for firmware, microcode, a basic- input-output (BIOS), unified extensible firmware interface (UEFI), etc.) 1006, and mass storage 1008 (e.g., hard drive, tape drive, flash storage, or other block devices) some or all of which may communicate with each other via an interlink (e.g., bus) 1030. The machine 1000 may further include a display unit 1010, an alphanumeric input device 1012 (e.g., a keyboard), and a user interface (UI) navigation device 1014 (e.g., a mouse). In an example, the display unit 1010, input device 1012 and UI navigation device 1014 may be a touch screen display. The machine 1000 may additionally include a storage device (e.g., drive unit) 1008, a signal generation device 1018 (e.g., a speaker), a network interface device 1020, and one or more sensors 1016, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 1000 may include an output controller 1028, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).

[0092] Registers of the processor 1002, the main memory 1004, the static memory 1006, or the mass storage 1008 may be, or include, a machine readable medium 1022 on which is stored one or more sets of data structures or instructions 1024 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 1024 may also reside, completely or at least partially, within any of registers of the processor 1002, the main memory 1004, the static memory 1006, or the mass storage 1008 during execution thereof by the machine 1000. In an example, one or any combination of the hardware processor 1002, the main memory 1004, the static memory 1006, or the mass storage 1008 may constitute the machine readable media 1022. While the machine readable medium 1022 is illustrated as a single medium, the term "machine readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 1024.

[0093] The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 1000 and that cause the machine 1000 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, optical media, magnetic media, and signals (e.g., radio frequency signals, other photon based signals, sound signals, etc.). In an example, a non-transitory machine readable medium comprises a machine readable medium with a plurality of particles having invariant (e.g., rest) mass, and thus are compositions of matter. Accordingly, non- transitory machine-readable media are machine readable media that do not include transitory propagating signals. Specific examples of non-transitory machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.

[0094] The instructions 1024 may be further transmitted or received over a communications network 1026 using a transmission medium via the network interface device 1020 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 1020 may include one or more physical jacks (e.g., Ethernet, coaxial, or phonejacks) or one or more antennas to connect to the communications network 1026. In an example, the network interface device 1020 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 1000, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. A transmission medium is a machine readable medium.

Notes And Examples

[0095] The following, non-limiting examples, detail certain aspects of the present subject matter to solve the challenges and provide the benefits discussed herein, among others.

[0096] Example 1 is a mobile cleaning robot comprising: a body movable within an environment and defining a suction duct; and a cleaning assembly connected to the body and configured to at least partially define, together with a floor surface of the environment, a debris port connected to the suction duct, the cleaning assembly engageable with a surface of the environment to direct debris through the debris port, and the cleaning assembly including a guide adjustable to alter suction through the debris port.

[0097] In Example 2, the subject matter of Example 1 optionally includes a plurality of debris guides connected to the body forward of the cleaning assembly and extending downward from the body to at least partially define, together with the floor surface and the cleaning assembly, the debris port.

[0098] In Example 3, the subject matter of Example 2 optionally includes wherein the guide includes a plurality of flaps, each flap extendable between adjacent debris guides of the plurality of debris guides to at least partially define, together with the floor surface and the debris guides, the debris port.

[0099] In Example 4, the subject matter of Example 3 optionally includes wherein the guide is movable between an extended position and a retracted position. [00100] In Example 5, the subject matter of Example 4 optionally includes a drive assembly operable to move the guide between the extended position and the retracted position. [00101] In Example 6, the subject matter of any one or more of Examples 3-5 optionally include wherein the guide is releasably connectable to the cleaning assembly to alter the debris port.

[00102] In Example 7, the subject matter of any one or more of Examples 1-6 optionally include wherein the guide is a bristle guide connected to a rear portion of the cleaning assembly.

[00103] In Example 8, the subject matter of Example 7 optionally includes wherein the bristle guide extends along lateral sides of the cleaning assembly from the rear portion.

[00104] In Example 9, the subject matter of any one or more of Examples 7-8 optionally include wherein the guide is movable between an extended position and a retracted position.

[00105] In Example 10, the subject matter of Example 9 optionally includes a drive assembly operable to move the guide between the extended position and the retracted position.

[00106] In Example 11, the subject matter of Example 10 optionally includes wherein the guide increases suction through the suction port when the guide is in the extended position and decreases suction through the suction port when the guide is in the retracted position.

[00107] In Example 11, the subject matter of any one or more of Examples 7-11 optionally include wherein the guide is releasably connectable to the cleaning assembly to alter the debris port.

[00108] Example 13 is a method of operating a mobile cleaning robot, the method comprising: determining a floor type of a floor surface of an environment; determining a location of the mobile cleaning robot within the environment; and adjusting a position of a guide of a cleaning assembly of the mobile cleaning robot to adjust a debris port defined, at least in part, by the guide and the floor surface. [00109] In Example 14, the subject matter of Example 13 optionally includes wherein adjusting the position of the guide includes extending the guide toward the cleaning surface or retracting the guide away from the cleaning surface. [00110] In Example 15, the subject matter of any one or more of Examples 13-14 optionally include detecting debris on the floor surface of the environment; determining a debris type of the detected debris; and adjusting the position of the guide based on the debris type.

[00111] In Example 16, the subject matter of any one or more of Examples 13-15 optionally include adjusting a rotational speed of one or more rollers of the cleaning assembly based on the position of the guide.

[00112] In Example 17, the subject matter of any one or more of Examples 13-16 optionally include determining a drive speed of the robot; and adjusting the drive speed of the mobile cleaning robot based on the determined drive speed and the position of the guide.

[00113] In Example 18, the subject matter of any one or more of Examples 13-17 optionally include determining a number of passes performed in a designated space of the environment; and adjusting a drive direction of the mobile cleaning robot based on the determined number of passes.

[00114] Example 19 is a non-transitory machine-readable medium including instructions, for pre-operatively operating a mobile cleaning robot, which when executed by a machine, cause the machine to: determine a floor type of a floor surface of an environment; determine a location of the mobile cleaning robot within the environment; and adjust a position of a guide of a cleaning assembly of the mobile cleaning robot to adjust a debris port defined, at least in part, by the guide and the floor surface.

[00115] In Example 20, the subject matter of Example 19 optionally includes the processor further configured to: detect debris on the floor surface of the environment; determine a debris type of the detected debris; and adjust the position of the guide based on the debris type.

[00116] In Example 21, the subject matter of any one or more of Examples 19-20 optionally include wherein adjusting the position of the guide includes extending the guide toward the cleaning surface or retracting the guide away from the cleaning surface. [00117] In Example 22, the apparatuses or method of any one or any combination of Examples 1 - 21 can optionally be configured such that all elements or options recited are available to use or select from.

[00118] The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show, by way of illustration, specific embodiments in which the invention can be practiced. These embodiments are also referred to herein as “examples.” Such examples can include elements in addition to those shown or described. However, the present inventors also contemplate examples in which only those elements shown or described are provided. Moreover, the present inventors also contemplate examples using any combination or permutation of those elements shown or described (or one or more aspects thereof), either with respect to a particular example (or one or more aspects thereof), or with respect to other examples (or one or more aspects thereof) shown or described herein.

[00119] In the event of inconsistent usages between this document and any documents so incorporated by reference, the usage in this document controls. In this document, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, composition, formulation, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim.

[00120] In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one, independent of any other instances or usages of “at least one” or “one or more.” In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. In this document, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, composition, formulation, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.

[00121] The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other embodiments can be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is provided to comply with 37 C.F.R. §1.72(b), to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Also, in the above Detailed Description, various features may be grouped together to streamline the disclosure. This should not be interpreted as intending that an unclaimed disclosed feature is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that such embodiments can be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.