Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND DEVICE FOR CREATING A ROBOT CONTROL PROGRAM
Document Type and Number:
WIPO Patent Application WO/2020/070287
Kind Code:
A1
Abstract:
The invention relates to a method for creating a robot control program for operating a machine tool (110, 148), in particular a bending machine, having the steps: - generating (500) image material of a machining operation of a workpiece (116, 146, 200, 300, 400) on the machine tool (110, 148) by means of at least one optical sensor; - extracting (520) at least one part of the workpiece (116, 146, 200, 300, 400) and/or at least one part of a hand (202) of an operator (118) handling the workpiece (116, 146, 200, 300, 400) from the image material; - generating (530) a trajectory and/or a sequence of movement points of at least one part of the workpiece (116, 146, 200, 300, 400) and/or at least one part of a hand (202) of an operator (118) from the extracted image material; and - creating (550) a robot control program by reverse transformation of the trajectory and/or the sequence of movement points.

Inventors:
SEIBERT FRIEDEMANN (DE)
Application Number:
PCT/EP2019/076913
Publication Date:
April 09, 2020
Filing Date:
October 04, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BYSTRONIC LASER AG (CH)
International Classes:
B25J9/16; G05B19/42
Foreign References:
DE112016006116T52018-09-13
Other References:
JENS LAMBRECHT ET AL: "Spatial Programming for Industrial Robots Through Task Demonstration", INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, vol. 10, no. 254, 23 May 2013 (2013-05-23), AT, pages 1 - 10, XP055302454, ISSN: 1729-8806, DOI: 10.5772/55640
Attorney, Agent or Firm:
NIEPELT PATENTANWALTSGESELLSCHAFT MBH (DE)
Download PDF:
Claims:
Claims

1. A method for creating a robot control program for operating a machine tool (110, 148), in particular a bending machine, having the steps:

- generating (500) image material of a machining operation of a workpiece (116, 146, 200, 300, 400) on the machine tool (110, 148) by means of at least one optical sensor;

- extracting (520) at least one part of the workpiece (116, 146, 200, 300, 400) and/or at least one part of a hand (202) of an operator (118) handling the workpiece (116, 146, 200, 300, 400) from the image material;

- generating (530) a trajectory and/or a sequence of movement points of at least one part of the workpiece (116, 146, 200, 300, 400) and/or at least one part of a hand (202) of an operator (118) from the extracted image material; and

- creating (550) a robot control program by reverse transformation of the trajectory and/or the sequence of movement points.

2. The method according to claim 1, characterised in that during the extraction the spatial position of the workpiece (116, 146, 200, 300, 400), at least the spatial position of a gripping region of the workpiece (116, 146, 200, 300, 400) and/or the spatial position of the hand (202) is determined.

3. The method according to one of the preceding claims, characterised in that the

trajectory and/or the sequence of movement points is present in machine coordinates and that, before the reverse transformation, a transformation into robot coordinates occurs.

4. The method according to one of the preceding claims, characterised in that finger

positions of the operator (118) are detected and used to control gripping tools of the robot (144).

5. The method according to one of the preceding claims, characterised in that the image material is divided into handling sections, in which the hand (202) is in contact with the workpiece (116, 146, 200, 300, 400), and into machining sections, in which the hand (202) is not in contact with the workpiece (116, 146, 200, 300, 400), and that gripping tools of the robot (144) are controlled accordingly at the transitions between the sections.

6. The method according to one of the preceding claims, characterised in that, for the extraction, a machining plan, in particular a bending plan and/or a 3D object of the workpiece (116, 146, 200, 300, 400) is used for an image comparison.

7. The method according to claim 6, characterised in that, after completion of a machining step, in particular a bending operation, the shape of the 3D object is updated accordingly.

8. The method according to one of the preceding claims, characterised in that the

trajectory and/or the sequence of movement points are saved and that the saved trajectory and/or sequence of movement points is transmitted to at least one robot controller for creating a robot control program.

9. The method according to one of the preceding claims, characterised in that the image material is generated by means of a camera (120) and that a spatial position of the workpiece (116, 146, 200, 300, 400) is determined on the basis of at least two optical features, such as edges (200a; 300a; 400a, 400b), corners, surfaces and combinations thereof.

10. The method according to one of claims 1 to 8, characterised in that the image material is generated by means of two cameras and that a spatial position of the workpiece (116, 146, 200, 300, 400) is determined on the basis of at least one optical feature, such as an edge (200a; 300a; 400a, 400b), corner or surface.

11. A device (100) for creating a robot control program for operating a machine tool (110, 148), in particular a bending machine, comprising

- a machine tool (110, 148) configured for machining a workpiece (116, 146, 200, 300, 400);

- at least one optical sensor configured for generating image material of a machining operation of a workpiece (116, 146, 200, 300, 400) on the machine tool (110, 148);

- at least one computing unit (130, 142) configured for extracting at least one part of the workpiece (116, 146, 200, 300, 400) and/or at least one part of a hand (202) of an operator (118) handling the workpiece (116, 146, 200, 300, 400) from the image material, for generating a trajectory and/or a sequence of movement points from the extracted image material and for creating a robot control program by reverse transformation of the trajectory and/or the sequence of movement points.

12. The device (100) according to claim 11, characterised in that the computing unit (130, 142) is connected to a machine tool (110, 148) and the optical sensor is configured for extracting at least one part of the workpiece (116, 146, 200, 300, 400) and/or at least one part of a hand (202) of an operator (118) handling the workpiece (116, 146, 200, 300, 400) from the image material and for generating a trajectory and/or a sequence of movement points from the extracted image material and that a robot controller is configured for creating a robot control program by reverse transformation of the trajectory and/or the sequence of movement points.

13. The device (100) according to claim 11 or 12 , characterised in that the computing unit (130, 142) is configured for dividing the image material into handling sections, in which the hand (202) is in contact with the workpiece (116, 146, 200, 300, 400), and into machining sections, in which the hand (202) is not in contact with the workpiece (116, 146, 200, 300, 400), and that gripping tools of the robot (144) are controlled accordingly at the transitions between the sections.

14. The device (100) according to one of claims 11 to 13, characterised in that, for the

extracting, the computing unit (130, 142) is configured for using a machining plan, in particular a bending plan and/or a 3D object of the workpiece (116, 146, 200, 300,

400) for an image comparison.

15. The device (100) according to claim 14, characterised in that, the computing unit (130, 142) is configured for updating, after completion of a machining step, in particular a bending operation, the shape of the 3D object, accordingly.

16. The device (100) according to one of claims 11 to 15, comprising a camera (120)

configured for generating the image material and that the computing unit (130, 142) is configured for determining a spatial position of the workpiece (116, 146, 200, 300, 400) on the basis of at least two optical features, such as edges (200a; 300a; 400a, 400b), corners, surfaces and combinations thereof.

Description:
Description

Method and device for creating a robot control program

The invention relates to a method and device for creating a robot control program for operating a machine tool. In particular, the invention relates to a method for creating a robot control program according to claim 1 and a device for creating a robot control program for operating a machine tool according to claim 11.

A machine tool is used to manufacture and machine workpieces using tools. For example, sheet metal working machines, in particular bending machines or presses such as press brakes or laser cutting machines are considered to be machine tools here.

Workpieces are increasingly inserted into or removed from machine tools by robots. As a result, the step of robot control programming, that is the creation of a program for controlling the movement of the robot, must be additionally incorporated into the set-up process of a machine tool.

Robots are typically programmed using operating consoles or portable control consoles. In this case the operator is often in the danger zone of the robot and therefore can only move same at creep speed. In some cases, robots are also "taught" by manual tracking, but the operator in this case stands next to the robot, so dangers can arise here as well. In addition, the programming is time-consuming.

The aim of the invention is to avoid the disadvantages of the prior art and to provide improved robot control programming for a machine tool. Alternative aims are to provide an improved method for creating a robot control program or an improved device for creating a robot control program.

This aim is achieved by a method according to claim 1 or a device according to claim 11.

The method according to the invention for creating a robot control program for operating a machine tool, in particular a bending machine, comprises the steps:

- generating image material of a machining operation of a workpiece on the machine tool by means of at least one optical sensor; - extracting at least one part of the workpiece and/or at least one part of a hand of an operator handling the workpiece from the image material;

- generating a trajectory and/or a sequence of movement points of at least one part of the workpiece and/or at least one part of a hand of an operator from the extracted image material; and

- creating a robot control program by reverse transformation of the trajectory and/or the sequence of movement points.

In contrast to known systems, the robot here references or reacts to the movement of the operator or the position of the workpiece in the room. The operator can bend a sample part, and the movements of the operator or the change in position of the bending part in the room are converted into a movement of the robot. The robot does not necessarily have to be there. The operator does not have to specify how the robot moves its axes; this can be calculated by the robot controller. In this case, the trajectory of the workpiece or the hand, which can correspond to a trajectory of the tool centre point of the robot, is converted into the motion sequence of the robot. The tool centre point is the attachment point of the robot for the gripper and can be seen as a counterpart to the wrist of the operator.

The method according to the invention for creating a robot control program proposes that a camera system can record the movements of an operator or of the workpiece and the movements of the operator or of the workpiece are converted by means of a computer system into direct movement of the robot. For this purpose, the arm of the operator can be divided into a plurality of sub-sections, corresponding to or correlating with the joints of the robot, and the hand can be understood to be a gripper. Each movement of the sub-sections of the arm or the hand is then performed online and at the same speed so that the operator can teach the robot from behind a safety fence or at a safe distance.

Similarly, the operator can also program the robot via a video feed. This allows the operator to bend a part while being observed by the camera . Thereafter, this recording or the image material is converted into movements of the robot so that a similar situation arises as if an experienced operator were showing a colleague the hand movements.

The method according to the invention for creating a robot control program has the advantage of substantially simplifying and speeding up the programming of a robot. The robot control programs can be created without any knowledge of a robot controller; the journeyman shows the apprentice once how to bend a sheet by bending a part and the apprentice or robot learns this action the first time. It is possible, during the extraction, for the spatial position of the workpiece, at least the spatial position of a gripping region of the workpiece, and/or the spatial position of the hand to be determined. Instead of a relative reference, for example, to a bending tool, a general referencing in Cartesian coordinates can be determined, which can be easier to process and to transform.

It is also possible for the trajectory and/or the sequence of movement points to be in machine coordinates and, before the reverse transformation, for there to be a

transformation into robot coordinates. Thus, the image recording environment, the camera, etc. can be measured and calibrated in the context of the machine. Instead of machine coordinates, another coordinate system, for example the camera, can also be used.

It is possible for finger positions of the operator to be detected and used to control gripping tools of the robot. Gestures can be identified by means of which special functions of the robot can also be adopted. For example, the detected "hand on" gesture can be translated into the "gripper on" command.

It is also possible for the image material to be divided into handling sections, in which the hand is in contact with the workpiece, and into machining sections, in which the hand is not in contact with the workpiece, and for gripping tools of the robot to be controlled

accordingly at the transitions between the sections. Thus, the opening state of the gripper can also be derived by means of simple image recognition.

For the extraction, it is possible for a machining plan, in particular a bending plan, and/or a 3D object of the workpiece to be used for an image comparison. Thus, the bending sequence of the part to be machined can be created in advance and stored in the controller. For example, the bending angle amount, the bending force and the sequence of bending steps can be adopted from the machining plan. This information can be used both for the robot controller and for image processing.

It is also possible, after completion of a machining step, in particular a bending operation, for the shape of the 3D object to be updated accordingly. The modified shape of the workpiece after the machining step is then updated by the new 3D object from the bending plan. The updated shape can be used for image recognition and also for the robot controller. It is possible for the trajectory and/or the sequence of movement points to be saved and for the saved trajectory and/or sequence of movement points to be transmitted to at least one robot controller for creating a robot control program. This allows offline programming of one or more robots, even those from different manufacturers.

It is also possible for the image material to be generated by means of a camera and for a spatial position of the workpiece to be determined on the basis of at least two optical features, such as edges, corners, surfaces and combinations thereof. By means of two optical features, the spatial position of the workpiece can be clearly and accurately determined.

It is possible for the image material to be generated by means of two cameras and for a spatial position of the workpiece is determined on the basis of at least one optical feature, such as an edge, corner or surface. Two cameras can be used to capture and evaluate a stereo or 3D image so that one optical feature is sufficient for determining the spatial position of the workpiece.

The device according to the invention for creating a robot control program for operating a machine tool, in particular a bending machine, comprises

- a machine tool configured for machining a workpiece;

- at least one optical sensor configured for generating image material of a machining operation of a workpiece on the machine tool;

- at least one computing unit configured for extracting at least one part of the workpiece and/or at least one part of a hand of an operator handling the workpiece from the image material, for generating a trajectory and/or a sequence of movement points from the extracted image material and for creating a robot control program by reverse

transformation of the trajectory and/or the sequence of movement points.

The device is configured to carry out the method described above. The same advantages and modifications otherwise apply as described above.

It is possible for a computing unit to be connected to a machine tool and the optical sensor to be configured for extracting at least one part of the workpiece and/or at least one part of a hand of an operator handling the workpiece from the image material and for generating a trajectory and/or a sequence of movement points from the extracted image material and that a robot controller is configured for creating a robot control program by reverse transformation of the trajectory and/or the sequence of movement points. In this way, the resources can be used efficiently for programming the robot on the basis of the movement of at least one part of the workpiece and/or at least one part of a hand of an operator.

It is possible that the computing unit is configured for dividing the image material into handling sections, in which the hand is in contact with the workpiece, and into machining sections, in which the hand is not in contact with the workpiece, and that gripping tools of the robot are controlled accordingly at the transitions between the sections. Thus, the opening state of the gripper can also be derived by means of simple image recognition.

It also is possible that, for the extracting, the computing unit is configured for using a machining plan, in particular a bending plan and/or a 3D object of the workpiece for an image comparison. Thus, the bending sequence of the part to be machined can be created in advance and stored in the controller. For example, the bending angle amount, the bending force and the sequence of bending steps can be adopted from the machining plan. This information can be used both for the robot controller and for image processing.

It is possible that the computing unit is configured for updating, after completion of a machining step, in particular a bending operation, the shape of the 3D object, accordingly. The modified shape of the workpiece after the machining step is then updated by the new 3D object from the bending plan. The updated shape can be used for image recognition and also for the robot controller.

It also is possible that the device comprises a camera configured for generating the image material and that the computing unit is configured for determining a spatial position of the workpiece on the basis of at least two optical features, such as edges, corners, surfaces and combinations thereof. By means of two optical features, the spatial position of the workpiece can be clearly and accurately determined.

Further preferred embodiments of the invention arise from the remaining features mentioned in the dependent claims.

The various embodiments of the invention mentioned in this application can, unless otherwise stated in individual cases, be advantageously combined with one another.

The invention is explained below in exemplary embodiments with reference to the corresponding drawings. The following is shown in the figures: Figure 1 is a schematic representation of a device for creating a robot control

program for operating a machine tool.

Figure 2 is a schematic representation of image material representing a workpiece when inserted into the machine tool.

Figure 3 is a schematic representation of image material representing a workpiece during machining in the machine tool.

Figure 4 is a schematic representation of image material representing a workpiece during removal from the machine tool.

Figure 5 shows a flow chart of a method for creating a robot control program.

Figure 1 shows a schematic representation of a device 100 for creating a robot control program for operating a machine tool 110. The machine tool 110 is, for example, a sheet metal working machine, in particular a bending machine or press brake.

The machine tool 110 is shown in a greatly simplified sectional view. The machine tool 110 comprises an upper tool 112 and a lower tool 114. A workpiece 116, such as a thin sheet, is bent by the machine tool 110. An operator 118 inserts the workpiece 116 in the machine tool 110, where it is then bent between the upper tool 112 and the lower tool 114. If the workpiece 116 is bent several times, the workpiece 116 is removed from the machine tool 110 by the operator 118 and inserted again until the bending operation is completed.

This machining operation or bending operation is captured by an optical sensor here in the form of a camera 120. A laser scanner can also be used as an optical sensor. It is also possible to use a plurality of optical sensors for generating three-dimensional data.

The camera 120 generates image material of the workpiece 116 and/or at least the hand of the operator 118. In this case, a video stream or even individual images can be generated by the camera 120. The image material is transmitted to a computing unit 130, which can be designed as an independent unit or can be a component of the machine tool 110.

The image material is further processed in the computing unit 130 by means of image processing, which can be performed in hardware and/or software. In this case, individual coordinates or a trajectory of coordinates are generated. This generation is described in more detail with reference to the following figures.

If the coordinates or the trajectory are present in the coordinate system of the machine tool 110, the coordinates or the trajectory are transformed into a coordinate system of a robot system 140, for which the robot control program is to be created for operating a machine tool.

Accordingly, the coordinates or the trajectory are transmitted to a computing unit 142 of the robot system 140. The computing unit 142 creates the robot control program by means of a reverse transformation of the coordinates or of the trajectory. By means of this robot control program, an industrial robot 144 is then controlled, which handles the machining of a workpiece 146 in a machine tool 148. In this case, the workpiece 146 corresponds to the workpiece 116. The machine tool 148 can correspond to the machine tool 110. Another type of machine tool can also be used.

During the reverse transformation or inverse kinematics, the last link of the kinematic chain, the gripper, is moved and brought into the desired position. The arm elements of the robot, as the remaining links of the chain, must then take appropriate positions according to the degrees of freedom of their joints. This movement of the gripper or tool centre point (TCP), i.e. the coordinates or trajectory, is created by generating and processing the image material. From these Cartesian coordinates, the joint angles for the individual joints of the robot are then calculated during the reverse transformation, for example by applying transformation matrices. The robot control program is then created from a sequence of these joint angles.

In this way, the industrial robot 144 can be programmed without knowledge of the programming language of the robot 144 by filming and evaluating the operator 118. In addition, offline programming is possible, during which the device 100 for recording the movement of the operator 118 or of the workpiece 116 and the robot system 140 are spatially distant from one another. Temporally speaking, a time interval between the recording and processing of the data in the device 100 and the creation of the robot control program in the robot system 140 can be provided by caching the data.

Figure 2 shows a schematic representation of image material taken by the camera 120. The image material can be a photographic representation that is a single image. It can also be part of a film or video stream. The image material shown in figure 2 forms a workpiece 200 when inserted into the machine tool, i.e. between the upper tool 112 and the lower tool 114. A hand 202 of the operator 118 holds the workpiece 200 at an edge 200a during insertion into the machine tool.

This edge 200a can be used, for example, as a reference edge for generating a trajectory or a sequence of movement points. Other features of the workpiece 200 can also be used for referencing the position of the workpiece 200 in space or with respect to the upper tool 112 or the lower tool 114. For example, the two end points of the edge 200a or the entire outer contour of the workpiece 200 can be used. The reference feature, here in the form of the reference edge 200a, is traced through the image material by an algorithm, such as image processing, so that spatial changes can be tracked and the changing coordinates can be determined.

For the purpose of illustration, figures 2 to 4 each show a coordinate system to which the coordinates of the workpiece or of parts of the workpiece relate. The coordinate system is not present in the image material recorded by the camera.

Likewise, the hand 202 of the operator can be used as a reference for generating a trajectory or a sequence of movement points.

Individual fingers of the operator's hand 202 can also be detected and used as inputs to control the robot's gripper. This can comprise simple commands such as gripper open or close, but also more complex gripper actions where individual gripper fingers can be controlled according to the operator's fingers.

To improve optical recognition during generation of the image material, the workpiece 200 and/or the hand 202 can be colour-coded. For this purpose, the workpiece 200 can be painted accordingly and the operator can use a differently coloured glove. It is also possible for optical patterns to be provided on the workpiece 200 and/or on the glove, which allow accurate tracking of the corresponding object in the image material. To discriminate between the hand and workpiece, for example, special sensors or cameras, such as thermal cameras could be used. Figure 3 shows the workpiece 300 during the bending operation, in which it is clamped between the upper tool 112 and the lower tool 114. The hand of the user is therefore not on the workpiece 300 and thus not in the image material, as shown in figure 3.

In this way, the image material can be divided into handling sections, in which the hand is in contact with the workpiece (figure 2), and into machining sections, in which the hand is not in contact with the workpiece (figure 3). This can take place by image identification or image processing. Thus, the gripping tool of the robot can be controlled accordingly at the transitions between the sections. While the gripper should be closed in the section of figure 2 (which includes further images that are not shown here), it is opened in the section of figure 3 (which also includes other images not shown here).

Figures 2 to 4 merely show snapshots or individual representations in a series of images or in a continuous image sequence. The image is evaluated for each individual representation, such as, in this example, determining of the reference edge 300a and testing for the presence of the operator's hand. Of the reference features, such as the edge 300a or the user's hand, the coordinates and/or status (hand present, opened, etc.) are captured frame by frame. Thus, a sequence of coordinates or trajectory can be generated, along which the tool centre point or the gripper of the robot is to move.

In figure 3, the edge 300a of the workpiece 300 is already moved slightly upward by the bending operation. In the subsequent images (not shown here) of the image material, the edge 300a of the workpiece 300 will continue to move until a contour of the workpiece is reached, as shown in figure 4 after completion of the bending step.

Since the robot does not have to grip the workpiece 300 during the bending process, image capturing and/or image processing need not necessarily be carried out during a bending operation.

When removed as shown in figure 4, an interaction of the robot similar to the movement of the hand 202 is then necessarily required again and thus also the image capturing and image processing.

Figure 4 shows a representation of the image material with the workpiece 400 after completion of the first bending operation. The contour of the workpiece 400 is thereby changed in comparison with the unfinished state of the workpiece 200 in figure 2. The modified 3D contours can be provided from the bending plan to the image processing and also to the robot controller to improve the accuracy of the image processing, such as, for example, the extraction of the workpiece or a reference feature of the workpiece or to facilitate a check of the running program.

As shown in figure 4, the operator's hand 202 holds an edge 400b, as in this example the previous reference edge 400a is to be bent in a next bending step and therefore has to be inserted between the upper tool 112 and the lower tool 114.

Therefore, the edge 400b can become the new reference edge, for example. Alternatively and additionally to the reference edges, the hand 202 or a part of the hand 202, such as fingers or a marker (also on a glove), can be used as a reference for the coordinates of the tool centre point or the gripper.

In the image material which follows the representation of figure 4 and is no longer shown here, the operations of inserting (similar to figure 2) the workpiece, machining (similar to figure 3) the workpiece and removing (similar to figure 4) the workpiece are repeated until the machining or bending has been fully completed.

Figure 5 shows a flowchart of the method for creating the robot control program.

In a first step 500, image material of a machining operation of a workpiece on the machine tool is generated by means of at least one optical sensor.

In a second step, checks are made as to whether further image material should be produced, for example because the machining of the workpiece has not yet been completed. If this is the case, the method jumps back to step 500 so that a continuous generation and, optionally also, recording of the image material occurs.

If the generation of image material is completed, on the other hand, the method jumps to step 520. The image material is processed and used from step 520. The image material can be processed and used when the image material exists in full, such as in a manner of post- processing, or during creation, in real time or quasi real time.

In step 520, at least one part of the workpiece, for example the reference edge, and/or at least one part of a hand of an operator handling the workpiece is extracted from the image material. In this case, corresponding matching or comparison algorithms can be used which, for example, compare the 3D object of the workpiece known from the bending plan with the image material. During the extraction, the spatial position of the workpiece, at least the spatial position of a gripping region of the workpiece, and/or the spatial position of the hand can be determined.

In step 530, a trajectory and/or a sequence of movement points of at least one part of the workpiece and/or at least one part of a hand of an operator is produced from the extracted image material. The trajectory or the movement points in each case comprise coordinates that can then be used to guide the robot.

In step 540, an optional transformation of the trajectory and/or the sequence of movement points from machine coordinates into robot coordinates occurs, if these are present in machine coordinates.

In step 550, a robot control program is created by reverse transformation of the trajectory and/or of the sequence of movement points.

In an optional step 560, this robot control program is executed by a robot. A robot can now process the workpieces accordingly. This is done on the basis of the recorded images of a human operator and the image processing, but without the classic programming of the robot.

In an optional step 570, this robot control program is executed by another robot.

Advantageously, a multiplication in the programming can be achieved in this way, which can significantly increase efficiency. A database can also be created using sequences of motion associated with a particular product which have been created once. This can be refined with even more detail to store individual bending operations of a product or workpiece. In this way, for example, programs for variants of a workpiece can be easily created in which, for example, only one edge is bent differently.

The method presented here for creating a robot control program for operating a machine tool allows simple and precise programming of robots without knowledge of special programming language. In addition, by learning the movements of the part to be processed and/or the hand of the operator, the degrees of freedom of the robot can be much better used and the fastest or shortest path can always be selected.