Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR REAR STATUS DETECTION
Document Type and Number:
WIPO Patent Application WO/2018/227301
Kind Code:
A1
Abstract:
A method for identifying a loading bay at a facility to which a vehicle is reversing, the method including determining, at a sensor apparatus connected with the vehicle, that the vehicle is reversing; capturing an image of the loading bay; and determining, from the image, the loading bay at the facility to which the vehicle is reversing.

Inventors:
SEAMAN CONRAD DELBERT (CA)
KUHN DEREK JOHN (CA)
WEST STEPHEN (CA)
Application Number:
PCT/CA2018/050722
Publication Date:
December 20, 2018
Filing Date:
June 14, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BLACKBERRY LTD (CA)
International Classes:
B65G69/34; B60R11/04; B60W30/00
Foreign References:
US20150029416A12015-01-29
US20170015813A12017-01-19
EP2474944A12012-07-11
US20130011690A12013-01-10
US20100026504A12010-02-04
US5938710A1999-08-17
Other References:
See also references of EP 3638609A4
Attorney, Agent or Firm:
MOFFAT & CO. (CA)
Download PDF:
Claims:
CLAIMS

1 . A method for identifying a loading bay at a facility to which a vehicle is reversing, the method comprising:

determining, at a sensor apparatus connected with the vehicle, that the vehicle is reversing;

capturing, at the sensor apparatus, an image of the loading bay; and

determining, from the image, the loading bay at the facility to which the vehicle is reversing.

2. The method of claim 1 , wherein the determining the loading bay comprises processing the image.

3. The method of claim 2, wherein the processing comprises using image recognition to identify at least one marking on the loading bay.

4. The method of claim 3, wherein the at least one marking is selected from the group consisting of letters, numbers or code.

5. The method of claim 2, wherein the processing further comprises utilizing supplementary information during the processing.

6. The method of claim 2, further comprising sending results of the processing to a server.

7. The method of claim 1 , wherein the determining the loading bay comprises sending the image to a server.

8. The method of claim 7, wherein the sending the image further comprises sending supplementary information along with the image.

9. The method of claim 1 , further comprising determining a location of the vehicle based on the determined loading bay.

10. A sensor apparatus connected with a vehicle configured for identifying a loading bay at a facility to which the vehicle is reversing, the sensor apparatus comprising:

a processor;

an image capture sensor; and

a communications subsystem,

wherein the sensor apparatus is configured to:

determine that the vehicle is reversing;

capture an image of the loading bay; and

determine, from the image, the loading bay at the facility to which the vehicle is reversing.

1 1 . The sensor apparatus of claim 10, wherein the sensor apparatus is further configured to determine the loading bay by:

processing the image at the sensor apparatus; and

signaling results of the processing to a server.

12. The sensor apparatus of claim 1 1 , wherein the sensor apparatus is further configured to process by using image recognition to identify at least one marking on the loading bay.

13. The sensor apparatus of claim 12, wherein the at least one marking is selected from the group consisting of letters, numbers or code.

14. The sensor apparatus of claim 1 1 , wherein the sensor apparatus is further configured to process by utilizing supplementary information during the processing.

15. The sensor apparatus of claim 12, wherein the supplementary information includes at least one information element selected from the group of : a last position fix for the vehicle; stored information regarding types of markings on loading bays at the facility; stored information regarding location of markings on loading bays at the facility; physical characteristics of the facility; or a bearing the loading bay is facing.

1 6. The sensor apparatus of claim 10, wherein the sensor apparatus is further configured to determine the loading bay by sending the image to a server.

17. The sensor apparatus of claim 1 6, wherein the sending the image further comprises sending supplementary information along with the image.

18. The sensor apparatus of claim 17, wherein the supplementary information includes at least one information element selected from the group of : a last position fix for the vehicle; stored information regarding types of markings on loading bays at the facility; stored information regarding location of markings on loading bays at the facility; physical characteristics of the facility; or a bearing the loading bay is facing.

19. The sensor apparatus of claim 10, wherein the sensor apparatus is further configured to determine a location of the vehicle based on the determined loading bay.

20. A computer readable medium for storing instruction code, which when executed by a processor on a sensor apparatus connected with a vehicle cause the sensor apparatus to identify a loading bay at a facility to which the vehicle is reversing, the instruction code configured to:

determine that the vehicle is reversing; capture an image of the loading bay; and

determine, from the image, the loading bay at the facility to which the vehicle is reversing.

Description:
METHOD AND SYSTEM FOR REAR STATUS DETECTION FIELD OF THE DISCLOSURE

[0001] The present disclosure relates to vehicles, and in particular relates to rear status detection for vehicles.

BACKGROUND

[0002] When trailers are backed into loading bays, they face several challenges related to location and driver awareness. One problem occurs with the status of the doors on a truck or trailer. In particular, doors on the truck or trailer may not be secured properly and thus be only partially open. During reversing maneuvers, such unsecured doors may be damaged.

[0003] Further, it is sometimes desirable for fleet tracking purposes, cargo management purposes and/or facility management purposes to know the location of a vehicle or trailer. Knowing the location of the trailer or vehicle while it is backing up to a loading or unloading facility is sometimes problematic. Specifically, during the backing up into the loading bay, if an overhang exists over the loading bay, this may obscure GPS readings and make it impossible for the trailer to report its location.

[0004] Further, even without an overhang, loading bay doors are often very close together and therefore an accurate location of the trailer with regard to the particular loading bay that the trailer has backed into may not be readily apparent.

BRIEF DESCRIPTION OF THE DRAWINGS

[0005] The present disclosure will be better understood with reference to the drawings, in which: [0006] Figure 1 is a block diagram of an example sensor apparatus for use with the embodiments of the present disclosure;

[0007] Figure 2 is a block diagram showing an example environment for a sensor apparatus in accordance with the embodiments of the present disclosure;

[0008] Figure 3 is a process diagram showing a process to determine whether doors of a vehicle are secured;

[0009] Figure 4 is a perspective view of an example facility with three loading bays;

[0010] Figure 5 is a process diagram showing a process for determining which loading bay a vehicle is backing into, where the processing is performed on a sensor apparatus;

[0011] Figure 6 is a process diagram showing a process for determining which loading bay a vehicle is backing into, where the processing is performed at a server;

[0012] Figure 7 is a process diagram showing a process for determining which loading bay a vehicle is backing into performed at a server;

[0013] Figure 8 is a perspective view of an example facility with three loading bays having electronic signals to provide identification of the loading bays; and

[0014] Figure 9 is a process diagram showing a process for determining which loading bay a vehicle is backing into using the electronic signals from Figure 8.

DETAILED DESCRIPTION OF THE DRAWINGS

[0015] The present disclosure provides a method for identifying a loading bay at a facility to which a vehicle is reversing, the method comprising determining, at a sensor apparatus connected with the vehicle, that the vehicle is reversing; capturing an image of the loading bay; and determining, from the image, the loading bay at the facility to which the vehicle is reversing. [0016] The present disclosure further provides a sensor apparatus connected with a vehicle configured for identifying a loading bay at a facility to which the vehicle is reversing, the sensor apparatus comprising: a processor; an image capture sensor; and a communications subsystem, wherein the sensor apparatus is configured to: determine that the vehicle is reversing; capture an image of the loading bay; and determine, from the image, the loading bay at the facility to which the vehicle is reversing.

[0017] The present disclosure further provides a computer readable medium for storing instruction code, which when executed by a processor on a sensor apparatus connected with a vehicle cause the sensor apparatus to identify a loading bay at a facility to which the vehicle is reversing, the instruction code configured to: determine that the vehicle is reversing; capture an image of the loading bay; and determine, from the image, the loading bay at the facility to which the vehicle is reversing.

[0018] In vehicle operations, sensor systems may be included on the vehicle and include a plurality of sensor apparatuses operating remotely from a central monitoring station to provide remote sensor data to a management or monitoring hub. For example, one sensor system involves fleet management or cargo management systems. In fleet management or cargo management systems, sensors may be placed on a trailer, shipping container or similar product to provide a central station with information regarding the container. Such information may include, but is not limited to, information concerning the current location of the trailer or shipping container, the temperature inside the shipping container or trailer, or that the doors on the shipping container or trailer are closed, whether a sudden acceleration or deceleration event has occurred, the tilt angle of the trailer or shipping container, among other data.

[0019] In other embodiments the sensor apparatus may be secured to a vehicle itself. As used herein, the term vehicle can include any motorized vehicle such as a truck, tractor, car, boat, motorcycle, snow machine, among others, and can further include a trailer, shipping container or other such cargo moving container, whether attached to a motorized vehicle or not.

[0020] In accordance with the embodiments described herein, a sensor apparatus may be any apparatus that is capable of providing data or information from sensors associated with the sensor apparatus to a central monitoring or control station. Sensors associated with the sensor apparatus may either be physically part of the sensor apparatus, for example a built-in global positioning system (GPS) chipset, or may be associated with the sensor apparatus through short range wired or wireless communications. For example, a tire pressure monitor may provide information through a Bluetooth™ Low Energy (BLE) signal from the tire to the sensor apparatus. In other cases, a camera may be part of the sensor apparatus or may communicate with a sensor apparatus through wired or wireless technologies. Other examples of sensors are possible.

[0021] A central monitoring station may be any server or combination of servers that are remote from the sensor apparatus. The central monitoring station can receive data from a plurality of sensor apparatuses, and in some cases may have software to monitor such data and provide alerts to operators if data is outside of the predetermined boundaries.

[0022] One sensor apparatus is shown with regard to Figure 1. The sensor apparatus of Figure 1 is however merely an example and other sensor apparatuses could equally be used in accordance with the embodiments of the present disclosure.

[0023] Reference is now made to Figure 1 , which shows an example sensor apparatus 110. Sensor apparatus 110 can be any computing device or network node. Such computing device or network node may include any type of electronic device, including but not limited to, mobile devices such as smartphones or cellular telephones. Examples can further include fixed or mobile devices, such as internet of things devices, endpoints, home automation devices, medical equipment in hospital or home environments, inventory tracking devices, environmental monitoring devices, energy management devices, infrastructure management devices, vehicles or devices for vehicles, fixed electronic devices, among others.

[0024] Sensor apparatus 110 comprises a processor 120 and at least one communications subsystem 130, where the processor 120 and communications subsystem 130 cooperate to perform the methods of the embodiments described herein. Communications subsystem 130 may, in some embodiments, comprise multiple subsystems, for example for different radio technologies.

[0025] Communications subsystem 130 allows sensor apparatus 110 to communicate with other devices or network elements. Communications subsystem 130 may use one or more of a variety of communications types, including but not limited to cellular, satellite, Bluetooth™, Bluetooth™ Low Energy, Wi-Fi, wireless local area network (WLAN), near field communications (NFC), ZigBee, wired connections such as Ethernet or fiber, among other options.

[0026] As such, a communications subsystem 130 for wireless communications will typically have one or more receivers and transmitters, as well as associated components such as one or more antenna elements, local oscillators (LOs), and may include a processing module such as a digital signal processor (DSP). As will be apparent to those skilled in the field of communications, the particular design of the communication subsystem 130 will be dependent upon the communication network or communication technology on which the sensor apparatus is intended to operate.

[0027] Processor 120 generally controls the overall operation of the sensor apparatus 110 and is configured to execute programmable logic, which may be stored, along with data, using memory 140. Memory 140 can be any tangible, non-transitory computer readable storage medium, including but not limited to optical (e.g., CD, DVD, etc.), magnetic (e.g., tape), flash drive, hard drive, or other memory known in the art. [0028] Alternatively, or in addition to memory 140, sensor apparatus 110 may access data or programmable logic from an external storage medium, for example through communications subsystem 130.

[0029] In the embodiment of Figure 1 , sensor apparatus 110 may utilize a plurality of sensors, which may either be part of sensor apparatus 110 in some embodiments or may communicate with sensor apparatus 110 in other embodiments. For internal sensors, processor 120 may receive input from a sensor subsystem 150.

[0030] Examples of sensors in the embodiment of Figure 1 include a positioning sensor 151 , a vibration sensor 152, a temperature sensor 153, one or more image sensors 154, accelerometer 155, light sensors 156, gyroscopic sensors 157, and other sensors 158. Other sensors may be any sensor that is capable of reading or obtaining data that may be useful for sensor apparatus 110. However, the sensors shown in the embodiment of Figure 1 are merely examples, and in other embodiments different sensors or a subset of sensors shown in Figure 1 may be used.

[0031] Communications between the various elements of sensor apparatus 110 may be through an internal bus 160 in one embodiment. However, other forms of communication are possible.

[0032] Sensor apparatus 110 may be affixed to any fixed or portable platform. For example, sensor apparatus 110 may be affixed to shipping containers, truck trailers, truck cabs in one embodiment. In other embodiments, sensor apparatus 110 may be affixed to any vehicle, including motor vehicles (e.g., automobiles, cars, trucks, buses, motorcycles, etc.), aircraft (e.g., airplanes, unmanned aerial vehicles, unmanned aircraft systems, drones, helicopters, etc.), spacecraft (e.g., spaceplanes, space shuttles, space capsules, space stations, satellites, etc.), watercraft (e.g., ships, boats, hovercraft, submarines, etc.), railed vehicles (e.g., trains and trams, etc.), and other types of vehicles including any combinations of any of the foregoing, whether currently existing or after arising, among others.

[0033] In other cases, sensor apparatus 110 could be carried by a user.

[0034] In other cases, sensor apparatus 110 may be affixed to stationary objects including buildings, lamp posts, fences, cranes, among other options.

[0035] Such sensor apparatus 110 may be a power limited device. For example sensor apparatus 110 could be a battery operated device that can be affixed to a shipping container or trailer in some embodiments. Other limited power sources could include any limited power supply, such as a small generator or dynamo, a fuel cell, solar power, among other options.

[0036] In other embodiments, sensor apparatus 110 may utilize external power, for example from the engine of a tractor pulling the trailer, from a land power source for example on a plugged in recreational vehicle or from a building power supply, among other options.

[0037] External power may further allow for recharging of batteries to allow the sensor apparatus 110 to then operate in a power limited mode again. Recharging methods may also include other power sources, such as, but not limited to, solar, electromagnetic, acoustic or vibration charging.

[0038] The sensor apparatus from Figure 1 may be used in a variety of environments. One example environment in which the sensor apparatus may be used is shown with regard to Figure 2.

[0039] Referring to Figure 2, three sensor apparatuses, namely sensor apparatus 210, sensor apparatus 212, and sensor apparatus 214 are provided. [0040] In the example of Figure 2, sensor apparatus 210 may communicate through a cellular base station 220 or through an access point 222. Access point 222 may be any wireless communication access point.

[0041] Further, in some embodiments, sensor apparatus 210 could communicate through a wired access point such as Ethernet or fiber, among other options.

[0042] The communication may then proceed over a wide area network such as Internet 230 and proceed to servers 240 or 242.

[0043] Similarly, sensor apparatus 212 and sensor apparatus 214 may communicate with servers 240 or server 242 through one or both of the base station 220 or access point 222, among other options for such communication.

[0044] In other embodiments, any one of sensors 210, 212 or 214 may communicate through satellite communication technology. This, for example, may be useful if the sensor apparatus is travelling to areas that are outside of cellular coverage or access point coverage.

[0045] In other embodiments, sensor apparatus 212 may be out of range of access point 222, and may communicate with sensor apparatus 210 to allow sensor apparatus 210 to act as a relay for communications.

[0046] Communication between sensor apparatus 210 and server 240 may be one directional or bidirectional. Thus, in one embodiment sensor apparatus 210 may provide information to server 240 but server 240 does not respond. In other cases, server 240 may issue commands to sensor apparatus 210 but data may be stored internally on sensor apparatus 210 until the sensor apparatus arrives at a particular location. In other cases, two-way communication may exist between sensor apparatus 210 and server 240. [0047] A server, central server, processing service, endpoint, Uniform Resource Identifier (URI), Uniform Resource Locator (URL), back-end, and/or processing system may be used interchangeably in the descriptions herein. The server functionality typically represents data processing/reporting that are not closely tied to the location of movable image capture apparatuses 210, 212, 214, etc. For example, the server may be located essentially anywhere so long as it has network access to communicate with image capture apparatuses 210, 212, 214, etc.

[0048] Server 240 may, for example, be a fleet management centralized monitoring station. In this case, server 240 may receive information from sensor apparatuses associated with various trailers or cargo containers, providing information such as the location of such cargo containers, the temperature within such cargo containers, any unusual events including sudden decelerations, temperature warnings when the temperature is either too high or too low, among other data. The server 240 may compile such information and store it for future reference. It may further alert an operator. For example, a sudden deceleration event may indicate that a trailer may have been in an accident and the operator may need to call emergency services and potentially dispatch another tractor to the location.

[0049] In other embodiments, server 240 may be a facilities management server, and direct loading and unloading of goods to vehicles in particular bays of the facility.

[0050] Other examples of functionality for server 240 are possible.

[0051] In the embodiment of Figure 2, servers 240 and 242 may further have access to third-party information or information from other servers within the network. For example, a data services provider 250 may provide information to server 240. Similarly, a data repository or database 260 may also provide information to server 240. [0052] For example, data services provider 250 may be a subscription based service used by server 240 to obtain current road and weather conditions.

[0053] Data repository or database 260 may for example provide information such as image data associated with a particular location, aerial maps, or other such information.

[0054] The types of information provided by data service provider 250 or the data repository or database 260 is not limited to the above examples and the information provided could be any data useful to server 240.

[0055] In some embodiments, information from data service provider 250 or the data repository from database 260 can be provided to one or more of sensor apparatuses 210, 212, or 214 or processing at those sensor apparatuses.

[0056] A sensor apparatus such as that described in Figures 1 and 2 above may be used for backing up of a trailer, shipping container or other vehicle. Various embodiments are provided below. In a first embodiment, a camera associated with the sensor apparatus may be used. However, in other embodiments low-power communication methods including BLE, ZigBee, near field communications, among other short-range wireless communications could be utilized, as described below.

[0057] In accordance with embodiments described below, a method and system is provided to provide accurate loading or unloading location information for trailers to improve yard management. Specifically, if the exact location for a trailer becomes known, a trailer yard becomes similar to an airport and the management of trailers into and out of the facility can be more efficiently done, for example through a central monitoring station.

[0058] Under normal operation, a trailer and a truck typically have unobstructed views of the sky to obtain accurate GPS location fixes. The embodiments described provide that when a truck is backing up, a second set of location management algorithms may be engaged to improve accuracy and safety. Such algorithms may assist an operator to avoid damage to the trailer or doors by providing alerts. Further, using such algorithms, location accuracy may be improved at loading bays.

[0059] Therefore, in accordance with a first embodiment of the present disclosure, a detection of whether a door is secured during backing up is provided. Reference is now made to Figure 3.

[0060] The process of Figure 3 starts at block 310 and proceeds to block 312, in which a check is made to determine whether or not the trailer is backing up. The check at block 312 may comprise various techniques. In a first embodiment, sensors may be located on the engine of the tractor pulling the trailer. In this case, when the engine is placed into reverse, the sensors may indicate to a controller or sensor apparatus that the trailer is backing up. In other embodiments, the trailer may know its orientation and utilize the GPS to determine whether or not it is backing up. In other embodiments, a sensor apparatus may have a camera which indicates that objects in the rear of the trailer are getting larger and therefore that the trailer is backing up. In other cases, the fact that the trailer has entered a geo-fenced area may indicate that the trailer is, or soon will be, backing up. Other options are possible.

[0061] If, at block 312, the check determines that the trailer is not backing up then the process proceeds to loop back to block 312 until a determination is made that the trailer is backing up.

[0062] Once a determination is made at block 312 that the trailer is backing up, the process proceeds to block 320 in which a check is made to determine whether the doors of the trailer are secured. In particular, the check at block 320 may be accomplished in various ways. In the first embodiment, sensors on the door may indicate if the doors are completely closed or are anchored in an open position. For example, in some embodiments, the design of a loading bay may necessitate the trailer reversing with the doors pinned to the side of the trailers in a fully open position. In other cases, the trailer should reverse with the doors fully closed.

[0063] Various sensors may be part of the doors or latching system or anchoring system which may provide feedback to a sensor apparatus to indicate that the doors are in such a position.

[0064] In other embodiments, a camera on the door may indicate whether the door is fully closed or fully open. For example, if the camera shows that the image of objects behind the trailer are not completely perpendicular to the direction of motion then the doors may not be fully closed. Similarly, if the objects on the side of the trailer are not moving parallel to the direction of motion, then the doors may not be in a fully open position. Various imaging processing software may be used to make the determination of whether the doors are fully open are fully closed.

[0065] In other embodiments, a camera may be mounted to the back of the trailer but above the doors. This camera may then be used to detect the position of the doors through image recognition.

[0066] Other options for the determination of whether the door is secured would be apparent to those skilled in the art having regard to the present disclosure.

[0067] From block 320, if the doors are secured then the process proceeds to block 330 and ends.

[0068] Conversely, if the doors are not secured and the vehicle is reversing, then the process proceeds from block 320 to block 340, in which an alarm is raised. The alarm at block 340 may, in one embodiment, be one or a combination of an audio, visual or sensory alarm within a cab of a tractor to alert the operator that the doors are not closed. In other embodiments, the alarm may be a signal to a central controller, which may then provide an alert, for example to the loading bay to which the trailer is being backed up, to allow visual or auditory indicators to be provided to the driver or loading bay staff. In other cases, a test or data notification to an in-cab solution or mobile phone may be made. Other options for alerts would be apparent to those skilled in the art having regard to the present disclosure.

[0069] Further, in some cases, in addition to an alarm, or instead of an alarm, a processor may take positive action to stop potential damage of the doors. This may include, for example, actuating or overriding braking functions on a vehicle to stop the motion of the vehicle and/or trailer. If the doors can be opened or closed electronically, the functionality may further include closing the doors or opening them fully.

[0070] From block 320, the process proceeds to block 330 and ends.

[0071] Thus, in accordance with the various embodiments of the present disclosure described above, a sensor apparatus on a vehicle may contain a camera or other sensors, including a gyroscope, time-of-flight (TOF), radar, light detection and ranging (LIDAR), among other options. This sensor apparatus may be used to determine if the door is open. If the door is open a series of potential alerts can be sent to the driver or loading bay operator to stop backing up in order to reduce potential door damage, or positive action may be taken without driver input.

[0072] In a further embodiment of the present disclosure, knowledge of the exact bay that a trailer is being reversed into may be desirable at a central station. Such knowledge may allow management of the loading or unloading of the trailer, provide end to end goods tracking at a more refined level, provide which bay the goods were loaded at and which bay the goods were unloaded at, provide for terminal management functionality, among other benefits.

[0073] However, it may be difficult to detect in the exact bay that a trailer is being backed into by merely using a GPS sensor on a sensor apparatus. In particular, loading bay doors are often located very close to each other and the level of granularity of the GPS system may not be sufficient to allow the exact determination of the loading bay that the truck or trailer is backing into. For example, in some cases GPS may only be accurate to 3 meters, 10 meters or even 100 meters depending on the quality of the position fix.

[0074] In other cases, loading bays are located under an overhang or other obstruction. In this case, the sky is obscured and a GPS fix may be impossible to obtain.

[0075] Therefore, in accordance with one embodiment of the present disclosure, a visual sensor on the rear of the trailer may be used to take photos or images of the loading bay that is being approached. Specifically, bay doors may, be labelled with a letter, number or code that is highly visible. Optical recognition of the image may then be used to determine which loading bay is being approached. In this case, the accurate location may be used with imperfect GPS location information to determine the exact bay that it is being backed into.

[0076] In particular, reference is now made to Figure 4, which shows a perspective view of a facility having three loading bays. Such loading bays are labeled as loading bays 410, 412, and 414 in the embodiment of Figure 4.

[0077] In the example of Figure 4, a letter is placed on a canopy above each door, shown as letters A, B and C. The use of letters on a canopy however is only one example. In other cases, the door marking may be placed on the building, awning or on a canopy at any location, including above, at the sides or below a door. Further, the marking may be placed on the door itself in some embodiments.

[0078] The marking may be a letter, number or other code that could be detected and would be visible in an image of the door.

[0079] Reference is now made to Figure 5, which shows one process for the determination of a location at the sensor apparatus. In particular, the process of Figure 5 starts at block 510 and proceeds to block 512 in which a determination is made as to whether the trailer or vehicle is backing up. The determination of block 512 may be done similarly to the determination made at block 312 above.

[0080] If the vehicle is not backing up, the process proceeds back to block 512 until a determination is made that the vehicles is backing up.

[0081] Once a determination is made that the vehicle is backing up, the process may optionally proceed to block 514. In block 514, a determination is made of whether the vehicle is within a threshold proximity to a bay door. This may be done, for example, through visual processing indicating that there is a bay door behind the trailer and that the trailer is within a certain distance of the day door. In particular, in some cases the trailer may be backing up for other reasons, including parking in a trailer yard, among other options. In this case the determination of a bay door is not relevant.

[0082] Further, in some cases the vehicle may not be able to back straight into a bay. For example, the backup path may involve turning the trailer during reversing. In this case, image capture while the bay door is not within a captured image may be irrelevant. Therefore, in the determination at block 514, if the vehicle is not within proximity to a bay door or not facing the door, then the process may proceed back to block 512 to determine whether or not the vehicle is continuing to backup. [0083] Once the vehicle is within a proximity and visual range of the door, or if block 514 is omitted, the process proceeds to block 520 in which an image is captured. The image capture may occur, for example, utilizing a camera sensor on the trailer. However, in other embodiments the camera sensor may be located on the tractor with a view behind the vehicle. The camera may be part of the sensor apparatus or may be communicating with the sensor apparatus in some embodiments.

[0084] Once the image is captured, in accordance with the embodiment of Figure 5, the sensor apparatus may process the image at block 530. The processing of the image may utilize any image processing techniques including symbol detection to detect the symbol around the bay door. Alternatively, the sensor apparatus may send the image to a server for processing, and as discussed herein, processing steps may be performed at the sensor apparatus and/or a server system.

[0085] Further, in some cases supplementary information may be used to assist the image processing. For example, information such as the location of the vehicle may be utilized to focus the image processing. Specifically, the sensor apparatus, through sensors on a trailer, may know at least a rough estimate of the location of the trailer, and therefore may know the facility that it is at. Supplemental information may further include information such as the orientation of the trailer, a tilt angle of the trailer, other images recently captured by the image capture device, among other supplemental information. The use of such information may be beneficial in image processing to narrow the possible bay doors that the trailer may be at. Supplemental information may further include a map overlay of information relating to the specific location and identification of bay doors on a more general purpose map. Such supplemental information may be used to allow the image processing to locate the bay markings. For example, the facility may have bay markings on the left of the bay door, and the marking may be numbers. Knowledge of such information may allow for better accuracy in identifying the bay. Other examples of supplementary information may include a last position fix for the vehicle, stored information regarding types of markings on loading bays at the facility, stored information regarding location of markings on loading bays at the facility, physical characteristics of the facility, a bearing the loading bay is facing.

[0086] In other cases, if the facility is not known, data from the image may be compared with data stored locally or in a remote database. In this case, a lookup may be performed based on information within the image, such as the type of marking, the location of marking, facility color, whether the bay doors include a canopy or awning, whether the driveway is paved, among other information that may assist the image processing.

[0087] The information about the facility may be stored at the sensor apparatus, or may be obtained from a remote store.

[0088] In some cases, no supplementary information is available, and the image processing at block 530 relies solely on the image captured at block 520.

[0089] Any symbol recognition software or algorithm may be utilized to facilitate the image processing at block 530. In some embodiments, user input may assist such image processing.

[0090] Based on the processing at block 530, a bay is identified and the process proceeds to block 540. At block 540 the identification of the bay into which the trailer is reversing is sent/signaled/transmitted to a central location. The signaling at block 540 is however optional. In some cases, instead of signaling, the information may be stored on the sensor apparatus for later processing. Other options are possible.

[0091] From block 540 the process proceeds to block 550 and ends. [0092] In other embodiments, the image processing may occur at a remote location. Reference is now made to Figure 6.

[0093] In the embodiment of Figure 6, a sensor apparatus on the vehicle starts the process at block 610 and proceeds to block 612 in which a determination is made of whether the trailer or truck or vehicle is backing up. Such determination is similar to the determinations made at blocks 312 and 512 described above.

[0094] The process may then optionally proceed to block 614 in which a determination is made whether the trailer is within a proximity and visual range of the loading bay. This is similar to the process described above with regard to block 514 above.

[0095] From block 614, the process proceeds to block 620 in which an image of the loading bay door that the trailer or vehicle is backing into is captured.

[0096] From block 620 the process proceeds to block 630 in which image is transmitted to a central server. In particular, the image data may comprise one or more pictures or may comprise a video of the trailer backing into the loading bay.

[0097] The image data may, in some embodiments, have other supplemental information included. Such supplemental information may include location data for the last position fix of the trailer, which may be utilized to narrow the potential locations that the trailer is backing into. Supplemental information may further include information such as the orientation of the trailer, a tilt angle of the trailer, other images recently captured by the image capture device, among other supplemental information.

[0098] From block 630 the process proceeds to block 650 and ends. [0099] Referring to Figure 7, on the central monitoring side the process starts at block 710 and proceeds to block 712. At block 712, the central server receives the image data, which may in some embodiments include supplemental data such as location data sent by a trailer.

[00100] The process then proceeds to block 720 in which the image is processed. Such processing may in some embodiments involve deriving information from the image. For example, through a database at the central monitoring station or accessible to the central monitoring station, a lookup may be performed based on information within the image, such as the type of marking, the location of marking, facility color, whether the bay doors include a canopy or awning, whether the driveway is paved, among other information that may assist the image processing.

[00101] The processing at block 720 may further use information that was sent in conjunction with the image data, such as the last position fix from the vehicle, previous image data, identity of the sensor apparatus unit, or other such information to further facilitate the processing.

[00102] The processing at block 720 allows a bay that the trailer is backed into to be determined. This information may be stored or associated with the particular trailer for future use. It may also be used by management or coordination software for the particular facility to enable operations or optimize the facility utilization at the particular loading bay facility.

[00103] The process then proceeds to block 750 and ends.

[00104] In still further embodiments, instead of using visual data, other sensor data may be utilized to determine which bay the vehicle is backing into. For example, short range communications may be utilized to provide information on the bay that the vehicle is backing into.

[00105] Reference is now made to Figure 8 which shows a perspective view of an example loading bay facility having three loading bays.

[00106] In the embodiment of Figure 8, short range communications transmitters 810, 812 and 814 are placed above each loading bay door. For example, such sensors may be short range wireless transmitters such as Wi-Fi, Bluetooth™, BLE, ZigBee, near field communication, among other options. However, in other embodiments, the transmitters may be placed at other locations relative to the loading bays.

[00107] A sensor apparatus on a trailer may look for the signals transmitted by transmitters 810, 812 and 814 and determine which signal is the strongest. Based on the strength indicator, the sensor apparatus on the trailer may know which is the closest transmitter and therefore which bay the trailer has backed into.

[00108] Specifically, in one embodiment, transmitter 812 may be a Bluetooth™ Low Energy transmitter in which a preamble or other signal characteristic is different from the preambles or signal characteristics from transmitter 810 and 814.

[00109] Transmitters 810, 812 and 814 may be calibrated to provide the same power level when transmitting. Alternatively, the transmitters at a location may be fingerprinted, or otherwise characterized for later processing. For example, certain transmitters may be stronger or less powerful than others. Knowing this information, the sensor apparatus may understand that each location has a particular profile and that adjustments can be applied. Moreover, the system may feed back information about a location to be analyzed and sent back to other devices that may happen upon the location. The profiles/fingerprints for locations may be stored at a server and sent to the sensor apparatus. If a destination for a sensor apparatus is determined, then profiles for that location may be sent to the sensor apparatus before, en route, or after arrival. Alternatively, the sensor apparatus may request a location's profile.

[00110] Therefore, a sensor on the trailer may detect signals from the three transmitters but determine that the signal from transmitter 814 is the strongest. The sensor may therefore read the preamble from the transmitter 812 and report that this signal was received with the strongest signal power level to a central controller.

[00111] Reference is now made to Figure 9. In particular, the process of Figure 9 starts at block 910 and proceeds to block 912 in which a determination is made that the trailer is backing up. Such determination is similar to the determination made in blocks 312, 512, and 612 above.

[00112] From block 912 the process proceeds to block 914 in which a determination is made whether the trailer detects one or more signals from a bay. Thus, if the trailer is not backing into a facility that includes wireless communications to indicate bay doors, the detection at block 914 may not exist. Therefore, if no signal is detected the process proceeds to block 920 and ends.

[00113] If one or more location signals are detected at block 914 then the process proceeds to block 930 in which a strongest signal is determined.

[00114] From block 930 the process may proceed to block 940 in one embodiment, in which the strongest signal and information from within that signal is reported to a central monitoring station. Alternatively, the process may proceed from block 930 to block 950 in which the determination of the bay is made at the sensor apparatus and the particular bay that the trailer is backing into is then reported at block 950. [00115] From blocks 940 or 950 the process proceeds to block 920 and ends.

[00116] Referring again to Figure 8, in some embodiments, a low-range transmitter such as transmitters 820, 830 and 840 may exist. Such low-range transmitters may only have a range of a few centimeters, inches, meters or feet. In this case, a corresponding sensor on the trailer may be located in a position that, when the trailer is close enough to transmitters 820, 830 or 840, the sensor apparatus can detect such transmitters and report that it is close to that particular bay. In this case, the distance the transmitter can be detected may be smaller than the width of the bay door, and thus only the transmitter on the bay door that the trailer is reversing into is detected.

[00117] Therefore in accordance with the embodiments of Figures 8 and 9, instead of a camera, a low-power local network such as BLE, ZigBee, Wi-Fi, Bluetooth™, near field communications, or other radiofrequency identification may be used to determine the bay at which the vehicle is now parked. The cost of installing such small transmitter devices used to broadcast the door name or identity and its GPS location may be low in many cases.

[00118] The information provided based on the location could therefore substitute inaccurate GPS location information and provide for the accurate determination of which loading bay the trailer has backed into. Again, such information may be utilized by a central controller or an operations management controller for a particular facility in order to optimize operations, provide end-to-end tracking of goods, or be used with other factors which may be useful for the shipping or trailer industry.

[00119] The embodiments described herein are examples of structures, systems or methods having elements corresponding to elements of the techniques of this application. This written description may enable those skilled in the art to make and use embodiments having alternative elements that likewise correspond to the elements of the techniques of this application. The intended scope of the techniques of this application thus includes other structures, systems or methods that do not differ from the techniques of this application as described herein, and further includes other structures, systems or methods with insubstantial differences from the techniques of this application as described herein.

[00120] While operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be employed. Moreover, the separation of various system components in the implementation descried above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a signal software product or packaged into multiple software products.

[00121] Also, techniques, systems, subsystems, and methods described and illustrated in the various implementations as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component, whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and may be made.

[00122] While the above detailed description has shown, described, and pointed out the fundamental novel features of the disclosure as applied to various implementations, it will be understood that various omissions, substitutions, and changes in the form and details of the system illustrated may be made by those skilled in the art. In addition, the order of method steps are not implied by the order they appear in the claims. [00123] When messages are sent to/from an electronic device, such operations may not be immediate or from the server directly. They may be synchronously or asynchronously delivered, from a server or other computing system infrastructure supporting the devices/methods/systems described herein. The foregoing steps may include, in whole or in part, synchronous/asynchronous communications to/from the device/infrastructure. Moreover, communication from the electronic device may be to one or more endpoints on a network. These endpoints may be serviced by a server, a distributed computing system, a stream processor, etc. Content Delivery Networks (CDNs) may also provide may provide communication to an electronic device. For example, rather than a typical server response, the server may also provision or indicate a data for content delivery network (CDN) to await download by the electronic device at a later time, such as a subsequent activity of electronic device. Thus, data may be sent directly from the server, or other infrastructure, such as a distributed infrastructure, or a CDN, as part of or separate from the system.

[00124] Except where otherwise described as such, a server, central server, service, processing service, endpoint, Uniform Resource Identifier (URI), Uniform Resource Locator (URL), back-end, and/or processing system may be used interchangeably in the descriptions and examples herein. Mesh networks and processing may also be used alone or in conjunction with other types as well as fog computing. Moreover, communication may be from device to device, wherein they may use low power communication (e.g., Bluetooth, Wi-Fi), and/or a network, to communicate with other devices to get information.

[00125] Typically, storage mediums can include any or some combination of the following : a semiconductor memory device such as a dynamic or static random access memory (a DRAM or SRAM), an erasable and programmable read-only memory (EPROM), an electrically erasable and programmable read-only memory (EEPROM) and flash memory; a magnetic disk such as a fixed, floppy and removable disk; another magnetic medium including tape; an optical medium such as a compact disk (CD) or a digital video disk (DVD); or another type of storage device. Note that the instructions discussed above can be provided on one computer-readable or machine-readable storage medium, or alternatively, can be provided on multiple computer-readable or machine-readable storage media distributed in a large system having possibly plural nodes. Such computer-readable or machine-readable storage medium or media is (are) considered to be part of an article (or article of manufacture). An article or article of manufacture can refer to any manufactured single component or multiple components. The storage medium or media can be located either in the machine running the machine- readable instructions, or located at a remote site from which machine-readable instructions can be downloaded over a network for execution.

[00126] In the foregoing description, numerous details are set forth to provide an understanding of the subject disclosed herein. However, implementations may be practiced without some of these details. Other implementations may include modifications and variations from the details discussed above. It is intended that the appended claims cover such modifications and variations.