Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
VEHICLE CAMERA SYSTEM
Document Type and Number:
WIPO Patent Application WO/2018/025273
Kind Code:
A1
Abstract:
Methods and systems are disciosed for using in vehicle cameras coitpied with mobile computing devices, for vehicle safety, vehicle checks and vehicle security.

Inventors:
KAHLON YAIR (IL)
Application Number:
PCT/IL2017/050867
Publication Date:
February 08, 2018
Filing Date:
August 06, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KAHLON YAIR (IL)
GOLAN YUVAL (IL)
KAHLON YOSEF (IL)
DE LEVI YOSEF (IL)
International Classes:
B60W30/00; H04M1/72412; H04W4/02
Foreign References:
US20160150070A12016-05-26
Attorney, Agent or Firm:
FRIEDMAN, Mark (IL)
Download PDF:
Claims:
Claims:

1. A computerized method for providing a continuous view of a location comprising:

continuously obtaining a representation of a location from an imaging devic as the imaging device travels along a path;

converting the obtained representation into digital data for a mobile computing device including a display screen; and,

electronically transmitting the digital data to the mobile computing device, such that the digital data is received by the mobile computing device and displays on a screen display of the mobile computing device in a visual format as the continuously obtained represeniation of the location.

2. The computerized method of claim 1 , where the continuously obtaining a representation of a location from an imaging device as the imaging device travels along a path is performed in real time.

3. The computerized method of claim 2, wherein the electronically transmitting the digital data to the mobile computing device, such that the digital data is received by the mobile computing device, is in real tunc.

4. The computerized method of claim 1, wherein the imaging device is a camera, and the continuously obtained representation from the camera is a continuous image obtained by the camera while a vehicle including the camera travels along the path,

5. The computerized method of claim 3, wherein the visual format includes an image,

6. The computerized method of claim 5, wherein the image which displays on the screen display of the mobile computing device is a background image.

7. The computerized method of cl aim 6, wherein the background image is configured for supporting a graphic indication of an application as a foreground image, which at least a portion thereof overlays the background image.

8. The computerized method of claim 7, wherein, the foreground image displays transparently over the at least a portion of the background image.

9. The computerized method of claim 7, wherein the graphic indication of the application as a foreground image remains displayed until the event associated with the application ends.

30, The computerized method of claims 1, 7 or 9, wherein the mobile computing device includes a smartphone.

I f. The computerized method of claim wherein the electronically transmitting the digital data to the mobile computing device is via at least one of Bluetooth®, Wifl® or a data communication over- one or more communications networks.

12. A computerized method for alerting a user of a mobile computing device via the mobile computing device of a situation, comprising:

detecting a first event associated with a physical activity performed on a vehicle, and responding to the detection f the first event by obtaining a reference image of a portion of the cabin of the vehicle:

detecting a second event associated with a physical activity performed on a vehicle, and responding to the detection of the second event by obtaining a subsequent image of substantially the same portion of the cabin of the vehicle;

analyzing the first image against: the second image for differences; and.

if there are differences between the reference image and the subsequent image, transmitting the subsequent image to the mobile computing device associated with the user.

13. The computerized method of claim .12, wherein the obtaining the reference image and the subsequent image is performed by an in-vehicle camera.

14. The computerized method of claim 12, wherein the. mobile computing device is a smartphone.

15. A computerized method for vehicle security comprising:

detecting a physical action associated with the vehicle; activating an imaging device in response to the detected physical action to obtain an image of a subject;

analyzing the subject of the obtained image against images of know subjects associated with the vehicle; and,

if the subject of the obtained image is different than the subjects from the known images associated with the vehicle, .transmitting the obtained image of the subject to at least one mobile computing device.

16. The computerized method of claim 15, wherein the detected physical action is analyzed to determine whether the physical action is an abnormal event, and coupled with the subject of the obtained image being different than the subjects from the known images associated with the vehicle, transmitting the obtained image of the subject to at leas one mobile computing device.

17. The computerized method of claim 1 5 or 16, wherein the at least one mobile computing device includes -a mobile computing device of a user associated with the vehicle.

1 8. The computerized method of claim 17, wherein the mobile computing device includes a smartphone,

19. A system for providing a continuous vie of a location comprising:

an imaging device for continuously obtaining a representat on of a location as the imaging device travels along a path, the imaging device including:

a processor for converting the obtained representation into digital data for a mobile computing device including a display screen, the mobile computing device linked to the imaging device; and, a transmitter for electronically transmitting the digital data to the mobile computing device, such that the digital data is received by the mobile computing device and displays on the display screen of the mobile computing device in a visual format as the continuously obtained representation of the location,

20. The system of claim 1.9, wherein the mobile device includes a processor programmed to place a graphic indication of an application over the representation of the location as displayed o the display screen of the mobile computing device, the digital data of the representation of the location for supporting the indication of the application placed over the representation of the location,

21. The imaging device of claim 19. wherein the imaging device continuously obtains the representation in real time, the processor converts the obtained representation to digital data in real time, and the a transmitter for electronically transmitting the digital data to the mobile computing device in real lime.

22. The system of claim 20 or 21 , wherein the imaging device includes a camera, configured for use in a vehicle.

23. A system for alerting a user of a mobile computing device via the mobile computing device of a situation, comprising;

a storage medium for storing computer components; and,

a processor for executing the computer components,, the computer components comprising;

a first computer component for detecting a first event associated with a physical activity performed on a vehicle, and responding to the detection of the first event by causing a reference image of a portion of the cabin of the vehicle to be obtained from an imaging device, and detecting a second event associated with a physical activity performed on a vehicle, and responding to the detection of the second event by causing a subsequent image of substantially the same portion of the cabin of the vehicle to be obtained from the imaging device:

a second computer component analyzing the first image against the second image for differences; and,

a third computer component such that, if there are differences between the reference image and the subsequent image, the third computer component for transmitting the subsequent image to the mobile computing device associated with the user.

24. A system for vehicle security comprising;

a storage medium for storing computer components; and,

a processor for executing the computer components, the computer components comprising;

a first computer component for detecting a physical action associ ted with the vehicle; a. second computer component for activating an imaging device in response to the detected physical action to obtain an image of a subject;

a third computer component for analyzing the subject of the obtained image against images of know subjects associated with the vehicle: and,

a fourth computer camponent, such that i the subject of the obtained image is different than the subjects from the known images associated with the vehicle, the fourth computer component for transmitting the obtained image of the subject to at least one mobile computing device.

25. A computer usable non-transitory storage medium having a computer program embodied thereon for causing a suitable programmed system to provide a continuous view of a location to a linked mobile computing device, by performing the following steps when such program is executed on the system, the steps comprising:

continuously obtaining a representation of a location from an imaging device as the imaging device travels along a path;

converting the obtained representation into digital data for a mobile computing device including a display screen: and,

electronically transmitting the digital data to the mobile computing device, such that the digital data is received by the mobile computing device and displays on a display screen of the mobile computing device in a vi sual format as the continuously obtained representation of the location,

26. The computer usable non-transitory storage medium of claim 25, wherein the steps are performed in real time.

27. The computer usable non-transitory storage medium of claim 25, wherein the mobile computing device includes a smartphone.

28. The computer usable non -transitory storage medium of claim 26, wherein the steps additionally comprise: placing a graphic indication of an application over the representation of the location as displayed on the display screen, of the mobile computing device, the digital data of the representation of the location for supporting the indication of the application placed over the representation of the location.

29. A computer usable non- transitory storage medium having a computer program embodied thereo for causing a suitable programmed system to alert a user of a mobile computing device, via the mobile computing device, of a situation, by performing the following steps when such program is executed on the system, the steps comprising:

detecting a first event associated with a physical activity performed on a vehicle, and responding to the detection of the first event by obtaining a reference image of a portion of the cabin of the vehicle;

detecting a second event associated with a physical activity performed on a vehicle, and responding to the detection of the second event by obtaining a subsequent image of substantially the same portion of the cabin of the vehicle:

analyzing the first image against the second image for differences; and,

if there are differences between the reference image and the subsequent image, transmitting the subsequent image to the mobile computing device associated with the user.

30. A computer usable non-transitor storage medium having a computer program embodied thereon for causing a suitable programmed system to alert a user of a mobile computing device, via the mobile computing device, of a situation associated with a vehicle, by performing the following steps when such program is executed on. the system, the steps comprising:

detecting a physical action associated with the vehicle;

activating an imaging device in response to the detected physical action to obtain a image of a subject;

analyzing the subject of the obtained image against images of 'know subjects associated with the vehicle; and,

if the subject of the obtained image is different than the subjects from the known images associated with the vehicle, transmitting the obtained image of the subject to at least one mobile computing device.

31. The computer usable non-transitory storage medium of claim 30, wherein the at least, one mobile computing device includes a smartphone linked to the imaging device and associated with a driver of the vehicle.

Description:
VEHICLE CAMERA SYSTEM

CROSS REFERENCES TO RELATED APPLICATIONS

This application is related to and claims priority from commonly owned US Provisional Patent Application Serial No. 62/370,734, entitled: Vehicle Camera System, filed on August 4, 2016, the disclosure of which is incorporated by reference in its entirety herein,

TECHNICAL FIELD

The present invention relates to a vehicle camera systems,

BACKGROUND OF THE INVENTIO

The use of mobile devices, such as smartphones and the like, while driving is dangerous, as the driver becomes distracted and accident prone. Typically, should the driver receive a call, text, message or other communication, the smartphone makes a sound indicative of the call, text, message or communication. This sound and subsequent attention to responding to the communication by operating the smartphone draws the driver's attention to the smartphone. The driver may look toward the smartphone, and thus, look away from the road, creating a momentarily distracted driver and for that time, rendering the driver accident prone.

SUMMARY OF THE INVENTION

The present invention relates to a vehicle camera system which uses a dashboard camera coordinated with handheld devices, such as mobile computing devices, e.g., smartphones, configured to obtain images of locations, such as the street on which the vehicle is traveling, and display the images on the handheld device in real time.

The camera can be configured to transmit a streaming video of the road ahead of the driver t the handheld device, such that the person using the handheld device can view the streaming video on the screen of the handheld device, and accordingly, always have a view of the road, in order to avoid accidents caused when the driver takes his eye off of the road.

Embodiments of the invention are such that the handheld device can be configured to display the images from the camera on the background of the screen as wallpaper, if the images are in the form of a video stream, the handheld device can be configured to display the video as animated wallpaper, such that the user can us applications on the handheld device while images of the road are displayed on the background. The animated wallpaper can be configured to display at the background of all the applications running on the handheld device, or on selected ones. The running applications are displayed "transparently", typically as overlays, minimizing the applications in the foreground and maximizing the background image of the street on which the vehicle is traveling.

For example, the animated wallpaper displaying streaming video of the road can be displayed on the smartphone, as imaged by the vehicle camera, which is linked to the smartphone, either over a communications network, or a short range transmission, such as Bluetooth®. As a result, while the user is dialing a number, or searching for information, receiving a calk text, message or the like, a real-time dynamic view of the road is displayed on the background of the touchscreen display of the smartphone. An indicator of the dialed number, information, incoming call, text, message or communication displays on the phone, but the background image of the road remains displayed in the foreground, for example, as an overlay, in real time. Similarly, when mobile computing device the user is using a texting application, such as WhatsApp® (www.whatsapp.com) and the like, the real-time dynamic view of the road is displayed is displayed in the background, allowing the user to be alerted on any hazard on the road ahead of the vehicle.

Embodiments of the invention are such that the dashboard camera can he further configured to obtain images of the interior of the vehicle, and to transmit the same to a handheld device, e.g., a mobile computing device, such as a smartphone. For example, the dashboard camera can be configured to obt ain an image of the interior of the vehicle when the driver has left the vehicle, for example, when the driver's door is closed. The image is sent to a handheld device associated with the driver.

Embodiments of the invention are such that the system can analyze the image of the interior of the vehicle and determine whether there is a passenger left in the vehicle. The camera cart thus, obtain a reference image of the interior of the vehicle when the driver opens the door to get in the car, when the driver closes the door after leaving the car the system obtain a second image. The system then compares the second image to the reference image and if there are any objects in the image which were not detected in the reference image, the system sends the image to the handheld device of the driver alerting him. to review the image and decide whether or not a passenger was left in the vehicle.

Embodiments of the invention are such that use a Bluetooth® protocol, WiFi®. or an y other short distance communication protocol. This way, as soon as the driver has left the vehicle, the image of the interior of the vehicle is obtained, .analyzed and if deemed necessary sent to the driver.

Embodiments of the invention are such that the camera can be equipped with a long distance communication transceiver, such as a cellular communication, SIM. or the like. The camera can be configured to obtain a video clip of the interior of the car upon detection of movement in the vehicle and send same to the driver's handheld device. This way, the system can alert the driver of burglary, theft, or unauthorized movement and unauthorized drivers of the vehicle. T he system can also be configured to allow the driver to initiate a request form the handheld device which is sent to the camera. The request can include instructions to obtain an image, or streamin video of the interior of the vehicle.

Embodiments of the invention are directed to a computerized method for providing a continuous view of a location. The method comprises: continuously obtaining a representation of a location from an imaging device as the imaging device travels along a. path; converting the obtained representation into digital data for a mobile computing device including a display screen; and, electronically transmitting the digital data to the mobile computing device, such that the digital data is. received by the mobile computing device and displays on screen display of the mobile computing device in a visual format as the continuously obtained representation of the location.

Optionally, the continuously obtaining a representation of a location from an imaging device occurs as the imaging device travels along a path is performed in real time.

Optionally, the electronically transmitting the digital data to the mobile computing device, such that the digital data is received by the mobile computing device, is in real time.

Optionally, the imaging device is a camera, and the continuously obtained representation from the camera is a con.tir.mous image obtained by the camera while a vehicle including the camera travels along the path. Optionally, the visual format includes aft image.

Optionally, the image which displays on the screen display of the mobile computing device is a background image.

Optionally, the background image is configured for supporting a graphic indication of an application as a foreground image, which at least a portion thereof overlays the background image.

Optionaiiv, the foreground image displays transparently over the at least a portion of the background image.

Optionally, the graphic indication of the application as a foreground image remains displayed until the event associated with the application ends.

Optionally, the mobile computing device includes a smartphone.

Optionally, the electronically transmitting the digital data to the mobile computing device is via at least one of Bluetooth®, Wifi® or a data communication over one or more communications networks.

Embodiments of the invention are directed to a computerized method for alerting a user of a mobile computing device, via the mobile computing device, of a situation. The method comprises: detecting a first event associated with a physical activity performed on a vehicle, and responding to the detection of the first event by obtaining a reference image of a portion of the cabin of the vehicle; detecting, a second event associated with a physical activity performed on a vehicle, and responding to the detection of the second event by obtaining a subsequent image of substantially the same portion of the cabin of the vehicle; analyzing the first image against the second image for differences; and, if there are differences between the reference image and the subsequent image, transmitting the subsequent image to the mobile computing device associated with the user.

Optionally, the obtaining the reference image and the subsequent image is performed by an in- vehicle camera. Optionally, the mobile computing device is a smartphone.

Embodiments of the invention are directed to a Computerized method for vehicle security. The method comprises: detecting a physical action associated with the vehicle; activating an imaging device in response to the detected physical action to obtain an image of a subject; analyzing the subjec of the obtained image against images of know subjects associated with the vehicle; and, if the subject of the obtained image is different than the subjects from the known images associated with the vehicle, transmitting the obtained image of the subject to at least one mobile computing device.

Optionally, the detected physical action is analyzed to determine whether the physical action is an abnormal event, and coupled with the subject of the obtained image being different than the subjects from, the known images associated with the vehicle, transmitting the obtained image of the subject to at least one mobile computing device.

Optionally, the at least one mobile computing device includes a mobile computing device of a user associated with the vehicle.

Optionally, the mobile computing device includes a smartphone.

Embodiments of the present invention are directed to a system for providing a continuous view of location. The system comprises: an imaging device for continuously obtaining a representation of a location as the imaging device travels along a path. The imaging device includes: a processor for converting the obtained representation into digital data for a mobile computing device including a display screen, the mobile computing device linked to the imaging device; and, a transmitter for electronically transmitting the digital data to the mobile computing device, such that the digital data is received b the mobile computing device and displays on the display screen of the mobile computing device in a visual format as the continuously obtained representation of the location.

Optionally, the mobile device includes a processor programmed to place a graphic indication of an application over the representation of the location as displayed on the display screen of the mobile computing device, the digital data of the representation of the location for supporting the indication of the application placed over the representation of the location. Optionally, the imaging device continuously obtains the representation in real time, the processor converts the obtained representation to digital data in real time, and the a transmitter for electronically transmitting the digital data to the mobile computing device in. real time.

Optionally, the imaging device includes a camera, configured for use in a vehicle.

Embodiments of the present invention are directed to a system for alerting a user of a mobile computing device, via the mobile computing device, of a situation, The system comprises; a storage medium, for storing computer components; and. a processor for executing the computer components. The computer components comprise; a first computer component for detecting a first event associated with a physical activity performed on a vehicle, and responding to the detection of the first event by causing a reference image of a portion of the cabin of the vehicle to be obtained from an imaging device, and detecting a second event associated with a physical activity performed on a vehicle, and responding to the detection of the second event by causing a subsequent image of substantially the same portion of the cabin of the vehicle to be obtained from the imaging device; a second computer component analyzing the first image against the second image for differences; and, a third computer component such that, if there are differences between the reference image and the subsequent image, the third computer component for transmitting the subsequent image to the mobile computing device associated with the user.

Embodiments of the present invention are directed to a system for vehicle security. The system comprises: a storage medium for storing computer components; and, a processor for executing the computer components. The computer components comprise: a first computer component for detecting a physical action associated with the vehicle; a second computer component for activating an imaging device in response to the detected physical action to obtain an image of a subject; a third computer component for analyzing the subject of the obtained image against images of know subjects associated with the vehicle; and, a fourth computer component, such that if the subject of the obtained image is different than the subjects from the known images associated with the vehicle, the fourth computer component for transmitting the obtained image of the subject to at. least one mobile computing device. Embodiments of the present invention are directed to a computer usable non-transitory storage medium having a computer program embodied thereon fo causing a suitable programmed system to provide a continuous view of a location to a linked mobile computin device, by performing the following steps when such program is executed on the system. The steps comprise: continuously obtaining a representation of a location from an imaging device as the imaging device travels along a path; converting the obtained representation into digital data for a mobile computing device including a display screen; and, electronically transmitting the digital data to the mobile computing device, such that the digital data is received by the mobile computing device and displays on a display screen of the mobile computing device in a visual format as the continuously obtained representation of the location.

Optionally, the computer usable non-transitory storage medium is such that the steps are performed in real time.

Optionally, the computer usable non-transitory storage medium is such that the mobile computing device includes a smartphone.

Optionally, ihe computer usable non-transitory storage medium is such that the steps additionally comprise; placing a graphic, indication of an application over the representation of the location as displayed on the display screen of the mobile computing device, the digital data of the representation of the location for supporting the indication of the application placed ove the representation of the location.

Embodiments of the present invention are directed to a computer usable non-transitory storage medium having a computer program embodied thereon for causing a suitable programmed system to alert: a user of a mobile computing device, via the mobile computing device, of a situation, by performing the following steps when such program is executed on the system. The steps comprise: detecting a first event associated with a physical activity performed on a vehicle, and responding to the detection of the first event by obtaining a reference image of a portion of the cabin of the vehicle; detecting a second event associated with a physical activity performed on a vehicle, and responding to the detection of the second event by obtaining a subsequent image of substantially the same portion of the cabin of the vehicle; analyzing the first image against the second image for differences; and. if there are differences between the reference image and the subsequent image, transmitting the subsequent image to the mobile computing device associated with the user.

Embodiments of the present invention are directed to a computer usable non-transitory storage medium having a computer program embodied thereon for causing a suitable programmed system to alert a user of a mobile computing device, via the mobile computing device, of a situation associated with a vehicle, by performing the following steps when such program is executed on the system. The steps comprise: detecting a physical action associated with the vehicle; activating an imaging device in response to the detected physical action to obtain an image of a subject; analyzing the subject of the obtained image against images of know subjects associated with the vehicle; and, if the subject of the obtained image is different than the subjects from the known images associated with the vehicle, transmitting the obtained image of the subject to at least one mobile computing device.

Optionally, the computer usable non -transitory storage medium is such thai the at least one mobile computing device includes a smartphone linked to the imaging device and associated with a driver of the vehicle.

This document references terms that are used consistently or interchangeably herein. These terms, including variations thereof are as follows.

A "computer " includes machines, computers and computing or computer ' systems (for example, physically separate locations or devices), servers, computer and computerized devices, processors, processing systems, computing cores (for example, shared devices), and similar systems, workstations, modules and combinations of the aforementioned. The aforementioned "computer " may be in various -types, such as a personal computer (e.g.. laptop, desktop, tablet computer), or any type of computing device, including mobile devices (mobile computing devices) that can be readily transported from one location to another location, these mobile devices or mobile computing devices including, for example, smartphones, persona! digital assistants (PDAs), other mobile telephones and cellular telephones. A server is typically a remote computer or remote computer system, or computer program (herein, in accordance with the "computer" defined above, that is accessible over a communications medium, such as a communications network or other computer network, including the Internet. A "server" provides services to, or performs functions for, other computer programs (and their users), in the same or other computers, A server may also include a virtual machine, a software based emulation of a computer.

An "application", includes executable software, and optionally, any graphical user interfaces (GUI), through which certain functionality may be implemented.

A "client" is an application that runs on a computer, workstation or the like and relies on a server to perform some of its operations or functionality.

Unless otherwise defined herein, ail. technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to whic the invention pertains. Although methods and materials similar or equivalent to those described herein may be used in the practice or testing of embodiments of the invention,, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be neeessariJ y limiting,

BRIEF DESCRIPTION OF THE DRAWINGS

Some embodiments of the present invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the Invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.

Attention is now directed to the drawings, where like reference numeral or characters indicate corresponding or like components, in the drawings: .FIG. 1 is a diagram of an exemplary environment for the system in which embodiments of the invention are performed;

FIG. 2 A ' is a block diagram of the vehicle camera of the system of the invention;

FIG. 2B is a block diagram of the home server of the system of the invention;

FIGs. 3A, 3B and 3C are flow diagrams of exemplary processes perfomied by the system of the invention;

FIGs. 4 A and 4B are images of screen displays on a smartphone in accordance with embodiments of the invention;

FIGs. 5A and 5B are images of screen displays on a smartphone in accordance with other embodiments oi the invention; and,

FIG. 6 is an image of a screen display of still another embodiment of the invention. DETAILED DESCRIPTION OF THE DRAWINGS

Before explaining at ieast one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in. the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways.

As wiil be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may- take the form of an. entirely hardware embodiment, an entirely software embodiment {including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware- aspects that may all generall be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a compute program product embodied in one or more non-transitory computer readable (storage) medium(s) having computer readable program code embodied thereon. Throughout this document, numerous .textual arid graphical references are made to trademarks, and domain names. These trademarks and domain names are the property of their respective owners, and are referenced only for explanation purposes herein.

The present invention provides methods and systems for using in- vehicle cameras coupled with mobile computing devices, for vehicle safety, vehicle checks and vehicle security.

The present invention provides methods and systems for providing a real time view of the road, over which a vehicle is traveling, to mobile computing devices, such as smartphones, as taken from a camera in or otherwise associated with the traveling vehicle. The smartphone is typically the .smartphone of the driver of the vehicle. The real time image of the road is taken along the direction of travel of the vehicle. This real, time image is displayed on the smartphone display, typically as a background image. The system is such that other images, such as those indicative of an incoming or outgoing communication, such as a p!ione call, text, message, or other application, for example, overlies the background image, or a portion, thereof, or is to a side of the background image of the road, over which the vehicle is traveling. The background image and the overlay image (in the foreground) (as well as the side displays of the overlay) are displayed on. the display screen of smartphone. The display of the communication, such as a phone call, text, message, or other application, is typically transparent, so that the background image of the road remains fully visible, even though there is an overlying (foreground) image representative of the communication, appl ication or the like.

Reference is now made to FIG. 1 , which shows an exemplary operating environment, including a network(s) 50 (also known as a communications network(s)), to vvhieh is linked a. camera 100 in a vehicle 102, as the vehicle, e.g., an automobile, travels along a path, such as a street 104, road, highway or other thoroughfare (these terms are all interchangeable herein). The vehicle 102 is operated by a driver 106, who is in possession of a mobile computing device, such as a. smartphone 1 .10 (the smartphone 11 including a touchscreen display 1 10a (the touchscreen display also known as a screen display or display screen, these terms used interchangeably herein), shown in FIGs. 4A, 4B, 5A, 5B and 6). The network 50 is also linked to cellular towers 120, 121, which link to the smartphone 1 10 and camera 100 respectively, so these devices can. access the network 50. The network 50 link to a home server (HS) 130, as well as an application server .132, and can also link to The network 50 is, for example, a communications network, such, as a Local Area Network (LAN), or a Wide Area Network (WAN), including public networks such as the internet. The network 50 is either a single network or a combination of networks and/or multiple networks, including also (in addition to the aforementioned communications networks such as the Internet), for example, cellular networks. "Linked" as used herein includes both wired or wireless links, either direct, or indirect, and placing the camera 102, mobile computing devices, such as smartphones 1 10, and computers, including, servers 130, 132, components and the like, in electronic and/or data communications with each other.

The camera 100, is, for example, in the vehicle 1 2, and, for example, is mounted on the dashboard 1.02a of the vehicle 102. The camera 100 is, for example, operable as both a front and rear camera, with imaging apparatus for obtaining representations of a location, by capturing front (forward) 103f images, for example, of the location, for example, the vehicle's route of travel (which is continuously changing due to the vehicle's continued movement during travel), and rear (back) i03r images, for example, of the vehicle cabin or the vehicle's route of travel The aforementioned images (representations of the location, e.g., the vehicle's route of travel) from the camera 1.00 are, for example, captured in real time, and are continuously updated, such that the camera images (obtains images of) the exact location at each instant (e.g., predetermined interval, such as every one-half second ) along the path of which the vehicle is traveling. The camera 100 may also include laterally positioned imaging apparatus, so as to also be a side camera. The camera .100 links to the network 50, and the cellular phone 110 of or associated with the driver 1 6 via the cellular towers 120, 121. Should the camera .100 be in close enough proximity to the smartphone 1 10, these devices- may communicate directly vi short range transmissions, such as Bluetooth® (or WiFi®).

The home server (HS) 1 0. also known as a main server, includes a system 130', which, either alone or with other, computers, including servers, components, and applications, receives communications and data from the camera 100, and sends the various data, such as that of real time images from the camera 100 both front and rear images, to the mobile computing device, e.g., smartphone 110, and vice versa.

The home server 130 and/or the application server 132 store and make accessible, for example, by downloading, the application (APP) 140 of the present invention, by mobile computing devices, such as the smartphone 1 10 {associated with the driver 106 of the vehicle 102). The application 140 includes executable software, and graphical user interfaces (GUT), through which the functionalities of the invention are implemented. The application 140 includes code segments which map to the camera 100 and the system 130 ' of the home server 130, for allowing the user's mobile computerized device, e.g„ smartphone 110, to interact with the camera 1.00 and the system 130 " of the home server 130.

There may also be additional are also additional servers, such as those for cloud storage 150, which can store all of the camera 100 images,, associated with each smartphone i 10, This stored image data is useful, for example, by insurance companies, law enforcement, and the like, to detennine the exact route of travel of the vehicle 102,

FIG. 2A shows a block diagram of the camera 100, The camera 100 includes a central processing unit (CPU) 202 formed of one or more processors, including data processors electronically connected, including in electronic and/or data communication with storage/memory 204, and, storage media 206, an application program interface (API) 208, a rules and policies module 210, an internal measurement unit (IMU) 212, an imaging module 214, a display module 216, an image analysis module 238, a communications module 220, a transceiver 222, and a messaging module 224. All of the aforementioned components of the camera 100 are linked so as to be in direct and/ or indirect communications with each other.

The Centra! Processing Unit (CPU) 202 is formed of one or more processors, such as data processors and includes microprocessors, for performing the camera 100 functions and operations detailed herein, as well as controlling the storage media 206, an application program interlace (API) 208, a rules and policies module 210, an internal measurement unit (IMU) 2.12, an imaging module 214, a display module 216, an image analysis module 218, a communications module 220, a transceiver 222, and. a messaging module 224, and performing the processes of and subprocesses shown in F!Gs. 3 A, 3B and 3C, as detailed below. The processors are, for example, conventional processors, such as those used in servers, computers, and other computerized devices. For example, the processors may include x.86 Processors from AMD and Intel, Xenon*) and Pentium® processors from inie!, as well as any combinations thereof.

The storage/memory 204 is any conventional storage media. The storage/memory 204 stores machine executable instructions for execution by the CPU 202, to perform the processes of the invention. The storage/memory 204 also includes machine executable instructions associated with the operation of the components, including the storage media 206, an application program interface (API) 208, a rules and policies module 210, an internal measurement unit (IMU) 2.12, an imaging module 214. a display module 216, an image analysis module 218, a communications module 220, a transceiver 222, and a messaging module 224.

The storage media 206, for example, stores various data associated with the camera 100, including the images, both front and rear, taken by the camera 100, as well as the mobile computing devices, e.g., sffiartphones 1 1.0, and servers, such as the home server 130, to which the camera .100 is linked.

The interface module 208, is for example, an Application Programming interface (API) module, which functions to specify how the camera 100 requests, retrieves, and may share data from servers and other computers, e.g., mobile computing devices such as smartphones 1 10, linked to the network 50, as well as linked by short range transmissions. The API module 208 also functions to facilitate the interaction of the various software, components, modules, and databases with each other, and also interactions with other servers and the like outside of the camera 100, for operation of the disclosed processes.

The rues and policies module 210 provides various rules and policies applied by the CPU 202, in performing the processes of the invention. This module 210 also determines ''events' ' which cause action by the various components of the camera 100, smartphone 110 and/or home server 130. For example, a device event may be the use of an application, sending receiving of a call, a message or other communication, it may be a sound, such as a door opening, closing, and/or unlocking/locking, detected by the IMU 12, sounds and forces associated with a forced entry, a crash of the vehicle, or the like, also as detected by the IMU 12.

The internal measurement unit. (IMU) 212 includes, for example, a gyrometer, an accelerometer and a magnetomer, in order to detect -movements, sounds, vibrations, shocks, jolts, forces, angular rates, accelerations, magnetic fields, and the like in the vehicle 102, and its surroundings.

The imaging module 214 obtains and digitizes the images (converts the images into digital data/digital, media) from the front and rear cameras (arid optionally, side and top/bottom cameras, if present) of the camera 100. The display module 216 obtains the digitized images and formats the images from the camera 100 the various devices and their display screens, such as that for the touchscreen display 1 10a of the smartphone 110.

The image analysis module 218 analyzes images against each other in accordance with the rules and policies of the rules arid policies module 210. When appropriate, the module 218 signals the messaging module 224, to issue messages to users, such as the smartphone 1 10. via the communications module 220 or the transceiver 222.

The communications module 220 is designed to handle communications over networks 50, such as the Internet, cellular networks and the like. For example, the communications module handles data exchanges (transmission and receipt of data) between the camera 100 and the home server 130, as well as other computers servers and the like on the network 50. and communications including data exchanges, between the various mobile computing devices, e.g., smartphones 110 and the home server 130, as. well as other computers, servers, and the like on the network 50. This module 220 includes, for example, transmitters and receivers for handling data exchanges and communications over the network(s) 50,

The transceiver 222, formed of a transmitter and a receiver, handles communications including data exchanges, between the camera 100 and mobile computing devices, e.g., smartphone .1 10. The communication and data exchanges are, for example, in accordance with short distance protocols, such as Bluetooth® and Wifi®.

The messaging module 224 handles administration and operation of the messaging processes, and applies various rules and policies (from the rules and policies module 210) to the messaging processes when necessary.

The home server (HS) 130 sen'es facilitate communications and data exchanges between the camera 100 and the mobile computing device, e.g., the smartphone 1 10. The home server 130 may include one or more servers, components and the like, and may be or include a server associated with the carrier of the smartphone Π0, such as Sprint®, Verizon®, T-Mobiie®, Orange®, Partner®, and the like. The home server 130 includes for example, components for facilitating data exchanges between the one or more of the home server 130, the camera 100, mobile computing devices, e.g., the smartphone 1 10, and other servers and components, such as cloud storage 150, linked to the network 50. The components germane to the home server 130 are described below, although, the home server 1.30 may include numerous other components, modules and engines for performing numerous other functions. As shown ill FIG. 2B, the home server 1.30 and its system 130' include, for example, a CPU 252, storage/memory 254, storage media 256, and a communications module 258. All of the aforementioned components of the home server 130 are linked so as to be in direct and/or indirect communications' with each other.

The Central Processing Unit (CPU) 252 is formed of one or more processors, such as data processors and includes microprocessors, for performing the home server 130 and system 130 * functions and operations detailed herein, as well as controlling the storage media 256 and the communications module 258, and performing the processes of and subprocesses shown in FiGs. 3 A, 3B and 3C, as detailed below. The processors are, for example, conventional processors, such as those used in servers, computers-, and other computerized devices. For example, the processors ma include x86 Processors from AMD and Intel, Xenon® and Pentium® processors from Intel, as well as any combinations thereof.

The storage/memory 254 is an conventional storage media. The storage/memory 254 stores machine executable instructions for execution by the CPU 252, to perform the processes of the invention. The storage/memory 254 also includes machine executable instructions associated with the operation of the components, including the storage media 256, and the communications module 258.

The storage media 256, for example, stores various data associated with the home server 130, including the images, both front 103 f and rear i03r, taken by the camera 100, as well as data associated with the mobile computing devices, e.g., smartphones 1 10, and servers, such as the home server 130. cloud storage servers 150 and the like, to which the camera 100 is linked.

The communications module 258 is designed to handle communications over networks 50, such as the internet, cellular networks and the like. For example, the communications module handles data exchanges (transmission and receipt of data) between the camera 100 and the home server .130, as well, as other computers servers and the like on the network 50, and communications including data exchanges, between the various mobile computing devices, e.g., smartphones 1 10 and the home server 130, as well as other computers servers and the like on. the network 50. Attention is now directed to FIGs. 3A, 3B and 3C, which show flow diagrams detailing computer- implemented, processes in accordance with embodiments of the present invention. Reference is also made to the screen diagrams of FIGs. 4A, 4B. 5A, 5B and 6, which appeal- on the display, e.g., touchscreen, of the smartphone 1 10. The process and subprocesses of FIGs. 3 A, 3B and 3C are computerized processes performed by the system camera 100 and/or the system 130' of the home server 130, with, for example, the application 140, deployed on the user mobile computing devices, e.g., smartphone 1 10 of the driver 106. The aforementioned processes and sub-processes are, for example, performed automatically, and, for example, in. real time.

Prior to the START block 300 of the process detailed in FIG. 3 A, the application 1 0 has been installed by the driver 106 on his associated smartphone 110. The process begins at the S TART block 300. when the application 140 is open, or otherwise activated (running and executing), and, for example, a connection or pipe from, the smartphone 110 to the camera 100 and/or the home server 130, is opened.

The process moves to block 302, where the camera image 100, such as that taken by the front camera, shows the front camera image 103 , which is view of the street 104 along which the vehicle 102 is traveling. This image, as taken by the front camera of the camera 100 is transmitted to (e.g., by the communications module 220 or transceiver 222, depending on the distance between the camera 1 0 and the smartphone 1 10) and displayed on the display 1 10a of the smartphone 1 10., As shown in FIG. 4A, the image 400 on the display 1 10a of the smartphone 1 10, is displayed, for example, in real time. For example, this image 400 is taken at a first time, or time 1, which is 12:43 pm. This image 400 is displayed, for example, as the background, or wall paper of the touchscreen display 1 10a of the smartphone 1 10, for example, during the time the application 140 is running on the smartphone 110.

The process moves to block 304, where the smartphone 1 10 is monitored for a device event. If .a device event is not detected,, the process returns to block 302. Should a device event be detected, at block 304, for example, the use of an application, sending receiving of a call, a message or other communication, the process moves to block 306, At block 306, it is determined whether to display the event, for example, as an overlay or a lateral placement, or combination thereof, over the background image, which is the view of the street 104, over which the vehicle 102 is traveling. If no, at block 306, the process moves to block 302, from vvhere .it resumes. If yes at block 306, the process moves to block 308, where the event, e.g., and indication of the event, is displayed as a graphic indication for representation) of the application, for example, as an overlay or a lateral placement, or combination thereof, over (placed over) the background image, which is the view of the street 104, over which the vehicle 102 is traveling. FIG. 4B shows the indication of the event 402, an incoming telephone call from G. Brett of telephone number 816-333-5555, received at time 2, 12:44pm, which is subsequent to time. 1 , This indication of the event 402 is visual (graphic), and is displayed within the background image 400 (view of the street 104 along which the vehicle 102 is traveling), for example, in real time. The indication of the event 402 is transparent (a transparent overlay, e.g., foreground image), allowin for maximization of the background image 400. The indication of the event 402, is, for example, placed centrally within the background image 400, but. raay also be placed anywhere within the background image 400, as well as positioned pailially within the background image 400, or as a side image, outside of the background image 400.

The process moves to block 310, where it is determined, whether the event is complete, if no, the process remains at block 310, and the touchscreen display remains as shown in FIG. 4B. if yes at block 310, the event is complete, as it has ended, e.g., the phone call is closed, or disconnected, and the process moves to block 312. At block 312, the event indicator 402 (foreground image) disappears, as it removed from the background 400 image, such that the image on the touchscreen display 1 10a of the smartphone 1 10 returns to that similar to the image 400 of FIG, 4A.

FIG. 36 shows another process in accordance with the present invention. Prior to the START block 330 of the process detailed in FIG. 3B, the application 140 has been installed by the driver 106 on his associated smartphone 1.10, The process begins at the START block 330. when the application 140 is open or otherwise activated (running and executing), and, for example, a connection or pipe from the smartphone 1 30 to the camera 100 and/or the home server 130, is opened.

At block 332, the IMU 212 detects an event, in accordance with the rules and policies (of module 208). The event is, for example, a vehicle 1.02 door opening, such as a unlocking should and the movement of the door being opened, being detected. The IMU signals the camera 100, which causes the rear camera, to take an image of the cabin of the vehicle 1 .2, at block 334. This image is known as a reference image, as it is the initial or first image taken. For example, the image 500 is the reference image, and is displayed on the touchscreen display 1 1 a, and shows the back seat of the vehicle 102 (in the vehicle cabin), and is taken at a first time, time 1 , e.g., 13:02 pm, as shown in FIG. 5A. The process moves to block 336, where this first or reference image at time 1 (13:0.2 pm) is placed into the storage media 206 of the camera 1 0, At block 338, the IMU 21 2 detects another event, subsequent to the previous event being detected (at block 332), this subsequent event also in accordance -with the rules and policies (of module 210). The subsequent event is, for example, a vehicle 102 door opening and subsequently being closed, and/or locked, as detected by the IMU 212. The detection of this subsequent event causes the IMU 212 to signal the camera 100, which causes the rear camera, to take an image (subsequent/second image) of the cabin of the vehicle 102, at block 340. For example, the subsequent, image 502, is taken at time 2, e.g., 14:03 pm, which is subsequent to time 1 , and is displayed on the touchscreen display 1 10a is of the driver's smartphone 1 10. The process moves to block 342, where the subsequent image is stored.

The process moves to block 344, where the stored images, the reference image 500, and the subsequent image 502, are compared by the image analysis module 21 8. At block 346. the image analysis module 21 8, in accordance with the rules and policies (of rules and policies module 210) determines whether there are differences in the two images, e.g., the first or reference image 500 at time 1 of FIG. 5A and the subsequent (second) image 502 from time 2 of FIG, 5B. If no, the process moves to block 350, where it ends. If yes, the second image with or without a message is sent to the smartphone 1 10 associated with the driver 106, and linked to the camera 100, so the driver can see the content of the subsequent (second) image. For example, from FIG. 5B, the driver has left his backpack 504 in the vehicle, which the driver 106 can see as image 502 is displayed on the touchscreen display 1 10a of his associated smartphone 1 10, as shown in FIG. SB.

Alternately, the image analysis of block 346 may be based on artificial intelligence (Al), where the differences in the reference 500 and subsequent 502 images is compared against images of known conditions, to determine whether the subsequent image 502 should be sent (process moves to block 348) or not be sent (process moves to block 350).

This process of FIG. 3B is also useful in determining whether people such as children, and/or pets were accidentally left in the vehicle. This process allows for immediate attention to the situation, so that the. children pets can be attended to before being left too long in the vehicle cabin.

FIG. 3C shows another process in accordance with the present invention. Prior to the START block 330 of the process detailed in FIG. 3C, the application 140 has been installed by the driver 106 on his associated smartphone 1 1 . The process begins at the START block 360, when the application 140 is open or otherwise activated (running and executing), and, for e am le., a connection or pipe from the smartphone 1 10 to the camera 100 and/or the home server 130, is opened.

At block 362, the I U 212 detects an event, in accordance with the rules and policies (of module 210). The event is, for example, an abnormal action (an abnormal event), such as sounds and forces of a lock being tampered with, a window being broken, a vehicle door opening subsequently, and a person entering the vehicle .102. The process moves to block 364, where the IMU 212 signals the camera 100 to activate, and causes the rear camera, to obtain (take) an image of the cabin of the vehicle 102, at block 366. The process moves to block 368, where the image is analyzed, by the image analysis module 218.

The process moves to block 370, where it is determined, by the image analysis module 218, whether the image is of an unknown, entity. For example, the image analysis module 218 compares the image of the driver J 06, to stored images of previous dri vers. This comparison is optionally coupled with rules and policies for the IMU 212 event (from the rules and policies module 2.10). For •example, an analysis may be based on whether this is the first time for this driver's image, and optionally coupled with rules and policies for the I U 232 (from the rules and policies module 210, e.g., the IMU detected event is indicative of a break in), if no, as per block 370, the process moves to block 362, from where it resumes, as detailed above. If yes, as per block 370, the process moves to block 372, where the messaging module .2.24 and the communications module 220 and/or transceiver 2,22 are signaled to transmit the image, for example, obtained from the rear camera of the camera 1 0, to the smartphone 1 10 (e.g., touchscreen display 1 ί 0a of the smartphone 1 10), to which the camera 100 is linked, the smartphone 5 10 associated with the driver 1 6. The transmitted image is of the cabin of the vehicle 102, and optionally, may also include a message indicating the unauthorized person in the vehicle 102. An example of the image 600 taken by the rear camera of the camera 100, of the unauthorized person, along with a. message 602 indicating the same, as shown on the touch screen displa 1 10a of the smartphone 1 10 associated with the driver 106, is shown in FIG. 6.

Alternately, at block 372, the image and/or message may be transmitted to third parties, such as law enforcement, first responders and the like, either instead of the smartphone 1 10 of the driver/user, or in addition thereto, when the application 140 is programmed accordingly. In alternative embodiments, the system of the invention and the processes of the invention can also be used with streaming video, in accordance with the descriptions provided above.

The implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instmmeotation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.

For example, hardware for performing selected tasks according to embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. n an exemplary embodiment of the invention, one or more tasks according to exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memoiy for sioring instructions and/or data and/or a non-volatile storage, for example, non-transitory storage media such as a magnetic hard-disk and/or removable media, For sioring instructions and of data. Optionally, a network connection is provided as well A display and/or a user input device such, as a keyboard or mouse are optionally provided as well.

For example, any combination of one or more non-transitory computer readable (storage) medium(s) may be utilized in accordance with the above-listed embodiments of the present invention. The non- transitory computer readable (storage) medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited, to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non- exhaustive list) f the computer readable .storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPR.OM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD- ROM), an. optical storage device, a magnetic storage device, or any suitable combination of the foregoing, in the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device,

A computer readable signal medium may include a. propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to. electro-magnetic, optical, or any suitable combination thereof A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device,

As will be understood with reference to the paragraphs and the referenced drawings, provided above, various embodiments of computer-implemented methods are provided herein, some of which can be performed by various embodiments of apparatuses and systems described herein and some of which can be performed according to instructions stored in non-transitory computer-readable storage media described herein. Still, some embodiments of computer-implemented methods provided herein can be performed by other apparatuses or systems and can be performed according to instructions stored in computer-readable storage media other than that described herein, as will become apparent to those having skill in the art with reference to the embodiments described herein. Any reference to systems and computer-readable storage media with respect to the following computer-implemented methods is provided for explanatory purposes, and is not intended to limit any of such systems and any of such non -transitory computer-readable storage media with regard to embodiments of computer-implemented methods described above. Likewise, any reference to the following computer-implemented methods with respect to systems and computer-Teadable storage media is provided for explanatory purposes, and is not intended to limit any of such compiiter- implemented methods disclosed herein.

The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible, implementations of systems, methods and computer program products according to various embodiments of the present invention, in this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical functions), it should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in. the reverse order, depending upon the functionality involved, it will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware -based systems that perform the specified functions or acts, or combinations of special purpose, hardware and compute instructions.

The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies Found- in the marketplace, or to enable others - of ordinary skill in the art to understand the embodiments disclosed herein.

As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dict tes otherwise.

The word "exemplary" is used herein to mean "serving as an example, instance or illustration'*. Any embodiment described as "exemplary 5* is no necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.

It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combinalion in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements. The above-described processes including portions thereof can be performed by software, hardware and combinations thereof. These processes and portions thereof can be performed by computers, computer-type devices, workstations, processors, micro-processors, other electronic searching tools and memory and other non-transitory storage-type devices associated therewith. The processes and portions thereof can also be embodied in programmable non-transitory storage media, for example, compact discs (CDs) or other discs including magnetic, optical, etc., readable by a machine or the like, or other computer usable storage media, including magnetic, optical, or semiconductor storage, or other source of electronic signals.

The processes (methods) and systems, including components thereof, herein have been described with, exemplary reference to specific hardware and software. The processes (methods) have been described as exemplary, whereby specific steps and their order can be omitted and/or changed by persons of ordinary skill in the art to reduce these embodiments to practice without undue experimentation. The processes (methods) and systems have been described in a manne sufficient to enable persons of ordinary skill in the art to readily adapt other hardware and software as may be needed to reduce any of the embodiments to practice without undue experimentation and using conventional techniques.

Although the invention has been described in conjunction with specific embodiments thereof, it. is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.