Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHODS FOR INDOOR NAVIGATION
Document Type and Number:
WIPO Patent Application WO/2015/083150
Kind Code:
A1
Abstract:
An indoor navigation methods and system for navigating a mobile device couple with a user, inside a particular indoor facility. The system includes an indoor navigation services server and at least one mobile device, having an activated indoor navigation management module being in communication flow with the indoor navigation services server. The method includes identifying the particular indoor facility and sending that identity to the indoor-navigation-services server; receiving data associated with the particular indoor facility, including site-maps and beacons, including the location and dimensions of the beacons; receiving a plurality of sensed data readings from selected sensors of the mobile device; determining the position of the mobile device, inside the particular indoor facility, from an acquired image frame; tracking movements of the mobile device, using elements detected in an acquired sequence of image frames; and updating the position of the mobile device, using the tracked movements of the mobile device.

Inventors:
DEVORA GIL (IL)
Application Number:
PCT/IL2014/000064
Publication Date:
June 11, 2015
Filing Date:
December 08, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SHOP CLOUD LTD (IL)
International Classes:
G01C21/20; G01C21/10
Foreign References:
US20130297205A12013-11-07
US20130158941A12013-06-20
US20100135527A12010-06-03
US20110135207A12011-06-09
Attorney, Agent or Firm:
PRESENTI, Eran (16 Abba Hillel St, 02 Ramat Gan, IL)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. An indoor navigation method for navigating a mobile device (22) coupled with a user, having a camera (24), inside a particular indoor facility, comprising the steps of:

a) providing:

i) an indoor-navigation- services server (102) having a main-processor (110) and a database (130);

ii) an activated indoor-navigation-management module (120) being in communication flow with said indoor-navigation- services server; b) identifying the particular indoor facility by said indoor-navigation-management module;

c) sending said identity of the particular indoor facility to said main-processor by said indoor-navigation-management module;

d) receiving site data associated with the particular indoor facility, said site data comprising site-maps and beacons/landmarks, including data associated with said beacons/landmarks, wherein said data associated with said beacons/landmarks includes the location and dimensions of said beacons/landmarks;

e) receiving a plurality of sensed data readings from selected sensors of said mobile device;

f) acquiring at least one positioning image frame by said camera;

g) identifying at least one of said beacons/landmarks in said at least one acquired positioning image frame, using said at least one acquired positioning image frame and said data associated with said identified beacon/landmark;

h) computing the distance and elevation of said mobile device, inside the particular indoor facility, to said identified beacon;

i) acquiring a sequence of tracking image frames by said camera;

j) tracking movements of said mobile device inside the particular indoor facility, using detected elements in said acquired sequence of tracking image frame; and k) updating the position of said mobile device, using said tracked movements of said mobile device.

2. The indoor navigation method of claim 1, wherein said indoor-navigation-services server is a remote server.

3. The indoor navigation method of claim 1, wherein said identifying of the particular indoor facility is obtained using the GPS of said mobile device or obtained from the cellular services provider using a location device driver of said mobile device.

4. The indoor navigation method of claim 1 further comprising a step of calibrating/initializing said selected mobile device sensors.

5. The indoor navigation method of claim 4, wherein said selected mobile device sensors are selected from the group consisting of a camera, a gyroscope and a magnetometer.

6. The indoor navigation method of claim 4, wherein said selected mobile device sensors are selected from the group including a front camera, a rear camera, a magnetometer, an accelerometer, a gyroscope an absolute height sensor and a relative height sensor.

7. The indoor navigation method of claim 1, wherein said computing the position of said mobile device comprises the steps of:

a) detecting at least one object in said at least one acquired positioning image frame; and

b) matching detected objects with shapes received from said data associated with said beacons/landmarks.

8. The indoor navigation method of claim 1, wherein said computing the position of said mobile device comprises the steps of:

a) detecting at least one object in said at least one acquired positioning image frame;

b) performing OCR analysis on said at least one detected object; and

c) matching detected text with text data included in said data associated with said beacons/landmarks .

9. The indoor navigation method of claim 1, wherein said computing the position of said mobile device comprises the steps of:

a) detecting at least one object in said at least one acquired positioning image frame;

b) performing color separation analysis said at least one detected object to thereby compute an analyzed color data of said at least one detected object; and c) matching said analyzed color data with color data included in said data associated with said beacons/landmarks.

10. The indoor navigation method of claim 1, wherein said tracking movements of said mobile device uses optical flow analysis, comprising the steps of:

a) detecting contours in the image frames;

b) analyzing said detected contours to thereby determine at least one trackable element that appears in successive image frames of said acquired sequence of tracking image frames;

c) calculating the size of said at least one trackable element, using pitch and height data sensed by said selected mobile device sensors;

d) comparing size and image frame location of said at least one trackable element in said successive image frames, to thereby determine distance and direction of motion of said mobile device; and

e) updating the position of said mobile device.

11. The indoor navigation method of claims 1 or 10, wherein upon detection of accelerated motion of said mobile device that is at a rate faster than a pre-configured threshold, said tracking movements of said mobile device comprises the steps of:

a) detecting motion vector using one or more accelerometers and a gyroscope; b) detecting motion direction using a gyroscope; and

c) updating the position of said mobile device.

12. The indoor navigation method of claims 10 or 11, wherein said tracking of movements of said mobile device further comprises the steps of:

a) counting the user's steps to thereby determine distance of motion;

b) detecting motion direction using a gyroscope; and

c) updating the position of said mobile device.

13. The indoor navigation method of claim 12, wherein said counting the user's steps is performed by a step counting module running on said mobile device.

14. The indoor navigation method of claim 1, wherein when a beacon/landmark detected by said indoor-navigation-management module in said step of identifying at least one of said beacons/landmarks in said at least one acquired positioning image frame, and said indoor-navigation-management module does not find a matched beacon/landmark in said site data, the indoor navigation method further includes the step of updating said database with said detected beacon/landmark.

15. The indoor navigation method of claim 1 further includes the steps of:

a) sending said at least one acquired positioning image frame to said main- processor of said indoor-navigation-services server;

b) detecting elements in said at least one acquired positioning image frame;

c) matching said elements with beacons/landmarks in said site data stored in said database; and

d) updating said database with said detected elements for which detected

elements no match was found.

16. An indoor navigation system comprising:

a) an indoor-navigation-services server (102) having a main-processor (110) and a database (130); and

b) at least one mobile device (22) coupled with a user, having an activated indoor- navigation management module (120) being in communication flow with said indoor-navigation-services server,

wherein said indoor-navigation-management module is configured to identify a desired indoor facility;

wherein said indoor-navigation-management module sends said identity of the indoor facility to said indoor-navigation- services server and receives site data associated with the indoor facility, said site data comprising site-maps and beacons/landmarks, including data associated with said beacons/landmarks, and wherein said data associated with said beacons/landmarks includes the location and dimensions of said beacons/landmarks;

wherein said indoor-navigation-management module is configured to receive a plurality of sensed data readings from selected sensors of said mobile device;

wherein said indoor-navigation-management module is configured to determine the position of said mobile device using an identified beacon/landmark;

wherein said indoor-navigation-management module is configured to track movements of said mobile device inside the indoor facility; and

wherein said indoor-navigation-management module is configured to update the current position of said mobile devices using said tracked movements.

17. The indoor navigation system of claim 16, wherein said indoor-navigation- services server is a remote server.

18. The indoor navigation system of claim 16, wherein said identifying of the particular indoor facility is obtained using the GPS of said mobile device or obtained from the cellular services provider using a location device driver of said mobile device.

19. The indoor navigation system of claim 16, wherein said selected mobile device sensors are selected from the group consisting of a camera, a gyroscope and a magnetometer.

20. The indoor navigation system of claim 16, wherein said selected mobile device sensors are selected from the group including a front camera, a rear camera, a magnetometer, an accelerometer, a gyroscope an absolute height sensor and a relative height sensor.

Itz Katz, Patent Attorney,

M. Firon & Co.

Aurec House, 16 Abba Hillel Silver Road Ramat-Gan 52506, Israel

For the applicant

Description:
SYSTEM AND METHODS FOR INDOOR NAVIGATION

ABSTRACT

An indoor navigation methods and system for navigating a mobile device couple with a user, inside a particular indoor facility. The system includes an indoor- navigation-services server and at least one mobile device, having an activated indoor- navigation management module being in communication flow with the indoor- navigation-services server. The method includes identifying the particular indoor facility and sending that identity to the indoor-navigation- services server; receiving data associated with the particular indoor facility, including site-maps and beacons, including the location and dimensions of the beacons; receiving a plurality of sensed data readings from selected sensors of the mobile device; determining the position of the mobile device, inside the particular indoor facility, from an acquired image frame; tracking movements of the mobile device, using elements detected in an acquired sequence of image frames; and updating the position of the mobile device, using the tracked movements of the mobile device.

FIELD OF THE INVENTION

The present invention generally relates to the field of indoor navigation and more particularly, to a system and methods that provide an indoor navigation service using built-in sensors of a personal mobile dives to detect recognized beacons such as signs and logos of chain stores.

BACKGROUND OF THE INVENTION

Outdoor navigation has recently become common using mobile devices having a satellite based GPS. However the accuracy and availability of GPS dramatically deteriorates indoors and therefore, conventional methods become unavailable. Since GPS signals are ineffective in indoor environments, many other methods using other signals such as WiFi, RFID, Bluetooth, ultrasonic wave, compass, G-meter or the like have been proposed. The term "indoor navigation", as used herein refers to any environment in which conventional, satellite based GPS accuracy and/or availability are dramatically deteriorated.

There is therefore a need and it would be advantageous to have an indoor navigation system and methods that relay only on sensors built into conventional mobile devices such as cameras, a magnetometer, accelerometers, a gyroscope and the like.

SUMMARY OF THE INVENTION

The principle intentions of the present invention include providing a system and methods for indoor navigation using a personal mobile device, such as a smartphone or a tablet, and remote server, without a need to install additional hardware within the indoor facility itself.

The indoor navigation includes three main processes: a) Data downloading from the remote server;

b) Positioning the mobile device within the indoor facility; and

c) Tracking the motion of the user/mobile device within the indoor facility.

The indoor facility site may be located automatically by a GPS/device locating system, when approaching the site or using the last known GPS location of the mobile device. The data associated with that site will automatically download to the mobile device. Otherwise the management module will ask the user to identify the site manually. The downloaded data includes beacons, floor maps, store data, landmarks and the like.

The term "beacon", as used herein refers to a visible object in the environment of a given indoor facility, wherein the absolute location of the object, within the indoor facility, is known, as well as other features such as the shape, dimensions and height (above the floor, for example). For example, with no limitations, beacons may be in the form of a store sign, a logo and gates. Landmarks may be in the form of Exit signs, patterns on the ceiling such as patterns of lighting, other signs (textual or images), pictures on the walls, signs on the floor and so on and so forth. A camera of the mobile device acquires a sequence of image frames from which one or more beacons/landmarks are identified. Since the location and dimensions of the identified beacon/landmark is known, the distance from the beacon and the angle at which the image frame was taken from, in relation with the identified beacon/landmark, can be calculated.

An aspect of the indoor navigation methods is to include a learning process. That is, when the positioning process is over, the acquired data is sent to the server for analysis, and if the server identifies a new beacon/landmark (such as a store change) the server updates the database accordingly.

According to the teachings of the present invention, there are provided methods for navigating a mobile device operatively coupled with a user, having a camera, inside a particular indoor facility. The method includes the step of providing an indoor-navigation-services server, having a main-processor and a database; and an activated indoor-navigation-management module being in communication flow with the indoor-navigation- services server.

The method further includes the steps of identifying the particular indoor facility by the indoor-navigation-management module; sending the identity of the particular indoor facility to the main-processor by the indoor-navigation-management module; receiving site data associated with the particular indoor facility, the site data including site-maps and beacons/landmarks, including data associated with the beacons/landmarks, wherein the data associated with the beacons/landmarks includes the location and dimensions of the beacons/landmarks.

The method further includes the steps of receiving a plurality of sensed data readings from selected sensors of the mobile device; acquiring at least one positioning image frame by the camera; and determining the position of the mobile device inside the particular indoor facility, when the at least one positioning image frame was acquired.

The step of determining the position of the mobile device includes the steps of identifying at least one of the beacons/landmarks in the at least one acquired positioning image frame, using the at least one acquired positioning image frame and the data associated with the identified beacon/landmark; and computing the distance and elevation of the mobile device, inside the particular indoor facility, to the identified beacon.

Optionally, the computing the position of the mobile device includes the steps of detecting at least one object in the at least one acquired positioning image frame, and matching detected objects with shapes received from the data associated with the beacons/landmarks .

Optionally, the computing the position of the mobile device includes the steps of detecting at least one object in the at least one acquired positioning image frame, performing OCR analysis on the at least one detected object, and matching detected text with text data included in the data associated with the beacons/landmarks.

Optionally, the computing the position of the mobile device includes the steps of detecting at least one object in the at least one acquired positioning image frame, performing color separation analysis the at least one detected object to thereby compute an analyzed color data of the at least one detected object, and matching the analyzed color data with color data included in the data associated with the beacons/landmarks .

The identifying of the particular indoor facility is obtained using the GPS of the mobile device or obtained from the cellular services provider using a location device driver of the mobile device.

Optionally, the indoor navigation method further includes a step of calibrating/initializing the selected mobile device sensors.

The selected mobile device sensors are selected from the group consisting of a camera, a gyroscope and a magnetometer.

Optionally, the selected mobile device sensors are selected from the group including a front camera, a rear camera, a magnetometer, an accelerometer, a gyroscope an absolute height sensor and a relative height sensor.

The method for navigating a mobile device further includes the steps of acquiring a sequence of tracking image frames by the camera; tracking movements of the mobile device inside the particular indoor facility, using detected elements in the acquired sequence of tracking image frames; and updating the position of the mobile device, using the tracked movements of the mobile device. Optionally, the tracking movements of the mobile device uses optical flow analysis, including the steps of detecting contours in the image frames; analyzing the detected contours to thereby determine at least one trackable element that appears in successive image frames of the acquired sequence of tracking image frames;

calculating the size of the at least one trackable element, using pitch and height data sensed by the selected mobile device sensors; comparing size and image frame location of the at least one trackable element in the successive image frames, to thereby determine distance and direction of motion of the mobile device; and updating the position of the mobile device.

Optionally, upon detection of accelerated motion of the mobile device that is at a rate faster than a pre-configured threshold, the tracking movements of the mobile device includes the steps of detecting motion vector using one or more accelerometers and a gyroscope, detecting motion direction using a gyroscope, and updating the position of the mobile device.

Optionally, the tracking of movements of the mobile device further includes the steps of: counting the user's steps to thereby determine distance of motion, detecting motion direction using a gyroscope, and updating the position of the mobile device.

Optionally, counting the user's steps is performed by a step counting module running on the mobile device.

Preferably, the indoor navigation method further includes the steps of sending the at least one acquired positioning image frame to the main-processor of the indoor- navigation-services server, detecting elements in the at least one acquired positioning image frame, matching the elements with beacons/landmarks in the site data stored in the database, and updating the database with the detected elements for which detected elements no match was found.

Optionally, when a new beacon/landmark detected by the indoor-navigation- management module in the step of identifying at least one of the beacons/landmarks in the at least one acquired positioning image frame, and the indoor-navigation- management module does not find a matched beacon/landmark is the site data, the indoor navigation method further includes the step of updating the database with the detected beacon/landmark. It is an aspect of the present invention to provide an indoor navigation system including an indoor-navigation-services server, having a main-processor and a database, and at least one mobile device coupled with a user, having an activated indoor-navigation management module being in communication flow with the indoor- navigation- services server.

The indoor-navigation-management module is configured to identify a desired indoor facility.

The indoor-navigation-management module sends the identity of the indoor facility to the indoor-navigation-services server and receives site data associated with the indoor facility, the site data including site-maps and beacons/landmarks, including data associated with the beacons/landmarks, and wherein the data associated with the beacons/landmarks includes the location and dimensions of the beacons/landmarks.

The indoor-navigation-management module is configured to receive a plurality of sensed data readings from selected sensors of the mobile device.

The indoor-navigation-management module is configured to determine the position of the mobile device using an identified beacon/landmark, to track

movements of the mobile device inside the indoor facility, and to update the current position of the mobile devices using the tracked movements.

Preferably, the indoor-navigation-services server is a remote server.

The identifying of the particular indoor facility is obtained using the GPS of the mobile device or obtained from the cellular services provider using a location device driver of the mobile device.

Although the present invention has been described with reference to the preferred embodiment and examples thereof, it will be understood that the invention is not limited to the details thereof. Various substitutions and modifications are suggested in the foregoing description, and other will occur to those of ordinary skill in the art. Therefore, all such substitutions and modifications are intended to be embraced within the scope of the invention as defined in the claims. BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will become fully understood from the detailed description given herein below and the accompanying drawings, which are given by way of illustration and example only and thus not limitative of the present invention, and wherein:

Fig. 1 is a general schematic block diagram illustration of the components of an indoor navigation system, according to an embodiment of the present invention.

Fig. 2 shows a schematic flowchart diagram of a method for an indoor navigation method, according to an embodiment of the present invention.

Fig. 3 shows a schematic flowchart diagram of a method for downloading the map and floor plan data of a visited indoor site, used in the indoor navigation method outlined in Fig. 2.

Fig. 4 shows a schematic flowchart diagram of a method for initializing the mobile device sensors that may participate in the indoor navigation method outlined in Fig. 2.

Fig. 5 shows a schematic flowchart diagram of a method deriving the positioning of a mobile device, used in the indoor navigation method outlined in Fig. 2.

Fig. 6 shows a schematic flowchart diagram of a method of tracking the mobile device motion, used in the indoor navigation method outlined in Fig. 2.

Fig. 7 shows a schematic flowchart diagram of a method of detecting motion of the mobile device, using optical flow analysis.

Fig. 8 shows a schematic flowchart diagram of a method of detecting mobile device motion using step counting.

DETAILED DESCRIPTION OF THE INVENTION

The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided, so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. An embodiment is an example or implementation of the inventions. The various appearances of "one embodiment," "an embodiment" or "some embodiments" do not necessarily all refer to the same embodiments. Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment.

Reference in the specification to "one embodiment", "an embodiment", "some embodiments" or "other embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment, but not necessarily all embodiments, of the inventions. It is understood that the phraseology and terminology employed herein is not to be construed as limiting and are for descriptive purpose only.

Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks. The order of performing some methods step may vary. The descriptions, examples, methods and materials presented in the claims and the specification are not to be construed as limiting but rather as illustrative only.

Meanings of technical and scientific terms used herein are to be commonly understood, unless otherwise defined. The present invention can be implemented for testing or practice with methods and materials equivalent or similar to those described herein.

Reference is now made to the drawings. Fig. 1 is a general schematic block diagram illustration of the components of an indoor navigation system 100, according to an embodiment of the present invention. Indoor navigation system 100 includes a server that may be a server 102 of a provider of indoor navigation services. The server of indoor navigation system 100 may be an in-site server that can be a standalone server, or operatively coupled with an indoor-navigation- services server 102. The in- site server may be integrated into an organizational computerized system of the site (or a site-chain). The present invention will be described, by way of example, with no limitations, in terms of the server of indoor navigation system 100 being indoor- navigation-services server 102. Indoor-navigation- services server 102 includes a main-processor 110 and a database unit 130. Processor 110 includes a facilities-management unit 112, a user's- management unit 114 and a processor 116. Database unit 130 includes a maps & floor plans DB 132, a beacons & landmarks DB 136 and optionally, a users' DB 134. The databases may be merged into one or two databases, or split into more databases or have additional databases added there to.

Indoor navigation system 100 may be used by users 20, each coupled with a personal mobile device 22, having an indoor-navigation-management module 120, activated thereon, and at least one image sensor (camera) 24. Personal mobile device 22 is in communication flow with indoor-navigation- services server 102 over a wireless network 50, such as an internet network.

Floor plans data is stored in maps & floor plans DB 132, that is uploaded offline thereto.

To use indoor navigation system 100, a shopper 20, logs into indoor- navigation- services server 102 over a network 50, such as an internet network, a cellular network or any other network, by activating a dedicated indoor-navigation- management module 120,, running on his/her smart mobile device 22,, while visiting an indoor facility or before entering an indoor facility.

Reference is now made to Fig. 2, showing a schematic flowchart diagram of an example indoor navigation method 200 for serving a user 20, situated in a particular indoor facility, according to embodiments of the present invention. Once shopper 20, has activated dedicated indoor-navigation-management module 120,, indoor navigation method 200 proceeds as follows:

Step 210: retrieving geographical location to facilitate identification of the indoor site.

Typically, indoor-navigation-management module 120, retrieves the global geographical location of smart mobile device 22,, using the GPS of smart mobile device 22,. However, the geographical location of the smart mobile device 22j may also be entered manually, or retrieved from the internet or in any other method.

Step 300: downloading map & floor plan data. Indoor-navigation-management module 120, downloads the map and/or floor plan data of the indoor facility from database unit 130 into smart mobile device 22i, based on the retrieved global geographical location of smart mobile device 22,.

Step 400: initializing mobile device sensors.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to initialize the sensors of smart mobile device 22, that may participate in the indoor navigation.

Step 500: deriving the mobile device positioning.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to derive the position of mobile device 22, within the indoor facility.

Step 600: tracking mobile device motion.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to track the position of mobile device 22j, within the indoor facility, in near real time.

Step 250: checking if the indoor navigation application has lost the position of the mobile device.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to check if the position of mobile device 22,, within the indoor facility, is lost. If the position of mobile device 22,, within the indoor facility, is lost, go to step 500.

Else, repeat step 600 to thereby continue tracking the position of mobile device 22,, within the indoor facility.

(end of indoor navigation method 200)

Reference is now also made to Fig. 3, a schematic flowchart diagram, of step 300, outlining the downloading of the site maps and/or floor plan data of a visited indoor site, used in the indoor navigation method 200. Method 300 proceeds as flows:

Step 310: obtaining facility map & floor plans data from maps & floor plans DB. The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to obtain the facility map & floor plans data from maps & floor plans DB 132.

Step 320: obtaining beacons & landmarks from beacons & landmarks DB.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to obtain the beacons & landmarks from beacons & landmarks DB 136.

The data may contain beacons, store data, landmarks (e.g., emergency exits, gates and so on and so forth) and other spatial stationary marks.

(end of downloading site data method 300)

Reference is now also made to Fig. 4, a schematic flowchart diagram, of step 400, outlining the initializing of the sensors of mobile device 22,, used in the indoor navigation method 200. Method 400 proceeds as flows:

Step 410: calibrating compass (magnetometer).

Preferably, the processing unit of smart mobile device 22, activates indoor- navigation-management module 120, to calibrate the built-in magnetometer of smart mobile device 22,.

Step 420: opening front camera.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to activate the built-in front camera of smart mobile device 22,.

Step 430: acquiring an image frame.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to acquire an image frame using the front camera of smart mobile device 22,, if such front camera exists, to thereby determine the availability of the front camera.

Step 440: opening rear camera. The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to activate the built-in rear camera of smart mobile device 22,.

Step 450: acquiring an image frame.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to acquire an image frame using rear camera of smart mobile device 22,, to thereby determine the availability of the rear camera.

Step 460: selecting a camera.

The processing unit of smart mobile device 22, activates indoor-navigation- management module 120, to select between the front camera and the rear camera.

The selection process determines which camera to operate, depending on availability and if both are available, on the environment conditions. For example, the selection process determines the number of trackable elements (objects) detectable within the field of view of a camera. Typically, the camera that at least partially faces the floor is chosen. However, if the floor is homogeneous the algorithm will choose the ceiling lights as trackable objects, using the other camera.

(end of sensors initialization the method 400)

Reference is now also made to Fig. 5, a schematic flowchart diagram, of positioning method 500, being used in the indoor navigation method 200. The flowchart diagram outlines the method of deriving the position of mobile device 22j within the indoor facility. Method 500 proceeds as flows:

Step 510: capturing a sequence of image frames.

Indoor-navigation-management module 120, selects and activates a camera 24, of smart mobile device 22,, and acquires a sequence of image frames viewed in the field of view (FOV) of the selected camera 24,.

Step 520: detecting objects in the acquired images. Indoor-navigation-management module 120, detects at least one object in at least one acquired image frame.

Step 530: matching detected objects with shapes received from Beacons & landmarks DB.

Indoor-navigation-management module 120, matches the at least one detected object with shapes received from Beacons & landmarks DB 136.

Step 535: checking if found a match to the at least one detected object.

Indoor-navigation-management module 120, checks if found a match to the at least one detected object with a shape fetched from Beacons & landmarks DB 136.

If found a match to the at least one detected object with the shape of an object fetched from Beacons & landmarks DB 136, go to step 560.

Step 540: performing OCR analysis.

Indoor-navigation-management module 120, performs an OCR analysis on the at least one detected object. If no text is found, go to step 550.

Step 545: checking if found a match to the detected text in the at least one detected object.

Indoor-navigation-management module 120, checks if found a match to the detected text in the at least one detected object with text in objects fetched from Beacons & landmarks DB 136.

If found a match to the detected text in the at least one detected object with text in of an object fetched from Beacons & landmarks DB 136, go to step 560.

Step 550: performing color separation analysis.

Indoor-navigation-management module 120, performs a color separation analysis on the at least one detected object.

Step 555: checking if found a match to the detected color characteristics found in the at least one detected object. Indoor-navigation-management module 120, checks if found a match to the detected color characteristics found in the at least one detected object with the color characteristics of an object fetched from Beacons & landmarks DB 136.

If did not found a match to the detected color characteristics in the at least one detected object with color characteristics in objects fetched from Beacons & landmarks DB 136, go to step 510.

Step 560: calculating distance and elevation to beacon.

The processing unit of smart mobile device 22, calculates the distance and elevation to matched beacon.

(end of position determining method 500)

Another aspect of the indoor navigation methods is to include a learning process. As the positioning process is completed, the data acquired in the process is sent to main-processor 110 for analysis, and if processor 116 identifies a new beacon/landmark (such as a store change) processor 116 updates database unit 130 accordingly.

Reference is now also made to Fig. 6, a schematic flowchart diagram, of step 600, being used in the indoor navigation method 200. The flowchart diagram outlines a continuous method of tracking the motion of mobile device 22,, within the indoor facility. Method 600 proceeds as flows:

Step 610: capturing a sequence of image frames.

Indoor-navigation-management module 120, selects and activates a camera 24, of smart mobile device 22,, and acquires a sequence of image frames viewed in the field of view (FOV) of the selected camera 24j. It should be noted that the selection of the camera depends on the conditions of the viewed terrain.

Step 620: detecting substantial changes in sensors of the mobile device being tracked.

Indoor-navigation-management module 120, analyzes the acquired sequence of image frames to detect if mobile device 22j has moved at a motion rate that is faster (for example while running, or turning around) than the image capturing frame rate, such that the image frame analysis is impaired. Step 630: checking if found substantial changes in sensors of the mobile device being tracked.

Indoor-navigation-management module 120, checks if found substantial changes in sensors of the mobile device 22, being tracked.

If did not found substantial changes in sensors of the mobile device 22, being tracked, go to step 700, to perform motion verification using optical flow.

Step 640: analyzing sensors data for motion.

Since substantial changes in sensors were found, indoor-navigation- management module 120, analyzes sensors data for motion, using sensor selected from the group of sensors including accelerometers to identify acceleration vectors, a magnetometer to identify the direction and a gyroscope to identify the device position in the environment.

Step 800: counting steps.

Optionally, indoor-navigation-management module 120, performs a step counting procedure to count the steps of user 20, carrying mobile device 22,, to thereby facilitate calculation of motion distance and direction.

Step 660: calculating distance of movement.

Based on steps 640 and/or 800, indoor-navigation-management module 120, calculates the distance and direction of movement of mobile device 22j and/or of user 20i carrying mobile device 22,.

Step 670: updating current position.

Based on the motion calculations, indoor-navigation-management module 120, updates the current position of mobile device 22,.

(end of mobile device motion tracking method 600)

Reference is now also made to Fig. 7, a schematic flowchart diagram, of step 700, outlining a motion verification method, using optical flow, used by tracking method 600. Method 700 proceeds as follows:

Step 710: detecting contours in the image frames. Indoor-navigation-management module 120, analyzes the image frames acquired in step 610 to detect contours.

Step 720: detecting trackable elements in identified objects.

Indoor-navigation-management module 120, analyzes the contours found in step 710 to determine trackable elements that appear in successive image frames.

Step 730: calculating element size according to pitch and height.

The indoor-navigation-management module 120, calculates the size of the trackable elements, as appearing in a respective image frame, according to pitch and roll, as sensed by sensors of mobile device 22, such as a gyroscope, using a calculation engine. For example, when the trackable element is of an object situated on the floor surface, the pitch and the height of mobile device 22j define the distance of mobile device 22j from the respective object. Therefore, the motion of mobile device 22j, between the respective successive image frames, can be calculated.

Optionally, to reduce computational cost, the computation is performed on low resolution image frames.

Step 740: analyzing element movement.

The indoor-navigation-management module 120, calculates the motion of a tracked element between image frames, based the frame acquisition rate (FPS, frames per second) and on the relative translation of the element between the respective image frames.

Optionally, each image frame passes an HSV (hue, saturation, value) filter, in order to find the right contrast (for example to find only lights, specific colors and the like) and resulting with a binary image having one or more white spots, wherein the positioning of the spots and the movement reflect the user's 20j movement. For example, if the front camera is selected and the spots move down the image, it implies that the mobile device/user is moving forward.

It should be noted that in order to filter motion artifacts of mobile device 22,, such as moving the mobile device 22, by user 20, with respect to his/her body, indoor-navigation-management module 120, uses other sensors of mobile device 22,, such as, with no limitations, a gyroscope, an accelerometer and a magnetometer. For example, user 20, moves the hand holding mobile device 22j while walking ("walking mode"), which motion of that hand affects the optical flow of elements between respective image frames.

Step 750: calculating the distance of movement.

Indoor-navigation-management module 120, calculates the movement of user 20i based on the calculated optical flow of selected elements between successive image frames.

(end of optical flow method 700)

Reference is now also made to Fig. 5, a schematic flowchart diagram, of step 800, outlining a steps counting method, optionally used by tracking method 600. Method 800 proceeds as follows:

Step 810: checking the availability of a steps counting API (Application program interface).

Indoor-navigation-management module 120, checks the availability of a steps counting API.

Step 820: checking if found a steps counting API.

Indoor-navigation-management module 120, checks if found a step counting API.

If did not find a step counting API, go to step 840.

Step 830: activating the steps counting API.

Indoor-navigation-management module 120, activating the found steps counting API.

Go to step 899 (exit).

Step 840: analyzing the Gyro graph for step detection.

Having not found a steps counting API, indoor-navigation-management module 120, analyzes Gyro graph for step detection, since the steps of user 20, imprints a steps walking signature of the Gyro graph. Step 850: checking the mobile device pitch.

Indoor-navigation-management module 120, checks the pitch of mobile device 22j, to verify the pitch of mobile device 22j with respect to the horizon.

Step 860: detecting shoe.

Indoor-navigation-management module 120, analyzes the image frames to detect an element representing a user's 20, shoe or a blob representing a user's 20j shoe.

Step 870: checking if found a user's shoe.

Indoor-navigation-management module 120, checks if found an element representing a user's 20, shoe or a blob representing a user's 20, shoe.

If not found an element representing a user's 20, shoe or a blob representing a user's 20i shoe, go to step 899 (exit).

Step 880: counting # of times that the shoe is not in frame.

Indoor-navigation-management module 120, counting the number of times that the shoe is not in an image frame between successive appearances in image frames.

Using the frame acquisition rate the walking speed of user 20, can be determined.

Step 890: counting the number steps for a predetermined time interval.

Steps 860 and 880 are repeated for a predetermined time interval.

Go to step 899 (exit).

(end of position determining method 800)

Preferably, the indoor navigation method further includes the steps of sending the at least one acquired positioning image frame to the main-processor of the indoor- navigation-services server, detecting elements in the at least one acquired positioning image frame, matching the elements with beacons/landmarks in the site data stored in the database, and updating the database with the detected elements for which detected elements no match was found. Optionally, when a new beacon/landmark detected by the indoor-navigation- management module in the step of identifying at least one of the beacons/landmarks in the at least one acquired positioning image frame, and the indoor-navigation- management module does not find a matched beacon/landmark is the site data, the indoor navigation method further includes the step of updating the database with the detected beacon/landmark.

Although the present invention has been described with reference to the preferred embodiment and examples thereof, it will be understood that the invention is not limited to the details thereof. Various substitutions and modifications have been suggested in the foregoing description, and others will occur to those of ordinary skill in the art. Therefore, all such substitutions and modifications are intended to be embraced within the scope of the invention as defined in the following claims.