Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATED HARDWARE AND SOFTWARE FOR MOBILE MICROSCOPY
Document Type and Number:
WIPO Patent Application WO/2016/061563
Kind Code:
A1
Abstract:
An automated slide scanning system, comprising one or more optical elements, including an objective, having an optical path configured to be disposed within view of a camera of a portable device. An automated stage disposed within the optical path, the automated stage comprising a platform configured for receiving a slide containing a biological sample and having a drive mechanism for translating the stage in at least one direction with respect to said objective. A communications interface coupled to the automated stage, and is configured for receiving a command from the portable device to control operation of the mechanical stage.

Inventors:
FLETCHER DANIEL (US)
D AMBROSIO MICHAEL (US)
SKANDARAJAH ARUNAN (US)
MYERS III FRANK (US)
REBER CLAY (US)
Application Number:
PCT/US2015/056114
Publication Date:
April 21, 2016
Filing Date:
October 16, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV CALIFORNIA (US)
International Classes:
G02B21/36; G02B26/10; H01J37/26
Foreign References:
US20140063226A12014-03-06
US20120050853A12012-03-01
JP2013050613A2013-03-14
US20130038931A12013-02-14
JP4477714B22010-06-09
Attorney, Agent or Firm:
O'BANION, John (400 Capitol Mall Suite 155, Sacramento California, US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1 . An automated slide scanning system, comprising;

one or more optical elements having an optical path configured to be disposed within view of a camera of a portable device;

the one or more optical elements comprising an objective;

an automated stage disposed within the optical path;

the automated stage comprising a platform configured for receiving a slide containing a biological sample;

the automated stage comprising a drive mechanism for translating the stage in at least one direction with respect to said objective; and

a communications interface coupled to the automated stage;

wherein the communication interface is configured for receiving a command from the portable device to control operation of the mechanical stage.

2. A system as recited in claim 1 , the portable device comprising a cellular device having a processor and memory, the system further comprising: application programming configured to be stored as instructions in non- transitory computer-readable memory on the portable device; and

wherein the application programming is executable on the processor for executing the command.

3. A system as recited in claim 2:

wherein the automated stage comprises an x-y stage comprising independent drive mechanisms for x-axis translation and y-axis translation of the slide with respect to the objective; and

wherein the application programming is configured for controlling motion of the x-y stage via one or more commands initiated from the portable device.

4. A system as recited in claim 3:

wherein the automated stage comprises a z-axis stage comprising an independent drive mechanism for z-axis translation of the slide with respect to the objective; and

wherein the application programming is configured for controlling motion of the stage via one or more commands initiated from the portable device to focus imaging of the sample on the slide.

5. A system as recited in claim 1 , further comprising:

an illumination source coupled to the automated stage, the illumination source configured to direct light at the slide; and

wherein the programming is further configured to control operation of the illumination source via one or more commands initiated at the portable device.

6. A system as recited in claim 1 , wherein the objective comprises an infinity objective, and the optical elements further comprise a plurality of mirrors folding the light path into a compact light path in line with an achromatic tube lens and the infinity objective.

7. A system as recited in claim 2:

wherein the portable device comprises a touch screen; and

wherein the application software comprises a gesture recognition module configured to control motion of the automated stage in response to one or more hand gestures applied to the touch screen.

8. A system as recited in claim 7:

wherein the communications interface comprises a wireless communication interface, and

wherein motion of the automatic stage is controlled wirelessly from one or more hand gestures initiated at the touch screen.

9. A system as recited in claim 7, wherein translation of the automatic stage is in proportion to the one or more gestures.

10. A system as recited in claim 7, wherein translation of the automatic stage is affected as an automated sequence of motions in response to the one or more gestures.

1 1 . A system as recited in claim 5, wherein the programming is further configured to control one or more of bright field, dark field, phase, or intensity of the light directed by the illumination source.

12. A system as recited in claim 5:

wherein the illumination source comprises an LED array; and

wherein the application programming is further configured to vary the orientation and numerical aperture of the illumination light directed by the illumination source or initiate the acquisition of a series of illumination states with respect to the light directed by the illumination source.

13. A system as recited in claim 2, wherein the application programming is configured to vary one or more of the numerical aperture, resolution, and field of view of the objective.

14. A system as recited in claim 2, wherein the application programming is configured to vary a digital magnification of a region of interest of the sample and display a corresponding size calibration corresponding to the magnification.

15. A system as recited in claim 5, wherein where illumination source and application programming are configured for fluorescence imaging of the sample.

16. A system as recited in claim 2, further comprising:

a calibration sample positioned within the optical path; wherein the application software is configured to perform a calibration routine to provide calibration images of the sample of one or more of: focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, or stage translation/backlash.

17. A system as recited in claim 2, wherein the application programming is configured to apply one or more shape metrics to classify features within an image of the sample.

18. A system as recited in claim 2, wherein the application programming is configured to process an image of the sample during motion of the automatic stage to determine a distance traveled and providing feedback for pixel-scale positioning of the automatic stage.

19. A system as recited in claim 2, further comprising:

one or more proximity sensors coupled to the automatic stage; and wherein the one or more proximity sensors provide position data of the automatic stage for establishing one or more of an origin point or limit for translation of the automatic stage.

20. A system as recited in claim 1 , wherein the drive mechanism comprises a stepper motor coupled to a rack and pinion.

21 . A system for automatically scanning and imaging a slide containing a biological sample, comprising:

(a) one or more optical elements having an optical path configured to be disposed within view of a camera of a portable device;

(b) the one or more optical elements comprising an objective;

(c) an automated stage disposed within the optical path, the automated stage comprising a platform configured for receiving a slide containing a biological sample and a drive mechanism for translating the stage in at least one direction with respect to said objective; (d) a communications interface coupled to the automated stage;

(e) a computer processor; and

(f) a non-transitory computer-readable memory storing instructions executable by the computer processor;

(g) wherein said instructions, when executed by the computer processor, perform steps comprising:

(i) for receiving a command from the portable device;

(ii) sending the command via the communication interface to the automatic stage; and

(iii) translating the mechanical stage in response to said command.

22. A system as recited in claim 21 :

wherein the automated stage comprises an x-y stage comprising

independent drive mechanisms for x-axis translation and y-axis translation of the slide with respect to the objective; and

wherein said instructions when executed by the computer processor are further configured to control motion of the x-y stage via one or more commands initiated from the portable device.

23. A system as recited in claim 22:

wherein the automated stage comprises a z-axis stage comprising an independent drive mechanism for z-axis translation of the slide with respect to the objective; and

wherein said instructions when executed by the computer processor are further configured to control motion of the stage via one or more commands initiated from the portable device to focus imaging of the sample on the slide.

24. A system as recited in claim 21 , further comprising:

an illumination source coupled to the automated stage, the illumination source configured to direct light at the slide; and

wherein said instructions when executed by the computer processor are further configured to control operation of the illumination source via one or more commands initiated at the portable device.

25. A system as recited in claim 21 , wherein the objective comprises an infinity objective, and the optical elements further comprise a plurality of mirrors folding the light path into a compact light path in line with an achromatic tube lens and the infinity objective.

26. A system as recited in claim 21 :

wherein the portable device comprises a touch screen; and

wherein said instructions when executed by the computer processor are further configured to control motion of the automated stage in response to one or more hand gestures applied to the touch screen.

27. A system as recited in claim 26:

wherein the communications interface comprises a wireless communication interface; and

wherein motion of the automatic stage is controlled wirelessly from one or more hand gestures initiated at the touch screen.

28. A system as recited in claim 26, wherein translation of the automatic stage is in proportion to the one or more gestures.

29. A system as recited in claim 26, wherein translation of the automatic stage is affected as an automated sequence of motions in response to the one or more gestures.

30. A system as recited in claim 24, wherein said instructions when executed by the computer processor are further configured to control one or more of bright field, dark field, phase, or intensity of the light directed by the illumination source.

31 . A system as recited in claim 24:

wherein the illumination source comprises an LED array; and

wherein said instructions when executed by the computer processor are further configured to vary the orientation and numerical aperture of the

illumination light directed by the illumination source or initiate the acquisition of a series of illumination states with respect to the light directed by the illumination source.

32. A system as recited in claim 21 , wherein said instructions when executed by the computer processor are further configured to vary one or more of the numerical aperture, resolution, and field of view of the objective.

33. A system as recited in claim 21 , wherein said instructions when executed by the computer processor are further configured to vary a digital magnification of a region of interest of the sample and display a corresponding size calibration corresponding to the magnification.

34. A system as recited in claim 24, wherein where illumination source is configured for fluorescence imaging of the sample.

35. A system as recited in claim 21 , further comprising:

a calibration sample positioned within the optical path;

wherein said instructions when executed by the computer processor are further configured to calibrate images of the sample of one or more of: focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, or stage translation/backlash.

36. A system as recited in claim 21 , wherein said instructions when executed by the computer processor are further configured to apply one or more shape metrics to classify features within an image of the sample.

37. A system as recited in claim 21 , wherein said instructions when executed by the computer processor are further configured to process an image of the sample during motion of the automatic stage to determine a distance traveled and providing feedback for pixel-scale positioning of the automatic stage.

38. A system as recited in claim 21 , further comprising:

one or more proximity sensors coupled to the automatic stage; and wherein the one or more proximity sensors provide position data of the automatic stage for establishing one or more of an origin point or limit for translation of the automatic stage.

39. A method for automatically scanning and imaging a slide containing a biological sample, comprising:

providing an automated stage having an opticall path and an objective disposed within the optical path, the automated stage comprising a platform configured for receiving a slide containing a biological sample and a drive mechanism for translating the stage in at least one direction with respect to said objective;

positioning a camera of a portable device in view of the optical path;

sending a command from the portable device to the automatic stage;

translating the mechanical stage in response to said command; and imaging the biological sample with the camera.

40. A method as recited in claim 39, wherein the portable device comprises a touch screen, the method further comprising:

controlling motion of the automated stage in response to one or more hand gestures applied to the touch screen.

41 . A method as recited in claim 39, the method further comprising: controlling motion of the stage via one or more commands initiated from the portable device to focus imaging of the sample on the slide.

Description:
AUTOMATED HARDWARE AND SOFTWARE

FOR MOBILE MICROSCOPY

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority to, and the benefit of, U.S. provisional patent application serial number 62/065,560 filed on October 17, 2014, incorporated herein by reference in their entirety.

STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT

[0002] This invention was made with Government support under

OAA-A-13-00002-12 awarded by the United States Agency for International Development. The Government has certain rights in the invention.

INCORPORATION-BY-REFERENCE OF

COMPUTER PROGRAM APPENDIX

Not Applicable

NOTICE OF MATERIAL SUBJECT TO COPYRIGHT PROTECTION

[0004] A portion of the material in this patent document is subject to

copyright protection under the copyright laws of the United States and of other countries. The owner of the copyright rights has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office publicly available file or records, but otherwise reserves all copyright rights whatsoever. The copyright owner does not hereby waive any of its rights to have this patent document maintained in secrecy, including without limitation its rights pursuant to 37 C.F.R. § 1 .14. BACKGROUND

[0005] 1 . Technical Field

[0006] This technology pertains generally to digital microscopy, and more particularly to a portable imaging system compatible with the camera of a smart phone or tablet to capture and analyze microscopic images of a sample in an automated manner.

[0007] 2. Background Discussion

[0008] Traditional microscopes are powerful measurement tools that are used in biology to understand fundamental processes and used clinically to determine whether a disease is present in a sample. These devices normally require significant investment to purchase as well as skilled technicians to use them and regular power for operation. While there are mechanical systems that can be attached to microscopes to automatically acquire images thereby reducing the time required of an operator, current implementations add significantly to cost and still require someone with a reasonable amount of skill to use. This has confined microscopes, both manual and automated, to centralized settings where these resources are available. The domains in which microscopes are used, including clinical diagnostics and laboratories in low-resource settings, would benefit from simplified automated instrumentation with a lower required level of operator skill and utilize a convenient interface with the technology, such as that provided by a touch screen. In clinical diagnostic use, for example, making a measurement at the point of care could enable a faster and more actionable diagnosis, while measurements for water quality would benefit from utilizing freshly prepared samples.

[0009] One limitation of current portable imaging systems is that a user must collect images one at a time under manual control. Many microscopy applications demonstrate increased sensitivity when scanning a larger area and inspecting more of the sample. Because of this relationship, it will take a significant amount of time from the user to reach a target level of sensitivity and specificity for certain microscopic assay using current systems. [0010] Another limitation of current microscopy systems capable of distributed use is that it can be challenging for someone without significant training to bring the sample into focus before acquisition. Being able to execute this procedure establishes a minimum level of skill to capture usable images from the sample. Even for trained users, focusing on relevant features in each field adds to the time required to capture images.

[0011] Finally, current systems are also unable to immediately analyze the sample and provide an assessment of the sample content on the device itself. They traditionally require a trained interpreter to be on-site or require remote transmission for telemedicine. Applications such as counting tuberculosis bacilli using on-board image processing are thus beyond the capabilities of fully integrated, portable solutions.

[0012] In conclusion, there is a need for a low cost portable device with

hardware and software automation that can achieve automated sample scanning and focused imaging and be used by a minimally trained operator.

Such a system would enable decentralized point-of-collection analysis of clinical and research samples for transmission or on-board analysis.

BRIEF SUMMARY

[0013] One aspect is a mobile microscope system for decentralized

microscopic data collection and analysis using automation of hardware and software.

[0014] The disclosed system comprises a mobile device with a touch

screen and camera integrated into a microscope with an automated stage that can adjust the position of a sample relative to the imaging optics. The system enables automated microscopic disease screening and

environmental monitoring, applications that currently have long lead times because samples must be transported to a central facility where a skilled microscopist is available to do the tedious task of imaging the slide.

[0015] By automating image capture and either digital transmission or onboard analysis, the features of this system enable rapid decision-making for treatment. The user interacts with the system by using a custom touch interface in which gestures and buttons trigger commands including stage motion, image acquisition, and illumination mode. These commands are then relayed to the stage hardware over a wireless or wired interface for a responsive user experience. The user interface also presents opportunities for on-board review and annotation. The device can be configured to image in bright field or multi-color fluorescence. The device can carry out fully automated image capture, improving over other implementations by using a unique set of low-cost hardware and selectively capturing fields of interest through real-time processing of the visible field. To make a decision about the captured data, the system can either upload images for remote review or provide analysis carried out on the device to the user. Finally, the system can respond to changes in its image capture capabilities during field use through the use of on-board calibration targets, adjusting camera properties as necessary or sending out a service request if required.

[0016] Diseases that are diagnosed by skilled microscopists employing conventional, manual microscopes could be targeted for replacement by this automated approach. Similarly, diseases diagnosed in centralized facilities by large, fully-automated imaging devices could be replaced by deploying these devices at the location of the patient for data collection and transfer. Specific diagnostic uses include automated screening of slides prepared from: brush biopsy samples of the oral cavity for oral cancer; brush biopsy samples of the cervix for cervical cancer; sputum for tuberculosis presence and drug resistance; blood for differential blood counting, malaria, or parasitic worms; or fecal samples for helminth eggs.

[0017] Other commercial applications revolve around rapid, field-response to environmental changes.

[0018] Microscopic indicators of biome health could be monitored for

damage and recovery after chemical contamination, for example in a marine environment after an oil spill.

[0019] By utilizing the imaging, computational processing, and

communication capabilities of a mobile phone or tablet in conjunction with automated mechanical components, the system can provide the precise motion, high-quality imaging, and real-time analysis capabilities that enable automated screening in low resource as well as laboratory settings.

[0020] Further aspects of the technology described herein will be brought out in the following portions of the specification, wherein the detailed description is for the purpose of fully disclosing preferred embodiments of the technology without placing limitations thereon.

BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)

[0021] The technology described herein will be more fully understood by reference to the following drawings which are for illustrative purposes only:

[0022] FIG. 1 illustrates a perspective-view schematic diagram of a fully automated slide scanning system in accordance with the present description.

[0023] FIG. 2 shows a schematic diagram of an embodiment of a light path and optics module of the system of FIG. 1 .

[0024] FIG. 3 illustrates a top-view schematic diagram an embodiment of an x-y stage of the system of FIG. 1 .

[0025] FIG. 4A and FIG. 4B show top and side views, respectively, of a schematic diagram of an embodiment of a focusing mechanism of the system of FIG. 1 .

[0026] FIG. 5 illustrates a schematic diagram of an embodiment of software organization which may be implemented as application software to be installed on portable device for control of the system of FIG. 1 .

[0027] FIG. 6 shows a schematic diagram of an exemplary implementation of the hardware and software of the automated slide scanning system of FIG. 1with a mobile device in accordance with the present description.

[0028] FIG. 7 shows a plot comparing resolution at various numerical

apertures for an iPhone and scientific camera with respect to Raleigh resolution.

[0029] FIG. 8A and inset FIG. 8B magnification show exemplary SEM

images of a Wright-stained blood smear at 100um. [0030] FIG. 8C and inset FIG. 8D magnification show exemplary SEM images of a Wright-stained blood smear at 25um.

[0031] FIG. 9 is a plot of normalized illumination intensity vs. pixel value illustrating a mobile phone's ability to provide linear measurements of intensity after gamma correction.

DETAILED DESCRIPTION

[0032] FIG. 1 illustrates a perspective-view schematic diagram of a fully automated slide scanning system 10 in accordance with the present description. System 10 is configured to be adapted for use with various portable devices 12 (e.g. portable cellular devices such as a cellular phone (iPhone depicted in FIG. 1 or tablet (iPad shown in FIG. 2) having a camera 14 (see FIG. 2). A platform 16 is provided such that when portable device 12 is positioned on the platform 16 that lens L of the portable device 12 camera 14 is in-line with optical path 30 and optics module 20 and associated components. A sample slide 40 containing the subject specimen is positioned on an x-y stage 50 (for in-plane scanning and positioning) and is illuminated by light source 44 (e.g. LED light source) and plastic diffuser 42. A z-axis stage 70 is also included for vertical alignment of slide 30 and focusing. It is also appreciated that camera 14 may be separate from the mobile device 12 and controlled via wireless or wired connection.

[0033] FIG. 2 shows a schematic diagram of an embodiment of a light path 30 and optics module 20. The portable device 12 camera 14 is optically coupled to the system 10 by an eyepiece 22. A series of mirrors 24 direct the light path 30 to an achromatic tube lens 64. An infinity objective 28 is used to image the sample 40. The sample 40 is illuminated by a diffuser 42 dispose in front of an LED or LED array 44.

[0034] FIG. 3 illustrates a top-view schematic diagram an embodiment of an x-y stage 50 (i.e. 2-D X-Y translation mechanism). In a preferred

embodiment, rotational motion of the stepper motor shafts are converted to linear motion of the x-y stage 50 through use of a rack and pinion system which is fabricated and meshed utilizing the complementary laser cutter kerfs to ensure precise meshing A pair of x-axis stepper motors 52 translate the x-axis via a rack and pinion mechanism 54. A single y-axis stepper motor translates the y-axis also by a rack and pinion mechanism 64. A y-axis sample holder 58 is mounted on top of the x-axis platform 56, and is coupled to a holder 66 that grips and translates the sample slide 40. The x-axis is mounted to the base 60 of the device. An LED 42 in line with the y-axis sample 58 holder illuminates the sample, as described in FIG. 1 and FIG. 2.

[0035] FIG. 4A and FIG. 4B show top and side views, respectively of a

schematic diagram of an embodiment of a focusing mechanism 70. A z- axis stepper motor 74 is coupled to motor pulleys 88 via a drive belt 86 which drives a pair of pulleys 82, each attached to lead screws 84. The pulleys 82 are coupled to one another by to a timing belt 76 and a pair of idler pulleys 80 to direct the timing belt 76 around the objective 78. The stepper 74 and lead screws 84 are anchored to a non-moving top platform 72. The objective 78 is attached to translating bottom platform 90, which translates along a pair of linear shafts (not shown) when the lead screws turn via a pair of fixed nuts. Each axis may be guided by ball-bearing rails (not shown).

[0036] System 10 may also include vibration-isolation feet attached to the device housing or stages 50, 70 of the device.

[0037] While not required, stepper motors are preferably used to provide a precise, inexpensive, simple automated translation mechanism. Stepper motors allow for acceptable accuracy and precision in open-loop operation, and high torque, low speed movement without gearing. In general, the smallest stepper motors that are able to provide an adequate amount of torque at reasonable power consumption are preferred (smaller stepper motors are often notably less efficient than larger ones).

[0038] In one embodiment, system 10 measures 155mm x 225mm x

135mm. In order to achieve such a small form factor, several design decisions were made to minimize size. First, the light path 30 is folded by four mirrors 24, allowing a 200mm optical path length (the minimum required to achieve acceptable magnification at the imaging sensor, e.g. of an iPad, or similar tablet, or portable cellular device) to be fit into the system 10 with the sample 40 centered. In order to allow for a compact focusing mechanism that involves only moving the objective 28, an infinity system with an achromatic lens 26 is used, with an eyepiece 22 to couple optics to the portable device camera 14. Second, tall components (the objective 28 and the stepper motors 52, 62 and 74) share vertical space to minimize the device's height. Third, elements of the focusing mechanism 70 are in parallel vertical space to the objective 28, occupying otherwise empty space instead of contributing to the device's height.

FIG. 5 illustrates a schematic diagram of an embodiment of software organization which may be implemented as application software 100 to be installed on portable device 12 for control of system 10. A user interface 1 10 connects the actions of the user to the rest of the software, and performs some basic tasks like taking a picture with camera 14. User settings 144 and other persistent data 144 are stored in the data model 106. When performing a scan of a slide 40, information from the data model 106 is relayed to the other classes through the user interface 1 10, although it can also be accessed directly by the other modules, if necessary. The event sequencer module 102 returns to the user interface 1 10 a sequence of hardware and software events 140 that are to be executed in order to conduct a scan. The user interface 1 10 then executes these steps. If a step requires manipulation of the device's hardware (like x-y stage control 162 or z-stage control 164 for moving either of the stages 50, 70, respectively), the user interface hands this task to the hardware control module 108. The hardware control module 108 may also be configured to control sample illumination via LED control 160 for operating LED 44 wirelessly (e.g. via Bluetooth LE). Illumination source 44 may also include LEDs or LED arrays for both bright field and fluorescent illumination. These LEDs may be driven by current-mode buck/boost converters (not shown) that are controlled by pulse-width modulated signals, allowing software-controlled brightness. For fluorescence, one or more high-power blue LEDs 44 may be imaged onto the sample plane (critical illumination), although other illumination configurations may be used, including Kohler and oblique. Spectral selectivity may be achieved via excitation and emission filters placed within the illumination optics and collection optics, respectively.

[0040] If the step requires an image processing step (like calculating the focus metric via an autofocus module 154), the user interface 1 10 hands this task to the image processor module 104.

[0041] In order to enable the system to intelligently scan a slide, a number of software control techniques may be implemented in application software

100. One obstacle to scanning a large area is maintaining focus. To overcome this obstacle, an auto focusing system may be used based around the standard deviation of the laplacian of the image (e.g. focus metric). Numerous other implementations of the focus metric can be implemented. By moving the z-axis a set amount (via stage 70), and recalculating and comparing the focus metric, proper focus can be converged upon.

[0042] In a preferred embodiment, autofocus module 154 sweeps the

objective through 5 positions and calculates the focus at each. If a satisfactory peak is not found, the sweep is repeated with a larger distance between each position. In order to limit the possibility of focus being incorrectly moved at position on the slide 40 where there is little or sample present, the autofocus module 154 calculates a content score, which is simply the mean of the image, and will only attempt to focus if it is above a user-set threshold. Note that this safe-guard generally works when the system is already near focus, such as if it is in the middle of scanning a large area where if found focus at a previous position.

[0043] The autofocus module 154 may also be configured to analyze the relationship of the focus metric scores to determine how likely it is that focus is achieved, and optionally increases the search area. As the content score can be calculated rapidly, it also enables a mode of acquisition where the device rapidly scans an area at low resolution, only slowing to produce full-resolution images where there is sufficient content. Alternate

implementations of focusing can include capturing focus information from both bright field and fluorescence channels and using some combination as a focus metric.

[0044] In order to make the system 10 user intuitive, a gesture recognition - based control module 146 may be implemented for control of the stages 50, 70. The gesture recognition module 146 is configured to move the stages 50, 70 corresponding to dragging or swiping gestures on the screen of the portable device (e.g. iPad). The speed and positional movement (in pixels) of these gestures may be used to calculate the rate and distance that the stage 50 or 70 will move. The rate and distance is then relayed to the stage via wireless communication (e.g. Bluetooth LE). Gesture controls may also be used to allow zooming and panning, both on live and previously taken images, display the magnified images, as well as provide an interface for control data collection, analysis, and transmission. Gesture recognition - based control module 146 may be configured to operate with hand gestures either in contact with the screen of device 12 or detected by a non-contact method (e.g. front-facing camera) to control microscope, stage, and data functions. Gesture recognition -based control module 146 may also control stage movement through a wired or wireless (e.g. Bluetooth) connection to the motors that control sample movement and focus through specific sequences (e.g. double taps, long taps, etc).

[0045] Application software 100 may also be configured to enhance the accuracy of the stages 50, 70. When scanning large objects of which only a piece fits in one field of view, it is desirable to reconstruct these pieces to be able to interact with the entire object. With micron-scale resolution, as is preferred for tuberculosis diagnosis, this would require sub-micron scale movement of the sample 40 if done purely mechanically. Instead, an image registration routine 152 may be included within processing module 104 and or hardware control module 108 that is configured for stitching together images taken with some positional ambiguity based on the content of the image. Image registration module 152 may also be used to provide closed- loop control to the z-stage motor 74 and x-y stage motors 52 and 62 in near real-time. By tracking the motion of features on the screen, it can be determined how far the sample 40 has actually translated with sub-micron accuracy. Furthermore, when attempting to move the sample 40 back to a previous location, image registration module 152 can be used to reposition the stage with enhanced accuracy if sufficient optical content is present.

[0046] Image processing module 104 may also include algorithms 150 for the automated detection of features within an image, such as bacteria, that are relevant for diagnosis. These features can be presented to a trained user for evaluation or can be automatically evaluated using a computational diagnosis algorithm.

[0047] It is appreciated that software 100 may be configured for

implementing a number of various automated algorithms for calibration, analysis, and diagnosis. By using a known reference sample (not shown) embedded within the instrument, automated self-testing and calibration can be performed on a regular basis.

[0048] For example, a calibration sample may be automatically positioned under the collection optics during a "self test" routine in order to provide calibration images for focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, or translation/backlash measurements, among other features.

[0049] Furthermore, the results from the self test routine may be compared with thresholds set for performance along dimensions of calibration images for focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, monitoring resolution, signal-to-noise ratio, and stage translation/backlash to alert the user for maintenance and automatically order replacement parts to be delivered to the location of the system 10.

[0050] An embodiment of softwarel 00 for device control is provided in

Table 1 .

[0051] FIG. 6 shows a schematic diagram of an exemplary implementation of the hardware and software of the automated slide scanning system 10 with a mobile device in accordance with the present description. In a preferred embodiment, application software is loaded into memory 120 of portable device 12. The onboard processor 122 of the device 12 may be used to execute the application programming 100 for execution of various functions detailed in FIG. 5, operate camera 14, and communication via wireless communication hardware 124 (e.g. Bluetooth, Wi-Fi, etc.) to the various hardware components of the system 10.

[0052] In one embodiment, one or more wireless communication devices 126 may be coupled to the system hardware (e.g. x-y stage 50, z-stage 70, LED 44) for transmission of commands or data between the portable device 12, individual components, or other external device. In one embodiment, communication device 126 may comprise a Bluetooth LE board, which communicates via SPI to an Arduino, and instructs downstream electronics driving the motors (e.g. a stepper motor controller featuring an Allegra A4983 or an on-board servo control hardware (both not shown)). Bluetooth LE is an ideal protocol for battery-powered applications such as system 10 where high communication bandwidth is not required.

[0053] As seen in FIG. 6, batteries 128 (e.g. an internal rechargeable

battery or a switchable external battery) may be used to power any of the stage components 50, 70 or illumination source 44. Sensors, e.g. hall sensors, may also be employed in one or more of stage components 50, 70 to provide position feedback.

[0054] It is appreciated that the components shown in system 10 of FIG. 1 through FIG. 6 any be interchangeably used with various complexity to accommodate different uses or applications.

[0055] For example, FIG. 1 through FIG. 6 generally show a first

embodiment with a fully automated slide scanning system with three-axis motion control, where two of the axes control the x-y positioning of the slide under the imaging optics having an infinity-corrected objective and a tube lens, and a third axis positions the imaging optics to bring the sample into focus. In one embodiment, an iPad with a built-in camera, positioned above imaging optics captures images and controls the three motion axes via Bluetooth Low Energy (LE). [0056] A particular implementation of this first embodiment is for imaging of stained cells from brush biopsies of suspected oral cancer lesions. Early diagnosis of oral cancer dramatically improves the outcome of clinical interventions. In parts of the world where oral cancer is most prevalent, there is often also insufficient infrastructure to detect patients at early stages. Patients are unable to reach centralized facilities to reach qualified staff and the necessary instrumentation because of cultural and economic barriers. There are local providers of healthcare, but they are not equipped to make a determination and ineffective at encouraging patients to pursue further care. A distributed system for microscopic analysis, if paired with appropriate sample collection and preparation, could enable broader screening of at-risk members of the population at this local level. This first embodiment preferably includes transmission and remote analysis of images collected from the system 10, although diagnosis or risk- stratification may also be implemented on system 10 through on-board approaches similar to those described in the other embodiments described below.

[0057] A second embodiment includes the features of the second

embodiment, and alternatively incorporates an image processing module 104 that includes modules for fluorescence imaging 156, resolution calibration, and on-board image analysis 158 for the detection and quantification of particles in the image.

[0058] One implementation of this second embodiment includes

quantification and/or detection of Mycobacterium tuberculosis bacilli in human sputum smears (sample 40) collected from patients in various healthcare settings. In this second configuration, the smears are stained with Auramine-O, a fluorescent dye commonly used for TB diagnosis. The smear slide 40 is then inserted into the system 10, which automatically focuses and images (via application software 100) several hundred fields across the smear. A image analysis module 158 may include a support vector machine (SVM) algorithm that has been trained to detect the rod- shaped bacilli, analyze each image, and then return a quantitative estimate of the number of bacilli present in the sample. This modified system enables the automated diagnosis of tuberculosis from a symptomatic patient in under an hour. The second configuration is preferably portable, battery powered, and requires minimal user training. As such, it is ideally positioned to address the estimated three million TB cases that go undiagnosed each year due to inadequate diagnostic resources in remote and low income settings.

[0059] In a third embodiment, the system alternatively uses a one- dimensional scanning system (e.g. stage 50 of FIG. 1 comprises an x-stage translation only) that uses a hobby servo motor as the drive mechanism and the portable device 12 (e.g. built-in focusing mechanism of the portable device camera 14) to eliminate the need for a separate focusing motor (e.g. z-stage mechanism 70). The imaging optics module 20 in this third embodiment may comprise of a conventional finite imaging system based on an objective or of a second mobile phone lens-group that is inverted. The configuration of this third embodiment is capable of scanning one row of a slide or capillary and user-defined increments and includes automated image processing of images and movies to calculate results. One particular implementation or application of this third embodiment is for quantification of the filarial parasite Loa loa (i.e. quantify Loa mf load) in whole blood.

[0060] In this third configuration, application software 100 may contain

programming or a user interface 1 10 that gives simple instructions to the user on loading the device, and automatically acquires 5-second videos of 5 fields of view. When the acquisition is complete, the application software 100 runs an image-processing algorithm within module 104 on the videos to quantify Loa mf/ml. To start a test, the operator simply loads blood from a finger prick into a capillary pre-fitted into a sample holder 40, inserts the holder into the system, and presses a single button on the portable device (e.g. iPhone or the like).

[0061] A fourth embodiment of system 10 alternatively combines the

compact inverted lens imaging system of the third embodiment with three- axis motion control. This fourth embodiment is distinct from the first embodiment in its use of simplified optics, and a simple, less accurate drive mechanism that is nonetheless sufficient for imaging at the lower magnification of the simplified optics. One implementation of this device is for analysis of blood smears to quantify malaria load.

[0062] The fourth embodiment preferably runs an application 100 that

allows gesture recognition 146 based slide control with optical feedback, and can use multiple autofocus algorithms 154. This four the embodiment is preferably configured to exchange images and data with a remote server via wireless communication module 126 (see FIG. 6), which preferably has an interface to review previously taken data and annotate images. To use this fourth embodiment, the operator inserts a slide 40 and secures the slide with a magnetic holder, and presses a button to load the slide and optionally start a full scan.

[0063] In the simplified third and fourth configurations above, where less precision is required due to lower magnification and resolution, hobby servo motors with built-in feedback are preferably used in place of stepper motors. Servo motors provide higher max torque at lower power, but suffer from torque limitations when approaching the set point, leading to less reproducibility and inability to move small amounts. In order to enable gross reproducibility when scanning multiple slides, one absolute position is encoded using a pair of hall-effect sensors 130 on the system housing (one for x and one for y) which correspondingly include a pair of magnets on the x-y stage. While other sensor types may be employed, contactless hall- effect sensors were selected because of their long lifetimes and high precision.

[0064] With respect to compatibility of system 10 with differing portable devices 12, and particularly mobile devices such as iPhones or iPads, a significant advantage of using an iPad or iPhone in the above

implementation is connectivity. Because these devices 12 are capable of connecting to the internet and accessing remote servers, images and data can be sent to and from the system 10. This allows for remote diagnosis of images, remote quality control, offloading complex computational tasks from the device, and pushing of software and algorithmic updates to the system 10. In addition, the GPS sensor in the iPad or iPhone 12 can be used to geotag diagnoses, enabling disease monitoring and

epidemiological studies.

[0065] Finally, thanks to the rich, intuitive user interface of the iPad or

iPhone 12, training materials may be directly embedded into the application software 100. For example, language-specific videos can be used to demonstrate sample collection, sample preparation, slide loading, and diagnosis interpretation, further simplifying user training.

[0066] When implemented with system 10 of the present description, these devices also provide advantageous optics/imaging qualities. FIG. 7 shows a plot comparing resolution at various numerical apertures for an iPhone and scientific camera with respect to Raleigh resolution. FIG. 8A and inset FIG. 8B magnification show exemplary SEM images of a Wright-stained blood smear at 100um. FIG. 8C and inset FIG. 8D magnification show exemplary

SEM images of a Wright-stained blood smear at 25um. FIG. 9 is a plot of normalized illumination intensity vs. pixel value, illustrating a mobile phone's ability to provide linear measurements of intensity after gamma correction.

[0067] All of the above embodiments are preferably complemented by an intuitive user interface 1 10 for gesture-based control 146 and on-board video tutorials for instruction. The system may also operate in an automatic mode that includes slide loading, auto focusing, and taking pictures over a large sample area, often from a slide.

[0068] In accordance with the above embodiments, application software 100 may be configured to include gesture control to move one or more stages in proportion to the gesture and/or initiate an automated sequence of motions, control imaging features, including focus, exposure, color balance, contrast, etc., control sample illumination mode (bright field, dark field, phase, etc) or intensity, control an arrayed LED illumination source to vary the orientation and numerical aperture of the illumination or initiate the acquisition of a series of illumination states, vary the digital magnification of the region of interest and display a corresponding size calibration (the dynamic scale bar), switch the light collecting optic (i.e. the objective 28) to vary the numerical aperture, resolution, and field of view, etc.

[0069] Furthermore, application software 100 may be configured to include modules for analyzing the image captured and additionally process the image during stage motion to determine the distance traveled (e.g. by autocorrelation), providing feedback on motor control for pixel-scale positioning accuracy. The application software 100 may be configured to discern the amount of relevant material in the field of view and

automatically capture fields above a user-defined threshold while passing over the remainder, and additionally process the image to determine a reference point based on feature recognition of a reference point in the sample.

[0070] Application software 100 may also incorporate electronic medical record keeping, and GPS geotagging of acquired images, where

geotagging is used for disease epidemiology, outbreak monitoring, or patient follow-up planning. Functionality for SMS text-message or voice calls may be implemented, which relay information about the data collected by the device to stakeholders (users, supervisors, doctors, patients, etc.). The application software 100 may be configured to transmit images, videos, analysis results, device usage data, and medical record data to a cloud server for analysis, storage, or dissemination, as well as analyze collected images or videos and provide an automated clinical diagnosis.

[0071] Further aspects of application software 100 include functionality for detection of bacteria in stained specimens, provide sub-images of interest to the user for evaluation, include a support-vector machine algorithm which uses shape metrics to classify features within an image, automatically align and stitch neighboring fields, creating a larger image mosaic, acquire a mosaic of the slide area at a lower capture resolution and present it to the user for selection of the region of interest to be captured at a higher resolution, and detect a calibration target and automatically adjusts camera, stage, or analysis settings depending on the features of said target.

[0072] As a result of this novel combination of hardware and software features, the system 10 represents a platform for microscopic data acquisition in low-resource or non-traditional locations where infrastructure and user skill may be insufficient for existing technologies.

[0073] Embodiments of the present technology may be described herein with reference to methods and systems according to embodiments of the technology, and/or procedures, algorithms, steps, operations, formulae, or other computational depictions, including flowchart illustrations thereof, which may also be implemented as computer program products. In this regard, each block or step of a flowchart, and combinations of blocks (and/or steps) in a flowchart, as well as any procedure, algorithm, step, operation, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code. As will be appreciated, any such computer program instructions may be executed by one or more computer processors, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer processor(s) or other programmable processing apparatus create means for implementing the function(s) specified.

[0074] Accordingly, blocks of the flowcharts, and procedures, algorithms, steps, operations, formulae, or computational depictions described herein support combinations of means for performing the specified function(s), combinations of steps for performing the specified function(s), and computer program instructions, such as embodied in computer-readable program code logic means, for performing the specified function(s). It will also be understood that each block of the flowchart illustrations, as well as any procedures, algorithms, steps, operations, formulae, or computational depictions and combinations thereof described herein, can be implemented by special purpose hardware-based computer systems which perform the specified function(s) or step(s), or combinations of special purpose hardware and computer-readable program code. [0075] Furthermore, these computer program instructions, such as embodied in computer-readable program code, may also be stored in one or more computer-readable memory or memory devices that can direct a computer processor or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory or memory devices produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s). The computer program instructions may also be executed by a computer processor or other programmable processing apparatus to cause a series of operational steps to be performed on the computer processor or other programmable processing apparatus to produce a computer-implemented process such that the instructions which execute on the computer processor or other programmable processing apparatus provide steps for implementing the functions specified in the block(s) of the flowchart(s), procedure (s) algorithm(s), step(s), operation(s), formula(e), or computational

depiction(s).

[0076] It will further be appreciated that the terms "programming" or

"program executable" as used herein refer to one or more instructions that can be executed by one or more computer processors to perform one or more functions as described herein. The instructions can be embodied in software, in firmware, or in a combination of software and firmware. The instructions can be stored local to the device in non-transitory media, or can be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely. Instructions stored remotely can be downloaded (pushed) to the device by user initiation, or automatically based on one or more factors.

[0077] It will further be appreciated that as used herein, that the terms

processor, computer processor, central processing unit (CPU), and computer are used synonymously to denote a device capable of executing the instructions and communicating with input/output interfaces and/or peripheral devices, and that the terms processor, computer processor, CPU, and computer are intended to encompass single or multiple devices, single core and multicore devices, and variations thereof.

[0078] From the description herein, it will be appreciated that that the

present disclosure encompasses multiple embodiments which include, but are not limited to, the following:

[0079] 1 . An apparatus comprising a mobile device (phone, tablet, etc) with a touch screen and camera, integrated into a microscope with an

automated stage that moves a sample below an objective.

[0080] 2. The apparatus of any preceding embodiment, with an internal rechargeable battery or a switchable external battery.

[0081] 3. The apparatus of any preceding embodiment, where the touch screen is used to display the magnified images, control the stage

movement through hand gestures, and/or control data collection, analysis, and transmission.

[0082] 4. The apparatus of any preceding embodiment, where hand

gestures either in contact with the screen or detected by a non-contact method (e.g. front-facing camera) control microscope, stage, and data functions.

[0083] 5. The apparatus of any preceding embodiment, where gestures control stage movement through a wired or wireless (e.g. Bluetooth) connection to motors that control sample movement and focus through specific sequences (e.g. double taps, long taps, etc).

[0084] 6. The apparatus of any preceding embodiment, where the motors move the sample and/or the optical system (in the case of an infinity optics system) to collect data.

[0085] 7. The apparatus of any preceding embodiment, where software on the mobile device under gesture control is used to move the stage in proportion to the gesture and/or initiate an automated sequence of motions.

[0086] 8. The apparatus of any preceding embodiment, where software on the mobile device under gesture control is used to control imaging features, including focus, exposure, color balance, contrast, etc.

[0087] 9. The apparatus of any preceding embodiment, where software on the mobile device under gesture control is used to control sample illumination mode (bright field, dark field, phase, etc) or intensity.

[0088] 10. The apparatus of any preceding embodiment, where software on the mobile device under gesture control is used to control an arrayed LED illumination source to vary the orientation and numerical aperture of the illumination or initiate the acquisition of a series of illumination states.

[0089] 1 1 . The apparatus of any preceding embodiment, where software on the mobile device under gesture control is used to vary the digital magnification of the region of interest and display a corresponding size calibration (the dynamic scale bar).

[0090] 12. The apparatus of any preceding embodiment, where software on the mobile device under gesture control is used to switch the light collecting optic (the objective) to vary the numerical aperture, resolution, and field of view.

[0091] 13. The apparatus of any preceding embodiment, where the

illumination and/or collection optics utilize a compact folded optical path.

[0092] 14. The apparatus of any preceding embodiment, where the camera is a separate from the mobile device and controlled wirelessly.

[0093] 15. The apparatus of any preceding embodiment, where activation of proximity sensors (e.g., Hall Effect or limit switches) on each axis are used to represent an origin point for the sample position in the microscope.

[0094] 16. The apparatus of any preceding embodiment, where software for analyzing the image captured can additionally process the image during stage motion to determine the distance traveled (e.g. by autocorrelation), providing feedback on motor control for pixel-scale positioning accuracy.

[0095] 17. The apparatus of any preceding embodiment, where software on-board can discern the amount of relevant material in the field of view and automatically capture fields above a user-defined threshold while passing over the remainder.

[0096] 18. The apparatus of any preceding embodiment, where onboard software incorporates tutorials to guide the user of the instrument.

[0097] 19. The apparatus of any preceding embodiment, where software for analyzing the image captured can additionally process the image to determine a reference point based on feature recognition of a reference point in the sample.

[0098] 20. The apparatus of any preceding embodiment, where illumination optics and interference filters are used for fluorescence imaging.

[0099] 21 . The apparatus of any preceding embodiment, where the

illumination optics and filters are changed for multi-channel fluorescence imaging.

[00100] 22. The apparatus of any preceding embodiment, incorporating a calibration sample which is automatically positioned under the collection optics during a "self test" routine in order to provide calibration images for focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, or stage translation/backlash.

[00101] 23. The apparatus of any preceding embodiment, where results from the self test routine are compared with thresholds set for performance along dimensions of calibration images for focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, and stage translation/backlash to alert the user for maintenance and

automatically order replacement parts to be delivered to the location of the instrument.

[00102] 24. The apparatus of any preceding embodiment, with vibration- isolation feet attached to the base.

[00103] 25. The apparatus of any preceding embodiment, incorporating electronic medical record keeping.

[00104] 26. The apparatus of any preceding embodiment, incorporating GPS geotagging of acquired images.

[00105] 27. The device and GPS geotagging feature above, in which

geotagging is used for disease epidemiology, outbreak monitoring, or patient follow-up planning.

[00106] 28. The apparatus of any preceding embodiment, incorporating software for SMS text-message or voice calls which relay information about the data collected by the device to stakeholders (users, supervisors, doctors, patients, etc.).

[00107] 29. The apparatus of any preceding embodiment, incorporating software to transmit images, videos, analysis results, device usage data, and medical record data to a cloud server for analysis, storage, or dissemination.

[00108] 30. The apparatus of any preceding embodiment, where the mobile device incorporates analysis software to analyze collected images or videos.

[00109] 31 . The apparatus of any preceding embodiment, where the

software provides an automated clinical diagnosis.

[00110] 32. The apparatus of any preceding embodiment, where the

software is a support-vector machine algorithm which uses shape metrics to classify features within an image.

[00111] 33. The apparatus of any preceding embodiment, where the

software detects bacteria in stained specimens.

[00112] 34. The apparatus of any preceding embodiment, where the

software presents sub-images of interest to the user for evaluation.

[00113] 35. The apparatus of any preceding embodiment, where the

software automatically aligns and stitches neighboring fields, creating a larger image mosaic.

[00114] 36. The apparatus of any preceding embodiment, where the

software can acquire a mosaic of the slide area at a lower capture resolution and present it to the user for selection of the region of interest to be captured at a higher resolution.

[00115] 37. The apparatus of any preceding embodiment, where the

software detects a calibration target and automatically adjusts camera, stage, or analysis settings depending on the features of said target.

[00116] 38. The apparatus of any preceding embodiment, where low-cost rack and pinion parts are fabricated and meshed utilizing the

complementary laser cutter kerfs to ensure precise meshing.

[00117] 39. An automated slide scanning system, comprising; one or more optical elements having an optical path configured to be disposed within view of a camera of a portable device; the one or more optical elements comprising an objective; an automated stage disposed within the optical path; the automated stage comprising a platform configured for receiving a slide containing a biological sample; the automated stage comprising a drive mechanism for translating the stage in at least one direction with respect to said objective; and a communications interface coupled to the automated stage; wherein the communication interface is configured for receiving a command from the portable device to control operation of the mechanical stage.

[00118] 40. The system of any preceding embodiment, the portable device comprising a cellular device having a processor and memory, the system further comprising: application programming configured to be stored as instructions in non-transitory computer-readable memory on the portable device; and wherein the application programming is executable on the processor for executing the command.

[00119] 41 . The system of any preceding embodiment: wherein the

automated stage comprises an x-y stage comprising independent drive mechanisms for x-axis translation and y-axis translation of the slide with respect to the objective; and wherein the application programming is configured for controlling motion of the x-y stage via one or more

commands initiated from the portable device.

[00120] 42. The system of any preceding embodiment: wherein the

automated stage comprises a z-axis stage comprising an independent drive mechanism for z-axis translation of the slide with respect to the objective; and wherein the application programming is configured for controlling motion of the stage via one or more commands initiated from the portable device to focus imaging of the sample on the slide.

[00121] 43. The system of any preceding embodiment, further comprising: an illumination source coupled to the automated stage, the illumination source configured to direct light at the slide; and wherein the programming is further configured to control operation of the illumination source via one or more commands initiated at the portable device. [00122] 44. The system of any preceding embodiment, wherein the objective comprises an infinity objective, and the optical elements further comprise a plurality of mirrors folding the light path into a compact light path in line with an achromatic tube lens and the infinity objective.

[00123] 45. The system of any preceding embodiment: wherein the portable device comprises a touch screen; and wherein the application software comprises a gesture recognition module configured to control motion of the automated stage in response to one or more hand gestures applied to the touch screen.

[00124] 46. The system of any preceding embodiment: wherein the

communications interface comprises a wireless communication interface, and wherein motion of the automatic stage is controlled wirelessly from one or more hand gestures initiated at the touch screen.

[00125] 47. The system of any preceding embodiment, wherein translation of the automatic stage is in proportion to the one or more gestures.

[00126] 48. The system of any preceding embodiment, wherein translation of the automatic stage is affected as an automated sequence of motions in response to the one or more gestures.

[00127] 49. The system of any preceding embodiment, wherein the

programming is further configured to control one or more of bright field, dark field, phase, or intensity of the light directed by the illumination source.

[00128] 50. The system of any preceding embodiment: wherein the

illumination source comprises an LED array; and wherein the application programming is further configured to vary the orientation and numerical aperture of the illumination light directed by the illumination source or initiate the acquisition of a series of illumination states with respect to the light directed by the illumination source.

[00129] 52. The system of any preceding embodiment, wherein the

application programming is configured to vary one or more of the numerical aperture, resolution, and field of view of the objective.

[00130] 53. The system of any preceding embodiment wherein the

application programming is configured to vary a digital magnification of a region of interest of the sample and display a corresponding size calibration corresponding to the magnification.

[00131] 54. The system of any preceding embodiment, wherein where

illumination source and application programming are configured for fluorescence imaging of the sample.

[00132] 55. The system of any preceding embodiment, further comprising: a calibration sample positioned within the optical path; wherein the application software is configured to perform a calibration routine to provide calibration images of the sample of one or more of: focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, or stage translation/backlash.

[00133] 56. The system of any preceding embodiment, wherein the

application programming is configured to apply one or more shape metrics to classify features within an image of the sample.

[00134] 57. The system of any preceding embodiment, wherein the

application programming is configured to process an image of the sample during motion of the automatic stage to determine a distance traveled and providing feedback for pixel-scale positioning of the automatic stage.

[00135] 58. The system of any preceding embodiment, further comprising: one or more proximity sensors coupled to the automatic stage; and wherein the one or more proximity sensors provide position data of the automatic stage for establishing one or more of an origin point or limit for translation of the automatic stage.

[00136] 59. The system of any preceding embodiment, wherein the drive mechanism comprises a stepper motor coupled to a rack and pinion.

[00137] 60. An apparatus for automatically scanning and imaging a slide containing a biological sample, comprising: (a) one or more optical elements having an optical path configured to be disposed within view of a camera of a portable device; (b) the one or more optical elements

comprising an objective; (c) an automated stage disposed within the optical path, the automated stage comprising a platform configured for receiving a slide containing a biological sample and a drive mechanism for translating the stage in at least one direction with respect to said objective; (d) a communications interface coupled to the automated stage; (e) a computer processor; and (f)a non-transitory computer-readable memory storing instructions executable by the computer processor; (g) wherein said instructions, when executed by the computer processor, perform steps comprising: (i) for receiving a command from the portable device; (ii) sending the command via the communication interface to the automatic stage; and (iii) translating the mechanical stage in response to said command.

[00138] 61 . The system of any preceding embodiment: wherein the

automated stage comprises an x-y stage comprising independent drive mechanisms for x-axis translation and y-axis translation of the slide with respect to the objective; and wherein said instructions when executed by the computer processor are further configured to control motion of the x-y stage via one or more commands initiated from the portable device.

[00139] 62. The system of any preceding embodiment: wherein the

automated stage comprises a z-axis stage comprising an independent drive mechanism for z-axis translation of the slide with respect to the objective; and wherein said instructions when executed by the computer processor are further configured to control motion of the stage via one or more commands initiated from the portable device to focus imaging of the sample on the slide.

[00140] 63. The system of any preceding embodiment, further comprising: an illumination source coupled to the automated stage, the illumination source configured to direct light at the slide; and wherein said instructions when executed by the computer processor are further configured to control operation of the illumination source via one or more commands initiated at the portable device.

[00141] 64. The system of any preceding embodiment, wherein the objective comprises an infinity objective, and the optical elements further comprise a plurality of mirrors folding the light path into a compact light path in line with an achromatic tube lens and the infinity objective. [00142] 65. The system of any preceding embodiment: wherein the portable device comprises a touch screen; and wherein said instructions when executed by the computer processor are further configured to control motion of the automated stage in response to one or more hand gestures applied to the touch screen.

[00143] 66. The system of any preceding embodiment: wherein the

communications interface comprises a wireless communication interface; and wherein motion of the automatic stage is controlled wirelessly from one or more hand gestures initiated at the touch screen.

[00144] 67. The system of any preceding embodiment, wherein translation of the automatic stage is in proportion to the one or more gestures.

[00145] 68. The system of any preceding embodiment, wherein translation of the automatic stage is affected as an automated sequence of motions in response to the one or more gestures.

[00146] 69. The system of any preceding embodiment, wherein said

instructions when executed by the computer processor are further configured to control one or more of bright field, dark field, phase, or intensity of the light directed by the illumination source.

[00147] 70. The system of any preceding embodiment: wherein the

illumination source comprises an LED array; and wherein said instructions when executed by the computer processor are further configured to vary the orientation and numerical aperture of the illumination light directed by the illumination source or initiate the acquisition of a series of illumination states with respect to the light directed by the illumination source.

[00148] 71 . The system of any preceding embodiment, wherein said

instructions when executed by the computer processor are further configured to vary one or more of the numerical aperture, resolution, and field of view of the objective.

[00149] 72. The system of any preceding embodiment, wherein said

instructions when executed by the computer processor are further configured to vary a digital magnification of a region of interest of the sample and display a corresponding size calibration corresponding to the magnification.

[00150] 73. The system of any preceding embodiment, wherein where

illumination source is configured for fluorescence imaging of the sample.

[00151] 74. The system of any preceding embodiment, further comprising: a calibration sample positioned within the optical path; wherein said instructions when executed by the computer processor are further configured to calibrate images of the sample of one or more of: focus, field flatness, illumination intensity, illumination uniformity, exposure, contrast, optical resolution, or stage translation/backlash.

[00152] 75. The system of any preceding embodiment, wherein said

instructions when executed by the computer processor are further configured to apply one or more shape metrics to classify features within an image of the sample.

[00153] 76. The system of any preceding embodiment wherein said

instructions when executed by the computer processor are further configured to process an image of the sample during motion of the automatic stage to determine a distance traveled and providing feedback for pixel-scale positioning of the automatic stage.

[00154] 77. The system of any preceding embodiment, further comprising: one or more proximity sensors coupled to the automatic stage; wherein the one or more proximity sensors provide position data of the automatic stage for establishing one or more of an origin point or limit for translation of the automatic stage.

[00155] 78. A method for automatically scanning and imaging a slide

containing a biological sample, comprising: providing an automated stage having an opticall path and an objective disposed within the optical path, the automated stage comprising a platform configured for receiving a slide containing a biological sample and a drive mechanism for translating the stage in at least one direction with respect to said objective; positioning a camera of a portable device in view of the optical path; sending a command from the portable device to the automatic stage; translating the mechanical stage in response to said command; and imaging the biological sample with the camera.

[00156] 79. The method of any preceding embodiment, wherein the portable device comprises a touch screen, the method further comprising: controlling motion of the automated stage in response to one or more hand gestures applied to the touch screen.

[00157] 80. The method of any preceding embodiment, the method further comprising: controlling motion of the stage via one or more commands initiated from the portable device to focus imaging of the sample on the slide.

[00158] Although the description herein contains many details, these should not be construed as limiting the scope of the disclosure but as merely providing illustrations of some of the presently preferred embodiments. Therefore, it will be appreciated that the scope of the disclosure fully encompasses other embodiments which may become obvious to those skilled in the art.

[00159] In the claims, reference to an element in the singular is not intended to mean "one and only one" unless explicitly so stated, but rather "one or more." All structural, chemical, and functional equivalents to the elements of the disclosed embodiments that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. Furthermore, no element, component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. No claim element herein is to be construed as a "means plus function" element unless the element is expressly recited using the phrase "means for". No claim element herein is to be construed as a "step plus function" element unless the element is expressly recited using the phrase "step for". Table 1

Example Software Embodiment

Firmware for Arduino Mega:

ArduinoFirmware_sent.ino

#include <SPI.h>

#include <boards.h>

#include <ble_shield.h>

#include <services.h>

#include <Servo.h>

#define STEP_PIN 28

#define STEP_PIN_2 31

#define STEP_PIN_3 30

#define STEP OBCHANGE 29

#define EN_1 24

#define EN_2 33

#define EN_3 34

#define DIR_PIN 35

#define DIR_PIN_2 26

#define DIR_PIN_3 32

#define xLimit 44

#define yLimit 38

#define zLimit 40

#define l_ED_1 46

#define LED_2 47

#define MICRO_STEP_PIN 41

Servo myservo; void setup()

{

pinMode(STEP_PIN, OUTPUT);

pinMode(STEP_PIN_2, OUTPUT);

pinMode(STEP_PIN_3, OUTPUT);

pinMode(DIR_PIN, OUTPUT);

pinMode(STEP_OBCHANGE, OUTPUT);

pinMode(DIR_PIN_2, OUTPUT);

pinMode(DIR_PIN_3, OUTPUT); pinMode(EN_1 , OUTPUT);

pinMode(EN_2, OUTPUT); pinMode(EN_3, OUTPUT); pinMode(zLimit, INPUT_PULLUP);

pinMode(xLimit, INPUT_PULLUP);

pinMode(yLimit, INPUT_PULLUP);

pinMode(LED_1 , OUTPUT);

pinMode(LED_2, OUTPUT); pinMode(MICRO_STEP_PIN, OUTPUT); digitalWrite(STEP_PIN, HIGH);

digitalWrite(STEP_PIN_2, HIGH);

digitalWrite(STEP_PIN_3, HIGH);

d ig ital Write(D I R_P I N , HIGH);

digitalWrite(STEP_OBCHANGE, HIGH); digitalWrite(DIR_PIN_2, HIGH);

digitalWrite(DIR_PIN_3, HIGH); digitalWrite(EN_1 , HIGH);

digitalWrite(EN_2, HIGH);

digitalWrite(EN_3, HIGH);

digitalWrite(MICRO_STEP_PIN, HIGH); digitalWrite(LED_1 , LOW);

dig ital Write(LED_2, HIGH);

SPI.setDataMode(SPI_MODE0);

SPI.setBitOrder(LSBFIRST);

SPI.setClockDivider(SPI_CLOCK_DIV16); SPI.begin(); ble_begin();

}

void loopO

{

static boolean analog_enabled = false; static byte old_state = LOW;

// If data is ready

while(ble_available())

{

// read out command and data

byte dataO = ble_read();

byte datal = ble_read();

byte data2 = ble_read(); if (dataO == 0x02) // Command is to step {

int i=0;

while (i<data1 ){ digitalWrite(STEP_PIN, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN, LOW);

delayMicroseconds(500);

i=i+1 ;

}

i=0;

while (i<data2){

digitalWrite(STEP_PIN_2, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_2, LOW);

delayMicroseconds(500);

i=i+1 ;

}

else if (dataO == 0x03) // Command is to set dir high digitalWrite(DIR_PIN, HIGH); else if (dataO == 0x13) // Command is to set dir low digitalWrite(DIR_PIN, LOW); else if (dataO == 0x04) // Command is to set EN1 to high digitalWrite(EN_1 , HIGH); else if (dataO == 0x14) // Command is to set EN1 to low digitalWrite(EN_1 , LOW); else if (dataO == 0x05) // Command is to set EN2 to high digitalWrite(EN_2, HIGH); else if (dataO == 0x15) // Command is to set EN2 to low digitalWrite(EN_2, LOW); else if (dataO == 0x06) // Command is to set EN3 to high digitalWrite(EN_3, HIGH); else if (dataO == 0x16) // Command is to set EN3 to low digitalWrite(EN_3, LOW); else if (dataO == 0x07) // Connnnand is to set obchange dir high digitalWrite(DIR_PIN_2, HIGH); else if (dataO == 0x17) // Connnnand is to set obchange dir low digitalWrite(DIR_PIN_2, LOW); else if (dataO == 0x18) // Connnnand is to obchange step int i=0;

while (i<data1 ){

digitalWrite(STEP_OBCHANGE, HIGH);

delay(1 );

digitalWrite(STEP_OBCHANGE, LOW);

delay(1 );

i=i+1 ;

} }

else if (dataO == 0x19) // Connnnand is to step motor 2

{

int i=0;

while (i<data1 ){

digitalWrite(STEP_PIN_2, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_2, LOW);

delayMicroseconds(500);

i=i+1 ;

}

}

else if (dataO == 0x20) // Connnnand is to step motor 3 { int i=0;

while (i<data1 ){

digitalWrite(STEP_PIN_3, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_3, LOW);

delayMicroseconds(500);

i=i+1 ;

}

}

else if (dataO == 0x21 ) // Command is to set to 1/16th step digitalWrite(MICRO_STEP_PIN, LOW); else if (dataO == 0x22) // Command is to half step {

digitalWrite(MICRO_STEP_PIN, HIGH); else if (data0== 0x23)

{

//move x to limit

digitalWrite(DIR_PIN_2, HIGH);

while (digitalRead(xLimit)==HIGH)

{

digitalWrite(STEP_PIN, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN, LOW);

delayMicroseconds(500);

}

//move x away from limit

digitalWrite(DIR_PIN_2, LOW);

while (digitalRead(xLimit)==LOW)

{

digitalWrite(STEP_PIN, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN, LOW);

delayMicroseconds(500);

}

//move y to limit

digitalWrite(DIR_PIN, LOW);

while (digitalRead(yLimit)==HIGH)

{

digitalWrite(STEP_PIN_2, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_2, LOW);

delayMicroseconds(500);

}

//move y away from limit

d ig ital Write( D I R_P I N , HIGH);

while (digitalRead(yLimit)==LOW)

{

digitalWrite(STEP_PIN_2, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_2, LOW);

delayMicroseconds(500);

} }

else if (data0== 0x24)

{

//move z to limit

digitalWrite(DIR_PIN_3, HIGH);

while (digitalRead(zLimit)==LOW){

digitalWrite(STEP_PIN_3, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_3, LOW);

delayMicroseconds(500);

}

//move z away from limit

digitalWrite(DIR_PIN_3,LOW);

while (digitalRead(zLimit)==HIGH){

digitalWrite(STEP_PIN_3, HIGH);

delayMicroseconds(500);

digitalWrite(STEP_PIN_3, LOW);

delayMicroseconds(500);

} else if (dataO == 0x08) // Command is to set dir3 high digitalWrite(DIR_PIN_3, HIGH); else if (dataO == 0x09) // Command is to set dir3 dir low digitalWrite(DIR_PIN_3, LOW); else if (data0== 0x25) digitalWrite(LED_1 ,LOW); else if (data0== 0x26)

{

dig italWrite(LED_1 .HIGH); else if (data0== 0x27)

{

digitalWrite(LED_2,LOW); else if (data0== 0x28)

{

digitalWrite(LED_2,HIGH); }

}

if (!ble_connected())

{

}

// Allow BLE Shield to send/receive data

ble_do_events();

}

iOS Software as part of application:

AppDelegate.h

#import <UIKit/UIKit.h>

©interface AppDelegate : UIResponder <UIApplicationDelegate> { NSManagedObjectModel * managedObjectModel;

NSManagedObjectContext * managedObjectContext;

NSPersistentStoreCoordinator * persistentStoreCoordinator;

}

©property (strong, nonatomic) UlWindow * window;

©property (nonatomic, retain, readonly) NSManagedObjectModel * managedObjectModel;

©property (nonatomic, retain, readonly) NSManagedObjectContext * managedObjectContext;

©property (nonatomic, retain, readonly) NSPersistentStoreCoordinator * persistentStoreCoordinator;

- (NSString * )applicationDocumentsDirectory;

©end

AppDelegate.m

//

// AppDelegate.m

// MiniScope

//

// Created by Mike D'Ambrosio on 2/22/14. // Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import "AppDelegate.h"

#import "ViewController.h"

#import <CoreData/CoreData.h>

#import "ViewController.h"

©interface AppDelegate ()

@end

©implementation AppDelegate

- (NSManagedObjectContext * ) managedObjectContext {

if (managedObjectContext != nil) {

return managedObjectContext;

}

NSPersistentStoreCoordinator * coordinator = [self persistentStoreCoordinator] if (coordinator != nil) {

managedObjectContext = [[NSManagedObjectContext alloc] in it] ;

[managedObjectContext setPersistentStoreCoordinator: coordinator];

}

return managedObjectContext;

}

- (NSManagedObjectModel * )managedObjectModel {

if (managedObjectModel != nil) {

return managedObjectModel;

}

managedObjectModel = [NSManagedObjectModel

mergedModelFromBundles:nil]; return managedObjectModel;

}

- (NSPersistentStoreCoordinator * )persistentStoreCoordinator {

if (persistentStoreCoordinator != nil) {

return persistentStoreCoordinator;

}

NSURL * storeUrl = [NSURL fileURLWithPath: [[self

applicationDocumentsDirectory]

stringByAppendingPathComponent: @"<Project

Name>.sqlite"]];

NSError * error = nil;

persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]]; if(![persistentStoreCoordinator

addPersistentStoreWithType:NSSQLiteStoreType

configuration:nil URL:storeUrl options:nil error:&error]) {

}

return persistentStoreCoordinator;

}

- (NSString * )applicationDocumentsDirectory {

return [NSSearchPathForDirectorieslnDomains(NSDocunnentDirectory, NSUserDomainMask, YES) lastObject];

}

- (void)dealloc {

}

- (BOOL)application:(UIApplication * )application

didFinishLaunchingWithOptions:(NSDictionary * )launchOptions

{

UINavigationController * rootView = (UINavigationController

*)self .window. rootViewController;

ViewController * wtf=(ViewController * )rootView.visibleViewController;

wtf.managedObjectContext= self.managedObjectContext;

return YES;

}

- (void)applicationWillResignActive:(UIApplication * )application

- (void)applicationDidEnterBackground:(UIApplication * )application

- (void)applicationWillEnterForeground:(UIApplication * )application

- (void)applicationDidBecomeActive:(UIApplication * )application - (void)applicationWillTernninate:(UIApplication * )application

{

}

@end

ImageProcessor.h

//

// ImageProcessor.h

// EyeScope2

//

// Created by Mike D'Ambrosio on 6/17/13.

// Copyright (c) 2013 Mike D'Ambrosio. All rights reserved.

//

#import <Foundation/Foundation.h>

#import <AssetsLibrary/AssetsLibrary.h>

#import <AVFoundation/AVFoundation.h>

©interface ImageProcessor : NSObject

-(double) calcFocus:(Ullmage * ) image; @end ImageProcessor.mm

//

// ImageProcessor.m

// EyeScope2

//

// Created by Mike D'Ambrosio on 6/17/13.

// Copyright (c) 2013 Mike D'Ambrosio. All rights reserved.

// #import "ImageProcessor.h"

©implementation ImageProcessor

{

}

-(id)init {

self=[super init];

return self;

}

-(double) calcFocus:(Ullmage * ) image{

CGSize size=CGSizeMake(640,480);

image=[self imageWithlmage:innage convertToSize:size];

cv::Mat src=[self cvMatFromUllnnage:innage];

cv::Mat lap;

cv::Laplacian(src, lap, CV_64F);

cv::Scalar ave, std;

cv::meanStdDev(lap, ave, std);

double focusMeasure = std.val[0] * std.val[0];

return focusMeasure;

}

- (cv::Mat)cvMatFromUllmage:(Ullnnage * )image2

{

CGColorSpaceRef colorSpace = CGImageGetColorSpace(image2.CGInnage) CGFIoat cols = image2. size. height;

CGFIoat rows = image2. size.width;

cv::Mat cvMat(rows, cols, CV_8UC4);

CGContextRef contextRef = CGBitmapContextCreate(cvMat.data, cols, rows, 8, cvMat.step[0], colorSpace, kCGImageAlphaNoneSkipLast |

kCGBitmapByteOrderDefault);

CGContextTranslateCTM (contextRef, cols, rows);

CGContextRotateCTM (contextRef,(-180 * M_PI/180));

CGContextDrawlmage(contextRef, CGRectMake(0, 0, cols, rows),

image2.CGInnage);

CGContextRelease(contextRef);

return cvMat;

}

- (Ullnnage * )imageWithlmage:(Ullnnage * )image convertToSize:(CGSize)size {

UIGraphicsBeginlmageContext(size);

[image drawlnRect:CGRectMake(0, 0, size.width, size. height)];

Ullnnage * destlmage = UIGraphicsGetlmageFromCurrentlmageContext(); UIGraphicsEndlmageContext();

return destlmage;

}

@end

Settings.h //

// Settings. h

// MiniScope

//

// Created by Mike D'Ambrosio on 5/6/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import <Foundation/Foundation.h>

#import <CoreData/CoreData.h>

©interface Settings : NSManagedObject

©property (nonatomic, retain) NSNumber * highResScanFields ©property (nonatomic, retain) NSNumber * fieldsPerFocus; ©property (nonatomic, retain) NSNumber * fineFocusOnly;

©end

Settings.m //

// Settings.m

// MiniScope

//

// Created by Mike D'Ambrosio on 5/6/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import "Settings. h"

©implementation Settings

©dynamic highResScanFields;

©dynamic fieldsPerFocus;

©dynamic fineFocusOnly;

©end ViewController.h

// // ViewController.h

// MiniScope

//

// Created by Mike D'Ambrosio on 2/22/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import <UIKit/UIKit.h>

#import <AVFoundation/AVFoundation.h>

#import "BLE.h"

#import "Sequencer.h"

#import "ImageProcessor.h"

#import <CoreData/CoreData.h>

#import "Stage.h"

©interface ViewController : UlViewController

<UIScrollViewDelegate,UIGestureRecognizerDelegate,NSFetch edResultsControll erDelegate,UIAIertViewDelegate> {

NSFetchedResultsController * fetchedResultsController;

NSManagedObjectContext * managedObjectContext;

UIButton * linkButton;

UlViewController * photoViewController;

}

©property (nonatonnic, strong) AVCaptureVideoPreviewLayer

*captureVideoPreviewLayer;

©property (nonatonnic, strong) AVCaptureSession * session;

©property (nonatonnic, strong) AVCaptureDevice * device;

©property (nonatonnic, strong) AVCaptureDevicelnput * input;

©property (nonatonnic, strong) AVCaptureVideoDataOutput * videoPreviewOutput;

©property (nonatomic, strong) AVCaptureVideoDataOutput * videoHDOutput;

©property (nonatomic, strong) AVCaptureStilllmageOutput * stillOutput;

©property (strong, nonatomic) Ullmage * image;

©property (strong, nonatomic) Sequencer * sequence;

©property (strong, nonatomic) ImageProcessor * stitcher;

©property (nonatomic, retain) NSFetchedResultsController

* fetchedResultsController;

©property (nonatomic, retain) NSManagedObjectContext

*managedObjectContext;

©property (nonatomic, retain) NSNumber * incomingX;

©property (nonatomic, retain) NSNumber * incomingY;

©property (nonatomic, retain) NSString * imageTitle;

©property (nonatomic, retain) Stage * stage; @end

ViewController.m

//

// ViewController.m

// MiniScope

//

// Created by Mike D'Ambrosio on 2/22/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

// #import "ViewController.h"

#import <AssetsLibrary/AssetsLibrary.h>

#import "Sequencer.h"

#import "ImageProcessor.h"

#import "Stage. h"

#import "SettingsViewController.h"

#import "Settings. h"

#import <lmagelO/lmagelO.h>

©interface ViewController () @end ©implementation ViewController {

UlView * previewView;

UlScrollView * preview;

UITextField * textField;

UITextField * title;

UIButton * focusButton;

UIButton ableButton;

UIButton * highResScanButton;

UIButton * settingsButton;

UIButton * reExposeButton;

UIButton * loadSlideButton;

UIButton * focusUpButton;

UIButton * focusDown Button;

UIButton * resetFocusButton;

UIButton * snapButton; float zPos;

float xPos; float yPos;

float lastYPos;

float lastXPos;

UlSlider * coarseFocusSlider;

UlSlider * fineFocusSlider;

double focusScore;

double bestFocusPos;

double lastSliderVal;

double bestFocusScore;

int fineFocus;

NSMutableArray * focusPositions;

int sendHalt;

int currHalfStepping;

int localSequenceCounter;

int fieldsPerFocus;

int continueGhd;

int fineFocusOnly;

NSTimer * focusUpTimer;

NSTimer * focusDownTimer;

NSMutableArray * imageUrlsForCopy;

}

©synthesize fetchedResultsController, managedObjectContext;

©synthesize session;

©synthesize device;

©synthesize input;

©synthesize videoPreviewOutput, videoHDOutput, stillOutput;

©synthesize captureVideoPreviewLayer;

©synthesize image;

©synthesize sequence;

©synthesize stitcher;

©synthesize imageTitle;

©synthesize stage;

- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer

*)gestureRecognizer{

return YES;

}

- (BOOL)gestureRecognizer:(UIGestureRecognizer * )gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestur eRecog * )oth erGestu re Recog n izer{

return YES;

}

- (BOOL)gestureRecognizer:(UIGestureRecognizer * )gestureRecognizer should ReceiveTouch :(U ITouch * )touch{

if ([touch.view isKindOfClass:[UIButton class]]) { return NO;

}

return YES;

}

- (void)viewDidLoad

{

[super viewDidLoad];

zPos=1 1200/6;

stage=[[Stage alloc] in it] ;

[stage connectBLE];

focusPositions = [[NSMutableArray alloc] in it] ;

imageUrlsForCopy = [[NSMutableArray alloc] in it] ; if(&UIApplicationWillEnterForegroundNotification != nil)

{

[[NSNotificationCenter defaultCenter] addObserver:self

selector:@selector(foregroundSel:)

name:UIApplicationWillEnterForegroundNotification object:nil];

}

UITapGestureRecognizer * singleTap = [[UITapGestureRecognizer alloc] initWithTarge self action:@selector(handleSingleTap:)];

[singleTap setNumberOfTapsRequired:1 ];

[self.view addGestureRecognizersingleTap];

UITapGestureRecognizer * doubleTap = [[UITapGestureRecognizer alloc] initWithTarge self action:@selector(handleDoubleTap:)];

[dou bl eTa p set N u m berOfTa ps Req u i red : 2] ;

[self.view addGestureRecognizer:doubleTap];

UILongPressGestureRecognizer * longPress =

[[UILongPressGestureRecognizer alloc] initWithTarge self

action:@selector(handleLongPress:)];

//[self.view addGestureRecognizer:longPress];

//longPress.delegate=self;

CGRect scrollViewFrame = CGRectMake(0, 0, 1024, 768);

preview = [[UlScrollView alloc] initWithFrame:scrollViewFrame];

preview.maximunnZoonnScale = 5;

preview.zoomScale = 1 ;

preview.minimumZoonnScale=1 ;

preview.bounces=NO;

preview.bouncesZoom=NO;

preview.panGestureRecognizer.cancelsToucheslnView = NO;

[self.view addSubview:preview];

previewView= [[UlView alloc] initWithFrame:CGRectMake(0, 0, 724,768)]; [preview addSubview:previewView]; [previewView setUserlnteractionEnabled:YES];

preview.bounces=NO;

preview.delegate=self; UIPanGestureRecognizer * panGestureRec = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(panGestureListener:)];

panGestureRec.delegate=self;

[preview addGestureRecognizer:panGestureRec]; UlView * verticalBar = [[UlView alloc] in it] ;

[verticalBar setBackgroundColor:[UIColor redColor]];

[verticalBar setFrame:CGRectMake(750,675,100,5)];

[[self view] addSubview:verticalBar];

textField = [[UITextField alloc] initWithFrame:CGRectMake(750, 625, 100, 50)]; [[self view] addSubview:textField];

textField. text=@"25 μηϊ';

U I Button * session BorderButton = [U I Button

button WithType : U I ButtonTypeCustom] ;

[session BorderButton setBackgroundColor:[UIColor whiteColor]];

[session BorderButton setTitleColor:[UIColor blackColor]

forState:UIControlStateNormal];

sessionBorderButton.layer.borderWidth = 1 ;

sessionBorderButton.layer.borderColor = [UlColor NghtGrayColor].CGColor; sessionBorderButton.layer.cornerRadius = 8;

sessionBorderButton.layer.masksToBounds = YES;

session BorderButton. userlnteractionEnabled=NO;

session BorderButton. frame = CGRectMake(875, 20, 150, 100);

[session BorderButton setTitle: @"" forState: UIControlStateNormal];

[self.view addSubview:sessionBorderButton];

UlTextField * titlel_abel = [[UITextField alloc] initWithFrame:CGRectMake(900, 25, 150, 50)];

titleLabel.textColor = [UlColor blackColor];

titlel_abel.backgroundColor=[UIColor clearColor];

titlel_abel.text=@"Session Title";

[titleLabel setUserlnteractionEnabled:NO];

[self.view addSubview:titlel_abel];

title= [[UITextField alloc] initWithFrame:CGRectMake(900, 50, 150, 75)];

title.textColor = [UlColor blueColor];

[title setUserlnteractionEnabled:YES];

title.text=@"default";

[self.view addSubview:title]; loadSlideButton = [UIButton buttonWithType:UIButtonTypeCustom];

[loadSlideButton setBackgroundColor:[UIColor whiteColor]];

[loadSlideButton setTitleColor:[UIColor blueColor] forState:UIControlStateNormal];

loadSlideButton. layer.borderWidth = 1 ;

loadSlideButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

loadSlideButton. layer.cornerRadius = 8;

loadSlideButton. layer.masksToBounds = YES;

[loadSlideButton addTarge self action:@selector(loadSlideButtonAction:) forControl Events : U IControl EventTouch Down] ;

loadSlideButton. frame = CGRectMake(0, 570, 150, 75);

[loadSlideButton setTitle: @"Load Slide" forState: UlControlStateNormal]; [self.view addSubview:loadSlideButton]; focusButton = [UIButton buttonWithType:UIButtonTypeCustom];

[focusButton setBackgroundColor:[UIColor whiteColor]];

[focusButton setTitleColor:[UIColor blueColor] forState:UIControlStateNormal] focusButton. layer.borderWidth = 1 ;

focusButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

focusButton. layer.cornerRadius = 8;

focusButton. layer.masksToBounds = YES;

[focusButton addTarge self action:@selector(focusButtonAction:)

forControl Events : U IControl EventTouch Down] ;

focusButton. frame = CGRectMake(0, 675, 150, 75);

[focusButton setTitle: @"Autofocus" forState: UlControlStateNormal];

[self.view addSubview:focusButton];

UIButton * focusBorderButton = [UIButton

button WithType : U I ButtonTypeCustom] ;

[focusBorderButton setBackgroundColor:[UIColor whiteColor]];

[focusBorderButton setTitleColor:[UIColor blackColor]

forState:UIControlStateNormal];

focusBorderButton. layer.borderWidth = 1 ;

focusBorderButton. layer.borderColor = [UlColor NghtGrayColor].CGColor; focusBorderButton. layer.cornerRadius = 8;

focusBorderButton. layer.masksToBounds = YES;

focusBorderButton. userlnteractionEnabled=NO;

focusBorderButton. frame = CGRectMake(0, 65, 150, 500);

[focusBorderButton setTitle: @"" forState: UlControlStateNormal];

[self.view addSubview:focusBorderButton]; focusDown Button = [UIButton buttonWithType:UIButtonTypeCustom];

[focusDown Button setBackgroundColor:[UIColor whiteColor]];

[focusDown Button setTitleColor:[UIColor blueColor]

forState:UIControlStateNormal];

focusDown Button. layer.borderWidth = 1 ;

focusDown Button. layer.borderColor = [UlColor NghtGrayColor].CGColor; focusDown Button. layer.cornerRadius = 8; focusDownButton.layer.masksToBounds = YES;

[focusDown Button addTarge self action:@selector(focusDownButtonUp:) forControlEvents:UIControlEventTouchUplnside];

[focusDown Button addTarge self action:@selector(focusDownButtonDown:) forControl Events : U IControl EventTouch Down] ;

focusDown Button. frame = CGRectMake(40, 510, 75, 50);

[focusDown Button setTitle: @"Down" forState: UIControlStateNormal];

//[self.view addSubview:focusDownButton]; focusUpButton = [UIButton buttonWithType:UIButtonTypeCustonn];

[focusUpButton setBackgroundColor:[UIColor whiteColor]];

[focusUpButton setTitleColor:[UIColor blueColor]

forState:UIControlStateNormal];

focusUpButton. layer.borderWidth = 1 ;

focusUpButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

focusUpButton. layer.cornerRadius = 8;

focusUpButton. layer.masksToBounds = YES;

[focusUpButton addTarge self action:@selector(focusUpButtonUp:) forControlEvents:UIControlEventTouchUplnside];

[focusUpButton addTarge self action:@selector(focusUpButtonDown:) forControl Events : U IControl EventTouch Down] ;

focusUpButton. frame = CGRectMake(40, 70, 75, 50);

[focusUpButton setTitle: @" Up" forState: UIControlStateNormal];

//[self.view addSubview:focusUpButton];

UITextField * focusLabel = [[UITextField alloc] initWithFrame:CGRectMake(45, 25, 50, 40)];

focusLabel.textColor = [UlColor blackColor];

focusLabel.backgroundColor=[UIColor clearColor];

focusLabel .text=@"Focus";

[focusLabel setUserlnteractionEnabled:NO];

[self.view addSubview:focusLabel];

UITextField * coarseFocusLabel = [[UITextField alloc]

initWithFrame:CGRectMake(5, 1 10, 60, 40)];

coarseFocusLabel.textColor = [UlColor blackColor];

coarseFocusLabel.backgroundColor=[UIColor clearColor];

coarseFocusLabel.text=@"Coarse";

[coarseFocusLabel setUserlnteractionEnabled:NO];

[self.view addSubview:coarseFocusLabel];

UITextField * fineFocusLabel = [[UITextField alloc]

initWithFrame:CGRectMake(95, 1 10, 50, 40)];

fineFocusLabel.textColor = [UlColor blackColor];

fineFocusLabel.backgroundColor=[UIColor clearColor];

fineFocusLabel.text=@"Fine";

[fineFocusLabel setUserlnteractionEnabled:NO]; [self.view addSubview:fineFocusLabel];

UITextField * scanLabel = [[UITextField alloc] initWithFrame:CGRectMake(920, 215, 150, 50)];

scanLabel.textColor = [UlColor blackColor];

scanLabel.backgroundColor=[UIColor clearColor];

scanLabel .text=@"Scans";

[scanLabel setUserlnteractionEnabled:NO];

[self.view addSubview:scanl_abel]; snapButton = [UIButton buttonWithType:UIButtonTypeCustonn];

[snapButton setBackgroundColor:[UIColor whiteColor]];

[snapButton setTitleColor:[UIColor blueColor] forState:UIControlStateNormal]; snapButton. layer.borderWidth = 1 ;

snapButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

snapButton. layer.cornerRadius = 8;

snapButton. layer.masksToBounds = YES;

[snapButton addTarge self action:@selector(snapButtonAction:)

forControl Events : U IControl EventTouch Down] ;

snapButton. frame = CGRectMake(875, 125, 150, 75);

[snapButton setTitle: @"Acquire Image" forState: UlControlStateNormal];

[self.view addSubview:snapButton]; highResScanButton = [UIButton buttonWithType:UIButtonTypeCustom];

[highResScanButton setBackgroundColor:[UIColor whiteColor]];

[highResScanButton setTitleColor:[UIColor blueColor]

forState:UIControlStateNormal];

highResScanButton. layer.borderWidth = 1 ;

highResScanButton. layer.borderColor = [UlColor NghtGrayColor].CGColor; highResScanButton. layer.cornerRadius = 8;

highResScanButton. layer.masksToBounds = YES;

[highResScanButton addTargetself

action : @selector(h igh ResScan Button Action : )

forControl Events : U IControl EventTouch Down] ;

highResScanButton.frame = CGRectMake(875, 250, 150, 75);

[highResScanButton setTitle: @"Hi-Res Scan" forState: UlControlStateNormal]

[self.view addSubview:highResScanButton]; resetFocusButton = [UIButton buttonWithType:UIButtonTypeCustom];

[resetFocusButton setBackgroundColor:[UIColor whiteColor]];

[resetFocusButton setTitleColor:[UIColor blueColor]

forState:UIControlStateNormal];

resetFocusButton. layer.borderWidth = 1 ;

resetFocusButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

resetFocusButton. layer.cornerRadius = 8;

resetFocusButton. layer.masksToBounds = YES;

[resetFocusButton addTargetself action:@selector(resetFocusButtonAction:) forControl Events : U IControl EventTouch Down] ;

resetFocusButton. frame = CGRectMake(875, 425, 150, 75);

[resetFocusButton setTitle: @"Reset Focus" forState: UlControlStateNornnal]; [self.view addSubview:resetFocusButton]; reExposeButton = [UIButton buttonWithType:UIButtonTypeCustom];

[reExposeButton setBackgroundColor:[UIColor whiteColor]];

[reExposeButton setTitleColor:[UIColor blueColor]

forState:UIControlStateNormal];

reExposeButton. layer.borderWidth = 1 ;

reExposeButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

reExposeButton. layer.cornerRadius = 8;

reExposeButton. layer.masksToBounds = YES;

[reExposeButton addTarge self action :@selector(reExposeButtonAction:) forControl Events : U IControl EventTouch Down] ;

reExposeButton. frame = CGRectMake(875, 525, 150, 75);

[reExposeButton setTitle: @" Re-Expose" forState: UlControlStateNornnal]; [self.view addSubview:reExposeButton]; settingsButton = [UIButton buttonWithType:UIButtonTypeCustom];

[settingsButton setBackgroundColor:[UIColor whiteColor]];

[settingsButton setTitleColor:[UIColor blueColor]

forState:UIControlStateNormal];

settingsButton. layer.borderWidth = 1 ;

settingsButton. layer.borderColor = [UlColor NghtGrayColor].CGColor;

settingsButton. layer.cornerRadius = 8;

settingsButton. layer.masksToBounds = YES;

[settingsButton addTarge self action :@selector(settingsButtonAction:) forControl Events : U IControl EventTouch Down] ;

settingsButton. frame = CGRectMake(875, 675, 150, 75);

[settingsButton setTitle: ©"Settings" forState: UlControlStateNornnal];

[self.view addSubview:settingsButton];

CGAffineTransform trans = CGAffineTransformMakeRotation(M_PI_2);

CGRect frame = CGRectMake(-135.0, 275.0, 350.0, 100.0);

coarseFocusSlider = [[UlSlider alloc] initWithFrame:frame];

[coarseFocusSlider addTarge self action:@selector(coarseSliderAction:) forControlEvents:UIControlEventValueChanged];

[coarseFocusSlider addTarget:self action:@selector(coarseSliderUpAction:) forControlEvents:UIControlEventTouchUplnside

|UIControlEventTouchUpOutside];

coarseFocusSlider.transform = trans;

[coarseFocusSlider setBackgroundColor:[UIColor clearColor]];

coarseFocusSlider.minimumValue = 0.0; coarseFocusSlider.maximunnValue = 22400.0/6.0;

coarseFocusSlider.continuous = YES;

coarseFocusSlider.value = 1 1400.0/6.0;

[self.view addSubview:coarseFocusSlider];

CGRect frame2 = CGRectMake(-60.0, 275.0, 350.0, 100.0);

fineFocusSlider = [[UlSlider alloc] initWithFrame:franne2];

[fineFocusSlider addTarge self action:@selector(fineSliderAction:)

forControlEvents:UIControlEventValueChanged];

[fineFocusSlider addTarge self action:@selector(fineSliderUpAction:) forControlEvents:UIControlEventTouchUpOutside |

UIControlEventTouchUplnside];

fineFocusSlider.transform = trans; [fineFocusSlider setBackgroundColor:[UIColor clearColor]];

fineFocusSlider.minimumValue = 0.0;

fineFocusSlider.maximunnValue = 254.0;

fineFocusSlider.continuous = YES;

fineFocusSlider.value = 122.0;

[self.view addSubview:fineFocusSlider];

lastSliderVal=122+1 1400.0/6.0;

CGPoint p;

p.x= .5;

p.y= -5;

[session beginConfiguration];

[device addObserver:self

forKeyPath:@"adjustingExposure"

options:NSKeyValueObservingOptionNew

contexfcnil];

[device lockForConfiguration:nil];

NSLog(@"calculating exposure");

[self .device setFocusPointOflnteres p];

[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure] ; [device unlockForConfiguration];

[session commitConfiguration];

[NSTimer scheduledTimerWithTinnelnterval:(float)1 .9 target:self

selector:@selector(lockExposureTimer:) userlnfo:nil repeats:NO]; [session beginConfiguration];

[device lockForConfiguration:nil];

NSLog(@"locking white balance");

[device setWhiteBalanceMode:AVCaptureWhiteBalanceModeLocked];

[device unlockForConfiguration];

[session commitConfiguration];

} - (void)viewWillAppear:(BOOL)animated

{

[self.navigationController setNavigationBarHidden:YES animated:aninnated]; [super viewWillAppearanimated];

[self centerScrollViewContents];

}

- (void)viewWillDisappear:(BOOL)animated

{

[self.navigationController setNavigationBarHidden:NO animated:aninnated]; [super viewWillDisappear:animated];

}

-(void) viewDidAppear:(BOOL)animated{ self.sequence = [[Sequencer alloc] in it] ;

self.stitcher = [[ImageProcessor alloc] in it] ;

if(![session isRunning])

{

self.session = [[AVCaptureSession alloc] in it] ;

self.session.sessionPreset = AVCaptureSession PresetPhoto;

self.device = [AVCaptureDevice

defaultDeviceWithMedia†ype:AVMediaTypeVideo];

NSError * error = nil;

self.input = [AVCaptureDevicelnput devicelnputWithDevice:self.device error:&error];

if (!input) {

NSLog(@"Error trying to open camera: %@", error);

}

self.stillOutput = [[AVCaptureStilllmageOutput alloc] in it];

NSDictionary * outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys AVVideoCodecJPEG, AWideoCodecKey, nil];

[self.stillOutput setOutputSettings:outputSettings];

[self.session addlnput:self.input];

[self.session addOutpu self.stillOutput];

[self.session startRunning];

dispatch_async(dispatch_get_main_queue(), Λ {

captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

captureVideoPreviewLayer.connection.videoOrientation =

AVCaptureVideoOrientationLandscapeLeft;

[captureVideoPreviewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];

[captureVideoPreviewLayer

setVideoGravity:AVLayerVideoGravityResizeAspectFill];

CALayer * rootl_ayer = [previewView layer]; [rootLayer setMasksToBounds:YES];

[captureVideoPreviewLayer setFrame: [rootLayer bounds]];

[rootLayer addSublayer:captureVideoPreviewLayer];

});

}

CGPoint p;

p.x= .5;

p.y= -5;

[session beginConfiguration];

[device lockForConfiguration:nil];

[device setFocusPointOflnterest:p];

[device setFocusMode:AVCaptureFocusModeLocked];

[device unlockForConfiguration];

[session commitConfiguration];

}

-(void)loadSlideButtonAction:(id)sender

{

[stage homeAndLoad];

[[[UIAIertView alloc]

initWithTitle:@"" message:@"Press OK when slide is loaded"

delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil] show];

-(void)focusUpButtonUp:(id)sender

[focusUpTimer invalidate];

-(void)focusUpButtonDown:(id)sender focusUpTimer=[NSTinner scheduledTimerWithTinnelnterval:(float).2 target:self selector:@selector(focusUpTimer:) userlnfo:nil repeats:YES];

-(void)focusDownButtonUp:(id)sender

[focusDownTimer invalidate];

-(void)focusDownButtonDown:(id)sender focusDownTimer=[NSTinner scheduledTimerWithTinnelnterval:(float).2 target:self selector:@selector(focusDownTimer:) userlnfo:nil repeats:YES];

-(void) focusDownTimer:(NSTinner * )timer {

float zDist=-255;

[stage setMicroStep:0];

[stage moveStage:0:0:(zDist/8)]; [stage setMicroStep:1 ];

}

-(void) focusUpTimer:(NSTinner * )timer {

float zDist=255;

[stage setMicroStep:0];

[stage moveStage:0:0:(zDist/8)];

[stage setMicroStep:1 ];

}

- (void)alertView:(UIAIertView * )alertView

clickedButtonAtlndex:(NSInteger)buttonlndex {

[stage homeAndMiddle];

}

-(void)settingsButtonAction:(id)sender

{

[self performSegueWith Identifier: ©"settingsSegue" sender: self];

}

-(void)reExposeButtonAction:(id)sender

{

[session beginConfiguration];

CGPoint p;

p.x= .5;

p.y= -5;

[session commitConfiguration];

[device lockForConfiguration:nil];

NSLog(@"calculating exposure");

[self .device setFocusPointOflnteres p];

[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure] [device unlockForConfiguration];

[session commitConfiguration];

[NSTimer scheduledTimerWithTimelnterval:(float)1 .9 targe self

selector:@selector(lockExposureTimer:) userlnfo:nil repeats:NO];

- (void) resetFocusButtonAction: (id) sender

[stage homeZ];

- (void) snapButtonAction: (id) sender

[self snap];

-(void) highResScanStart {

[sequence clearSequence]; imageTitle=title.text;

xPos=0;

yPos=0;

localSequenceCounter=0;

[stage setxPosyPos:0:0];

NSFetchRequest * fetchRequest = [[NSFetchRequest alloc] in it] ;

[fetchRequest setEntity:[NSEntityDescription entityForName:@"Settings" inManagedObjectContex managedObjectContext]];

NSError * error;

int fields=0;

NSArray * fetchedObjects = [managedObjectContext

executeFetchRequest:fetchRequest error:&error];

for (Settings * setting in fetchedObjects) {

fields=[setting.highResScanFields intValue];

fieldsPerFocus=[setting.fieldsPerFocus intValue];

fineFocusOnly=[setting.fineFocusOnly intValue];

}

//@todo- defaults should be set when entering values into core data.

if (fields==0) {fields =1 ;}

else if (fields>25) {fields =25;}

if (fieldsPerFocus==0) {fieldsPerFocus =2;}

[sequence generateGridSequence:fields];

[session beginConfiguration];

CGPoint p;

p.x= .5;

p.y= -5;

[session commitConfiguration];

[device lockForConfiguration:nil];

[self .device setFocusPointOflnterest:p];

[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure] [device unlockForConfiguration];

[session commitConfiguration];

[NSTimer scheduledTimerWithTimelnterval:(float)1 .9 target:self

selector:@selector(lockExposureTimer:) userlnfo:nil repeats:NO];

[imageUrlsForCopy removeAIIObjects];

[NSTimer scheduledTimerWithTimelnterval:(float)2 target:self

selector:@selector(snapForGridTimer:) userlnfo:nil repeats:NO];

}

-(void)highResScanButtonAction:(id)sender

{

[stage homeAndEdge];

//[NSTimer scheduledTimerWithTimelnterval:(float)2 target:self

selector:@selector(highResScanStartTimer:) userlnfo:nil repeats:NO];

[NSTimer scheduledTimerWithTimelnterval:(float)2 target:self

selector:@selector(timer2:) userlnfo:nil repeats:NO]; //[self autofocusSequence];

}

-(void) timer2:(NSTimer * )timer {

[self autofocusSequence];

[NSTimer scheduledTimerWithTimelnterval:(float)6 targe self selector:@selector(highResScanStartTinner:) userlnfo:nil repeats:NO]; //[NSTimer scheduledTimerWithTinnelnterval:(float)8 target:self selector:@selector(timer3:) userlnfo:nil repeats:NO];

(void) timer3:(NSTinner * )timer {

(void)focusButtonAction:(id)sender

NSLog(@"focusbuttonaction");

[self autofocusSequence];

(void)coarseSliderAction:(id)sender [self updateZPos];

(void) comnnunicationHaltTinner:(NSTinner * )timer {

sendHalt=0;

(void) highResScanStartTimer:(NSTinner * )timer {

[self highResScan Start];

-(void) updateZPos{

[NSTimer scheduledTimerWithTinnelnterval:(float)0 target:self selector:@selector(updateZPosTimer:) userlnfo:nil repeats:NO];

-(void) updateZPosTimer:(NSTinner * )timer {

if (sendHalt==0) {

sendHalt=1 ;

double coarseVal=roundf([coarseFocusSlider value]);

double fineVal=roundf([fineFocusSlider value]);

float zDist=lastSliderVal-(coarseVal+fineVal);

lastSliderVal=coarseVal+fineVal; if (abs(zDist)>=16 && currHalfStepping [stage setMicroStep:0];

[stage moveStage:0:0:(zDist 8)];

[NSTimer scheduledTimerWithTimelnterval:(float)(.04 * 3+.002 * zDist/8) target:self selector:@selector(communicationHaltTimer:) userlnfo:nil repeats:NO] [stage setMicroStep:1 ];

}

else {

[stage moveStage:0:0:(zDist)];

[NSTimer scheduledTimerWithTimelnterval:(float)(.04+.002 * zDist) target:self selector:@selector(communicationHaltTimer:) userlnfo:nil repeats:NO]

}

}

(void)fineSliderAction:(id)sender [self updateZPos];

(void)fineSliderUpAction:(id)sender [self resetSliders];

(void)coarseSliderUpAction:(id)sender [self resetSliders];

-(void) resetSliders {

zPos=1 1400.0/6.0+122.0;

coarseFocusSlider.value = 1 1400.0/6.0;

fineFocusSlider.value = 122.0;

lastSliderVal= 1 1400.0/6.0+122;

}

//@todo: this should be moved to the sequencer object

-(void) autofocusSequence{

fineFocus=1 ;

[sequence clearSequence];

[stage powerMotors];

[stage setMicroStep:0];

float zStep= 128/8; [stage moveStage:0:0:zStep * 3];

float y=0; float x=0; float z=-zStep; float r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zStep; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zStep; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zStep; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zStep; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zStep; r=0;

[sequence addToSequence:x:y:z:r]; [NSTimer scheduledTimerWithTinnelnterval:(float).25 targe self selector:@selector(autoFocusTimer:) userlnfo:nil repeats:NO]; focusScore=0;

bestFocusPos=0;

bestFocusScore=0;

[focusPositions removeAIIObjects];

}

-(void) fineAutofocusSequence{

[sequence clearSequence];

[stage setMicroStep:0];

[stage powerZMotor];

float zDist=32/8;

[stage moveStage:0:0:zDist * 3];

float y=0; float x=0; float z=-zDist; float r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zDist; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zDist; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zDist; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zDist; r=0;

[sequence addToSequence:x:y:z:r];

y=0; x=0; z=-zDist; r=0;

[sequence addToSequence:x:y:z:r];

[NSTimer scheduledTimerWithTimelnterval:(float).2 target:self selector:@selector(autoFocusTimer:) userlnfo:nil repeats:NO]; focusScore=0;

bestFocusPos=0;

bestFocusScore=0;

[focusPositions removeAIIObjects];

} -(void) autoFocusTimer:(NSTinner * )timer {

[self autoFocus];

}

-(void) autoFocus {

AVCaptureConnection * videoConnection = nil;

for (AVCaptureConnection Connection in stillOutput. connections)

{

for (AVCapturelnputPort * port in [connection inputPorts])

{

if ([[port imediaType] isEqual:AVMediaTypeVideo] )

{

videoConnection = connection;

break;

}

}

if (videoConnection) { break; }

}

[stillOutput captureStilllmageAsynchronouslyFronnConnection:videoConnecti on completionHandler: A (CMSampleBufferRef imageSampleBuffer, NSError * error) {

NSData * imageData = [AVCaptureStilllmageOutput

jpegStilllmageNSDataRepresentation:innageSannpleBuffer];

image = [[Ullmage alloc] initWithData:imageData];

double currFocus=[stage getzPos]; focusScore=[stitcher calcFocus:image];

if (focusScore>bestFocusScore){

bestFocusPos=currFocus;

bestFocusScore=focusScore;

}

NSArray * currSequence=[sequence getCurrentValue]; if ([currSequence count]>1 ){

float x = [[currSequence objectAtlndex:0] floatValue];

float y =[[currSequence objectAtlndex:1 ] floatValue];

float z =[[currSequence objectAtlndex:2] floatValue];

[NSTimer scheduledTimerWithTimelnterval:(float).3 target:self selector:@selector(autoFocusTimer:) userlnfo:nil repeats:NO];

[stage moveStage:x:y:z];

}

else {

int currHalfStep=[stage getMicrostep];

if (currHalfStep==0) [stage moveStage:0:0:bestFocusPos-[stage getzPos]];

else [stage moveStage:0:0:(bestFocusPos-[stage getzPos])];

if (fineFocus==1 ){

[self fineAutofocusSequence];

fineFocus++;

}

else {

[stage setMicroStep:1 ];

[stage moveStage:0:0:32];

[session beginConfiguration];

self.session.sessionPreset = AVCaptureSessionPresetPhoto; [session com mitConfigu ration];

CGPoint p;

p.x= .5;

p.y= -5;

[device lockForConfiguration:nil];

[self.device setFocusPointOflnterest:p];

[device

setExposureMode:AVCaptureExposureModeContinuousAutoExposu re];

[device unlockForConfiguration];

[session com mitConfigu ration];

[NSTimer scheduledTimerWithTimelnterval:(float)1 .9 targe self selector:@selector(lockExposureTimer:) userlnfo:nil repeats:NO];

[sequence clearSequence]; if (continueGrid==1 ){

continueGrid=0;

[sequence restoreSequence];

/ * NSArray * currSequence=[sequence getCurrentValue];

if ([currSequence count]>1 ){

float x = [[currSequence objectAtlndex:0] floatValue];

float y =[[currSequence objectAtlndex:1] floatValue];

float z =[[currSequence objectAtlndex:2] floatValue];

[stage moveStage:x:y:z];

xPos=xPos+x;

yPos=yPos+y;

}

[NSTimer scheduledTimerWithTimelnterval:(float).3 target:self selector:@selector(snapForGridTimer:) userlnfo:nil repeats:NO];

}

}

}

}

];

} -(void) lockExposureTimer:(NSTinner * )timer {

[session beginConfiguration];

[session commitConfiguration];

[device lockForConfiguration:nil];

[device setExposureMode:AVCaptureExposureModeLocked];

[device unlockForConfiguration];

[session commitConfiguration];

}

-(void) snapForGridTimer:(NSTimer * )timer {

[self snapForGrid];

}

-(void) snapForGrid {

AVCaptureConnection * videoConnection = nil;

for (AVCaptureConnection Connection in stillOutput. connections)

{

for (AVCapturelnputPort * port in [connection inputPorts])

{

if ([[port mediaType] isEqual:AVMediaTypeVideo] )

{

videoConnection = connection;

break;

}

}

if (videoConnection) { break; }

}

[stillOutput captureStilllmageAsynchronouslyFromConnection:videoConnectio n completionHandler: A (CMSampleBufferRef imageSampleBuffer, NSError * error)

{

NSData * imageData = [AVCaptureStilllmageOutput

jpegStilllmageNSDataRepresentation:imageSampleBuffer];

image = [[Ullmage alloc] initWithData:imageData];

ALAssetsLibrary * library = [[ALAssetsLibrary alloc] in it] ;

[library writelmageToSavedPhotosAlbum:image.CGImage

orientation:(ALAssetOrientation)[image imageOrientation]

completionBlock: A (NSURL * assetURL, NSError * error) {

//add the url of the image to the nsarray

[imageUrlsForCopy addObject:[assetURL absoluteString]]; localSequenceCounter++;

NSArray * currSequence=[sequence getCurrentValue];

if ([currSequence count]>1 ){

float x = [[currSequence objectAtlndex:0] floatValue]; float y =[[currSequence objectAtlndex:1] floatValue]; float z =[[currSequence objectAtlndex:2] floatValue];

[stage moveStage:x:y:z]; if ((localSequenceCounter % fieldsPerFocus)==0){

continueGrid=1 ;

[sequence saveSequence];

if (fineFocusOnly==0){

[self autofocusSequence];

}

else {

[self fineAutofocusSequence];

}

return;

}

else {

[NSTimer scheduledTimerWithTinnelnterval:(float).3 target:self selector:@selector(snapForGhdTimer:) userlnfo:nil repeats:NO];

}

xPos=xPos+x;

yPos=yPos+y;

}

else {

//grid acquisition is finished, copy urls to pasteboard as a csv string NSString * concatURLs=[imageUrlsForCopy

componentsJoinedByString:@","];

UlPasteboard * pb=[UIPasteboard generalPasteboard];

[pb setString:concatURLs];

[stage moveStage:8:8:0];

[stage disableMotors];

[stage setMicroStep:1 ];

[session beginConfiguration];

self.session.sessionPreset = AVCaptureSessionPresetPhoto;

[session com mitConfigu ration];

CGPoint p;

p.x= .5;

p.y= -5;

[session com mitConfigu ration];

[device lockForConfiguration:nil]; NSLog(@"calculating exposure");

[self.device setFocusPointOflnteres p];

[device

setExposureMode:AVCaptureExposureModeContinuousAutoExposu re];

[device unlockForConfiguration];

[session com mitConfigu ration];

[NSTimer scheduledTimerWithTinnelnterval:(float)1 targe self selector:@selector(lockExposureTimer:) userlnfo:nil repeats:NO];

[sequence clearSequence];

}

}];

}

];

}

- (void)scrollViewDidZoom:(UIScrollView * )scrollView

{

double currZoom=25/scrollView.zoonnScale;

int currZooml=round(currZoonn);

NSString * currZoomS = [NSString stringWithFormat:@"%d", currZooml]; NSString * s=[NSString stringWithFormat:@"%@%@", currZoomS, @" urn"]; textField.text=s;

[self centerScrollViewContents];

}

-(void)centerScrollViewContents {

CGSize boundsSize=CGSizeMake(1024,768);

CGRect contentsFrame = previewView.frame; if (contentsFrame. size.width < boundsSize.width) {

contentsFrame.origin.x = (boundsSize.width - contentsFrame. size.width) /

2. Of;

} else {

contentsFrame.origin.x = O.Of;

}

if (contentsFrame. size. height < boundsSize. height) {

contentsFrame.origin.y = (boundsSize. height - contentsFrame. size. height) /

2. Of;

} else {

contentsFrame.origin.y = O.Of;

}

previewView.frame = contentsFrame;

}

- (void)scrollViewDidScroll:(UIScrollView * )scrollView { }

- (UlView * )viewForZoominglnScrollView:(UIScrollView * )scrollView

{

return previewView;

}

- (void)panGestureListener:(UIPanGestureRecognizer * )panGesture {

if (panGesture. numberOfTouches==1 ){

float scrollViewHeight = preview.frame. size. height;

float scrollViewWidth = preview.frame.size.width;

float scrollContentSizeHeight = preview.contentSize. height;

float scrollContentSizeWidth = previewView.frame.size.width;

float scrollOffsetY = preview.contentOffset.y;

float scrollOffsetX = preview.contentOffset.x;

UlView * piece = [panGesture view]; if (scrollOffsetY == 0 || scrollOffsetY + scrollViewHeight >

scrollContentSizeHeight-1 || scrollOffsetX + scrollViewWidth >

scrollContentSizeWidth-1 || scrollOffsetX==0){

CGPoint translation = [panGesture translationlnView:[piece superview]]; int panTransX = (int) translation.x;

int panTransY = (int) translation.y;

if ([panGesture state] == UIGestureRecognizerStateBegan) {

lastYPos=0;

lastXPos=0;

}

else if ([panGesture state] == UIGestureRecognizerStateChanged) {

[stage moveStage:panTransX-lastXPos:panTransY-lastYPos:0]; lastYPos=panTransY;

lastXPos=panTransX;

NSLog(@"moving stage");

}

}

}

}

- (void)handlel_ongPress:(UIGestureRecognizer * )gestureRecognizer

{

if ([gestureRecognizer state] == UIGestureRecognizerStateBegan) { zPos=1 1400.0/6.0+122.0;

coarseFocusSlider.value = 1 1400.0/6.0;

fineFocusSlider.value = 122.0; lastSliderVal= 1 1400.0/6.0+122;

}

}

- (void) snap{

AVCaptureConnection * videoConnection = nil;

for (AVCaptureConnection Connection in stillOutput. connections)

{

for (AVCapturelnputPort * port in [connection inputPorts])

{

if ([[port mediaType] isEqual:AVMediaTypeVideo] )

{

videoConnection = connection;

break;

}

}

if (videoConnection) { break; }

}

[stillOutput captureStilllmageAsynchronouslyFronnConnection:videoConnecti on completionHandler: A (CMSampleBufferRef imageSampleBuffer, NSError * error) {

NSData * imageData = [AVCaptureStilllmageOutput

jpegStilllmageNSDataRepresentation:innageSannpleBuffer];

image = [[Ullmage alloc] initWithData:imageData];

ALAssetsLibrary * library = [[ALAssetsLibrary alloc] in it] ;

NSMutableDictionary * tiffDictionary= [[NSMutableDictionary alloc] in it] ;

NSMutableDictionary * myMetadata = [[NSMutableDictionary alloc] in it] ;

[tiffDictionary setObject:title.text

forKey:(NSString * )kCGInnagePropertyTIFFInnageDescription];

[myMetadata setObject:tiffDictionary

forKey:(NSString * )kCGImagePropertyTIFFDictionary];

[myMetadata setObject:[NSNumber numberWithlnt: 3]

forKey:(NSNumber * )kCGImagePropertyTIFFOrientation];

[library writelmageToSavedPhotosAlbum:image.CGImage

metadata:myMetadata completionBlock: A (NSURL * assetURL, NSError * error){

UlPasteboard * pb=[UIPasteboard generalPasteboard];

[pb setString:assetURL.absoluteString];

//[[UlPasteboard generalPasteboard] setlmage:image];

//[[UlPasteboard generalPasteboard] setlmages:[NSArray

arrayWithObjects:myFirstlmage, mySecondlmage, nil]];

//[[UlPasteboard generalPasteboard] setlmages:mylmagesArray];

}]; }];

}

- (void)handleSingleTap:(UIGestureRecognizer * )gestureRecognizer {

[sequence endSequence];

[stage setMicroStep:1 ];

[stage moveStage:8:8:0];

(void)handleDoubleTap:(UIGestureRecognizer * )gestureRecognizer [self snap];

(void)prepareForSegue:(UIStoryboardSegue * )segue sender:(id)sender if ([segue. identifier isEqualToSthng:@"settingsSegue"]) {

SettingsViewController * viewController =

SettingsViewController * )segue.destinationViewController;

viewController.managedObjectContext=nnanagedObjectContext;

}

(void)foregroundSel:(NSNotification * )notification

(void)didReceiveMennoryWarning [super didReceiveMemoryWarning];

@end

SettingsViewController.h

//

// SettingsViewController.h

// MiniScope

//

// Created by Mike D'Ambrosio on 4/23/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved. //

#import <UIKit/UIKit.h>

#import <CoreData/CoreData.h>

©interface SettingsViewController : UlViewController <UITextFieldDelegate, NSFetchedResultsControllerDelegate> {

NSFetchedResultsController * fetchedResultsController;

NSManagedObjectContext * managedObjectContext;

}

©property (nonatomic, retain) NSFetchedResultsController

*fetchedResultsController;

©property (nonatomic, retain) NSManagedObjectContext

*managedObjectContext;

©property (weak, nonatomic) IBOutlet UITextField * highResScanFields; ©property (weak, nonatomic) IBOutlet UITextField * fieldsPerFocus;

©property (weak, nonatomic) IBOutlet UITextField * fineFocusOnly;

©end

SettingsViewController.m

//

// SettingsViewController.m

// MiniScope

//

// Created by Mike D'Ambrosio on 4/23/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import "SettingsViewController.h"

#import "Settings. h"

©interface SettingsViewController () ©end

©implementation SettingsViewController

©synthesize fetchedResultsController, managedObjectContext;

©synthesize highResScanFields;

©synthesize fieldsPerFocus;

©synthesize fineFocusOnly; - (id)initWithNibName:(NSString * )nibNameOrNil bundle:(NSBundle

*)nibBundleOrNil

{

self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]; if (self) {

}

return self;

}

- (void)viewDidLoad

{

[super viewDidLoad];

highResScanFields.delegate =self;

fieldsPerFocus.delegate=self;

fineFocusOnly.delegate=self;

-(void) viewDidAppear:(BOOL)animated {

NSFetchRequest * fetchRequest = [[NSFetchRequest alloc] in it] ;

[fetchRequest setEntity:[NSEntityDescription entityForName:@"Settings" inManagedObjectContex managedObjectContext]];

NSError * error;

NSArray * fetchedObjects = [managedObjectContext

executeFetchReques fetchRequest error:&error];

for (Settings * setting in fetchedObjects) {

highResScanFields.text=[setting.highResScanFields stringValue]; fieldsPerFocus.text=[setting.fieldsPerFocus stringValue];

fineFocusOnly.text=[setting.fineFocusOnly stringValue];

}

}

- (IBAction)highResScanFields:(id)sender {

NSNumberFormatter * f = [[NSNumberFormatter alloc] in it] ;

[f setNumberStyle:NSNumberFormatterDecimalStyle];

NSFetchRequest * fetchRequest = [[NSFetchRequest alloc] in it] ;

[fetchRequest setEntity:[NSEntityDescription entityForName:@"Settings" inManagedObjectContex managedObjectContext]];

NSError * error;

NSArray * fetchedObjects = [managedObjectContext

executeFetchReques fetchRequest error:&error];

int obNum=0;

for (Settings * setting in fetchedObjects) {

[setting setValue:[f numberFromString:highResScanFields.text]forKey:@"highResScan Fields"]; obNum++;

}

if (obNum==0) {

Settings * settingsDB = [NSEntityDescription

insertNewObjectForEntityForName:@"Settings" inManagedObjectContext:managedObjectContext]; [settingsDB setValue:[f numberFromString:highResScanFields.text] forKey:@"highResScanFields"];

}

[managedObjectContext save:&error];

- (IBAction)fineFocusOnly:(id)sender {

NSNumberFormatter * f = [[NSNumberFormatter alloc] in it] ;

[f setNumberStyle:NSNumberFormatterDecimalStyle];

NSFetchRequest * fetchRequest = [[NSFetchRequest alloc] in it] ;

[fetchRequest setEntity:[NSEntityDeschption entityForName:@"Settings" inManagedObjectContex managedObjectContext]];

NSError * error;

NSArray * fetchedObjects = [managedObjectContext

executeFetchReques fetchRequest error:&error];

int obNum=0;

for (Settings * setting in fetchedObjects) {

[setting setValue:[f

nunnberFronnString:fineFocusOnly.text]forKey:@"fineFocusO nly"]; obNum++;

}

if (obNum==0) {

Settings * settingsDB = [NSEntityDescription

insertNewObjectForEntityForName:@"Settings" inManagedObjectContext:managedObjectContext]; [settingsDB setValue:[f numberFromString:fineFocusOnly.text] forKey:@"fineFocusOnly"];

}

[managedObjectContext save:&error];

}

- (IBAction)fieldsPerFocus:(id)sender { NSNumberFormatter * f = [[NSNumberFormatter alloc] in it] ;

[f setNumberStyle:NSNumberFormatterDecimalStyle];

NSFetchRequest * fetchRequest = [[NSFetchRequest alloc] in it] ;

[fetchRequest setEntity:[NSEntityDeschption entityForName:@"Settings" inManagedObjectContex managedObjectContext]];

NSError * error;

NSArray * fetchedObjects = [managedObjectContext

executeFetchReques fetchRequest error:&error];

int obNum=0;

for (Settings * setting in fetchedObjects) {

[setting setValue:[f

numberFronnString:fielclsPerFocus.text]forKey:@"fieldsPer Focus"]; obNum++;

}

if (obNum==0) {

Settings * settingsDB = [NSEntityDescription

insertNewObjectForEntityForName:@"Settings"

inManagedObjectContext:managedObjectContext];

[settingsDB setValue:[f numberFromString:fieldsPerFocus.text]

forKey:@"fieldsPerFocus"];

}

[managedObjectContext save:&error];

}

-(BOOL)textField:(UITextField * )textField

shouldChangeCharacterslnRange:(NSRange)range replacementSthng:(NSString * )string {

if([string length]==0){

return YES;

}

NSCharacterSet * myCharSet = [NSCharacterSet

characterSetWithCharacterslnString:@"0123456789"];

for (int i = 0; i < [string length]; i++) {

unichar c = [string characterAtlndex:i];

if ([myCharSet characterlsMember:c]) {

return YES;

}

}

return NO;

} - (void)didReceiveMennoryWarning

{

[super didReceiveMemoryWarning];

}

@end Stage.h

//

// Stage.h

// MiniScope

//

// Created by Mike D'Ambrosio on 4/21/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import <UIKit/UIKit.h>

#import "BLE.h"

©interface Stage : UlViewController <BLEDelegate> {

}

©property (strong, nonatomic) BLE * ble;

©property (strong, nonatomic) NSTimer * shutdownTimer;

-(void) connectBLE;

-(void) moveStage:(float) x: (float) y: (float) z;

-(void) setMicroStep:(int) state;

-(void) setMove:(NSNumber * ) xNSNum : (NSNumber * ) yNSNum;

-(void) setxPosyPos:(int) setX:(int) setY;

-(void) powerZMotor;

-(void) disableZMotor;

-(void) powerMotors;

-(void) disableMotors;

-(void) home;

-(void) homeZ;

-(void) homeAndMiddle;

-(void) homeAndLoad;

-(void) homeAndEdge;

-(int) getzPos;

-(int) getMicrostep;

©end Stage.m //

// Stage.m

// MiniScope

//

// Created by Mike D'Ambrosio on 4/21/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import "Stage.h" ©interface Stage () {

int connected;

float zPos;

float xPos;

float yPos;

float realxPos;

float realyPos;

float xLimit;

float yLimit;

float xRem;

float yRem;

float zRem;

float lastYPos;

float lastXPos;

float xBacklash;

float yBacklash;

float xBacklashComp;

float yBacklashComp;

int xDir;

int yDir;

int zDir;

int xyMotorsOn;

int zMotorOn;

int communicationHalt;

int stepsSinceLastxDirChange;

int stepsSinceLastyDirChange;

int halfStep;

NSTimer * connectionTimerTimer;

}

@end

©implementation Stage ©synthesize shutdownTimer;

©synthesize ble;

- (id)initWithNibName:(NSSthng * )nibNameOrNil bundle:(NSBundle * )nibBundleOrNil

{

self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]; if (self) {

}

return self;

}

-(id)init { self = [super init];

connected=0;

zPos=1 1200/6;

xPos=0;

yPos=0;

xRem=0;

yRem=0;

zRem=0;

xBacklash=16;

yBacklash=96;

xLimit=2500;

yLimit=7500; ble = [[BLE alloc] init];

[ble controlSetup];

ble.delegate = self;

return self;

}

- (void)viewDidLoad

{

[super viewDidLoad];

}

-(void) connectBLE {

[NSTimer scheduledTimerWithTinnelnterval:(float)5 target:self selector:@selector(connectBLETimer:) userlnfo:nil repeats:YES];

}

-(void) viewDidAppear:(BOOL)animated {

[NSTimer scheduledTimerWithTinnelnterval:(float)5 target:self selector:@selector(connectBLETimer:) userlnfo:nil repeats:YES];

}

-(void) connectBLETimer:(NSTinner * )timer {

if (connected==1 ) [timer invalidate]; else {[self scanForPeripherals];

NSLog(@"connectBLETimer fired");}

}

- (void)scanForPeripherals

{

if (ble.activePeripheral)

if(ble.activePeripheral. state == CBPeripheralStateConnected)

{

[[ble CM] cancelPeripheralConnection:[ble activePeripheral]];

return;

}

if (ble. peripherals)

ble. peripherals = nil;

[ble findBLEPeripherals:2]; connectionTimerTimer=[NSTimer scheduledTimerWithTinnelnterval:(float)2.0 target:self selector:@selector(connectionTimer:) userlnfo:nil repeats:NO];

}

-(void) home{

[self powerMotors];

Ulnt8 buf[3] = {0x23, 0x00, 0x00};

NSData * data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

-(void) homeZ{

[self powerZMotor];

Ulnt8 buf[3] = {0x24, 0x00, 0x00};

NSData * data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

sleep(8);

zDir=2;

[self setMicroStep:0];

[self moveStage:0:0:1500/8];

[self setMicroStep:1 ];

}

-(void) connectionTimer:(NSTinner * )timer

{

if (ble. peripherals. count > 0) {

[ble connectPeripheral:[ble. peripherals objectAtlndex:0]];

}

}

#pragma mark - BLE delegate

NSTimer * rssiTimer;

- (void)bleDidDisconnect

{

NSLog(@"->Disconnected");

[rssiTimer invalidate];

connected=0;

[NSTimer scheduledTimerWithTinnelnterval:(float)5 target:self selector:@selector(connectBLETimer:) userlnfo:nil repeats:YES];

}

-(void) shutdownTimer:(NSTinner * )timer {

[self disableMotors];

[self disableZMotor];

}

-(void) communicationHaltTimer:(NSTimer * )timer {

communicationHalt=0;

}

// When RSSI is changed, this will be called

-(void) bleDidUpdateRSSI:(NSNumber * ) rssi

{

//I bl RSSI. text = rssi.stringValue;

}

-(void) readRSSITimer:(NSTinner * )timer

{

[ble readRSSI];

}

// When disconnected, this will be called

-(void) bleDidConnect

{

[connectionTimerTimer invalidate];

NSLog(@"->Connected");

connected=1 ;

[self powerMotors];

[self moveStage:xBacklash * 3:yBacklash * 3:0];

[self setMicroStep:1 ];

[self disableMotors]; // Schedule to read RSSI every 1 sec.

rssiTimer = [NSTimer scheduledTimerWithTinnelnterval:(float)1 .0 targe self selector:@selector(readRSSITimer:) userlnfo:nil repeats:YES];

}

// When data is comming, this will be called

-(void) bleDidReceiveData:(unsigned char * )data length:(int)length

{

NSLog(@"Length: %d", length);

// parse data, all commands are in 3-byte

for (int i = 0; i < length; i+=3)

{

NSLog(@"0x%02X, 0x%02X, 0x%02X", data[i], data[i+1], data[i+2]); if (data[i] == OxOA)

{

}

else if (data[i] == OxOB)

{

}

}

}

-(void) powerMotors{

//enable y drive

Ulnt8 buf3[3] = {0x14, 0x00, 0x00};

NSData * data = [[NSData alloc] initWithBytes:buf3 length:3];

[ble write:data];

//enable x drive

Ulnt8 buf2[3] = {0x15, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf2 length:3];

[ble write:data];

xyMotorsOn=1 ;

}

-(void) powerZMotor{

//enable focus motor

Ulnt8 buf[3] = {0x16, 0x00, 0x00};

NSData * data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

zMotorOn=1 ;

}

-(void) disableZMotor{

//enable focus motor

Ulnt8 buf[3] = {0x06, 0x00, 0x00};

NSData * data = [[NSData alloc] initWithBytes:buf length:3]; [ble write:data];

zMotorOn=0;

}

-(void) disableMotors{

//enable x drive

Ulnt8 buf2[3] = {0x05, 0x00, 0x00};

NSData * data = [[NSData alloc] initWithBytes:buf2 length:3];

[ble write:data];

Ulnt8 buf3[3] = {0x04, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf3 length:3];

[ble write:data];

xyMotorsOn=0;

}

-(void) setMicroStep:(int) state {

if (state==0) {

Ulnt8 buf[3] = {0x21 , 0x00, 0x00}; //set pin 12 to low, go to half step NSData * data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

halfStep=1 ;

}

else if (state==1 ) {

Ulnt8 buf[3] = {0x22, 0x00, 0x00}; //set pin 12 to high, go to sixteenth step NSData * data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

halfStep=0;

}

}

-(void) setMove:(NSNumber * ) xNSNum : (NSNumber * ) yNSNum {

[self moveStage:[xNSNunn floatValue]-xPos:[yNSNum floatValue]-yPos:0];

}

-(void) homeAndMiddle {

[self setMicroStep:0];

[self home];

xDir=2;

yDir=2;

[self moveStage:-xLimit/2/8:-yLimit/2/8:0];

[self setMicroStep:1 ];

}

-(void) homeAndLoad {

[self setMicroStep:0];

[self home];

xDir=2; yDir=2;

[self moveStage:-xLimit/8:-yLimit/8:0];

[self setMicroStep:1 ];

}

-(void) homeAndEdge {

[self setMicroStep:0];

[self home];

sleep(1 );

//[self moveStage:xLimit/10/8:-yLimit/2/8:0];

xDir=2;

[self moveStage:-xLimit/10/8:0:0];

sleep(1 );

communicationHalt=0;

yDir=2;

[self moveStage:0:-yLimit/2/8:0]; [self setMicroStep:1 ];

}

-(void) setxPosyPos:(int) setX:(int) setY {

realxPos=realxPos-(xPos-setX);

realyPos=realyPos-(yPos-setY);

xPos=setX;

yPos=setY;

}

//@todo- these should probably be properties instead of having manual getters

-(int) getzPos {

return zPos;

}

-(int) getMicrostep {

return halfStep;

}

-(void) moveStage:(float) x: (float) y: (float) z{

Ulnt8 xybuf[3] = {0x02, 0x00, 0x00};

int move=0;

if (xyMotorsOn==0) {

[self powerMotors];

}

if (zMotorOn==0){

[self powerZMotor]; xRem=0;

yRem=0;

if (x>255) {

[self moveStage:x-255:0:0];

communicationHalt=0;

x=255;

}

else if (x<-255) {

[self moveStage:x+255:0:0];

comnnunicationHalt=0; x=-255;

}

if (y>255) {

[self moveStage:0:y-255:0];

comnnunicationHalt=0; y=255;

}

else if (y<-255) {

[self moveStage:0:y+255:0];

comnnunicationHalt=0; y=-255;

}

if (communicationHalt==0) {

if (abs(x)>=1 ){

move=1 ;

float totalX=x;

if (abs(totalX)>=1 ) {

//change dir to 0x03

if (x>=1 && xDir!=1 ) {

NSData * data;

Ulnt8 buf[3] = {0x07, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

xDir=1 ;

if (abs(stepsSinceLastxDirChange)<xBacklash){

if (halfStep==0) xBacklashComp=abs(stepsSinceLastxDirChange); else xBacklashComp=0;

}

else {

if (halfStep==0) xBacklashComp=xBacklash;

else xBacklashComp=1 ;

}

stepsSincel_astxDirChange=0; }

//change dir to 0x13

else if (x<=-1 && xDir!=0) {

NSData * data;

Ulnt8 buf[3] = {0x17, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

xDir=0;

if (abs(stepsSinceLastxDirChange)<xBacklash){

if (halfStep==0) xBacklashComp=- abs(stepsSinceLastxDirChange);

else xBacklashComp=0;

}

else {

if (halfStep==0) xBacklashComp=-xBacklash;

else xBacklashComp=-1 ;

}

stepsSincel_astxDirChange=0;

}

xybuf[1]=abs(totalX+xBacklashComp);

xBacklashComp=0;

if (halfStep==0) {

xPos=xPos+x;

stepsSincel_astxDirChange= StepsSinceLastxDirChange+x;

}

else {

xPos=xPos+x * 8;;

stepsSinceLastxDirChange= stepsSinceLastxDirChange+x * 8;

}

}

else {

xRem=x;

}

}

if (abs(y)>=1 ){

move=1 ;

float totalY=y;

if (abs(totalY)>=1 ) {

if (y>=1 && yDir!=1 ) {

NSData * data;

Ulnt8 buf[3] = {0x13, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

yDir=1 ;

if (abs(stepsSinceLastyDirChange)<yBacklash){

if (halfStep==0) yBacklashComp=abs(stepsSinceLastyDirChange) else yBacklashComp=0; }

else {

if (halfStep==0) yBacklashComp=yBacklash; else yBacklashComp=1 ;

}

stepsSinceLastyDirChange=0;

}

else if (y<=-1 && yDir!=0) {

NSData * data;

Ulnt8 buf[3] = {0x03, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

yDir=0;

if (abs(stepsSinceLastyDirChange)<yBacklash){ if (halfStep==0) yBacklashComp=- abs(stepsSinceLastyDirChange);

else yBacklashComp=0;

}

else {

if (halfStep==0) yBacklashComp=-yBacklash; else yBacklashComp=-1 ;

}

stepsSinceLastyDirChange=0;

}

xybuf[2]=abs(totalY+yBacklashComp);

yBacklashComp=0;

if (halfStep==0) {

yPos=yPos+y;

stepsSinceLastyDirChange=stepsSinceLastyDirChange+y;

}

else {

yPos=yPos+y * 8;;

stepsSinceLastyDirChange=stepsSinceLastyDirChange+y * 8;

}

}

else {

yRem=y;

}

}

if (move==1 ) {

NSData * data;

data = [[NSData alloc] initWithBytes:xybuf length:3];

[ble write:data];

comnnunicationHalt=1 ;

[NSTimer scheduledTimerWithTinnelnterval:(float).025 target:self selector:@selector(comnnunicationHaltTinner:) userlnfo:nil repeats:NO]; move=0; if (abs(z)>=1 ){

if (z>0 && zDir!=0) {

NSData * data;

Ulnt8 buf[3] = {0x08, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf length:3];

[ble write:data];

zDir=0;

}

//change dir to 0x13

else if (z<0 && zDir!=1 ) {

NSData * data;

Ulnt8 buf[3] = {0x09, 0x00, 0x00};

data = [[NSData alloc] initWithBytes:buf length:3]; [ble write:data];

zDir=1 ;

}

float totalZ=z;

if (abs(totalZ)>=1 ) {

NSData * data;

//step

Ulnt8 buf4[3] = {0x20, 0x00, 0x00};

buf4[1 ]= abs(totalZ);

data = [[NSData alloc] initWithBytes:buf4 length:3];

[ble write:data];

zPos=zPos+z;

zRem=0;

if (totalZ>255) {

[self moveStage:0:0:totalZ-255];

}

else if (totalZ<-255) {

[self moveStage:0:0:totalZ+255];

}

}

else {

}

}

}

else {

xRem=xRenn+x;

yRem=yRenn+y;

zRem=zRenn+z;

}

if ( [shutdownTimer isValid] ){

[shutdownTimer invalidate]; }

shutdownTimer=[NSTinner scheduledTimerWithTinnelnterval:(float)5 target:self selector:@selector(shutdownTimer:) userlnfo:nil repeats:NO];

}

- (void)didReceiveMemoryWarning

{

[super didReceiveMemoryWarning];

}

@end

Sequencer.h

//

// Sequencer.h

// MiniScope

//

// Created by Mike D'Ambrosio on 4/1/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import <UIKit/UIKit.h>

©interface Sequencer : UlViewController

©property (strong, nonatomic) NSMutableArray * sequences;

©property (strong, nonatomic) NSMutableArray * savedSequences;

©property (strong, nonatomic) NSArray * sequence;

-(void) clearSequence;

-(void) addToSequence:(float) x: (float) y: (float) z: (float) r;

-(void) endSequence;

-(NSArray * ) getCurrentValue;

-(void) generateGridSequence:(int) total N urn Rows;

-(void) saveSequence;

-(void) restoreSequence;

©end

Sequencer.m

//

// Sequencer.m

// MiniScope //

// Created by Mike D'Ambrosio on 4/1/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import "Sequencer.h"

©interface Sequencer () {

int currSequence;

int savedCurrSequence;

}

@end

©implementation Sequencer

©synthesize sequence;

©synthesize sequences;

©synthesize savedSequences; -(void) generateGridSequence:(int) totalNumRows{

float stepSize=10 * 8;

NSNumber * n1 = [NSNumber numberWithFloa stepSize];

NSNumber * mn1 = [NSNumber numberWithFloat:stepSize * -1 ];

NSNumber * zero = [NSNumber numberWithFloa O];

int rows =0;

int dir =1 ;

int i=0;

while (rows<totalNumRows){ if (dir==1 ) dir =0;

else if (dir==0) dir =1 ;

i=0;

while (i<25){

if (dir==1 ){

sequence = [NSArray arrayWithObjects: n1 ,zero,zero,zero, nil];

}

else {

sequence = [NSArray arrayWithObjects: mn1 ,zero,zero,zero, nil]; }

[sequences addObjec sequence];

i++;

}

sequence = [NSArray arrayWithObjects: zero,mn1 ,zero,zero, nil]; [sequences addObjec sequence];

rows = rows +1 ;

} }

-(void) addToSequence:(float) x: (float) y: (float) z:(float) r \ NSNumber * n1 = [NSNumber numberWithFloat:x]; NSNumber * n2 = [NSNumber numberWithFloa yj; NSNumber * n3 = [NSNumber numberWithFloa zj; NSNumber * n4 = [NSNumber numberWithFloa r];

sequence = [NSArray arrayWithObjects:n1 , π2,η3,η4, ni [sequences addObject:sequence];

}

-(void) saveSequence {

savedSequences=[sequences mutableCopy];

savedCurrSequence=currSequence;

}

-(void) restoreSequence {

sequences=[savedSequences mutableCopy];

currSequence=savedCurrSequence;

}

-(void) clearSequence{

[sequences removeAIIObjects];

currSequence=0;

}

-(void) endSequence {

currSequence=[sequences count];

savedSequences=[sequences mutableCopy];

savedCurrSequence=currSequence;

}

-(NSArray * ) getCurrentValue {

NSArray * currArray;

currSequence++;

if ((currSequence-1 )<[sequences count]) {

currArray= [sequences objectAtlndex:(currSequence return currArray;

}

else {

NSNumber * n1 = [NSNumber numberWithFloat:0]; currArray = [NSArray arrayWithObjects:n1 , nil];

}

return currArray;

} - (id)initWithNibName:(NSString * )nibNameOrNil bundle:(NSBundle

*)nibBundleOrNil

{

self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];

if (self) {

sequence= [[NSArray alloc] in it] ;

sequences = [[NSMutableArray alloc] in it] ;

savedSequences = [[NSMutableArray alloc] in it] ;

}

return self;

(void)viewDidLoad [super viewDidLoad];

(void)didReceiveMemoryWarning [super didReceiveMemoryWarning];

@end main.m

//

// main.m

// MiniScope

//

// Created by Mike D'Ambrosio on 2/22/14.

// Copyright (c) 2014 Mike D'Ambrosio. All rights reserved.

//

#import <UIKit/UIKit.h>

#import "AppDelegate.h" int main(int argc, char * argv[])

{

@autoreleasepool {

return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));

}

}