Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR INSPECTION ANIMATION
Document Type and Number:
WIPO Patent Application WO/2024/036076
Kind Code:
A1
Abstract:
The application discloses various systems and methods for generating and utilizing computer animation transition methods and systems in the leveling of a video to an upright, earth normal orientation which may be used in pipe inspections. Such computer animation transitions may be generated from a single inspection image at one time in the inspection video or from subsequent inspection images at different times and points in space in the inspection video. Some leveling methods may include rotating a square GL surface in ninety degree increments that includes one or more computer animation transition. Other leveling methods may also include rotating a three dimensional GL surface that is shaped to match the shape of the pipe that includes one or more. Further methods may include rotating a GL surface that is a regular polygon in shape that includes one or more computer animation transition.

Inventors:
NICHOLS MICHAEL (US)
WARREN ALEXANDER (US)
MARTIN MICHAEL (US)
Application Number:
PCT/US2023/071634
Publication Date:
February 15, 2024
Filing Date:
August 04, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SEESCAN INC (US)
International Classes:
G01N21/954; G03B37/00; H04N5/262; H04N5/272
Foreign References:
US20050275725A12005-12-15
US20170064208A12017-03-02
US11132781B12021-09-28
US8547428B12013-10-01
US20170176344A92017-06-22
US8587648B22013-11-19
US9927368B12018-03-27
US5808239A1998-09-15
US6545704B12003-04-08
US6831679B12004-12-14
US6958767B22005-10-25
US6862945B22005-03-08
US7009399B22006-03-07
US7136765B22006-11-14
US7221136B22007-05-22
US7276910B22007-10-02
US7288929B22007-10-30
US7298126B12007-11-20
US7332901B22008-02-19
US7336078B12008-02-26
US7498797B12009-03-03
US7498816B12009-03-03
US7518374B12009-04-14
US7557559B12009-07-07
US7619516B22009-11-17
US7733077B12010-06-08
US7741848B12010-06-22
US7755360B12010-07-13
US7825647B22010-11-02
US7830149B12010-11-09
US7864980B22011-01-04
US7948236B12011-05-24
US7969151B22011-06-28
US7990151B22011-08-02
US8013610B12011-09-06
US8035390B22011-10-11
US8106660B12012-01-31
US8203343B12012-06-19
US8248056B12012-08-21
US8264226B12012-09-11
US8289385B22012-10-16
US201313769202A2013-02-15
US201313793168A
US8395661B12013-03-12
US8400154B12013-03-19
US201314027027A2013-09-13
US201314033349A2013-09-20
US8547428B12013-10-01
US8564295B22013-10-22
US201414148649A
US8635043B12014-01-21
US8717028B12014-05-06
US8773133B12014-07-08
US8841912B22014-09-23
US8908027B22014-12-09
US8970211B12015-03-03
US8984698B12015-03-24
US9041794B12015-05-26
US9057754B22015-06-16
US9066446B12015-06-23
US9081109B12015-07-14
US9080992B22015-07-14
US9082269B22015-07-14
US9085007B22015-07-21
US9207350B22015-12-08
US9222809B12015-12-29
US9341740B12016-05-17
US9372117B22016-06-21
US201615187785A2016-06-21
US9411066B12016-08-09
US9411067B22016-08-09
US9435907B22016-09-06
US9448376B22016-09-20
US9465129B12016-10-11
US9468954B12016-10-18
US9477147B22016-10-25
US9488747B22016-11-08
US9494706B22016-11-15
US9521303B22016-12-13
US201715846102A2017-12-18
US201715866360A
US9571326B22017-02-14
US9599449B22017-03-21
US9599740B22017-03-21
US9625602B22017-04-18
US9632202B22017-04-25
US9634878B12017-04-25
US9638824B22017-05-02
US9651711B12017-05-16
US9684090B12017-06-20
US9696447B12017-07-04
US9696448B22017-07-04
US9703002B12017-07-11
US201615670845A
US201715681250A2017-08-18
US201715681409A2017-08-20
US9746572B22017-08-29
US9746573B12017-08-29
US9769366B22017-09-19
US9784837B12017-10-10
US9798033B22017-10-24
US201715811361A
US9835564B22017-12-05
US9841503B22017-12-12
US201815866360A2018-01-09
US201916255524A2019-01-23
US9891337B22018-02-13
US9914157B22018-03-13
US201815925643A2018-03-19
US201815925671A2018-03-19
US201815936250A2018-03-26
US9927545B22018-03-27
US9928613B22018-03-27
US201815250666A
US9880309B22018-01-30
US201916382136A2019-04-11
US201815954486A2018-04-16
US9945976B22018-04-17
US9959641B12018-05-01
US9989662B12018-06-05
US201816443789A
US10001425B12018-06-19
US10009582B22018-06-26
US201816036713A2018-07-16
US10027526B22018-07-17
US10024994B12018-07-17
US10031253B22018-07-24
US10042072B22018-08-07
US10059504B22018-08-28
US201816049699A2018-07-30
US10069667B12018-09-04
US201816121379A2018-09-04
US201816125768A2018-09-10
US10073186B12018-09-11
US201816133642A2018-09-17
US10078149B22018-09-18
US10082591B12018-09-25
US10082599B12018-09-25
US10090498B22018-10-02
US201816160874A2018-10-15
US10100507B12018-10-16
US10105723B12018-10-23
US201816222994A2018-12-17
US10162074B22018-12-25
US201916241864A2019-01-07
US201916810788A
US10247845B12019-04-02
US10274632B12019-04-30
US201916390967A2019-04-22
US10288997B22019-05-14
US201929692937F2019-05-29
US201916436903A2019-06-10
US10317559B12019-06-11
US201916449187A2019-06-21
US201916455491A2019-06-27
US10353103B12019-07-16
US10371305B12019-08-06
US201916551653A2019-08-26
US10401526B22019-09-03
US10324188B12019-06-18
US201916446456A2019-06-19
US201916520248A2019-07-23
US201916559576A2019-09-03
US201916588834A2019-09-30
US10440332B22019-10-08
US201916676292A2019-11-06
US10490908B22019-11-26
US201916701085A2019-12-02
US10534105B22020-01-14
US202016773952A2020-01-27
US202016780813A2020-02-03
US10555086B22020-02-04
US202016786935A2020-02-10
US10557824B12020-02-11
US202016791979A2020-02-14
US202016792047A2020-02-14
US10564309B22020-02-18
US10571594B22020-02-25
US10569952B22020-02-25
US202016827672A
US202016833426A2020-03-27
US10608348B22020-03-31
US202016837923A2020-04-01
US202117235507A2021-04-20
US199262630156P
US202016872362A
US202016882719A2020-05-25
US10670766B22020-06-02
US10677820B22020-06-09
US202016902245A2020-06-15
US202016902249A2020-06-15
US201362632127P
US202016908625A2020-06-22
US10690795B22020-06-23
US10690796B12020-06-23
US202016921775A2020-07-06
US197862630552P
US202117397940A2021-08-09
US202016995801A
US202017001200A2020-08-24
US202016995793A
US10753722B12020-08-25
US10754053B12020-08-25
US10761233B22020-09-01
US10761239B12020-09-01
US10764541B22020-09-01
US202017013831A2020-09-07
US202017014646A2020-09-08
US10777919B12020-09-15
US202017020487A2020-09-14
US202017068156A2020-10-12
US10809408B12020-10-20
US10845497B12020-11-24
US10848655B22020-11-24
US202017110273A2020-12-02
US10859727B22020-12-08
US10908311B12021-02-02
US10928538B12021-02-23
US10935686B12021-03-02
US202117190400A2021-03-03
US10955583B12021-03-23
US10976462B12021-04-13
US202117501670A
US202117528956A2021-11-17
US202117541057A
US11193767B12021-12-07
US11199510B12021-12-14
US202862632938P
US11209115B22021-12-28
US202117563049A2021-12-28
US198862633060P
US202217687538A2022-03-04
US11280934B22022-03-22
US11300597B22022-04-12
US11333786B12022-05-17
US202217845290A2022-06-21
US11366245B22022-06-21
US197962633688P
US11397274B22022-07-26
US202217815387A2022-07-27
US11404837B12022-08-02
US11402237B22022-08-02
US196062633707P
US11418761B22022-08-16
US11428814B12022-08-30
US202217930029A2022-09-06
US11448600B12022-09-20
US202217935564A2022-09-26
US11460598B22022-10-04
US11467317B22022-10-11
US11468610B12022-10-11
US11476539B12022-10-18
US11474276B12022-10-18
US11476851B12022-10-18
US197562633803P
US194862634351P
US202218089266A2022-12-27
US202318162663A2023-01-31
US200562634859P
US197362634924P
US11614613B22023-03-28
US11649917B22023-05-16
US11665321B22023-05-30
US11674906B12023-06-13
US9277105B22016-03-01
US9824433B22017-11-21
US10715703B12020-07-14
US5939679A1999-08-17
US201514970362A2015-12-15
US11171369B12021-11-09
US10084945B12018-09-25
US201916588834A2019-09-30
US197462632279P
Download PDF:
Claims:
CLAIMS

We Claim:

1. A method for video leveling that includes computer animation transitions for use in pipe inspection systems, comprising: generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images representing the field of view captured by the camera and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection images relative to an upright, earth normal orientation in inspection images of the video signal; receiving the video signal and orientation signal at a processing element having one or more graphics processing units (GPUs) or other processors; determining an orientation correction for the video signal describing rotations of the inspection images in ninety degree increments that reorients the images to most closely resemble an upright, earth normal orientation; generating a computer animation transition illustrating sequential steps in the degree and direction of orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video; outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment that further includes one or more computer animation transitions where orientation correction rotations occur; and displaying, on a display interface on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal.

2. The method of Claim 1, wherein the inspection images captured in the video signal are cropped, reduced in size, or stretched to fit the inspection area represented on the display interface in the rotation correction.

3. The method of Claim 1, wherein the computer animation transition includes cropping, reducing in size, or stretching to fit the inspection area represented on the display interface at each rotation step.

4. The method of Claim 1, wherein the inspection video images are mapped onto a graphics library (GL) surface that is further rotated in rotation corrections and associated computer animation transitions.

5. The method of Claim 1, wherein orientation corrections are generated from a single inspection image from the video signal captured at a single point in time and space in the inspection video.

6. The method of Claim 1, wherein orientation corrections are generated from a plurality of inspection images from the video signal captured at successive points in time and space.

7. The method of Claim 1, wherein the method is carried out and displayed in real-time or near real-time.

8. The method of Claim 1, wherein the corrected upright video having computer animation transitions is stored in one or more non-transitory memories.

9. The method of Claim 8, wherein the method is carried out in post-processing in one or more electronic display devices.

10. The method of Claim 1, wherein the inspection image rotates independently of other non-rotating additional information on a display interface during computer animation transitions for leveling video to upright, earth normal orientation.

11. A method for video leveling that includes computer animation transitions for use in pipe inspection systems, comprising: generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection images captured by the inspection camera relative to an upright, earth normal orientation; receiving the video signal and orientation signal at a processing element having one or more graphics processing units (GPUs) or other processors; determining a graphics library (GL) surface mapped on the processing element onto which the video signal images are rendered such that the GL surface represents a field of view captured in the video signal from the inspection camera that is or is made to be square in shape; determining, via the processing element, an orientation correction that describes rotations of the inspection images captured in the GL surface in ninety degree increments such that a corrected video includes orientation corrections that most closely resemble an upright, earth normal orientation; generating a computer animation transition illustrating sequential rotation steps in the degree and direction of orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video; outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment that further includes one or more computer animation transitions where orientation correction rotations occur; and displaying, on a display interface on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal.

12. The method of Claim 11, wherein the GL surface is a three dimensional shape matching the cylindrical shape of the inspected pipe and rotation corrections occur about the axis of the cylindrical pipe shape.

13. The method of Claim 11, wherein the GL surface is a non-square regular polygon in shape as mapped onto the processing element and rotation corrections occur at increments equal to the central angle of that polygonal shape.

14. The method of Claim 11, wherein inspection images are cropped, reduced in size, stretched, or otherwise resized to fit the GL surface shape.

15. The method of Claim 11, wherein orientation corrections are generated from a single inspection image from the video signal captured at a single point in time and space in the inspection video.

16. The method of Claim 11, wherein orientation corrections are generated from a plurality of inspection images from the video signal captured at successive points in time and space.

17. The method of Claim 11, wherein the method is carried out and displayed in real-time or near real-time.

18. The method of Claim 11, wherein the corrected upright video having computer animation transitions is stored in one or more non-transitory memories.

19. The method of Claim 18, wherein the method is carried out in post-processing in one or more electronic display devices.

20. The method of Claim 11, wherein the GL surface rotates independently of other nonrotating additional information on the display interface during computer animation transitions for leveling video to upright, earth normal orientation.

21. An inspection system for video leveling that includes computer animation transitions, comprising: an inspection camera, comprising; an imaging element having one or more imaging sensors positioned behind one or more lenses and disposed in a housing in generating a video signal; an orientation element having one or more orientation sensors in generating an orientation signal describing the orientation of the inspection images relative to upright, earth normal orientation; an electronic display device, comprising; a processing element having one or more processors receiving video signals and orientation signals from the inspection camera and outputting a corrected video signal having a corrected, upright, earth normal video that includes one or more video leveling computer animation transitions; a memory element having one or more non-transitory memories for storing instructions related to generating a corrected video signal including one or more video leveling computer animation transitions and the resulting corrected video signal including one or more video leveling computer animation transitions; and a display interface to display the corrected, upright, earth normal video including one or more video leveling computer animation transitions of the corrected video signal.

22. The inspection system of Claim 21, wherein the inspection camera is disposed on one end of a push-cable to move the inspection camera into a pipe or other void while distributing power and communicating data with the inspection camera.

23. The inspection system of Claim 21, wherein the inspection image rotation corrections associated with the corrected video signal are generated from a single inspection image from the video signal captured at a single point in time and space.

24. The inspection system of Claim 21, wherein the inspection image rotation corrections associated with the corrected video signal are generated from a plurality of inspection images from the video signal captured at successive points in time and space.

25. The inspection system of Claim 21, wherein the processing element includes one or more graphical processor units (GPUs).

26. The inspection system of Claim 21, wherein the electronic display device is or includes a camera control unit (CCU) used in pipe inspection systems.

27. The inspection system of Claim 21, wherein the electronic display device includes a smart phone, tablet or laptop computer, PC, or other remotely connected computing device.

28. The inspection system of Claim 21, wherein inspection images are cropped, reduced in size, or stretched to fit a GL surface shape in the processing element.

29. The inspection system of Claim 21, wherein the GL surface rotates independently of other non-rotating additional information on the display interface during computer animation transitions for leveling video to upright, earth normal orientation.

30. A method for smoothing the boundary transition between an image and non-image background on a display used in pipe inspection systems with digital self-leveling, comprising: generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images (original video image stream) representing the field of view captured by the camera; receiving the video signal at a processing element having one or more graphics processing units (GPUs) or other processors; rotating images of the original video stream to be used as a display background image to align with a display screen; scaling the background image to fully cover the display screen; editing the background image; overlaying the rotated original video image stream on top of the background image; and displaying the overlaid images on a display interface on one or more electronic display devices.

31. The method of Claim 30, wherein scaling the background image comprises enlarging, stretching, extending, cropping, or otherwise editing the background image.

32. The method of Claim 30, wherein editing the background image comprises applying a

Gaussian Blur filter.

33. The method of Claim 30, wherein editing the background image comprises blurring the background image by applying PhotoShop blurring effects, or other image blurring programs, algorithms, techniques, or filters.

34. The method of Claim 30, wherein editing the background image comprises applying pixel color averaging.

35. The method of Claim 30, further comprising determining via the processing element, a GL surface that is square in shape mapped onto the processing element containing all or a portion of the field of view captured in the video signal from the inspection camera.

36. The method of Claim 30, further comprising adjusting the focus view of the background to compensate for a camera or an optical sensor which has dynamic zoom functionality.

37. The method of Claim 30, wherein the one or more electronic display devices comprise a smart phone, tablet or laptop computer, PC, or other remotely connected computing device.

38. The method of Claim 30, further comprising creating a composite image by mirroring the image around a central image of itself to create multiple frames of the image, aligning the edges of the multiple images that overlap to create a single frame, and then applying blurring effects to areas of the composite image in overlapping regions that were not part of the central image.

39. The method of Claim 38, further comprising applying bending affects to one or more of the multiple frames to create a visual perception of to a viewer of moving with the camera as it moves through a pipe, conduit, or void.

40. A pipe inspection system for smoothing the boundary transition between an image and non-image background on a display, the system comprising: a cable drum; a self-leveling camera for capturing data relating to one or more images; a push-cable, wherein a proximal end of the push-cable is attached to the cable drum, and a distal end of the push-cable is attached to the camera; an editing module comprising: a processing element including one or more processors, and one or more memories programmed with instructions to smooth the transition between a background image and an overlaid actual image by editing the background image; a display for rendering the actual image and the background image.

41. The system of Claim 40, wherein the editing module further includes instructions for scaling the background image by enlarging, stretching, extending, cropping, or otherwise editing the background image to fit the display.

42. The system of Claim 40, wherein editing the background image comprises applying a Gaussian Blur filter.

43. The system of Claim 40, wherein editing the background image comprises blurring the background image by applying PhotoShop blurring effects, or other image blurring programs, algorithms, techniques, or filters.

44. The system of Claim 40, wherein editing the background image comprises applying pixel color averaging.

45. The system of Claim 40, further comprising determining via the processing element, a GL surface that is square in shape mapped onto the processing element containing all or a portion of the field of view captured in the video signal from the inspection camera.

46. The system of Claim 40, further comprising adjusting the focus view of the background to compensate for a camera or an optical sensor which has dynamic zoom functionality.

47. A method for smoothing the boundary transition between an image and non-image background on a display used in pipe inspection systems with digital self-leveling, comprising: generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images (original video image stream) representing the field of view captured by the camera; receiving the video signal at a processing element having one or more graphics processing units (GPUs) or other processors; rotating images of the original video stream to align with a display screen; selecting a background image to fully cover the display screen; blurring the background image; overlaying the rotated original video image stream on top of the displaying the overlaid images on a display interface on one or more electronic display devices.

48. The method of Claim 47, wherein the background selected includes a live real-time or near real-time image(s) that is the same as the original video image stream, a live real-time or near real-time image(s) that is different than the original video image stream, or a single image, multiple images, or a video image stream that has been previously stored in one or memories, a database or a cloud.

49. A method for video leveling that includes computer animation transitions for use in pipe inspection systems, comprising: generating, via one or more image sensors disposed in an inspection camera, image data representing the field of view captured by the camera and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection images relative to an upright, earth normal orientation in inspection images of the video signal; receiving the video signal and orientation signal at a processing element having one or more graphics processing units (GPUs) or other processors; determining an orientation correction for the video signal describing rotations of the inspection images in ninety degree increments that reorients the images to most closely resemble an upright, earth normal orientation; generating a computer animation transition illustrating sequential steps in the degree and direction of orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video; outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment that further includes one or more computer animation transitions where orientation correction rotations occur; and displaying, on a display interface on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal.

50. The method of Claim 49, wherein the corrected upright video having computer animation transitions is stored in one or more non-transitory memories.

51. The method of Claim 49, further comprising storing un-righted video with a righting sequence so that it can be righted during playback in one or more non-transitory memories.

Description:
SYSTEMS AND METHODS FOR INSPECTION ANIMATION

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority under 35 U.S.C. § 119(e) to co-pending United States Provisional Patent Application Serial No. 63/370,760 entitled SYSTEMS AND METHODS FOR INSPECTION ANIMATION, filed on August 8, 2022, and Serial No. 63/485,905 entitled SYSTEMS AND METHODS FOR INSPECTION ANIMATION, filed on February 18, 2023, the contents of which are hereby incorporated by reference herein in their entirety for all purposes.

FIELD

[0002] This disclosure relates generally to computer animation transition methods and systems in the leveling of a video to an upright, earth normal orientation. More specifically, but not exclusively, the disclosure relates to computer animation transition methods and systems in the leveling of a video to an upright, earth normal orientation as used in pipe inspections.

BACKGROUND

[0003] There are many applications where a video camera may be used to generate a video of a target or scene that may be viewed by a viewer in cither real-time or in playback. In many such applications, it may be advantageous for the observed video to have an “upright” or earth normal orientation in order to facilitate ease in viewing, help visually orient the viewer, and may help prevent neck strain from having to cock their head to the side to bring an image closer to normal. For instance, in video pipe inspection systems, a camera may be affixed to one end of a resilient flexible push-cable such that the inspection camera may deployed into a pipe or other void that may otherwise be difficult to access for the purposes of inspection. The push-cable may distribute electrical power to the inspection camera as well as communicate a video signal with an electronic display device on the opposite end of the push-cable for the purpose of displaying and recording the video and, in some known systems, other in-pipe data gathered by the sensors in or near the camera. The inspection camera may twist and turn as it is moved through bends and turns in the pipe or other void. As the inspection camera twists and turns, the orientation of the frame of view captured by the inspection camera may likewise twist and turn causing a viewer to become disoriented to the upright, earth normal orientation.

[0004] Solutions to rectify the orientation of a video to being upright and earth normal are known in the art though such known solutions are less than ideal. In at least one solution known in the art, the camera may be configured to mechanically self-level. For instance, such inspection cameras may include a camera body that is mounted for free rotation within a camera housing and a leveling weight that is physically attached to the camera body. The center of mass of the weight is displaced from the axis of rotation of the camera body so that the camera body may be leveled via gravitational forces. However, this design does not lend itself to easy removal and/or repair of the video camera and associated electronics within the camera head. Likewise, such designs may lack the ruggedness desirable in an inspection camera commonly used in harsh environments such as one might find inside a corroded pipe.

[0005] In the past, there have also been electronic solutions to the problem of orienting a video image from a remote video camera. If one is only interested in a rotation of one hundred and eighty degrees (the coarsest rotation— commonly called a screen flip), this can easily be done in one of two ways. The video transmitted by the camera can be converted to a digital format and remapped so that it is presented with what was originally the lower, right most pixel, remapped to the upper left most corner, and so on. The remapped digital data can then be reconverted to analog form. Alternatively in the case of a monitor having a cathode ray tube, the vertical and horizontal gun polarity can be reversed. Instead of scanning from left to right, top to bottom, the guns scan right to left, bottom to top. Either approach yields the same effect of rotating the video from the camera by one hundred and eighty degrees.

[0006] A digital flip and/or mirror is also commonly used both in LCD monitors as well as in some CCD cameras. One advantage of doing the flip before recording is that the corrected image is then recorded. Pipe inspection systems in use today that invert the image on the monitor do not allow the inverted image to be recorded. The main advantages of the flip approach are low cost and preservation of the original aspect ratio of the video (typically 4:3). The primary disadvantage is the limited rotational resolution (only offering two positions— 0 degrees and 180 degrees of rotation). [0007] Some manufacturers of video equipment have taken a video stream, converted it into a digital format, performed a matrix operation on the digital data to rotate the entire image by a predetermined amount, and then re-converted the digital date to an analog signal. This approach is optimal in terms of the rotational resolution, however it is extremely computationally intense, and therefore requires a significant cost in parts and power. It also suffers from the drawback that the rectangular 4:3 array is clipped so that some video content is lost at any angles other than zero and one hundred and eighty degrees. At rotations of ninety and two hundred and seventy degrees, the entire right and left lobes of the source video are lost.

[0008] In further solutions known in the art of video pipe inspection, a video signal may be converted to a digital format and a rectangular rotation at ninety degree increments may be performed via a processor (e.g., GPU or other processor) based on the output signal from the orientation sensor. Further, such solutions disclose various methods of cropping, reducing video image sizes, or stretching of video images to handle aspect ratios of video images having differences in horizontal to vertical image sizes measurements. Whereas such solutions are far less computationally intense, and therefore, require far less expense in parts and power, these solutions tend to momentarily disorient the viewer when such rotations in ninety degree increments occur. Further, such solutions tend to be carried out in hardware and may be tailored to a specific electronic display device. For instance, such an approach in a pipe inspection system may be carried out in a camera control unit (CCU) or similar display customized for rotation based on specific applications and hardware thus such solutions fail to be readily portable to other electronic display devices. Further, such solutions are limited to rectangular areas of rotation. Such problems are compounded in pipe inspection systems and similar systems where rotating inspection video images may share the same display interface as other non-rotating information.

[0009] Also, there is often a mismatch between the aspect ratio of the camera, and the aspect ratio of the display to be used. This tends to prevent the original as captured image to completely cover the display. This problem may be compounded if the viewer changes display modes from portrait (vertical) to landscape (horizontal). Previous attempts to correct or reconcile this deficiency have included scaling the image to fit the display which leaves regions of the display uncovered by the original image (blank or covered by background). Alternatively, the image may be sized to fit the full display area by stretching or otherwise extending it to cover the entire display area which results in distortion of the original image. Other attempts have included scaling the original image in a manner that prevents the distortion by extending the image completely over the display area, and then discarding any portion of the original image that does not fit the display. This results in the viewer seeing an inaccurate image, that is an image that is not the same as the original captured image.

[00010] Accordingly, there is a need in the art to address the above-described as well as other problems.

SUMMARY

[00011] In accordance with one aspect of the present invention a method for leveling of video that includes one or more computer animation transitions as used in inspection systems is disclosed. The method may include generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection image relative to an upright, earth normal orientation. The method may also include receiving the video signal and orientation signal at a processing element having one or more processors. The method may further include determining, via the processing element, an orientation correction that describes rotations of the inspection image(s) in ninety degree increments about a centroid such that a corrected video includes orientation corrections that most closely resembles an upright, earth normal orientation. The method embodiments may further include generating a computer animation transition illustrating sequential rotation steps in the degree and direction of the orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video. In some method embodiments, the rotation correction may use the same inspection image at each sequential rotation step. In other embodiments, the method may use different inspection images at the rotation step from different points in time or space in the inspection. In some such embodiments, each different inspection image may be the current inspection image generated at the inspection camera. The method may include cropping, reducing in size, stretching, or otherwise resizing the inspection image to fit the display interface on the electronic display device at each rotation interval. Method embodiments may further include outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment that further includes one or more computer animation transitions where orientation correction rotations occur. The method may further include displaying, on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal.

[00012] In accordance with one aspect of the present invention a method of computer animation transitions as used in inspection systems is disclosed. The method may include generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection image relative to an upright, earth normal orientation. The method may also include receiving the video signal and orientation signal at a processing element having one or more processors. Further, the method may include determining, via the processing clement, a GL surface that is square in shape mapped onto the processing element (e.g., one or more GPUs or the like) containing all or a portion of the field of view captured in the video signal from the inspection camera. The method may further include determining, via the processing element, an orientation correction that describes rotations of the field of view captured in the GL surface in ninety degree increments about a centroid such that a corrected video includes orientation corrections that most closely resembles an upright, earth normal orientation. In some method embodiments in keeping with the present disclosure, the GL surface may be any regular polygon in shape and such orientation corrections may occur at increments equal to the central angle measurement of the polygonal shape about a centroid thereof. The method embodiments may further include generating a computer animation transition illustrating sequential steps of the degree and direction of orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video. Method embodiments may further include outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment that further includes one or more computer animation transitions where orientation correction rotations occur. Such methods of the present disclosure may further include displaying, on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal. [00013] In accordance with another aspect of the invention, an inspection system including video leveling having computer animation transitions is disclosed. The inspection system may include an inspection camera having an imaging element that includes one or more imaging sensors positioned behind one or more lenses and disposed in a housing in generating a video signal. The inspection camera may further include an orientation element having one or more orientation sensors in generating an orientation signal describing the orientation of the field of view of the video relative to upright, earth normal orientation.

[00014] The inspection system may include an electronic display device having a processing element that includes one or more processors receiving the video signals and orientation signals from the inspection camera and may be configured to output a corrected video signal having a corrected, upright, earth normal video that includes one or more video leveling instances that each include a computer animation transition. The display device may further include a memory clement having one or more non-transitory memories in storing instructions related to generating a corrected video signal including one or more video leveling computer animation transitions and the resulting corrected video signal including one or more video leveling computer animation transitions. The electronic display device may include a display interface to display the corrected, upright, earth normal video including one or more video leveling computer animation transitions of the corrected video signal.

[00015] In accordance with another aspect of the invention, an inspection system may include video leveling, and more specifically digital self-leveling, which may be or share aspects with those disclosed in for example, United States Patent 8,587,648, issued November 19, 2013, entitled SELF-LEVELING CAMERA HEAD; and United States Patent 9,927,368, issued March 27, 2021, entitled SELF-LEVELING INSPECTION SYSTEMS AND METHODS, and/or other incorporated patents and applications.

[00016] When using digital self-leveling in a pipe inspection camera, images may be captured through a full 360 degrees of rotation of the camera head which may be actively rotating during an inspection. Note that this provides for image capturing with a full 360 degrees of rotation, not just 180 degrees of rotation, and then inverting the camera head to pick up the symmetry. The utility pipe inspection camera head, therefore, may be dynamically (actively) rotating in a pipe, conduit, or other void, thereby capturing a spinning image. [00017] Typically, based on the position or pose of the camera head when an image is captured, the image will need to be rotated in order to fit the orientation of the display. Images may also need to be scaled to fit the display. The steps of rotating and scaling of one or more images may be done in either order, or simultaneously, by one or more processors. There often exists a mismatch between the aspect ratio of the camera or image sensors, and the display to be used. This mismatch may cause regions of the display (gaps) to be uncovered by the image, and left blank, that is they are filled in with a background that is not part of the image. The contrast of the actual image with the background at the transition, i.e. the point where the image stops, and the background begins, will be harsh, i.e. not smooth (visually pleasing) to the viewer. The transition between the image and the display background may be smoothed out by filling in the non-image regions (boundary extension) with a chosen image or background which makes the transition seem to fade away. That is, the transition will be more pleasing to the eye of a viewer.

[00018] Another way to look at it is that by extending non-image regions into the actual region, it will make the contrast of the barrier between the image and the background (non-image area) less disruptive. Extending the boundary of an image (boundary extension) may include applying editing techniques to the image such as scaling, pixel averaging, image mirroring, and any other image extension techniques. Gaps between the image and the display area may be filled in with real-time images or stored images, and may include displaying a real-time image while filling in the gap with a previous image from the same stream or sequence as the real-time image being displayed.

[00019] In one aspect of the invention, a background image is chosen. Images may be chosen from the image or image stream to be displayed, or another desired background image may be used. Background images that are closest to the actual image will provide the most natural transition because they will be similar in characteristics such as color, tone, texture, angle, lighting, etc. The chosen background image is then fit to the size of the image display.

[00020] Fitting of the background image to the display screen may include rotating, aligning, extending (stretching), and/or scaling (enlarging or minimizing), cropping, or otherwise editing the image to completely cover the background. Then the background image may be blurred using a Gaussian Filter, or other well known blurring techniques, algorithms, or filters, such as provided in Adobe PhotoShop, or other readily available applications to blur, soften, smooth, or distort the image. The actual image, or image stream, is then overlaid on the background to render a single image to the viewer. The blurring effect will ensure a smooth image transition between the actual image and the background while preventing the viewer from confusing the background image data from real image data. As an alternative to using a blurring effect, pixel averaging techniques may be used to fill in the background with the average color (e.g. computing an average or arithmetic mean of the intensity values for each pixel position in a set of captured images from the same scene or view field) to smooth the transition.

[00021] The actual image may be a single image or a plurality of sequential images (a video stream) generated via one or more image sensors disposed in a utility pipe inspection camera. Different inspection images from a rotating camera head may be taken from different points in time or space in the inspection. The background image or images may be the same actual captured image, a different captured image, an image captured from a different camera or photo sensor, or an image retrieved from a memory, database, or the cloud.

[00022] In accordance with another aspect of the invention, the display may include a mobile device such as a cellular phone (cell phone), mobile phone, or smart phone, and processing of any images may be performed by hardware existing on the mobile device. Live video may be sent from a pipe inspection camera, and received as sensor data by a mobile device in real-time or near real-time. The mobile phone or smart phone may have its own orientation sensors, and data from the inspection camera sensors and the phone sensors may be be combined to provide a righted view on a display, especially a view that minimizes computational complexity to maximize processor and memory efficiency.

[00023] In accordance with another aspect of the invention, smoothing of the transition between the actual image and the background may include creating a composite image by mirroring a single image around itself to create multiple frames, and then editing the image using blurring and other techniques to smooth the transition between the edges of the image itself and the mirrored frames. Mirroring an image provides a memory-efficient implementation, because you do not have to store multiple frames in memory you just retrieve the frame that is needed.

[00024] In accordance with another aspect of the invention, a visual perception view may be provided using the image previously created using multiple frames, by editing the final image created to look like it is bending in towards the viewer, thereby creating an effect that the viewer would see if he was moving at the same the as the camera as it moves through a pipe, conduit, or void.

[00025] In accordance with another aspect of the invention, smoothing of the transition between an actual image or image stream, and the background, may also include adjusting the focus view of the background to compensate for a camera or optical sensor which has dynamic zoom functionality.

[00026] Various additional aspects, features, and functionality are further described below in conjunction with the appended Drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

[00027] The present application may be more fully appreciated in connection with the following detailed description taken in conjunction with the accompanying drawings, wherein:

[00028] FIG. 1A is an illustration of an embodiment of an inspection system which may include video leveling computer animation transitions in keeping with the present disclosure.

[00029] FIG. IB is a diagram of an embodiment the inspection system from FIG. 1 A.

[00030] FIG. 2A is an illustration of an embodiment of an inspection image relative to upright, earth normal orientation.

[00031] FIG. 2B is an illustration of an embodiment of a series of sequential rotation steps of a video leveling computer animation transition.

[00032] FIG. 3 is an embodiment of a method for rotation corrections that includes video leveling computer animation transitions.

[00033] FIG. 4A is a method for generating video leveling computer animation transitions from a single inspection image.

[00034] FIG. 4B is an embodiment of a method for generating video leveling computer animation transitions from multiple inspection image from different points in time and space in the inspection.

[00035] FIG. 5 A is an illustration of an embodiment of misalignment between a pre-rotation GL surface as compared to the upright, earth normal orientation. [00036] FIG. 5B is an illustration of an embodiment of the post-rotation alignment between the GL surface from FIG. 5A and the upright, earth normal orientation.

[00037] FIG. 6A is an illustration of an embodiment of a display interface demonstrating a computer animation transition from a single inspection captured at one point in time in the inspection video.

[00038] FIG. 6B is an embodiment of a method for generating a computer animation transition from a single inspection image captured at one point in time in the inspection video.

[00039] FIG. 7A is an illustration of an embodiment of a display interface demonstrating a computer animation transition from multiple inspection images generated at multiple points in time and space in the inspection video.

[00040] FIG. 7B is an embodiment of a method for generating a computer animation transition from multiple inspection images generated at multiple points in time and space in the inspection video.

[00041] FIG. 8 is an embodiment of a method for leveling of an inspection video that includes computer animation transitions.

[00042] FIG. 9A is an illustration of an embodiment of misalignment between a pre-rotation GL surface that is a three dimensional shape matching the cylindrical shape of the inspected pipe as compared to the upright, earth normal orientation.

[00043] FIG. 9B is an illustration of an embodiment of the post-rotation alignment between the GL surface from FIG. 9A and the upright, earth normal orientation.

[00044] FIG. 10A is an illustration of an embodiment of misalignment between a prerotation GL surface that is a regular polygon shape as compared to the upright, earth normal orientation.

[00045] FIG. 10B is an illustration of an embodiment of the post-rotation alignment between the GL surface from FIG. 10A and the upright, earth normal orientation.

[00046] FIG. 11 is an illustration of of an embodiment of various regular polygonal shapes of GL surfaces.

[00047] FIG. 12 is an embodiment of a method for leveling of an inspection video that includes computer animation transitions. [00048] FIG. 13 is an embodiment of a method for editing an image to smooth the transition between the edges of the actual image, and regions of a display that are filled with background using a blurring algorithm or other technique.

[00049] FIG. 14 is an illustration of an embodiment of rotating and scaling an image to be displayed.

[00050] FIG. 15 is an illustration of an embodiment of an image being displayed that does not completely cover the display area.

[00051] FIG. 16 is an illustration of an embodiment of an image before it is edited using a blurring algorithm or technique.

[00052] FIG. 17 is an illustration of an embodiment of an image after it is edited using a blurring algorithm or technique.

[00053] FIG. 18 is an embodiment of a method for editing an image created by mirroring a frame around itself to create a single composite image with smoother edge transitions by aligning the images of the created multiple frames.

[00054] FIG. 19 is an illustration of an embodiment of a composite image created by mirroring a single image around itself.

[00055] FIG. 20 is an illustration of an embodiment of a display view with image sensation of motion and depth in a pipe, void, or conduit.

DETAILED DESCRIPTION OF EMBODIMENTS

Overview

[00056] In accordance with one aspect of the present invention method for leveling of video that includes one or more computer animation transitions as used in inspection systems is disclosed. The method may include generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection image relative to an upright, earth normal orientation. The method may also include receiving the video signal and orientation signal at a processing element having one or more processors. The method may further include determining, via the processing element, an orientation correction that describes rotations of the inspection image(s) in ninety degree increments about a centroid such that a corrected video includes orientation corrections that most closely resembles an upright, earth normal orientation. The method embodiments may further include generating a computer animation transition illustrating sequential rotation steps in the degree and direction of the orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video. In some method embodiments, the rotation correction may use the same inspection image at each sequential rotation step. Tn other embodiments, the method may use different inspection images at the rotation step from different points in time or space in the inspection. In some such embodiments, each different inspection image may be the current inspection image generated at the inspection camera. The method may include cropping, reducing in size, stretching, or otherwise resizing the inspection image to fit the display interface on the electronic display device at each rotation interval. Method embodiments may further include outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment that further includes one or more computer animation transitions where orientation correction rotations occur. The method may further include displaying, on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal.

[00057] In accordance with one aspect of the present invention a method for video leveling computer animation transitions as used in inspection systems is disclosed. The method may include generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images and, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the inspection images captured by the inspection camera relative to an upright, earth normal orientation. It should be noted that orientations other than an upright, earth normal orientation could be used, if so desired. [00058] The method may also include receiving the video signal and orientation signal at a processing element having one or more processors which may include one or more graphical processing units (GPUs).

[00059] Further, the method may include determining a graphics library (GL) surface mapped on the processing element onto which the inspection images are rendered such that the GL surface represents a field of view captured in the video signal from the inspection camera that is or is made to be square in shape. Here, graphics library refers broadly to scaling, translation, rotation, and other common transformations and techniques performed in image processing. Inspection images may be cropped, stretched, reduced in size, or otherwise resized to fit the GL surface. Images may likewise be augmented in other ways to enhance suitability for continuous rotation, such as extending or extrapolating boundaries by computational photography or artificial intelligence techniques. Such extensions may be obtained by operations on previously acquired images, or by a machine learning model.

[00060] The method may further include determining, via the processing element, an orientation correction that describes rotations of the GL surface in ninety degree increments about a centroid such that a corrected video includes orientation corrections that most closely resembles an upright, earth normal orientation. In some method embodiments in keeping with the present disclosure, the GL surface may be a three dimensional shape matching the cylindrical shape of the inspected pipe and rotation corrections occur about the axis of the cylindrical pipe shape. In further method embodiments in keeping with the present disclosure, the GL surface may be any regular polygon in shape and such orientation corrections may occur at increments equal to the central angle measurement of the polygonal shape about a centroid thereof. The method embodiments may further include generating a computer animation transition illustrating sequential rotation steps in the degree and direction of orientation correction rotations comprising one or more inspection images from one or more points in times and space in the inspection video. Method embodiments may further include outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment. Each rotation correction of the corrected video may further include one or more computer animation transitions where orientation correction rotations occur. Such methods of the present disclosure may further include displaying, on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal. The corrected upright video having computer animation transitions and associated data may optionally be stored in one or more non-transitory memories.

[00061] In another aspect, in some method embodiments the orientation corrections are generated from a single inspection image from the video signal from a single point in time and space in the inspection video. In other method embodiments, orientation corrections may be generated from a plurality of inspection images from the video signal captured at successive points in time and space.

[00062] In another aspect, the method may be carried out and displayed in real-time or near real-time. In other method embodiments, the corrected upright video having computer animation transitions may be stored in one or more non-transitory memories and the method may be carried out in post-processing in one or more electronic display devices.

[00063] In another aspect, the display interface may include the inspection image may rotate independently of other non-rotating additional information during computer animation transitions for leveling video to upright, earth normal orientation. For instance, in some such embodiments inspection data may appear in such a non-rotating display area (e.g., information regarding status of system devices, battery power gauges, system device temperature, in-pipe or other inspection environment temperatures, mapping or position of the inspection camera, or the like).

[00064] In accordance with another aspect of the invention, an inspection system including video leveling computer animation transitions is disclosed. The inspection system may include an inspection camera having an imaging element that includes one or more imaging sensors positioned behind one or more lenses and disposed in a housing in generating a video signal. The inspection camera may further include an orientation element having one or more orientation sensors in generating an orientation signal describing the orientation of the field of view of the video relative to upright, earth normal orientation.

[00065] The inspection system may include an electronic display device having a processing element that includes one or more processors receiving the video signals and orientation signals from the inspection camera and may be configured to output a corrected video signal having a corrected, upright, earth normal video that includes one or more video leveling computer animation transitions. In some embodiments the processing element may include one or more graphical processor units (GPUs). Further, in some embodiments, such a processing element may be disposed in and processing of data may fully or partially be carried out in a remotely connected computing device. The display device may further include a memory element having one or more non-transitory memories in storing instructions related to generating a corrected video signal including one or more video leveling computer animation transitions and the resulting corrected video signal including one or more video leveling computer animation transitions. The electronic display device may include a display interface to display the corrected, upright, earth normal video including one or more video leveling computer animation transitions of the corrected video signal.

[00066] Details of example devices, systems, and methods that may be used in or combined with the computer animation transition methods and system in the leveling of a video to an upright, earth normal orientation as used in pipe inspections embodiments described herein, are disclosed in co-assigned patents and patent applications including: United States Patent 5,808,239, issued August 17, 1999, entitled VIDEO PUSH-CABLE; United States Patent 6,545,704, issued July 7, 1999, entitled VIDEO PIPE INSPECTION DISTANCE MEASURING SYSTEM; United States Patent 6,831,679, issued December 14, 2004, entitled VIDEO CAMERA HEAD WITH THERMAL FEEDBACK LIGHTING CONTROL; United States Patent 6,958,767, issued October 25, 2005, entitled VIDEO PIPE INSPECTION SYSTEM EMPLOYING NON-ROTATING CABLE STORAGE DRUM; United States Patent 6,862,945, issued March 8, 2005, entitled CAMERA GUIDE FOR VIDEO PIPE INSPECTION SYSTEM; United States Patent 7,009,399, issued March 7, 2006, entitled OMNIDIRECTIONAL SONDE AND LINE LOCATOR; United States Patent 7,136,765, issued November 14, 2006, entitled A BURIED OBJECT LOCATING AND TRACING METHOD AND SYSTEM EMPLOYING PRINCIPAL COMPONENTS ANALYSIS FOR BLIND SIGNAL DETECTION; United States Patent 7,221,136, issued May 22, 2007, entitled SONDES FOR LOCATING UNDERGROUND PIPES AND CONDUITS; United States Patent 7,276,910, issued October 2, 2007, entitled A COMPACT SELF-TUNED ELECTRICAL RESONATOR FOR BURIED OBJECT LOCATOR APPLICATIONS; United States Patent 7,288,929, issued October 30, 2007, entitled INDUCTIVE CLAMP FOR APPLYING SIGNAL TO BURIED UTILITIES; United States Patent 7,298,126, issued November 20, 2007, entitled SONDES FOR LOCATING UNDERGROUND PIPES AND CONDUITS; United States Patent 7,332,901, issued February 19, 2008, entitled LOCATOR WITH APPARENT DEPTH INDICATION; United States Patent 7,336,078, issued February 26, 2008, entitled MULTI-SENSOR MAPPING OMNIDIRECTIONAL SONDE AND LINE LOCATOR; United States Patent 7,498,797, issued March 3, 2009, entitled LOCATOR WITH CURRENT-MEASURING CAPABILITY; United States Patent 7,498,816, issued March 3, 2009, entitled OMNIDIRECTIONAL SONDE AND LINE LOCATOR; United States Patent 7,518,374, issued April 14, 2009, entitled RECONFIGURABLE PORTABLE LOCATOR EMPLOYING MULTIPLE SENSOR ARRAYS HAVING FLEXIBLE NESTED ORTHOGONAL ANTENNAS; United States Patent 7,557,559, issued July 7, 2009, entitled COMPACT LINE ILLUMINATOR FOR BURIED PIPES AND CABLES; United States Patent 7,619,516, issued November 17, 2009, entitled SINGLE AND MULTI-TRACE OMNIDIRECTIONAL SONDE AND LINE LOCATORS AND TRANSMITTER USED THEREWITH; United States Patent 7,619,516, issued November 17, 2009, entitled SINGLE AND MULTI-TRACE OMNIDIRECTIONAL SONDE AND LINE LOCATORS AND TRANSMITTER USED THEREWITH; United States Patent 7,733,077, issued June 8, 2010, entitled MULTI-SENSOR MAPPING OMNIDIRECTIONAL SONDE AND LINE LOCATORS AND TRANSMITTER USED THEREWITH; United States Patent 7,741,848, issued June 22, 2010, entitled ADAPTIVE MULTICHANNEL LOCATOR SYSTEM FOR MULTIPLE PROXIMITY DETECTION; United States Patent 7,755,360, issued July 13,

2010, entitled PORTABLE LOCATOR SYSTEM WITH JAMMING REDUCTION; United States Patent 7,825,647, issued November 2, 2010, entitled METHOD FOR LOCATING BURIED PIPES AND CABLES; United States Patent 7,830,149, issued November 9, 2010, entitled AN UNDERGROUND UTILITY LOCATOR WITH A TRANSMITTER, A PAIR OF UPWARDLY OPENING POCKET AND HELICAL COIL TYPE ELECTRICAL CORDS; United States Patent 7,864,980, issued January 4,2011, entitled SONDES FOR LOCATING UNDERGROUND PIPES AND CONDUITS; United States Patent 7,948,236, issued May 24,

2011, entitled ADAPTIVE MULTICHANNEL LOCATOR SYSTEM FOR MULTIPLE PROXIMITY DETECTION; United States Patent 7,969,151, issued June 28, 2011, entitled PRE-AMPLIFIER AND MIXER CIRCUITRY FOR A LOCATOR ANTENNA; United States Patent 7,990,151, issued August 2, 2011, entitled TRI-POD BURIED LOCATOR SYSTEM; United States Patent 8,013,610, issued September 6, 2011, entitled HIGH Q SELF-TUNING LOCATING TRANSMITTER; United States Patent 8,035,390, issued October 11, 2011, entitled OMNIDIRECTIONAL SONDE AND LINE LOCATOR; United States Patent 8,106,660, issued January 31, 2012, entitled SONDE ARRAY FOR USE WITH BURIED LINE LOCATOR; United States Patent 8,203,343, issued June 19, 2012, entitled RECONFIGURABLE PORTABLE LOCATOR EMPLOYING MULTIPLE SENSOR ARRAYS HAVING FLEXIBLE NESTED ORTHOGONAL ANTENNAS; United States Patent 8,248,056, issued August 21, 2012, entitled A BURIED OBJECT LOCATOR SYSTEM EMPLOYING AUTOMATED VIRTUAL DEPTH EVENT DETECTION AND SIGNALING; United States Patent 8,264,226, issued September 11, 2012, entitled SYSTEM AND METHOD FOR LOCATING BURIED PIPES AND CABLES WITH A MAN PORTABLE LOCATOR AND A TRANSMITTER IN A MESH NETWORK; United States Patent 8,289,385, issued October 16,

2012, entitled PUSH-CABLE FOR PIPE INFECTION SYSTEM; United States Patent Application 13/769,202, filed February 15, 2013, entitled SMART PAINT STICK DEVICES AND METHODS; United States Patent Application 13/793,168, filed March 11, 2013, entitled BURIED OBJECT LOCATORS WITH CONDUCTIVE ANTENNA BOBBINS; United States Patent 8,395,661, issued March 12, 2013, entitled PIPE INSPECTION SYSTEM WITH SELECTIVE IMAGE CAPTURE; United States Patent 8,400,154, issued March 19, 2013, entitled LOCATOR ANTENNA WITH CONDUCTIVE BOBBIN; United States Patent Application 14/027,027, filed September 13, 2013, entitled SONDE DEVICES INCLUDING A SECTIONAL FERRITE CORE STRUCTURE; United States Patent Application 14/033,349, filed September 20, 2013, entitled AN UNDERGROUND UTILITY LOCATOR WITH A TRANSMITTER, A PAIR OF UPWARDLY OPENING POCKET AND HELICAL COIL TYPE ELECTRICAL CORDS; United States Patent 8,547,428, issued October 1, 2013, entitled PIPE MAPPING SYSTEM; United States Patent 8,564,295, issued October 22, 2013, entitled METHOD FOR SIMULTANEOUSLY DETERMINING A PLURALITY OF DIFFERENT LOCATIONS OF THE BURIED OBJECTS AND SIMULTANEOUDLY INDICATING THE DIFFERENT LOCATIONS TO A USER; United States Patent 8,587,648, issued November 19,

2013, entitled SELF-LEVELING CAMERA HEAD; United States Patent Application 14/148,649, filed January 6, 2014, entitled MAPPING LOCATING SYSTEMS & METHODS; United States Patent 8,635,043, issued January 21, 2014, entitled LOCATOR AND TRANSMITTER CALIBRATION SYSTEM; United States Patent 8,717,028, issued May 6,

2014, entitled SPRING CLIPS FOR USE WITH LOCATING TRANSMITTERS; United States Patent 8,773,133, issued July 8, 2014, entitled ADAPTIVE MULTICHANNEL LOCATOR SYSTEM FOR MULTIPLE PROXIMITY DETECTION; United States Patent 8,841,912, issued September 23, 2014, entitled PRE-AMPLIFIER AND MIXER CIRCUITRY FOR A LOCATOR ANTENNA; United States Patent 8,908,027, issued December 9, 2014, entitled ASYMMETRIC DRAG FORCE BEARING FOR USE WITH A PUSH-CABLE STORAGE DRUM; United States Patent 8,970,211, issued March 3, 2015, entitled PIPE INSPECTION CABLE COUNTER AND OVERLAY MANAGEMENT SYSTEM; United States Patent 8,984,698, issued March 24, 2015, entitled LIGHT WEIGHT SEWER CABLE; United States Patent 9,041,794, issued May 26, 2015, entitled PIPE MAPPING SYSTEMS AND METHODS; United States Patent 9,057,754, issued June 16, 2015, entitled ECONOMICAL MAGNETIC LOCATOR APPARATUS AND METHOD; United States Patent 9,066,446, issued June 23, 2015, entitled THERMAL EXTRACTION ARCHITECTURE FOR CAMERA HEADS, INSPECTION SYSTEMS, AND OTHER DEVICES AND SYSTEMS; United States Patent 9,081,109, issued July 14, 2015, entitled GROUND-TRACKING DEVICES FOR USE WITH A MAPPING LOCATOR; United States Patent 9,080,992, issued July 14, 2015, entitled ADJUSTABLE VARIABLE RESOLUTION INSPECTION SYSTEMS AND METHODS; United States Patent 9,082,269, issued July 14, 2015, entitled HAPTIC DIRECTIONAL FEEDBACK HANDLES FOR LOCATION DEVICES; United States Patent 9,085,007, issued July 21, 2015, entitled MARKING PAINT APPLICATOR FOR PORTABLE LOCATOR; United States Patent 9,207,350, issued December 8, 2015, entitled BURIED OBJECT LOCATOR APPARATUS WITH SAFETY LIGHTING ARRAY; United States Patent 9,222,809, issued December 29,

2015, entitled PORTABLE PIPE INSPECTION SYSTEMS AND APPARATUS; United States Patent 9,341,740, issued May 17, 2016, entitled OPTICAL GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,372,117, issued June 21,

2016, entitled OPTICAL GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS; United States Patent Application 15/187,785, filed June 21, 2016, entitled BURIED UTILITY LOCATOR GROUND TRACKING APPATUS, SYSTEMS, AND METHODS; United States Patent 9,411,066, issued August 9, 2016, entitled SONDES & METHODS FOR USE WITH BURIED LINE LOCATOR SYSTEMS; United States Patent 9,411,067, issued August 9, 2016, entitled GROUND-TRACKING SYSTEMS AND APPARATUS; United States Patent 9,435,907, issued September 6, 2016, entitled PHASE SYNCHRONIZED BURIED OBJECT LOCATOR APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,448,376, issued September 20, 2016, entitled HIGH BANDWIDTH PUSH-CABLES FOR VIDEO PIPE INSPECTION SYSTEMS; United States Patent 9,465,129, issued October 11, 2016, entitled IMAGE-BASED MAPPING LOCATING SYSTEM; United States Patent 9,468,954, issued October 18,2016, entitled PIPE INSPECTION SYSTEM WITH JETTER PUSH-CABLE; United States Patent 9,477,147, issued October 25, 2016, entitled SPRING ASSEMBLIES WITH VARIABLE FLEXIBILITY FOR USE WITH PUSH-CABLES AND PIPE INSPECTION SYSTEMS; United States Patent 9,488,747, issued November 8, 2016, entitled GRADIENT ANTENNA COILS AND ARRAYS FOR USE TN LOCATING SYSTEM; United States Patent 9,494,706, issued November 15, 2016, entitled OMNI-INDUCER TRANSMITTING DEVICES AND METHODS; United States Patent 9,521,303, issued December 13, 2016, entitled CABLE STORAGE DRUM WITH MOVABLE CCU DOCKING APPARATUS; United States Patent Application 15/846,102, filed December 16,2016, entitled SYSTEMS AND METHODS FOR ELECTRONICALLY MARKING, LOCATING AND VIRTUALLY DISPLAYING BURIED UTILITIES; United States Patent Application 15/866,360, filed January 9, 2017, entitled TRACKABLE DISTANCE MEASURING DEVICES, SYSTEMS, AND METHODS; United States Patent 9,927,368, issued March 27, 2021, entitled SELF-LEVELING INSPECTION SYSTEMS AND METHODS; United States Patent 9,571,326, issued February 14, 2017, entitled METHOD AND APPARATUS FOR HIGH-SPEED DATA TRANSFER EMPLOYING SELF-SYNCHRONIZING QUADRATURE AMPLITUDE MODULATION (QAM); United States Patent 9,599,449, issued March 21, 2017, entitled SYSTEMS AND METHODS FOR LOCATING BURIED OR HIDDEN OBJECTS USING SHEET CURRENT FLOW MODELS; United States Patent 9,599,740, issued March 21, 2017, entitled USER INTERFACES FOR UTILITY LOCATORS; United States Patent 9,625,602, issued April 18, 2017, entitled SMART PERSONAL COMMUNICATION DEVICES AS USER INTERFACES; United States Patent 9,632,202, issued April 25, 2017, entitled ECONOMICAL MAGNETIC LOCATOR APPARATUS AND METHODS; United States Patent 9,634,878, issued April 25, 2017, entitled SYSTEMS AND METHODS FOR DATA TRANSFER USING SELF-SYNCHRONIZING QUADRATURE AMPLITUDE MODULATION (QAM); United States Patent Application, filed April 25, 2017, entitled SYSTEMS AND METHODS FOR LOCATING AND/OR MAPPING BURIED UTILITIES USING VEHICLE-MOUNTED LOCATING DEVICES; United States Patent 9,638,824, issued May 2, 2017, entitled QUAD-GRADIENT COILS FOR USE IN LOCATING SYSTEMS; United States Patent Application, filed May 9, 2017, entitled BORING INSPECTION SYSTEMS AND METHODS; United States Patent 9,651,711, issued May 16, 2017, entitled HORIZONTAL BORING INSPECTION DEVICE AND METHODS; United States Patent 9,684,090, issued June 20, 2017, entitled NULLED-SIGNAL LOCATING DEVICES, SYSTEMS, AND METHODS; United States Patent 9,696,447, issued July 4, 2017, entitled BURIED OBJECT LOCATING METHODS AND APPARATUS USING MULTIPLE ELECTROMAGNETIC SIGNALS; United States Patent 9,696,448, issued July 4, 2017, entitled GROUND-TRACKING DEVICES AND METHODS FOR USE WITH A UTILITY LOCATOR; United States Patent 9,703,002, issued June 11, 2017, entitled UTILITY LOCATOR SYSTEMS & METHODS; United States Patent Application 15/670,845, filed August 7, 2016, entitled HIGH FREQUENCY AC -POWERED DRAIN CLEANING AND INSPECTION APPARATUS & METHODS; United States Patent Application 15/681,250, filed August 18, 2017, entitled ELECTRONIC MARKER DEVICES AND SYSTEMS; United States Patent Application 15/681,409, filed August 20, 2017, entitled WIRELESS BURIED PIPE & CABLE LOCATING SYSTEMS; United States Patent 9,746,572, issued August 29, 2017, entitled ELECTRONIC MARKER DEVICES AND SYSTEMS; United States Patent 9,746,573, issued August 29, 2017, entitled WIRELESS BURIED PIPE AND CABLE LOCATING SYSTEMS; United States Patent 9,769,366, issued September 29, 2017, entitled SELFGROUNDING TRANSMITTER PORTABLE CAMERA CONTROLLER FOR USE WITH PIPE INSPECTION SYSTEMS; United States Patent 9,784,837, issued October 10, 2017, entitled OPTICAL GROUND TRACKING APPARATUS, SYSTEMS & METHODS; United States Patent 9,798,033, issued October 24, 2017, entitled SONDE DEVICES INCLUDING A SECTIONAL FERRITE CORE; United States Patent Application 15/811,361, filed November 13, 2017, entitled OPTICAL GROUND-TRACKING APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,835,564, issued December 5, 2017, entitled MULTICAMERA PIPE INSPECTION APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,841,503, issued December 12, 2017, entitled OPTICAL GROUND -TRAC KING APPARATUS, SYSTEMS, AND METHODS; United States Patent Application 15/846,102, filed December 18, 2017, entitled SYSTEMS AND METHOD FOR ELECTRONICALLY MARKING, LOCATING AND VIRTUALLY DISPLAYING BURIED UTILITIES; United States Patent Application 15/866,360, filed January 9, 2018, entitled TRACKED DISTANCE MEASURING DEVICES, SYSTEMS, AND METHODS; United States Patent Application 16/255,524, filed January 23, 2018, entitled RECHARGEABLE BATTERY PACK ONBOARD CHARGE STATE INDICATION METHODS AND APPARATUS; United States Patent 9,891,337, issued February 13, 2018, entitled UTILITY LOCATOR TRANSMITTER DEVICES, SYSTEMS, and METHODS WITH DOCKABLE APPARATUS; United States Patent 9,914,157, issued March, 13, 2018, entitled METHODS AND APPARATUS FOR CLEARING OBSTRUCTIONS WITH A JETTER PUSH-CABLE APPARATUS; United States Patent Application 15/925,643, issued March 19, 2018, entitled PHASE-SYNCHRONIZED BURIED OBJECT TRANSMITTER AND LOCATOR METHODS AND APPARATUS; United States Patent Application 15/925,671, issued March 19, 2018, entitled MULTIFREQUENCY LOCATING SYSTEMS AND METHODS; United States Patent Application 15/936,250, filed March 26, 2018, entitled GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,927,545, issued March 27, 2018, entitled MULTIFREQUENCY LOCATING SYSTEMS & METHODS; United States Patent 9,928,613, issued March 27, 2018, entitled GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS; United States Patent Application 15/250,666, filed March 27, 2018, entitled PHASE- SYNCHRONIZED BURIED OBJECT TRANSMITTER AND LOCATOR METHODS AND APPARATUS; United States Patent 9,880,309, issued March 28, 2018, entitled UTILITY LOCATOR TRANSMITTER APPARATUS & METHODS; United States Patent Application 16/382,136, filed April 11, 2018, entitled GEOGRAPHIC MAP UPDATING METHODS AND SYSTEMS; United States Patent Application 15/954,486, filed April 16, 2018, entitled UTILITY LOCATOR APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,945,976, issued April 17, 2018, entitled UTILITY LOCATOR APPARATUS, SYSTEMS, AND METHODS; United States Patent 9,959,641, issued May 1, 2018, entitled METHODS AND SYSTEMS FOR SEAMLESS TRANSITIONING IN INTERACTIVE MAPPING SYSTEMS; United States Patent 9,989,662, issued June 5, 2018, entitled BURIED OBJECT LOCATING DEVICE WITH A PLURALITY OF SPHERICAL SENSOR BALLS THAT INCLUDE A PLURALITY OF ORHTOGONAL ANTENNAE; United States Patent Application 16/443,789, filed June 18, 2018, entitled MULTI-DIELECTRIC COAXIAL PUSHCABLES AND ASSOCIATED APPARATUS; United States Patent 10,001,425, issued June 19, 2018, entitled PORTABLE CAMERA CONTROLLER PLATFORM FOR USE WITH PIPE INSPECTION SYSTEM; United States Patent 10,009,582, issued June 26, 2018, entitled PIPE INSPECTION SYSTEM WITH REPLACEABLE CABLE STORAGE DRUM; United States Patent Application 16/036,713, issued July 16, 2018, entitled UTILITY LOCATOR APPARATUS AND SYSTEMS; United States Patent 10,027,526, issued July 17, 2018, entitled METHOD AND APPARATUS FOR HIGH-SPEED DATA TRANSFER EMPLOYING SELFSYNCHRONIZING QUADRATURE AMPLITUDE MODULATION; United States Patent 10,024,994, issued July 17, 2018, entitled WEARABLE MAGNETIC FIELD UTILITY LOCATOR SYSTEM WITH SOUND FIELD GENERATION; United States Patent 10,031,253, issued July 24, 2018, entitled GRADIENT ANTENNA COILS AND ARRAYS FOR USE IN LOCATING SYSTEMS; United States Patent 10,042,072, issued August 7, 2018, entitled OMNI-INDUCER TRANSMITTING DEVICES AND METHODS; United States Patent 10,059,504, issued August 28, 2018, entitled MARKING PAINT APPLICATOR FOR USE WITH PORTABLE UTILITY LOCATOR; United States Patent Application 16/049,699, filed July 30, 2018, entitled OMNI-INDUCER TRANSMITTING DEVICES AND METHODS; United States Patent 10,069,667, issued September 4, 2018, entitled SYSTEMS AND METHODS FOR DATA TRANSFER USING SELF- SYNCHRONIZING QUADRATURE AMPLITUDE MODULATION (QAM); United States Patent Application 16/121,379, filed September 4, 2018, entitled KEYED CURRENT SIGNAL UTILITY LOCATING SYSTEMS AND METHODS; United States Patent Application 16/125,768, filed September 10, 2018, entitled BURIED OBJECT LOCATOR APPARATUS AND METHODS; United States Patent 10,073,186, issued September 11, 2018, entitled KEYED CURRENT SIGNAL UTILITY LOCATING SYSTEMS AND METHODS; United States Patent Application 16/133,642, issued September 17, 2018, entitled MAGNETIC UTILITY LOCATOR DEVICES AND METHODS; United States Patent 10,078,149, issued September 18, 2018, entitled BURIED OBJECT LOCATORS WITH DODECAHEDRAL ANTENNA NODES; United States Patent 10,082,591, issued September 25, 2018, entitled MAGNETIC UTILITY LOCATOR DEVICES & METHODS; United States Patent 10,082,599, issued September 25, 2018, entitled MAGNETIC SENSING BURIED OBJECT LOCATOR INCLUDING A CAMERA; United States Patent 10,090,498, issued October 2, 2018, entitled MODULAR BATTERY PACK APPARATUS, SYSTEMS, AND METHODS INCLUDING VIRAL DATA AND/OR CODE TRANSFER; United States Patent Application 16/160,874, filed October 15, 2018, entitled TRACKABLE DIPOLE DEVICES, METHODS, AND SYSTEMS FOR USE WITH MARKING PAINT STICKS; United States Patent 10,100,507, issued October 16,2018, entitled PIPE CLEARING CABLES AND APPARATUS; United States Patent 10,105,723, issued October 23, 2018, entitled TRACKABLE DIPOLE DEVICES, METHODS, AND SYSTEMS FOR USE WITH MARKING PAINT STICKS; United States Patent Application 16/222,994, filed December 17,

2018, entitled UTILITY LOCATORS WITH RETRACTABLE SUPPORT STRUCTURES AND APPLICATIONS THEREOF; United States Patent 10,105,723, issued October 23, 2018, entitled TRACKABLE DIPOLE DEVICES, METHODS, AND SYSTEMS FOR USE WITH MARKING PAINT STICKS; United States Patent 10,162,074, issued December 25, 2018, entitled UTILITY LOCATORS WITH RETRACTABLE SUPPORT STRUCTURES AND APPLICATIONS THEREOF; United States Patent Application 16/241,864, filed January 7,

2019, entitled TRACKED DISTANCE MEASURING DEVICES, SYSTEMS, AND METHODS; United States Patent Application 16/255,524, filed January 23, 2019, entitled RECHARGEABLE BATTERY PACK ONBOARD CHARGE STATE INDICATION METHODS AND APPARATUS; United States Patent Application 16/810,788, filed March 5, 2019, entitled MAGNETICALLY RETAINED DEVICE HANDLES; United States Patent 10,247,845, issued April 2, 2019, entitled UTILITY LOCATOR TRANSMITTER APPARATUS AND METHODS; United States Patent Application 16/382,136, filed April 11, 2019, entitled GEOGRAPHIC MAP UPDATING METHODS AND SYSTEMS; United States Patent 10,274,632, issued April 20, 2019, entitled UTILITY LOCATING SYSTEMS WITH MOBILE BASE STATION; United States Patent Application 16/390,967, filed April 22, 2019, entitled UTILITY LOCATING SYSTEMS WITH MOBILE BASE STATION; United States Patent 10,288,997, issued May 14, 2019, entitled ROTATING CONTACT ASSEMBLIES FOR SELF-LEVELING CAMERA HEADS; United States Patent Application 29/692,937, filed May 29, 2019, entitled BURIED OBJECT LOCATOR; United States Patent Application 16/436,903, filed June 10, 2019, entitled OPTICAL GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS FOR USE WITH BURIED UTILITY LOCATORS; United States Patent 10,317,559, issued June 11, 2019, entitled GROUND-TRACKING DEVICES AND METHODS FOR USE WITH A UTILITY LOCATOR; United States Patent Application 16/449,187, filed June 21, 2019, entitled ELECTROMAGNETIC MARKER DEVICES FOR BURIED OR HIDDEN USE; United States Patent Application 16/455,491, filed June 27, 2019, entitled SELFSTANDING MULTI-LEG ATTACHMENT DEVICES FOR USE WITH UTILITY LOCATORS; United States Patent 10,353,103, issued July 16, 2019, entitled SELF-STANDING MULTI-LEG ATTACHMENT DEVICES FOR USE WITH UTILITY LOCATORS; United States Patent 10,371,305, issued August 6, 2019, entitled DOCKABLE TRIPODAL CAMERA CONTROL UNIT; United States Patent Application 16/551,653, filed August 26, 2019, entitled BURIED UTILITY MARKER DEVICES, SYSTEMS, AND METHODS; United States Patent 10,401,526, issued September 3, 2019, entitled BURIED UTILITY MARKER DEVICES, SYSTEMS, AND METHODS; United States Patent 10,324,188, issued October 9, 2019, entitled OPTICAL GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS FOR USE WITH BURIED UTILITY LOCATORS; United States Patent Application 16/446,456, filed June 19, 2019, entitled DOCKABLE TRIPODAL CAMERA CONTROL UNIT; United States Patent Application 16/520,248, filed July 23, 2019, entitled MODULAR BATTERY PACK APPARATUS, SYSTEMS, AND METHODS; United States Patent 10,371,305, issued August 6, 2019, entitled DOCKABLE TRIPODAL CAMERA CONTROL UNIT; United States Patent Application 16/559,576, filed September 3, 2019, entitled VIDEO PIPE INSPECTION SYSTEMS WITH VIDEO INTEGRATED WITH ADDITIONAL SENSOR DATA; United States Patent Application 16/588,834, filed September 30, 2019, entitled VIDEO INSPECTION SYSTEM WITH WIRELESS ENABLED CABLE STORAGE DRUM; United States Patent 10,440,332, issued October 8, 2019, entitled INSPECTION CAMERA DEVICES AND METHODS WITH SELECTIVELY ILLUMINATED MULTISENSOR IMAGING; United States Patent Application 16/676,292, filed November 6, 2019, entitled ROBUST IMPEDANCE CONTROLLED SLIP RINGS; United States Patent 10,490,908, issued November 26, 2019, entitled DUAL ANTENNA SYSTEMS WITH VARIABLE POLARIZATION; United States Patent Application 16/701,085, filed December 2, 2019, entitled MAP GENERATION BASED ON UTILITY LINE POSITION AND ORIENTATION ESTIMATES; United States Patent 10,534,105, issued January 14, 2020, entitled UTILITY LOCATING TRANSMITTER APPARATUS AND METHODS; United States Patent Application 16/773,952, filed January 27, 2020, entitled MAGNETIC FIELD CANCELING AUDIO DEVICES; United States Patent Application 16/780,813, filed February 3, 2020, entitled RESILIENTLY DEFORMABLE MAGNETIC FIELD CORE APPARATUS AND APPLICATIONS; United States Patent 10,555,086, issued February 4, 2020, entitled MAGNETIC FIELD CANCELING AUDIO SPEAKERS FOR USE WITH BURIED UTILITY LOCATORS OR OTHER DEVICES; United States Patent Application 16/786,935, filed February 10, 2020, entitled SYSTEMS AND METHODS FOR UNIQUELY IDENTIFYING BURIED UTILITIES IN A MULTI-UTILITY ENVIRONMENT; United States Patent 10,557,824, issued February 11, 2020, entitled RESILIENTLY DEFORMABLE MAGNETIC FIELD TRANSMITTER CORES FOR USE WITH UTILITY LOCATING DEVICES AND SYSTEMS; United States Patent Application 16/791,979, issued February 14, 2020, entitled MARKING PAINT APPLICATOR APPARATUS; United States Patent Application 16/792,047, filed February 14, 2020, entitled SATELLITE AND MAGNETIC FIELD SONDE APPARATUS AND METHODS; United States Patent 10,564,309, issued February 18, 2020, entitled SYSTEMS AND METHODS FOR UNIQUELY IDENTIFYING BURIED UTILITIES IN A MULTI-UTILITY ENVIRONMENT; United States Patent 10,571,594, issued February 25, 2020, entitled UTILITY LOCATOR DEVICES, SYSTEMS, AND METHODS WITH SATELLITE AND MAGNETIC FIELD SONDE ANTENNA SYSTEMS; United States Patent 10,569,952, issued February 25, 2020, entitled MARKING PAINT APPLICATOR FOR USE WITH PORTABLE UTILITY LOCATOR; United States Patent Application 16/827,672, filed March 23, 2020, entitled DUAL ANTENNA SYSTEMS WITH VARIABLE POLARIZATION; United States Patent Application 16/833,426, filed March 27, 2020, entitled LOW COST, HIGH PERFORMANCE SIGNAL PROCESSING IN A MAGNETIC-FIELD SENSING BURIED UTILITY LOCATOR SYSTEM; United States Patent 10,608,348, issued March 31, 2020, entitled DUAL ANTENNA SYSTEMS WITH VARIABLE POLARIZATION; United States Patent Application 16/837,923, filed April 1, 2020, entitled MODULAR BATTERY PACK APPARATUS, SYSTEMS, AND METHODS INCLUDING VIRAL DATA AND/OR CODE TRANSFER; United States Patent Application 17/235,507, fded April 20, 2021, entitled UTILITY LOCATING DEVICES EMPLOYING MULTIPLE SPACED APART GNSS ANTENNAS; United States Provisional Patent Application 63/015,692, filed April 27, 2020, entitled SPATIALLY AND PROCESSINGBASED DIVERSE REDUNDANCY FOR RTK POSITIONING; United States Patent Application 16/872,362, filed May 11, 2020, entitled BURIED LOCATOR SYSTEMS AND METHODS; United States Patent Application 16/882,719, filed May 25, 2020, entitled UTILITY LOCATING SYSTEMS, DEVICES, AND METHODS USING RADIO BROADCAST SIGNALS; United States Patent 10,670,766, issued June 2, 2020, entitled UTILITY LOCATING SYSTEMS, DEVICES, AND METHODS USING RADIO BROADCAST SIGNALS; United States Patent 10,677,820, issued June 9, 2020, entitled BURIED LOCATOR SYSTEMS AND METHODS; United States Patent Application 16/902,245, filed June 15, 2020, entitled LOCATING DEVICES, SYSTEMS, AND METHODS USING FREQUENCY SUITES FOR UTILITY DETECTION; United States Patent Application 16/902,249, filed June 15, 2020, entitled USER INTERFACES FOR UTILITY LOCATORS; United States Provisional Patent Application 63/212,713, filed June 20, 2021, entitled DAYLIGHT VISIBLE AND MULTI-SPECTRAL LASER RANGEFINDER AND ASSOCIATED METHODS AND UTIITY LOCATOR DEVICES; United States Patent Application 16/908,625, fded June 22, 2020, entitled ELECTROMAGNETIC MARKER DEVICES WITH SEPARATE RECEIVE AND TRANSMIT ANTENNA ELEMENTS; United States Patent 10,690,795, issued June 23, 2020, entitled LOCATING DEVICES, SYSTEMS, AND METHODS USING FREQUENCY SUITES FOR UTILITY DETECTION; United States Patent 10,690,796, issued June 23, 2020, entitled USER INTERFACES FOR UTILITY LOCATORS; United States Patent Application 16/921,775, filed July 6, 2020, entitled AUTOTUNING CIRCUIT APPARATUS AND METHODS; United States Provisional Patent Application 63/055,278, fded July 22, 2020, entitled VEHICLE-BASED UTILITY LOCATING USING PRINCIPAL COMPONENTS; United States Patent Application 17/397,940, filed August 9, 2021, entitled INSPECTION SYSTEM PUSH-CABLE GUIDE APPARATUS; United States Patent Application 16/995,801, filed August 17, 2020, entitled UTILITY LOCATOR TRANSMITTER DEVICES, SYSTEMS, AND METHODS; United States Patent Application 17/001,200, fded August 24, 2020, entitled MAGNETIC SENSING BURIED UTLITITY LOCATOR INCLUDING A CAMERA; United States Patent 16/995,793, filed August 17, 2020, entitled UTILITY LOCATOR APPARATUS AND METHODS; United States Patent 10,753,722, issued August 25, 2020, entitled SYSTEMS AND METHODS FOR LOCATING BURIED OR HIDDEN OBJECTS USING SHEET CURRENT FLOW MODELS; United States Patent 10,754,053, issued August 25, 2020, entitled UTILITY LOCATOR TRANSMITTER DEVICES, SYSTEMS, AND METHODS WITH DOCKABLE APPARATUS; United States Patent 10,761,233, issued September 1, 2020, entitled SONDES AND METHODS FOR USE WITH BURIED LINE LOCATOR SYSTEMS; United States Patent 10,761,239, issued September 1, 2020, entitled MAGNETIC SENSING BURIED UTILITY LOCATOR INCLUDING A CAMERA; United States Patent 10,764,541, issued September 1, 2020, entitled COAXIAL VIDEO PUSH-CABLES FOR USE IN INSPECTION SYSTEMS; United States Patent Application 17/013,831, filed September 7, 2020, entitled MULTIFUNCTION BURIED UTILITY LOCATING CLIPS; United States Patent Application 17/014,646, filed September 8, 2020, entitled INTEGRATED FLEX-SHAFT CAMERA SYSTEM AND HAND CONTROL; United States Patent 10,777,919, issued September 15, 2020, entitled MULTIFUNCTION BURIED UTILITY LOCATING CLIPS; United States Patent Application 17/020,487, filed September 14, 2020, entitled ANTENNA SYSTEMS FOR CIRCULARLY POLARIZED RADIO SIGNALS; United States Patent Application 17/068,156, filed October 12, 2020, entitled DUAL SENSED LOCATING SYSTEMS AND METHODS; United States Patent 10,809,408, issued October 20, 2020, entitled DUAL SENSED LOCATING SYSTEMS AND METHODS; United States Patent 10,845,497, issued November 24, 2020, entitled PHASE-SYNCHRONIZED BURIED OBJECT TRANSMITTER AND LOCATOR METHODS AND APPARATUS; United States Patent 10,848,655, issued November 24, 2020, entitled HEAT EXTRACTION ARCHITECTURE FOR COMPACT VIDEO CAMERA HEADS; United States Patent Application 17/110,273, filed December 2, 2020, entitled INTEGRAL DUAL CLEANER CAMERA DRUM SYSTEMS AND METHODS; United States Patent 10,859,727, issued December 8, 2020, entitled ELECTRONIC MARKER DEVICES AND SYSTEMS; United States Patent 10,908,311 , issued February 2, 2021, entitled SELFSTANDING MULTI-LEG ATTACHMENT DEVICES FOR USE WITH UTILITY LOCATORS; United States Patent 10,928,538, issued February 23, 2021, entitled KEYED CURRENT SIGNAL LOCATING SYSTEMS AND METHODS; United States Patent 10,935,686, issued March 2, 2021, entitled UTILITY LOCATING SYSTEM WITH MOBILE BASE STATION; United States Patent Application 17/190,400, filed March 3, 2021, entitled DOCKABLE CAMERA REEL AND CCU SYSTEM; United States Patent 10,955,583, issued March 23, 2021, entitled BORING INSPECTION SYSTEMS AND METHODS; United States Patent 9,927,368, issued March 27, 2021, entitled SELF-LEVELING INSPECTION SYSTEMS AND METHODS; United States Patent 10,976,462, issued April 13, 2021, entitled VIDOE INFECTION SYSTEMS WITH PERSONAL COMMUNICATION DEVICE USER INTERFACES; United States Patent Application 17/501,670, filed October 14, 2021, entitled ELECTRONIC MARKER-BASED NAVIGATION SYSTEMS AND METHODS FOR USE IN GNSS-DEPRIVED ENVIRONMENTS; United States Patent Application 17/528,956, filed November 17, 2021, entitled VIDEO INSPECTION SYSTEM, APPARATUS, AND METHODS WITH RELAY MODULES AND CONNECTION PORT; United States Patent Application 17/541,057, fded December 2, 2021, entitled COLOR-INDEPENDENT MARKER DEVICE APPARATUS, METHODS, AND SYSTEMS; United States Patent Application 17/541,057, filed December 2, 2021, entitled VIDEO INSPECTION SYSTEM, APPARATUS, AND METHODS WITH RELAY MODULES AND CONNECTION PORTCOLORINDEPENDENT MARKER DEVICE APPARATUS, METHODS, AND SYSTEMS; United States Patent 11,193,767, issued December 7, 2021, entitled SMART PAINT STICK DEVICES AND METHODS; United States Patent 11,199,510, issued December 14, 2021, entitled PIPE INSPECTION AND CLEANING APPARATUS AND SYSTEMS; United States Provisional Patent Application 63/293,828, filed December 26, 2021, entitled MODULAR BATTERY SYSTEMS INCLUDING INTERCHANGEABLE BATTERY INTERFACE APPARATUS; United States Patent 11,209,115, issued December 28, 2021, entitled PIPE INSPECTION AND/OR MAPPING CAMERA HEADS, SYSTEMS, AND METHODS; United States Patent Application 17/563,049, filed December 28, 2021, entitled SONDE DEVICES WITH A SECTIONAL FERRITE CORE; United States Provisional Patent Application 63/306,088, filed February 2, 2022, entitled UTILITY LOCATING SYSTEMS AND METHODS WITH FILTER TUNING FOR POWER GRID FLUCTUATIONS; United States Patent Application 17/687,538, filed March 4, 2022, entitled ANTENNAS, MULTI- ANTENNA APPARATUS, AND ANTENNA HOUSINGS; United States Patent 11,280,934, issued March 22, 2022, entitled ELECTROMAGNETIC MARKER DEVICES FOR BURIED OR HIDDEN USE; and United States Patent 11,300,597, issued April 12, 2022, entitled SYSTEMS AND METHODS FOR LOCATING AND/OR MAPPING BURIED UTILITIES USING VEHICLE-MOUNTED LOCATING DEVICES; United States Patent 11,333,786, issued May 17, 2022, entitled BURIED UTILITY MARKER DEVICE, SYSTEMS, AND METHODS; United States Patent Application 17/845,290, filed June 21, 2022, entitled DAYLIGHT VISIBLE AND MULTI- SPECTRAL LASER RANGEFINDER AND ASSOCIATED METHODS AND UTITTY LOCATOR DEVICES; and United States Patent 11,366,245, issued June 21, 2022, entitled BURIED UTILITY LOCATOR GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS; United States Provisional Patent Application 63/368,879, filed July 19, 2022, entitled NATURAL VOICE UTILITY ASSET ANNOTATION SYSTEM; United States Patent 11,397,274, issued July 26, 2022, entitled TRACKED DISTANCE MEASURING DEVICES, SYSTEMS, AND METHODS; United States Patent Application 17/815,387, filed July 27, 2022, entitled INWARD SLOPED DRUM FACE FOR PIPE INSPECTION CAMERA SYSTEM; United States Patent 11,404,837, issued August 2, 2022, entitled ROBUST IMPEDANCE CONTROLLED SLIP RINGS; United States Patent 11,402,237, issued August 2, 2022, entitled VIDEO PIPE INSPECTION SYSTEMS WITH VIDEO INTEGRATED WITH ADDITIONAL SENSOR DATA; United States Provisional Patent Application 63/370,760, filed August 8, 2022, entitled SYSTES AND METHODS FOR INSPECTIONANIMATION; United States Patent 11,418,761, issued August 16, 2022, entitled INSPECTION CAMERA DEVICES AND METHODS WITH SELECTIVELY ILLUMINATED MULTISENSOR IMAGING SYSTEMS; United States Patent 11,428,814, issued August 30, 2022, entitled OPTICAL GROUND TRACKING APPARATUS, SYSTEMS, AND METHODS FOR USE WITH BURIED UTILTIY LOCATORS; United States Patent Application 17/930,029, filed September 6, 2022, entitled GNSS POSITIONING METHODS AND DEVICES USING PPP-RTK, RTK, SSR, OR LIKE CORRECTION DATA; United States Patent 11,448,600, issued September 20, 2022, entitled MULTI-CAMERA PIPE INSPECTION APPARATUS, SYSTEMS, AND METHODS; United States Patent Application 17/935,564, filed September 26, 2022, entitled SYSTEMS AND METHODS FOR DETERMINING AND DISTINGUISHING BURIED OBJECT USING ARTIFICIAL INTELLIGENCE; United States Patent 11,460,598, issued October 4, 2022, entitled USER INTERFACES FOR UTILITY LOCATORS; United States Patent 11,467,317, issued October 11, 2022, entitled ELECTROMAGNETIC MARKER DEVICES WITH SEPARATE RECEIVE AND TRANSMIT ANTENNA ELEMENTS; United States Patent 11,468,610, issued October 11, 2022, entitled METHODS AND SYSTEMS FOR GENERATING INTERACTIVE MAPING DISPLAYS IN CONJUNCTION WITH USER INTERFACE DEVICES; United States Patent 11,476,539, issued October 18, 2022, entitled MODULAR BATTERY PACK APPARATUS , SYSTEMS, AND METHODS INCLUDING VIRAL DATA AND/OR CODE TRANSFER; United States Patent 11,474,276, issued October 18, 2022, entitled SYSTEMS AND METHODS FOR UTILITY LOCATING IN A MULTIUTILITY ENVIRONMENT; United States Patent 11,476,851, issued October 18, 2022, entitled MAGNETICALLY SENSED USER INTERFCE DEVICES; United States Provisional Patent Application 63/380,375, filed October 20, 2022, entitled LINKED CABLE-HANDLING AND CABLE-STORAGE DRUM DEVICES AND SYSTEMS FOR THE COORDINATED MOVEMENT OF A PUSH-CABLE; United States Provisional Patent Application 63/435,148, filed December 23, 2022, entitled SYSTEMS, APPARATUS, AND METHODS FOR DOCUMENTING UTILITY POTHOLES AND ASSOCIATED UTILITY LINES; United States Patent Application 18/089,266, filed December 27, 2022, entitled MODULAR BATTERY SYSTEMS INCLUDING INTERCHANGEABLE BATTERY INTERFACE APPARATUS; United States Patent Application 18/162,663, filed January 31, 2023, entitled UTILTY LOCATING SYSTEMS AND METHODS WITH FILTER TUNING FOR POWER GRID FLUCTUATIONS; United States Provisional Patent Application 63/485,905, filed February 18, 2023, entitled SYSTEMS AND METHODS FOR INSPECTION ANIMATION; United States Provisional Patent Application 63/492,473, filed March 27, 2023, entitled VIDEO INSPECTION AND CAMERA HEAD TRACKING SYSTEMS AND METHODS; United States Patent 11,614,613, issued March 28, 2023, entitled DOCKABLE CAMERA REEL AND CCU SYSTEM; United States Patent 11,649,917, issued May 16, 2023, entitled INTEGRATED FLEX-SHAFT CAMERA SYSTEM WITH HAND CONTROL; United States Patent 11,665,321, issued May 30, 2023, entitled PIPE INSPECTION SYSTEM WITH REPLACEABLE CABLE STORAGE DRUM; and United States Patent 11,674,906, issued lune 13, 2023, entitled SELF-LEVELING INSPECTION SYSTEMS AND METHODS. The content of each of the above-described patents and applications is incorporated by reference herein in its entirety. The above applications may be collectively denoted herein as the “co-assigned applications” or “incorporated applications.”

[00067] The following exemplary embodiments are provided for the purpose of illustrating examples of various aspects, details, and functions of apparatus and systems; however, the described embodiments are not intended to be in any way limiting. It will be apparent to one of ordinary skill in the art that various aspects may be implemented in other embodiments within the spirit and scope of the present disclosure.

[00068] It is noted that as used herein, the term, "exemplary" means "serving as an example, instance, or illustration." Any aspect, detail, function, implementation, and/or embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects and/or embodiments.

Example Embodiments

[00069] Referring to FIGs. 1A and IB, an inspection system 100 is illustrated which may include video leveling computer animation transitions in keeping with the present disclosure. The inspection system 100 may include an inspection camera 110 having an imaging element 112 (FIG. IB) that includes one or more imaging sensors positioned behind one or more lenses 114 and disposed in a housing 116 in generating a video signal 120 (FIG. IB. One or more lights 118 may illuminate the field of view captured by the inspection camera 110 in generating the video signal 120 (FIG. IB). The inspection camera 110 may be or share aspects with the cameras disclosed in United States Patent 6,831,679, entitled VIDEO CAMERA HEAD WITH THERMAL FEEDBACK LIGHTING CONTROL issued December 14, 2004; United States Patent 8,587,648, entitled SELF-LEVELING CAMERA HEAD, issued November 9, 2013; United States Patent 9,066,446, entitled THERMAL EXTRACTION ARCHITECTURE FOR CAMERA HEAD, INSPECTION SYSTEMS, AND OTHER DEVICES AND SYSTEMS issued June 23, 2015; United States Patent 9,277,105, entitled SELF-LEVELING CAMERA HEADS, issued March 1, 2016; United States Patent 9,824,433 entitled PIPE INSPECTION SYSTEM CAMERA HEADS, issued on November 21, 2017; United States Patent 9,835,564, entitled MULTI-CAMERA PIPE INSPECTION APPARATUS, SYSTEM, AND METHODS, issued December 5, 2017; United States Patent 10,288,997, entitled ROTATING CONTACT ASSEMBLIES FOR SELFLEVELING CAMERA HEADS, issued May 14, 2019; United States Patent 10,715,703, entitled SELF-LEVELING CAMERA HEADS, issued July 14, 2020; United States Patent 10,848,655, entitled HEAT EXTRACTION ARCHITECTURE FOR COMPACT VIDEO CAMERA HEADS, issued November 24, 2020; United States Patent 11,209,115, entitled PIPE INSPECTION AND/OR MAPPING CAMERA HEADS, SYSTEMS, AND METHODS, issued December 28, 2021; and/or other disclosed in the incorporated patents and applications. The content of each of these applications is incorporated by reference herein in its entirety. The inspection camera 100 may further include an orientation element 130 (FIG. IB) having one or more orientation sensors (e.g., accelerometers, gyroscopic sensors, or other orientation determining sensors and mechanisms) in generating an orientation signal 140 (FIG. IB) describing the orientation of an inspection image representing the field of view captured by the inspection camera 110 in the video signal 120 (FIG. IB) relative to upright, earth normal orientation.

[00070] The inspection camera 110 may be disposed on the end of a push-cable 146 dispensed from a cable storage drum 150 allowing the inspection camera 110 to be moved through a pipe 148 or other inspection environment. The push-cable 146 may be configured to communicate the video signals 120 (FIG. IB) and orientation signals 140 (FIG. IB) to one or more electronic display devices 160 such as, but not limited to, a camera control unit (CCU) 162, smart phone 164, remote computing device 166, or the like. The push-cable 146 may be or share aspects with those disclosed in United States Patent 5,939,679, entitled VIDEO PUSH CABLE, issued August 17, 1999; United States Patent Application 14/970,362, entitled COAXIAL VIDEO PUSH-CABLES FOR USE IN INSPECTION SYSTEMS, filed December 15, 2015; United States Patent 9,448,376, entitled HIGH BANDWIDTH PUSH-CABLES FOR VIDEO INSPECTION SYSTEMS, issued September 20, 2016; United States Patent 9,468,954, entitled PIPE INSPECTION SYSTEM INCLUDING JETTER PUSH-CABLE, issued October 18, 2016; United States Patent 10,764,541, entitled COAXIAL VIDEO PUSH-CABLES FOR USE IN INSPECTION SYSTEMS, issued September 1, 2020; and/or other disclosed in the incorporated patents and applications. The content of each of these applications is incorporated by reference herein in its entirety.

[00071] Referring to FIG. IB, the cable storage drum 150 may include a communication element 152 (e.g., Wi-Fi, Bluetooth, or the like) to wireless communicate the video signals 120 and orientation signals 140 to the smart phone 164, the remote computing device 166, or other or other electronic display devices. Uncorrected video may also be output with metadata describing the orientation instead of using an orientation signal 140. In such cable-storage drum embodiments as the cable storage drum 150, a power element 154 may provide electrical power to the communication element 152 and/or other powered element that may be on or coupled with the cable storage drum 150. The power element 154 may be or include one or more batteries, grid tied power, or the like. In some such embodiments the power element 154 may be or include that may provide electrical power to the communication element 152. The one or more batteries of power element 154 may be or share aspects with those disclosed in United States Patent 10,090,498, entitled MODULAR BATTERY PACK APPARATUS, SYSTEMS, AND METHODS INCLUDING VIRAL DATA AND/OR CODE TRANSFER, issued October 2, 2018; United States Patent Application 16/255,524, entitled RECHARGEABLE BATTERY PACK ONBOARD CHARGE STATE INDICATION METHOD AND APPARATUS, filed lanuary 23, 2019; United States Patent 11,171,369, entitled MODULAR BATTERY PACK APPARATUS, SYSTEMS, AND METHODS, issued November 9, 2021; and/or other disclosed in the incorporated patents and applications. The content of each of these applications is incorporated by reference herein in its entirety. The communication element 152 may be or share aspects with the devices disclosed in United States Patent Application 17/528,956, entitled VIDEO INSPECTION SYSTEM APPARATUS AND RELAY MODULES AND CONNECTION PORTS, filed November 17, 2021 and/or others disclosed in the incorporated patents and applications. The content of each of these applications is incorporated by reference herein in its entirety. Further, the cable storage drum 150 may be or share aspects with those disclosed in United States Patent 6,958,767, entitled VIDEO PIPE INSPECTION SYSTEM EMPLOYING NON- ROTATING CABLE STORAGE DRUM, issued October 25, 2005; United States Patent 8,908,027, entitled ASYMMETRIC DRAG FORCE BEARING FOR USE IN PUSH-CABLE STORAGE DRUM, issued December 9, 2014; United States Patent 10,009,582, entitled PIPE INSPECTION SYSTEM WITH REPLACEABLE CABLE STORAGE DRUM, issued June 26, 2018; United States Patent 10,084,945, entitled CABLE STORAGE DRUM WITH MOVEABLE CCU DOCKING APPARATUS, issued September 25, 2018; United States Patent Application 16/588,834, entitled VIDEO INSPECTION SYSTEM WITH WIRELESS CABLE STORAGE DRUM, issued September 30, 2019; United States Patent Application 17/110,273, entitled INTEGRAL DUAL CLEANER CAMERA DRUM SYSTEMS AND METHODS, filed December 2, 2020; United States Provisional Patent Application 63/227,974, entitled INWARD SLOPED DRUM FACE FOR DRUM INSPECTION CAMERA SYSTEMS, filed July 30, 2021; and/or others disclosed in the incorporated patents and applications. The content of each of these applications is incorporated by reference herein in its entirety. The CCU 162 may be or share aspects with those disclosed in United States Patent 10,084,945, entitled CABLE STORAGE DRUM WITH MOVEABLE CCU DOCKING APPARATUS, issued September 25, 2018; United States Patent Application 17/190,400, filed March 3, 2021, entitled DOCKABLE CAMERA REEL AND CCU SYSTEM; and/or other disclosed in the incorporated patents and applications. The content of each of these applications is incorporated by reference herein in its entirety.

[00072] Still referring to FIG. IB, each of the electronic display devices 160 (e.g., the CCU 162, smart phone 164 remote computing device 166, and the like) may include a processing element 168 having one or more processors receiving the video signals 120 and orientation signals 140 from the inspection camera 110 via a wired connection or the communication element 152 of the cable storage drum 150 (e.g., communicating with another communication element 170 disposed in the smart phone 164 or remote computing device 166 or the like). The processing element 168 may be configured to determine and output a corrected video signal 180 having a corrected, upright, earth normal oriented video that includes one or more video leveling computer animation transitions (e.g., using the method 300 of FIG. 3, the method 400 of FIG. 4A, the method 450 of FIG. 4B, the method 630 of FIG. 6B, the method 730 of FIG. 7B, the method 800 of FIG. 8, or the method 1200 of FIG. 12). In some embodiments the processing element 170 (FIG. IB) may be or include one or more GPUs or the like processors. Further, in some embodiments, such a processing element as the processing element 170 may be disposed in a different computing device and the processing of data may fully or partially be carried out in the different computing device (e.g., the remote computing device 166 which may be a cloud computing device or the like). The output corrected video signal 180 may be communicated to an interface display 172 for displaying the upright, earth normalized inspection video, video leveling computer animation transitions, and optional additional data relating to the inspection and/or inspection system devices. In some embodiments, the inspection image rotations associated with the corrected video signal may be generated from a single inspection image from the video signal captured at a single point in time and space in the inspection video. In other method embodiments, the inspection image rotations associated with the corrected video signal may be generated from a plurality of inspection images from the video signal captured at successive points in time and space. Each of the display devices 160 (e.g., the CCU 162, smart phone 164, remote computing device 166, and others that may not be illustrated) may also include a memory clement 174 having one or more non-transitory memories in storing instructions related to generating a corrected video signal including one or more video leveling computer animation transitions and the resulting corrected video signal including one or more video leveling computer animation transitions.

[00073] The inspection images generated by a camera in an inspection system in keeping with the present disclosure, such as the system 100 of FIGs. 1A and IB, may generate one or more orientation correction in ninety degree increments to reorient inspection images to best realign with the upright, earth normal orientation. So as to not disorient a user, each orientation corrections may include a video leveling computer animation transition (also referred to herein as “computer animation transition”) animating the change in orientation.

[00074] Turning to FIG. 2A, an inspection image ninety degrees out of rotation to upright 210 relative to an upright, earth normal orientation 220 is illustrated. Prior to any rotation correction, such an inspection image may be rendered on a display interface displaying the resized inspection video image ninety degrees out of rotation to upright 230 allowing it to be viewed by a user. As the inspection image ninety degrees out of rotation to upright 210 would be inconvenient for a user to view in such a scenario as that rendered on the display interface displaying the resized inspection video image ninety degrees out of rotation to upright 230, a rotation correction that includes a video leveling computer animation transition to facilitate ease of viewing for the user.

[00075] Turning to FIG. 2B, a series of sequential rotation steps of a computer animation transition 250 are illustrated. In such a computer animation transition, the inspection image may rotate incrementally from a first rotation step 260 to a second rotation step 270 and further onto a third rotation step 280 and finally onto a fourth rotation step 290. At each rotation step 260, 270, 280, and 290 the inspection image may be resized (e.g., cropped, reduced in size, stretched, or otherwise resized) to fit the shape of a display interface. The series of sequential rotation steps of a computer animation transition 250 are illustrated in FIG. 2B as having four steps (e.g., the first rotation step 260, the second rotation step 270, the third rotation step 280, and the fourth rotation step 290), in other embodiments a rotation correction may be divided into any number sequential rotation steps for use in the computer animation transition. For instance, in some embodiments, a computer animation transition of the present invention animating a rotation correction may include one or more sequential rotation steps based on the refresh rate of the electronic display device or the inspection camera (e.g., one sequential rotation step for every refresh instance or the like).

[00076] Still referring to FIG. 2B, in some embodiments each sequential rotation step may rotate may use the same inspection image (e.g., the first rotation step 260, the second rotation step 270, the third rotation step 280, and the fourth rotation step 290 may all rotate the same inspection image). In other embodiments, the sequential rotation steps may use different sequential inspection images. For instance, the first rotation step 260 may use a first inspection image from a first place in time and space, the second rotation step 270 may use a second inspection image from a second place in time and space, the third rotation step 280 may use a third inspection image from a third place in time and space, and the fourth rotation step 290 may use a fourth inspection image from a fourth place in time and space. In some such embodiments, the inspection image at each rotation step may be the current inspection image generated by the inspection camera.

[00077] Turning to FIG. 3, a method 300 for rotation corrections in ninety degree increments (e.g., ninety or one hundred eighty degree rotation corrections) that include video leveling computer animation transitions is disclosed. In step 305, the method 300 may include generating, via one or more image sensors disposed in an inspection camera (e.g., imaging element 112 of the inspection camera 110 of FIG. IB), a video signal comprising a plurality of sequential inspection images representing the field of view captured by the camera and, via one or more orientation sensors disposed in the inspection camera (e.g., imaging element 112 of the inspection camera 110 of FIG. IB), an orientation signal measuring the orientation of each inspection image relative to an upright, earth normal orientation in inspection images of the video signal. In a step 310, the method 300 may include receiving the video signal and orientation signal at a processing element having one or more graphics processing units (GPUs) or other processors (e.g. , processing element 168 of FIG. IB). In decision step 315, the method 300 may determine whether the inspection image most closely resembles an upright, earth normal orientation. In other words, the method 300 may determine in decision step 315 whether a rotation correction would better reorient the inspection image to an upright, earth normal orientation. If the inspection image does not most closely resembles an upright, earth normal orientation in step 315, the method 300 may continue onto step 320. In step 320, an orientation correction for the video signal describing rotations of the video signal images in ninety degree increments that reorients the images to most closely resemble an upright, earth normal orientation may be determined. Onto step 325, the method 300 may include generating a computer animation transition animating sequential steps of the degree and direction of orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video. Such a computer animation transition may be generated via the method 400 of FIG. 4A or the method 450 of FIG. 4B. In step 330, the method 300 may include outputting video that may be corrected to include computer animation transitions and rotation corrections. In another step 335, the method 300 may include storing, in one or more non-transitory memories, the corrected video that may include computer animation transitions and rotation corrections. In a step 340, the method 300 may include displaying, on a display interface on one or more electronic display devices, the corrected upright video which may include rotation corrections and associated computer animation transitions.

[00078] Returning to step 315, if the inspection image does most closely resembles an upright, earth normal orientation, the method 300 may continue onto the decision step 345. In decision step 345, it may be determined if the inspection has concluded. If the inspection has concluded, the method may continue onto step 350 where the inspection will end. If the inspection has not concluded, the method may cycle back to step 305 where inspection images of video inspection and orientation signal will continue to be generated and onto step 330 outputting the corrected video signal.

[00079] Turning to FIG. 4A, a method 400 for generating a video leveling computer animation transition via a single inspection image is disclosed. In method 400, step 405 may include identifying an inspection image from the inspection video signal that corresponds with an orientation signal describing the orientation of the field of view captured in the inspection image relative to an upright, earth normal orientation that fits predetermined criteria to generate a rotation correction. In step 410, the method 400 may include determining the appropriate rotation to achieve an upright, earth normal orientation. In step 415, the method 400 may include dividing the rotation interval (e.g., a ninety or one hundred eighty degree rotation interval) into a number of sequential rotation steps. In step 420, the method 400 may include selecting inspection image from step 405 may be rotated to each sequential rotation step until reaching the final corrected rotation (optionally based on the refresh rate of the electronic display device or the frame rate of the inspection camera). In step 425, the method 400 may include cropping, reducing in size, stretching, or otherwise resizing the inspection image to fit the display interface on the electronic display device at each rotation interval. In step 430, the method 400 may include outputting the computer animation transition to one or more non-transitory memories and/or one or more electronic display devices for displaying the computer animation transition in real-time, near real-time, or in playback.

[00080] Turning to FIG. 4B, a method 450 for generating a video leveling computer animation transition via multiple successive inspection images is disclosed. In method 450, step 455 may include generating an inspection image from the inspection video signal that corresponds with an orientation signal describing the orientation of the field of view captured in the inspection image relative to an upright, earth normal orientation that fits predetermined criteria to generate a rotation correction. In step 460, the method 450 may include determining the appropriate rotation to achieve an upright, earth normal orientation. In step 465, the method 450 may include determining the rotation interval into a number of sequential rotation steps. In step 470, the method 450 may include selecting the initial inspection image from the inspection video for the first of the sequential rotation step. In step 475, the method 450 may include cropping, reducing in size, stretching, or otherwise resizing the inspection image of the inspection video to fit the display interface of the electronic display device. In step 480, the method 450 may include rotating the selected inspection image to the next sequential rotation step. In a decision step 485, the method 450 may determine if all sequential rotation steps been carried out. If all sequential rotation steps have not been carried out in step 485, the method 450 may continue onto a step 490 where the method 450 may include selecting the next inspection image from the inspection video which may be the current inspection image in time/space or other sequential inspection image in time/space in the inspection video. The method 450 may then repeat back at step 475.

[00081] Returning back to step 485, if all sequential rotation steps been carried out in step 485, the method 450 may continue onto a step 495. In a step 495, the method 450 may include outputting the computer animation transition including the rotation correction to one or more non- transitory memories and/or one or more electronic display devices for displaying the computer animation transition in real-time, near real-time, or in playback.

[00082] In some system and method embodiments disclosed herein, the rotations of inspection images may occur via rendering the inspection images onto a graphics library (GL) surface and the GL surface may be rotated in ninety degree increments. Such a GL surface may be or may be made to be square in shape in some embodiments. For instance, an inspection video may be applied as a texture to a square GL surface thus the output corrected video signal may be accessible for viewing on various different electronic display devices having a shared graphics library without relying upon a field-programmable gate array (FPGA) or like circuit dedicated for rotations based on specific applications and associated hardware as known in the art. As illustrated further herein, the GL surface may have other shapes e.g., the pipe-shaped GL surface of FIGs. 9A and 9B, the regular polygon shaped GL surfaces of FIGs. 10A, 10B, 11, and 12, or other shapes that have not been illustrated). Likewise, it should be noted that orientation corrections and associated computer animation transitions may be achieved in processing using techniques other than through the use of a GL surface. The use of GL surfaces described herein are provided to exemplify a technique by which orientation corrections and associated computer animation transitions in inspection systems may be achieved. Other techniques may be included in systems and methods in keeping with the present disclosure. Instead of utilizing a graphics library such as those publically available through OpenGL, WebGL, CUDA, or the like, one knowledge in the art may, for instance, utilize non-GPU processing (e.g., a CPU processing or the like), programmable logic (e.g., an FPGA or the like), specialty designed hardware (e.g., an ASIC or the like), or other techniques, methods, and hardware in generating rotations of inspection images and associated computer animation transitions.

[00083] Turning to FIG. 5A, a pre -rotation misalignment 500 is illustrated between a square GL surface orientation 510 capturing the field of view of the inspection video generated by an inspection camera (e.g., the inspection camera 110 of FIGs. 1A and IB) and an upright, earth normal orientation 520. Where a difference between the GL surface and upright orientation 530 is beyond a predetermined threshold, greater than forty-five degrees for instance, a rotation of the inspection image may occur in ninety degree increments about a centroid 540 to best align the field of view captured in the GL surface orientation 510 to best align with the upright, earth normal orientation 520. Such rotations of a GL surface may occur in either clockwise or counterclockwise directions and may occur in any increment multiple of ninety degrees (e.g., ninety degrees, one hundred eighty degrees, or two hundred seventy degrees).

[00084] Turning to FIG. 5B, a post rotation alignment 550 between a corrected GL surface orientation 560, having experienced one or more rotations in ninety degree increments 580, and the upright, earth normal orientation 520 is illustrated. As shown, the corrected GL surface orientation 560 and the upright, earth normal orientation 520 may align to a much greater degree versus the pre-rotation misalignment 500 illustrated in FIG. 5A. It should be noted that though the corrected GL surface orientation 560 and the upright, earth normal orientation 520 may align to a much greater degree in the post rotation alignment 550 of FIG. 5B, the alignment may not be exact. Instead, rotating the GL surface having the field of view captured in an inspection image by an inspection camera at ninety degree increments represents a low cost and robust approach to achieve a degree of video leveling that would be useful to an observer of an inspection video. In using such an approach to video leveling, an observer may be provided a computer animated transition where such video leveling events occur in a corrected inspection video that is generated in real-time, near real-time, or generated in post processing.

[00085] Turning to FIG. 6A, a display interface 600 is illustrated demonstrating a computer animation transition 610 in applying a rotation correction in keeping with the present disclosure. As illustrated, a pre-rotation GL surface 620a may include a visual rendering of an inspection video (e.g., via a video signal generated by an inspection camera such as the video signal 120 of FIG. IB generated by the inspection camera 110 of FIGs. 1A and IB). Further, in some embodiments, the display interface, such as display interface 600, may be configured to display additional information 322 outside the GL surface such as the pre -rotation GL surface 620a. For instance, the additional information 322 may include but is not limited to a battery status indicator 623 of the electronic display device or other system devices, an internal inspection camera temperature status 624, an external inspection environment temperature 625, a date of inspection 626, a time of inspection 627, a Sonde indicator 628 to notate one or more pipe Sondes associated with the inspection system, and/or the like.

[00086] Still referring to FIG. 3 A, the computer animation transition 610 may include a number of sequential rotation steps (e.g., a first sequential rotation step 612, a second sequential rotation step 614, and a third sequential rotation step 616 through to an nth sequential rotation step 618). As shown, the sequential rotation steps 612, 614, 616, and 618 of the computer animation transition 610 may rotate independently of the additional information 622 along the periphery of the display interface 600. It should be noted that at each of the sequential rotation steps (e.g., the first sequential rotation step 612, the second sequential rotation step 614, and the third sequential rotation step 616 through to the nth sequential rotation step 618), the inspection image from the inspection video may optionally be cropped, reduced in size, stretched, or otherwise resized to fit the shape of the GL surface at each rotation interval. For instance, a 640 x 480 video image may be resized to become 640 x 640 or 480 x 480. The computer animation transition 610, as well as other computer animation transition embodiments in keeping with the present disclosure, may include any number of rotation steps. For instance, in some embodiments, a computer animation transition of a rotation correction may include one or more sequential rotation steps based on refresh rate of the electronic display device (e.g., one sequential rotation step for every refresh instance or the like). In the computer animation transition 610, each rotation step (e.g., a first sequential rotation step 612, a second sequential rotation step 614, and a third sequential rotation step 616 through to an nth sequential rotation step 618) may be from a single inspection image captured in the inspection video at a single point in time and space until reaching the collected rotation GL surface 620b orientation. In other embodiments, rotation steps may be from multiple inspection images captured in the inspection video from different points in time and/or space. [00087] Turning to FIG. 6B, a method 630 for generating a computer animation transition that includes a rotation correction is disclosed. In step 635 the method 630 may include identifying and rendering on the GL surface an inspection image from the inspection video signal that corresponds with an orientation signal describing the orientation of the inspection image relative to an upright, earth normal orientation that fits predetermined criteria to generate a rotation correction. For instance, the inspection video may be applied as a texture to a GL surface having a square, regular polygon, three dimensional pipe shape, or other shape. Such predetermined criteria for rotation may, for instance, be any orientation measurement half of a rotation interval as compared to the upright, earth normal orientation (e.g., anything greater than a forty five degree offset from the upright, earth normal orientation in embodiments having a square GL surface). In step 640, the method 630 may determine the appropriate rotation to achieve an upright, earth normal orientation. For instance, such a rotation may be a ninety degree, one hundred eighty degree, or two hundred seventy rotation where a GL surface mapped on the processing element onto which the video signal images are rendered is or may be made to be square in shape. Likewise, other intervals of rotation may occur in some embodiments (e.g., the rotation intervals described in conjunction with regular polygons of FIGs. IDA, 10B, 11, and 12). In step 645 the method 630 may include dividing the rotation interval into a number of sequential rotation steps. Optionally, the step 645 may be carried out based on refresh rate of the electronic display device (e.g., one sequential rotation step for every refresh instance of the electronic display device or every few refresh instances or the like) or the frame rate of the inspection camera (e.g. , one sequential rotation step for every subsequent inspection image or every few subsequent inspection images or the like). In step 650, the GL surface containing the selected inspection image from step 635 may be rotated to each sequential rotation step until reaching the final corrected rotation. Further, the step 650 may optionally include cropping, reducing in size, stretching, or otherwise resizing the inspection image to fit the shape of the GL surface at each sequential rotation step. Where the GL surface is square, for instance, a 640 x 480 video image may be resized to become 640 x 640 or 480 x 480. In step 655 the computer animation transition may be output to one or more non-transitory memories and/or one or more electronic display devices for displaying the computer animation transition in real-time, near real-time, or in playback. [00088] Turning to FIG. 7 A, a display interface 700 is illustrated demonstrating a computer animation transition 710 in applying a rotation correction in keeping with the present disclosure. As illustrated, a pre-rotation GL surface 720a may include a visual rendering of an inspection video (e.g., via a video signal generated by an inspection camera such as the video signal 120 of FIG. IB generated by the inspection camera 110 of FIGs. 1A and IB). Further, in some embodiments, the display interface, such as display interface 700, may be configured to display additional information 722 outside the GL surface such as the pre-rotation GL surface 720a. For instance, the additional information 722 may include but is not limited to a battery status indicator 723 of the electronic display device or other system devices, an internal inspection camera temperature status 724, an external inspection environment temperature 725, a date of inspection 726, a time of inspection 727, a Sonde indicator 728 to notate one or more pipe Sondes associated with the inspection system, and/or the like.

[00089] Still referring to FIG. 4A, the computer animation transition 710 may include a number of sequential rotation steps (e.g., a first sequential rotation step 712, a second sequential rotation step 714, and a third sequential rotation step 416 through to an nth sequential rotation step 718). It should be noted that at each of the sequential rotation steps (e.g., a first sequential rotation step 712, a second sequential rotation step 714, and a third sequential rotation step 716 through to an nth sequential rotation step 718), the inspection image may be from a different point in time/space in the video signal than that used in the other sequential rotation steps (e.g., a first sequential rotation step 712, a second sequential rotation step 714, and a third sequential rotation step 716 through to an nth sequential rotation step 718). As shown, the sequential rotation steps 712, 714, 716, and 718 of the computer animation transition 710 may rotate independently of the additional information 722 along the periphery of the display interface 700. It should be noted that at each of the sequential rotation steps e.g., the first sequential rotation step 712, the second sequential rotation step 714, and the third sequential rotation step 716 through to the nth sequential rotation step 718), the inspection image may optionally be cropped, reduced in size, stretched, or otherwise resized to fit the shape of the GL surface at each rotation interval. The computer animation transition 710, as well as other video leveling computer animation transition embodiments in keeping with the present disclosure, may include any number of rotation steps. In the computer animation transition 710, each sequential rotation step (e.g., a first sequential rotation step 712, a second sequential rotation step 714, and a third sequential rotation step 716 through to an nth sequential rotation step 718) may be from a different inspection images that may be captured in the inspection video at multiple locations in time and space until reaching the corrected rotation GL surface 720b orientation. For instance, each rotation step (e.g., a first sequential rotation step 712, a second sequential rotation step 714, and a third sequential rotation step 716 through to an nth sequential rotation step 718) may be subsequent inspection images or a sequence from every other or every few inspection images from the inspection video captured at different times and location of the inspection video.

[00090] Turning to FIG. 7B, another method 730 for generating a computer animation transition that includes a rotation correction is disclosed. In step 735 the method 730 may begin with identifying and rendering on a GL surface an inspection image from the inspection video signal that corresponds with an orientation signal describing the of the inspection image relative to an upright, earth normal orientation that fits predetermined criteria to generate a rotation correction. For instance, the inspection video may be applied as a texture to a GL surface having a square, regular polygon, three dimensional pipe shape, or other shape. Such predetermined criteria for rotation may, for instance, be any orientation measurement half of a rotation interval as compared to the upright, earth normal orientation (e.g., anything greater than a forty five degree offset from the upright, earth normal orientation in embodiments having a square GL surface. In step 740, the method 730 may include determine the appropriate rotation to achieve an upright, earth normal orientation. For instance, such a rotation may be a ninety degree, one hundred eighty degree, or two hundred seventy rotation where a GL surface is or may be made to be square in shape. Likewise, other intervals of rotation may occur in some embodiments e.g., the rotation intervals described in conjunction with regular polygons of FIGs. 10A, 10B, 11, and 12). In step 745 the method 730 may include dividing the rotation interval into a number of sequential rotation steps. In some embodiments, such sequential rotation steps may be based on the refresh rate of the electronic display device or the frame rate of the inspection camera. In a step 750, the initial inspection image from step 735 may be selected for the first of the sequential rotation steps. In a step 755, the currently selected inspection image may be cropped, reduced in size, stretched, or otherwise resized to fit the GL surface shape. Where the GL surface is square, for instance, a 640 x 480 video image may be resized to become 640 x 640 or 480 x 480. In a step 760, the GL surface may be rotated to the next sequential rotation step. For instance, an inspection video may be applied as a texture to the GL surface. In a decision step 770, a decision may be made as to whether all sequential rotation steps been carried out. If all sequential rotation steps have not been carried out, method 730 may continue on to step 775. In step 775, the next inspection image may be selected from the inspection video. In some embodiments, this next inspection image may be the current inspection image in time/space in the inspection video. In other embodiments, the next inspection image may be any other sequential inspection image in time/space in the inspection video. Subsequent to step 775, the method 730 may repeat back at step 755. Referring back to step 770, if all sequential rotation steps have been carried out, method 730 may continue on to step 780. In step 780 the computer animation transition including the rotation correction may be output to one or more non-transitory memories and/or one or more electronic display devices for displaying the computer animation transition in real-time, near real-time, or in playback.

[00091] Turning to FIG. 8, a method 800 is disclosed for leveling of an inspection video that includes computer animation transitions of the present disclosure. In step 810, the method 800 may include generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images. In parallel step 820, the method 800 includes generating, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of each inspection images captured by the inspection camera relative to an upright, earth normal orientation. In step 830 subsequent to steps 810 and 820, the method 800 include receiving the video signal and orientation signal at a processing element having one or more processors which may include one or more graphical processing units (GPUs). In step 840, the method 800 may include determining a GL surface that is square in shape mapped onto the processing element (e.g., one or more GPUs or the like) onto which the video signal images are rendered such that the GL surface represents a field of view captured in the video signal from the inspection camera that is square in shape or is made to be a square in shape via cropping, reducing video image sizes, or stretching of video images to fit the GL surface shape. For instance, in some method embodiments, inspection video images may be applied to the square GL surface as a texture. In step 850, the method 800 may further include determining, via the processing element, an orientation correction that describes rotations of the inspection image captured in the GL surface in ninety degree increments about a centroid such that a corrected video includes orientation corrections that most closely resembles an upright, earth normal orientation (e.g., via the video leveling approach disclosed with FIGs. 5A and 5B). In step 860, the method 800 may further include generating a computer animation transition illustrating sequential rotation steps in the degree and direction of orientation correction rotations. Such a computer animation transition may include one or more inspection images from the video signal at one or more points in times and space in the inspection video rotating from a first orientation of the field of view that may not best align with the upright, earth normal orientation to the field of view orientation best aligning with the upright, earth normal orientation in one or more sequential rotation steps. In step 870, the method 800 may further include outputting a corrected video signal that includes an upright, earth normal oriented video that includes one or more computer animation transitions where orientation correction rotations occur. In step 880, the corrected upright video having computer animation transitions and associated data may optionally be stored in one or more non- transitory memories (e.g. the memory clement 174 of FIG. IB). In step 890, the method 800 may further include displaying, on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal. For instance, the corrected upright video having computer animation transitions from the corrected video signal may be displayed on the CCU 162 illustrated in FIGs. 1A and IB, smart phone 164 illustrated in FIGs. 1A and IB, remote computing device 166 illustrated in FIGs. 1A and IB, and/or other display device not illustrated.

[00092] In some embodiments in keeping with the present disclosure, the GL surface may be any three-dimensional shape including but not limited to a three-dimensional pipe shape. For instance, an inspection video may be applied as a texture to a three-dimensional pipe shaped GL surface (e.g. , cylindrical shape or the like) thus the output corrected video signal may be accessible for viewing on various different electronic display devices having a shared graphics library without relying upon a field-programmable gate array (FPGA) or like circuit dedicated for rotations based on specific applications and associated hardware as known in the art. In some such embodiments, orientation corrections may occur along a square cross-section or, where there may be a different polygonal cross-section, increments equal to the central angle measurement of the cross-section polygonal shape about a centroid thereof. [00093] Turning to FIG. 9A, a pre-rotation misalignment 900 is illustrated between a GL surface orientation 910 that is a three dimensional shape matching the cylindrical shape of an inspected pipe that is capturing the field of view of the inspection video generated by an inspection camera (e.g., the inspection camera 110 of FIGs. 1A and IB) and an upright, earth normal orientation 920. As shown, in some applications a difference between the GL surface and the upright orientation 930 may exist. Where the difference between the GL surface and the upright orientation 930 is beyond a predetermined threshold a rotation of the GL surface orientation 910 capturing the field of view of the camera may occur in ninety degree increments about an axis 940 of the cylindrical pipe shape to best align the GL surface orientation 910 to the upright, earth normal orientation 920 to the extent possible. It should be noted that in other embodiments, rotations may occur at increments equal to the central angle measurement of a polygonal cross- sectional plane (e.g., rotations at increments equal to a central angle 1035 illustrated in FIGs. 10A and 10B). Such rotations of a GL surface may occur in either clockwise or counterclockwise directions.

[00094] Turning to FIG. 9B, a post rotation alignment 950 between a corrected GL surface orientation 960, having experienced one or more rotations in central angle measurement increments 980, and the upright, earth normal orientation 920 is illustrated. As shown, the corrected GL surface orientation 960 and the upright, earth normal orientation 920 may align to a much greater degree versus the pre-rotation misalignment 900 illustrated in FIG. 9A. It should be noted that though the corrected GL surface orientation 960 and the upright, earth normal orientation 920 may align to a much greater degree in the post rotation alignment 950 of FIG. 9B, the alignment may not be exact. Instead, the GL surface represented in the GL surface orientation 910 being rotated at ninety degree increments (or increments equal to the central angle measurement of that particular polygon in embodiments having a polygonal plane cross-section) is a low cost and robust approach to achieve a degree of leveling that would be useful to an observer of an inspection video that is less taxing on processors. In using such an approach to leveling video, an observer may be provided a computer animated transition where such video leveling events occur in a corrected inspection video that is generated in real-time, near real-time, or generated in post processing. [00095] In some embodiments in keeping with the present disclosure, the GL surface may be any regular polygon shape. For instance, an inspection video may be applied as a texture to a polygonal GL surface (e.g., pentagon, hexagon, heptagon, and through to an nth sided polygon) thus the output corrected video signal may be accessible for viewing on various different electronic display devices having a shared graphics library without relying upon a field-programmable gate array (FPGA) or like circuit dedicated for rotations based on specific applications and associated hardware as known in the art. In some such embodiments, orientation corrections may occur at increments equal to the central angle measurement of the polygonal shape about a centroid thereof.

[00096] Turning to FIG. 10A, a pre-rotation misalignment 1000 is illustrated between a polygonal shaped GL surface orientation 1010 that is capturing the field of view of the inspection video generated by an inspection camera (e.g., the inspection camera 110 of FIGs. 1A and IB) and an upright, earth normal orientation 1020 (illustrated as a octagon but may be any regular polygon). As shown, in some applications a difference between the GL surface and the upright orientation 1030 may exist. Where the difference between the GL surface and the upright orientation 1030 is beyond a predetermined threshold a rotation of the GL surface from the GL surface orientation 1010 may occur in increments equal to the central angle measurement (e.g., a central angle 1035) about a centroid 1040 to best align the GL surface orientation 1010 to the upright, earth normal orientation 1020. Such rotations of a GL surface may occur in either clockwise or counterclockwise directions and may occur in any increment multiple of the measure of the polygons central angle (e.g., the central angle 1035).

[00097] Turning to FIG. 10B, a post rotation alignment 1050 between a corrected GL surface orientation 1060, having experienced one or more rotations in central angle measurement increments 1080, and the upright, earth normal orientation 1020 is illustrated. As shown, the corrected GL surface 1060 orientation and the upright, earth normal orientation 1020 may align to a much greater degree versus the pre-rotation misalignment 1000 illustrated in FIG. 10A. It should be noted that though the corrected GL surface 1060 orientation and the upright, earth normal orientation 1020 may align to a much greater degree in the post rotation alignment 1050 of FIG. 10B, the alignment may not be exact. Instead, the GL surface orientation 1010 being rotated at increments equal to the central angle measurement of that particular polygon represents a low cost and robust approach to achieve a degree of leveling that would be useful to an observer of an inspection video that is less taxing on processors. In using such an approach to leveling video, an observer may be provided a computer animated transition where such video leveling events occur in a corrected inspection video that is generated in real-time, near real-time, or generated in post processing.

[00098] Turning to FIG. 11, a number of different display interfaces 1100a, 1100b, 1100c, and 1 lOOd are illustrated 1100 is illustrated demonstrating different polygonal shaped GL surfaces 1120a, 1120b, 1120c, and 1120d. As illustrated the GL surfaces 1120a is a pentagonal shaped GL surface, the GL surfaces 1120b is a hexagonal shaped GL surface, the GL surfaces 1120c is a heptagonal shaped GL surface, and the GL surfaces 1120d is an octagonal shaped GL surface. It should be noted that a GL surface may be shaped via any regular polygon shape (through to an nth sided polygon).

[00099] Turning to FIG. 12, a method 1200 is disclosed for employing computer animation transitions as used in inspection systems of the present disclosure. In step 1210 the method 1200 may include generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images. In a parallel step 1220 the method 1200 may include generating, via one or more orientation sensors disposed in the inspection camera, an orientation signal measuring the orientation of the field of view captured by the inspection camera relative to an upright, earth normal orientation. In step 1230 subsequent to steps 1210 and 1220, the method 1200 may include receiving the video signal and orientation signal at a processing element having one or more processors which may include one or more graphical processing units (GPUs) and/or other processors. In step 1240 the method 1200 may include determining a graphics library (GL) surface mapped on the processing element onto which the video signal images are rendered such that the GL surface represents a field of view captured in the video signal from the inspection camera that is a regular polygonal in shape or is made to be a regular polygon in shape via cropping, reducing image sizes, or stretching of video images to fit the GL surface shape. For instance, in some method embodiments, inspection video images may be applied to the polygonal GL surface as a texture. The step 1240 may utilize the computer animation transition method 630 of FIG. 6B or the computer animation transition method 730 of FIG. 7B that includes rotation correction. In step 1250 the method 1200 may further include determining, via the processing element, an orientation correction that describes rotations of the GL surface in increments equal to the measure of the central angle such that a corrected video includes orientation corrections that most closely resembles the field of view having an upright, earth normal orientation. In step 1260 the method 1200 may further include generating a computer animation transition illustrating sequential rotation steps in the degree and direction of orientation correction rotations comprising one or more inspection images from the video signal at one or more points in times and space in the inspection video (e.g., using method 630 of FIG. 6B or the method 730 of FIG. 7B). In step 1270, the method 1200 may further include outputting a corrected video signal that includes an upright, earth normal oriented video along the most upright ninety degree rotation increment and further includes one or more computer animation transitions where orientation correction rotations occur. In step 1280, the corrected upright video having computer animation transitions and associated data may optionally be stored in one or more non-transitory memories (e.g. the memory element 174 of FIG. IB). In step 1290, the method 1200 may further include displaying, on one or more electronic display devices, the corrected upright video having computer animation transitions from the corrected video signal. For instance, the corrected upright video having computer animation transitions from the corrected video signal may be displayed on the CCU 162 illustrated in FIGs. 1A and IB, smart phone 164 illustrated in FIGs. 1A and IB, remote computing device 166 illustrated in FIGs. 1A and IB, and/or other display device not illustrated.

[000100] FIG. 13 illustrates details of an exemplary embodiment 1300 of a method for editing an image to smooth the transition between the edges of an actual image and regions of a display that are filled with background using image editing filters, algorithms, and techniques such as blurring, pixel averaging, etc. The method starts by generating images (original video stream) 1310 which may captured from a camera, photosensor(s), or other image capturing devices, and then receiving the video signal at a processing element 1320. Next, various image/photo editing techniques are applied. Background image(s) to be displayed are rotated to align with a display screen 1330. The image(s) selected to be used as a background may be a single image or multiple images, and may be still images or a video stream. Next, the background image is edited to fit (fully cover) the display area of the display screen in step 1340 by enlarging, stretching, extending or otherwise scaling the images. Scaling may be proportional (i.e. keeping the original image proportions) or nonproportional. In step 1350 the background image is blurred using a Gaussian Filter or other blurring filters, algorithms, or techniques. As an alternative to blurring pixel averaging may be used. In step 1360 the rotated images to be viewed from step 1330 or overlaid over the background image created in step 1350, and then the images are rendered as a single image on a display screen 1370. As an alternative to rendering as a single image, images may be composed of discrete layers, and rendered to look like a single image.

[000101] FIG. 14 illustrates details of an exemplary embodiment 1400 of an illustration of rotating and scaling an image to be displayed. An image 1410 is rotated as necessary to line up the image 1410 with a display 1420 in a desired position. Then, the image 1410 is scaled by resizing both the horizontal and vertical sides of the image frame 1430 to fit the display. A chosen background image or video stream, as well as an actual image to be overlaid and displayed on the background image, may both be fitted to the display screen in a likewise manner. Note that the sequence of image editing steps to fit the display screen, e.g. rotating, scaling, etc., may be done in any order, or simultaneously.

[000102] FIG. 15 illustrates details of an exemplary embodiment 1500 of an illustration of an image being displayed that does not completely cover the display area. Image 1410 is shown aligned and scaled to fit display screen 1420. However, uncovered/unfilled non-image background areas or regions 1510 of display screen 1420 may remain due to different aspect ratios between the camera that captured image 1410, and the display screen 1420. Different images to be displayed 1410 may have uncovered/unfilled areas or regions 1510 located in different sizes, shapes, and locations than shown.

[000103] FIG. 16 illustrates details of an exemplary embodiment 1600 of an illustration of an image 1610 before it has been edited overlaying the image 1610 on another similar image, and then applying a blurring algorithm or technique. An actual image to be displayed 1610 is shown on a display screen 1620 with uncovered/unfilled non-image background areas or regions 1630.

[000104] FIG. 17 illustrates details of an exemplary embodiment 1700 of an illustration of an image 1610 after it has been edited by applying a Gaussian Filter to provide a blurring effect. An actual image to be displayed 1610 is shown on a display screen 1620 with previously uncovered/unfilled non-image background areas or regions 1630 have now been blurred to provide a more appealing, natural view by smoothing out the transition between the actual image 1610 and the non-image background 1630. FIG. 18 illustrates details of an exemplary embodiment 1800 of a method for mirroring a frame around itself to create a single composite image with smoother edge transitions by aligning the images of the created multiple frames. The method begins by generating, via one or more image sensors disposed in an inspection camera, a video signal comprising a plurality of sequential inspection images (original video stream) representing the field of view captured by a camera 1810. Next, the video signal is received at a processing element having one or more graphics processing units (GPUs) or other processors 1820. In block 1830 an image to be displayed is rotated to align it at angle offset from the display screen orientation. Next, a composite image is created by mirroring multiple frames of the image around the image itself 1840. The composite image from step 1840 is then cropped to create a single final image that completely fits (fully covers) the area of the display screen 1850. In step 1860 blurring is applied to regions of the final composite image that were not part of the image to be displayed. Lastly, the final composite image is rendered on a display screen 1870.

[000105] FIG. 19 illustrates details of an exemplary embodiment 1900 of an embodiment of a composite image 1910 created by mirroring the single image 1910 around itself, and then cropping the mirrored image so that only the composite image 1910 is rendered on display screen 1920. As an alternative to using a single image 1910, each image may actually be a separate GL surface.

[000106] FIG. 20 illustrates details of an exemplary embodiment 2000 of a variation of FIG. 19 in which the mirrored images are projected by any of various means in the art, so that they appear at an angle to the main display image that is consistent with looking directly into a section of pipe. This may give the sensation of motion and depth in the pipe, and ameliorate any disorienting visual-perceptive qualities of digital image rotation. A composite image is created 1910 created by mirroring a single image 1910 around itself 2010. Then, blurring effects are applied to the regions of the mirrored images 2010 that overlap with the composite image 1910. Bending effects are then applied to the composite image 1910 to create an image with a moving forward visual perception view. The final composite image 1910 may then be cropped to fit a display screen 1920.

[000107] In some configurations, the apparatus or systems described herein may include means for implementing features or providing functions described herein. In one aspect, the aforementioned means may be a module including a processor or processors, associated memory and/or other electronics in which embodiments of the invention reside, such as to implement image and/or video signal processing, switching, transmission, or other functions to process and/or condition camera outputs, control lighting elements, control camera selection, or provide other electronic or optical functions described herein. These may be, for example, modules or apparatus residing in camera assemblies, camera and lighting assemblies, or other assemblies disposed on or within a push-cable or similar apparatus.

[000108] Those of skill in the art would understand that information and signals, such as video and/or audio signals or data, control signals, or other signals or data may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.

[000109] Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, electro-mechanical components, or combinations thereof. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.

[000110] The various illustrative functions and circuits described in connection with the embodiments disclosed herein with respect to tools, instruments, and other described devices may be implemented or performed in one or more processing elements using elements such as a general or special purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Processing elements may include hardware and/or software/firmware to implement the functions described herein in various combinations.

[000111] The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use various embodiments. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure.

[000112] Accordingly, the presently claimed invention is not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the specification and drawings, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. A phrase referring to “at least one of’ a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a; b; c; a and b; a and c; b and c; and a, b and c.

[000113] The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the spirit or scope of the disclosure. Thus, the scope of the present disclosure is not intended to be limited to only the specific aspects shown herein but should be accorded the widest scope consistent with the embodiments herein and their equivalents.