Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SOLAR POINTING SYSTEM
Document Type and Number:
WIPO Patent Application WO/2015/107559
Kind Code:
A1
Abstract:
A pointing sensor for solar tracking systems comprising image acquisition means ( 1 ), a shading sensor (2) integral and aligned with said image acquisition means (1), a processing unit (1 1 ) and a containing body (7) opaque to light radiation, which forms the divider of the shading sensor (2).

Inventors:
VINCENZI DONATO (IT)
BARICORDI STEFANO (IT)
OCCHIALI MASSIMILIANO (IT)
Application Number:
PCT/IT2015/000005
Publication Date:
July 23, 2015
Filing Date:
January 16, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
VINCENZI DONATO (IT)
BARICORDI STEFANO (IT)
OCCHIALI MASSIMILIANO (IT)
International Classes:
G01S3/786
Foreign References:
KR20070092695A2007-09-13
CN201724736U2011-01-26
Attorney, Agent or Firm:
NATALI, Dario et al. (Via Cola di Rienzo 265, Roma RM, IT)
Download PDF:
Claims:
Claims

1. Pointing sensor for solar tracking system comprising image acquisition means ( 1 ), a shading sensor (2) integral and aligned with the image acquisition means ( 1 ), a processing unit ( 1 1 ), a container body (7) opaque to light radiation, that defines the septum of the said shading sensor (2).

2. Pointing sensor for solar tracking system according to claim 1 , in which said acquisition means ( 1 ) comprise at least an image system, a sensor (4) provided with a photosensitive area (9) and are characterized by a first view field ( 101 ) having an optical axis ( 100), substantially parallel to the axis of the said container body (7).

3. Pointing sensor for solar tracking system according to claim 1 or 2, in which said shading sensor (2) comprises at least two pair of photosensitive elements (5,6) separated by a septum opaque to light radiation consisting in all or part of the structure opaque (7).

4. Pointing sensor for solar tracking system according to claim 3, in which the pairs of photosensitive elements (5, 6) are arranged symmetrically respect to opaque structure (7) that forms the body of the camera of the image system

( 1 ).

5. Pointing sensor for solar tracking system according to anyone of the preceding claims 3 or 4, in which said opaque septum (7) and said pairs of photosensitive elements (5,6) together defines a view field ( 102) substantially aligned with to first view field ( 101 ) but having an amplitude significantly greater.

6. Pointing sensor for solar tracking system according to anyone of the preceding claims from 3 to 5, in which said processing unit ( 1 1 ) processes the electrical signals output from the sensor (4) e and from the pairs of photosensitive elements (5,6) in order to identify and communicate the direction from which the sun's rays.

7. Pointing sensor for solar tracking system according to anyone of the previous claims from 1 to 6, in which the data processed by processing unit ( 1 1 ) are used to change mobile mechanical structure integral with said sensor, respect to the direction from which the sun's rays.

Description:
TITLE

" SOLAR POINTING SYSTEM

DESCRIPTION

The present invention concerns a solar pointing sensor for solar tracking systems, capable of precisely assessing the angle of misalignment between the direction of the solar rays and that of a reference axis, and of providing the actuators of the solar tracking system with the steering signal necessary for correcting this misalignment. Said pointing sensor incorporates a shading sensor and a videocamera which make it possible to identify the position of the sun within a first field of view and to precisely measure the pointing error when the sun is located within a second field of view, substantially more restricted than said first field of view.

Some solar systems, photovoltaic or thermal, are moved by suitable mechanical structures so that they are always oriented perpendicularly to the direction of the solar radiation, thus allowing more effective collection of the power emitted by the sun.

Photovoltaic and thermal solar systems are designed to convert solar energy into electrical or thermal energy.

In the northern hemisphere said systems are commonly exposed to solar radiation by orienting their active surface towards the south and inclining it by a suitable angle (called 'elevation') with respect to the horizon, so as to optimise the angle between the solar radiation and the perpendicular to the active surface. This solution makes it possible to maximise the efficiency of collection of the solar radiation in the hours of greatest irradiation.

In some cases solar systems are mounted on mechanical or electromechanical structures capable of tracking the sun in its apparent movement, and thus allow such systems to be always oriented perpendicularly to the direct solar radiation, i.e. to the radiation which arrives directly from the solar disc. This solution, too, makes it possible to maximise the efficiency of collection of the solar radiation over the course of the entire day. Certain particular solar systems called "concentration systems" make use of optical systems such as lenses or mirrors to concentrate the solar radiation onto active elements of small dimensions. Such active elements can be photovoltaic cells capable of generating electricity, or thermal receivers capable of converting solar radiation into heat.

In both cases, the optical concentration system makes it possible to collect very effectively only the solar radiation which falls within an imaginary cone whose semi-aperture is called angular "acceptance". The axis of this cone is typically referred to as the optical axis of the system. Compared with flat solar systems, concentration solar systems have an extremely small angular acceptance, and to ensure optimal efficiency of collection it is indispensable that the optical axis of the system is precisely aligned to the direction from which the solar rays come.

In photovoltaic systems based on flat panels the angle of acceptance is typically of several degrees and the need for correct alignment with the solar radiation also applies anyway in the case of flat solar systems.

Solar tracking systems are typically equipped with electronic control logic capable of providing signals to the actuators such as to enable the conversion systems to be constantly at the optimal angle of inclination, i.e. to align the optical axis of the systems with the direction from which the solar rays come. Such electronic control systems can make use of the astronomical coordinates (ephemerides) of the sun for a given day and a given time, or they can use alignment sensors which provide a direct indication of the inclination with respect to the direction of the solar radiation.

The typical precision of tracking systems depends strongly on the type of conversion systems used: in the case where the tracking system is coupled to flat panels, whether photovoltaic or thermal, the tolerable error is typically of several degrees, whereas in the case of concentration systems the tolerable error is typically of several tenths of a degree.

Manufacturers of solar tracking systems usually specify the precision with which the system is capable of aligning itself to the direction of the solar radiation, but instrumental proofs of this information are very rarely provided. In addition, the alignment error may depend on environmental conditions (wind load, thermal deformations, clouds) or on contingent conditions (dust accumulated on optical positioning sensors, errors in the timetable used for calculating the solar ephemerides).

The use of an instrument responsible for measuring the angular precision with which the movement system tracks the sun is therefore of extreme importance for providing a quantitative indication of the error committed and of the temporal course of this error, so as to be able to correlate it to particular environmental or contingent conditions. Ideally, this instrument must make it possible to be able to move the solar tracking system and correct the pointing error even when the angle of misalignment is of several tens of degrees, as can happen at sunrise or on days with variable weather.

There are on the market a number of instruments which provide this measurement and generally make use of videocameras, i.e. optical systems which form the image of the solar source on CMOS (complementary metal- oxide semiconductor) sensors, CCDs (charge coupled devices), or PSDs (position sensitive devices).

All systems based on the formation of the image of the sun on a sensor are characterised by a field of view having a certain width, typically expressed in degrees. To carry out a precise measurement of the angle of misalignment between the angle of the solar rays and the optical axis, it is necessary that small angular movements of the solar source are associated with relatively large movements of the position of the image of this source on the sensor. This is made possible by using optical image-forming systems having a relatively large focal length compared with the lateral dimension of the sensor (CMOS, CCD or PSD) used. The field of view of these optical systems is typically less than +/- 5° and offers an accuracy in measuring the angle of misalignment of +/- 0.01 °.

These systems offer excellent precision in measuring the pointing error but cannot be used for directly steering solar tracking systems because if the sun is located outside the field of view of the image-forming system, it is not possible to have an indication of the direction towards which the solar tracking system should be moved in order to be able to frame it once more.

There are examples of pointing sensors which use two distinct videocameras (see for example M. Davis, J. Lawler, J. Coyle, A. Reich, T. Williams, "Machine vision as a method for characterizing", Photovoltaic Specialists Conference, 2008. PVSC Ό8. 33rd IEEE, 1 1 - 16 May 2008), each having a different field of view; in this way it is possible to exploit the videocamera with the narrowest field of view to measure the angle of misalignment with greater resolution, while the videocamera with the widest field of view makes it possible to roughly identify the position of the sun even for considerably larger angles of misalignment. This solution makes it possible to use this type of pointing sensor for direct steering of solar tracking systems, but it is characterised by an intrinsically high cost. Each videocamera, in fact, has image-forming optics, a sensor and a steering unit for the sensor, which significantly increase the total cost.

Solar pointing systems are also known to experts in the field, based on shading sensors, commonly called four-quadrant sensors. These sensors are based on four photosensitive elements exposed to the solar rays and separated from each other by an opaque divider of suitable shape. This divider projects onto the plane in which the four photosensitive elements lie a shadow whose shape depends on the direction in which the sun is located with respect to a reference direction, typically perpendicular to the plane on which the four photosensitive elements lie. We cite as examples patents US 4146785 , US 41 5 1408, 4302710 and CN20235 1713. Thanks to the selective shading of a subset of the photosensitive elements, it is possible to identify the direction from which the solar rays come and therefore to have indications on how to move the solar tracker in order to eliminate the angle of misalignment.

A similar function can be performed by using three or, more frequently, four flat photosensitive elements (typically photodiodes or small photovoltaic cells) arranged on the inclined faces of a pyramid with a triangular or square base. The effect of the different relative inclination between the surfaces of the photosensitive elements and the direction of the solar rays makes it possible to identify the direction in which the sun is located.

In some cases, the components used as photosensitive elements are the flat photovoltaic modules themselves, which are arranged symmetrically with respect to a directrix direction. We cite as examples patents EP 255743 1 A l and CN 202109902 U.

This type of shading sensors has a relatively low cost and makes it possible to identify in which quadrant of the sky the sun is located, with respect to a reference direction of its own for each sensor, but, because of its intrinsic characteristics, does not make it possible to precisely measure the angle between the direction of the solar rays and said reference direction.

From the detailed description of the pointing sensor which is the subject of this patent application, the advantages of this solution may be understood, compared with the sensors hitherto developed and known to experts in the field. The subject of the present invention is therefore a pointing sensor for solar tracking systems comprising image acquisition means, a shading sensor integral and aligned with said image acquisition means, a processing unit and a containing body opaque to light radiation.

These and other characteristics of the invention, and the advantages which derive from them, will appear evident in the detailed description which follows of a preferred embodiment thereof, presented by way of non-limiting example, with reference to the attached plates of drawings, in which:

figure 1 represents the solar pointing sensor according to the present invention;

figure 2 represents the field of view of the image acquisition means of the pointing sensor depicted in Fig. l ;

figure 3 represents the field of view of the shading sensor of the pointing sensor depicted in Fig. l ;

figure 4A represents the pointing sensor according to the present invention in the case where the solar source lies inside the field of view of the acquisition means;

figure 4B represents the pointing sensor according to the present invention in the case where the solar source lies outside the field of view of the acquisition means.

Figure 1 represents the preferred embodiment of the pointing sensor which is the subject 'of this patent application. The videocamera 1 is enclosed within an opaque structure which functions as a divider 7 for the shading sensor 2.

Said shading sensor 2 consists of said opaque divider 7 and two pairs 5, 6 of photosensitive elements mounted on the same electronic board 15 on which is mounted the sensor 4 of the videocamera 1.

The opaque divider 7 can advantageously be used for enclosing not only the videocamera 1 but also a series of filters 16 which make it possible to attenuate the intensity or modify the spectrum of the light radiation falling on the instrument.

The processing unit 1 1 is located in proximity to the electronic board 15 and electrically connected thereto.

Figure 2 represents the field of view 101 of the videocamera 1 . The image- forming optics 3, 8 and the sensor 4 together determine the optical axis 100 and the width of the field of view 101 .

All the rays which enter the optical system through the aperture 8 at angles equal to or less than the angle which defines field of view 101 are focused on the sensitive area 9 of sensor 4.

The sensor forms a plane 103 on which, preferably, lie also the pairs of photosensitive elements 5,6 which, together with the divider 7, constitute the shading sensor 2.

The divider 7 also constitutes the body of the videocamera 1 which contains optical elements 3,8 and sensor 4.

The pointing sensor for solar tracking system according to the present invention consists of image acquisition means 1 , substantially a videocamera, having a first field of view 101 and shading sensor 2, integral and aligned with said videocamera 1 , having a second field of view 102, substantially wider than said first field of view 101 , and of an electronic processing unit 1 1 .

The field of view 101 of videocamera 1 is defined as the portion of space identified by all the rays which, passing through the aperture 8 and the image- forming system 3 , are focused on one of the points which constitute the photosensitive area 9 of sensor 4.

Said videocamera 1 comprises an aperture 8, an image-forming system 3 and a sensor 4 consisting of a plurality of photosensitive elements arranged as a matrix 9, called 'pixels'.

Preferably, but without limitation, the sensor can be of CMOS or CCD type, or made with another technology known to experts in the field.

The sensor 4 is mounted on an electronic board 15, preferably perpendicular to the optical axis 100, which defines the plane of the sensor 103.

The image-forming system 3 and the sensor 4 together define an optical axis 100 such that a ray of light parallel to said optical axis 100 is focused substantially at the centre of sensor 4.

The optical components 3,4,8 of the videocamera 1 are enclosed in a structure opaque to solar radiation which extends in the direction of the optical axis 100 constituted by the divider 7 of the shading sensor 2.

In a possible embodiment of the videocamera 1 the image-forming system 3 consists simply of the aperture 8, whose diameter is reduced so as to behave as a "pinhole". The diameter of this aperture 8, in the case where it is used as a "pinhole", is typically less than 1 mm, and preferably of a few tenths of a millimetre.

The shading sensor 2 is made up of at least two pairs of photosensitive elements 5,6 and of an opaque divider 7. Each of the pairs 5,6 of photosensitive elements is separated by at least one portion of said opaque divider 7 which extends along an axis substantially coinciding with the optical axis 100.

The photosensitive elements which constitute the pairs 5, 6 are preferably mounted on the same electronic board 15 as the sensor 4 and are arranged symmetrically with respect to the divider 7 in proximity to its perimeter.

The opaque divider 7 and the pairs of photosensitive elements 5,6 together identify a field of view 102 which is defined as the portion of space within which the solar source produces a differential electrical signal in the pairs of photosensitive elements 5,6, such as to make possible the indirect identification of the direction from which he solar rays are coming.

The position of the photosensitive elements which constitute the pairs 5 and 6 and the shape of the divider 7 ensure that, in the case of perfect alignment between the optical axis 100 and the direction of the solar rays, said photosensitive elements are, in pairs, illuminated with the same intensity and therefore produce an electrical signal, in pairs, of equal amplitude. If the direction of the solar rays is not perfectly aligned with the optical axis 100 the photosensitive elements which constitute pairs 5 and 6 produce an electrical signal of greater amplitude for the more greatly illuminated elements, indirectly indicating the direction in which the sun is located.

When the sun lies within the field of view 101 of the videocamera 1 , the image-forming system 3 makes it possible to form an image 12 of the solar source 10 inside the sensitive area 9 of the sensor 4, as in figure 4a.

Fig. 3 : Figure 3 represents the field of view 102 of the shading sensor 2. Said shading sensor is made up of at least two pairs of photosensitive elements 5,6 arranged symmetrically with respect to an opaque divider 7. When the direction from which the light rays emitted by the solar source 10 come is not perfectly aligned with the axis 100 of the opaque divider 7, a shade region 18 is created which causes one of the two photosensitive elements that constitute pairs 5 and 6 to be less illuminated than the respective element positioned in a manner diametrically opposite with respect to the opaque divider 7. The electronic board 15 measures the electrical signal generated by the photosensitive elements of the pairs 5,6, obtaining an indication of the portion of space from which the solar rays come. Because of the particular conformation of the shading sensor 2 it is not possible to effect a precise measurement of the angle between the direction from which the solar rays come and the optical axis 100, but it is possible to identify the portion of space in which the solar source 10 is located within a field of view 102 very much wider than field of view 101 , and typically of a half-width greater than 60°. Fig. 4: Figure 4A represents the sensor 4 of the videocamera 1 and the matrix of photosensitive elements 9 called 'pixels'. If the solar source 10 lies within the field of view 101 of the videocamera 1 , the optical components 3,8 bring about the formation of the image 12 of said solar source 10 within the sensitive area 9 of sensor 4.

In these conditions, by analysing successive images it is possible to calculate the angular displacement which the solar source 10 undergoes with respect to the optical axis 100 of the videocamera 1.

This image 12 is processed by the processing unit 1 1 and, by analysing consecutive images, it associates the position of said image 12 within the sensitive area 9 of sensor 4 with an angular displacement to be attributed to a misalignment between the optical axis 100 of the videocamera 1 and the direction from which the solar rays come.

Said signal processing unit 1 1 is capable of communicating to the outside the direction towards which the solar tracking system should be moved, and the pointing error and all the results of processing the signals coming from the videocamera 1 and from the shading sensor 2.

Preferably, but without limitation, the signal processing unit 15 communicates to the outside by means of electrical signals.

Alternatively, communication to the outside by the signal processing unit 15 can be by a wireless communication protocol (WiFi, Bluetooth, Zig-Bee, or other protocol known to experts in the field) or by optical signals, possibly conveyed in optic fibre.

In operating conditions it can happen that the misalignment between the direction of the solar rays and the optical axis 100 of the videocamera 1 becomes greater than the maximum aperture of the field of view 101 , and the image 12 of the solar source is formed outside the sensitive area 9 of the sensor 4 as in figure 4b. In this case the processing unit 1 1 is not able to utilise the output signal from the videocamera 1 to obtain information about the position of the sun with respect to the optical axis 100. Without the aid of a complementary alignment system 2, it would not be possible to identify the direction towards which the solar tracking system should be moved to bring the image 12 of the solar source 10 within the sensitive area 9 of the sensor 4.

In the pointing sensor here described, the processing unit 1 1 acquires not only the output signal from the videocamera 1 but also the electrical signal generated by the shading sensor 2. The comparison between the electrical signal generated by each of the photosensitive elements which constitute pairs 5 and 6 makes it possible for the processing unit 1 1 to determine in any event in which direction the sun is located with respect to the axis 100 even if the sun is outside field of view 101 .

As regards the general characteristics of the shading sensor 2, it is not generally possible to carry out precise measurements of the angle of misalignment between the direction of the solar rays and axis 100, but it allows the processing unit 1 1 to identify the direction towards which the solar tracking system should be moved to bring the solar source within the field of view 101 of videocamera 1 . Once the solar source comes within the field of view 101 of the videocamera 1 the processing unit 1 1 makes it possible to measure precisely the angle of misalignment and to make fine corrections to the pointing direction of the solar tracking system.

Combining the shading sensor 2 and the videocamera 1 into a single integrated instrument offers various advantages.

The use of the body of the videocamera 1 as the opaque divider 7 of the shading sensor 2 makes it possible to perfectly align the optical axis of the two sensors to form a single optical axis 100.

Otherwise, the two separate sensors would require micrometric adjustment of the direction of the divider 7 in order to obtain the condition of parallelism between the respective optical axes.

Figure 4B represents the same sensor 4 of the videocamera 1 and the same matrix of photosensitive elements 9 in the case where the solar source 10 lies outside the field of view 101 of said videocamera 1.

In this case, the optical components 3,8 form an image 12 of the solar source 10 outside the photosensitive area 9. This entails that an analysis of the image collected by the sensor 4 does not make it possible to have any information on the position of the solar source 10 with respect to the optical axis 100 of the videocamera 1 .

To get good angular resolution in the measurement of the angle of misalignment between the direction from which the solar rays come and the optical axis 100, it is necessary for the field of view 101 of the videocamera 1 to be sufficiently restricted.

Preferably the half-width of the field of view of the videocamera 1 must be less than 10°, and even more preferably less than 5°.

This entails that an angular displacement between the direction of the solar rays and the optical axis 101 takes the image 12 out of the sensitive area 9 of the sensor 4, with the effect that it is not possible to identify the direction in which the solar tracking system must be moved to bring the sun 10 within the field of view 101 of the videocamera.

This condition would preclude the use of a videocamera with a restricted field of view, with a sensor for solar tracking, unless it were complemented by a further sensor to provide for rough pre-positioning and to ensure that the solar source 10 always lies inside the field of view 101 of the videocamera 1 .