]> pere.pagekite.me Git - homepage.git/blob - mypapers/socc-img-calibr/index.html
Generated.
[homepage.git] / mypapers / socc-img-calibr / index.html
1 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"
2 "http://www.w3.org/TR/REC-html40/loose.dtd">
3 <HTML>
4 <META NAME="GENERATOR" CONTENT="TtH 2.67">
5
6
7
8 <title> Camera calibration for CIIPS Glory soccer programs 1998/1999</title>
9
10 <H1 align="center">Camera calibration for CIIPS Glory soccer programs 1998/1999 </H1>
11
12 <H3 align=center>Petter Reinholdtsen &lt; pere@td.org.uit.no &gt; </H3>
13
14 <H3 align=center>2000-03-01 </H3>
15
16 <p>
17 <H2><A NAME="tth_sEc1">
18 1</A>&nbsp;&nbsp;Purpose of this paper</H2>
19
20 <p>
21 The image processing functions in the CIIPS Glory soccer programs for
22 year 1998 and 1999 uses lookup tables to determine distances. These
23 lookup tables depends on the cameras physical position, the servo
24 configuration on the hardware description table and the lens view
25 angle. The tables assume a tilting camera.
26
27 <p>
28 The original programs ran on Eyebot robots with Color Quickcam, and
29 the current tables reflect the settings which was valid then. The
30 current robots have different HDT settings, some of them have
31 different lenses and others use different cameras (EyeCam).
32
33 <p>
34 This document gives a short description on how to calibrate the
35 tables. The method was developed by Birgit Graf for her diploma
36 Thesis.
37
38 <p>
39 <H2><A NAME="tth_sEc2">
40 2</A>&nbsp;&nbsp;Camera focusing</H2>
41
42 <p>
43
44 <p><A NAME="tth_fIg1">
45 </A> <center> <img src="lensfocus.png" alt="lensfocus.png"><br> <center>Figure 1: Focus pattern</center>
46 <A NAME="fig:lensfocus">
47 </A>
48 </center><p>
49 <p>
50 Before the camera can be used, we need to make sure the lens is in
51 focus. Focusing is done by turning the lens in it's socket. The
52 QuickCam lenses can be turned right away, while the EyeCam lenses need
53 to have a screw on the side unscrewed before they will move.
54
55 <p>
56 The simplest way to focus the cameras is to connect them to a PC to
57 see the color images at full framerate. By using a simple pie drawing
58 with 2.5 degree black and white arcs, focusing is done by turning the
59 lense until the blurry center is as small as possible<a href="#tthFtNtAAB" name="tthFrefAAB"><sup>1</sup></a>. Figure <A href="#fig:lensfocus">1</A>
60 gives an example of this pattern.
61
62 <p>
63 <H2><A NAME="tth_sEc3">
64 3</A>&nbsp;&nbsp;Lookup tables</H2>
65
66 <p>
67 The soccer code uses two lookup tables to calculate different
68 distances and pixel widths. Each table has tree different settings,
69 reflecting the tree camera angles used; middle=0, up=1 and down=2.
70 The two tables are used for horizontal and vertical calculations.
71
72 <p>
73 <H3><A NAME="tth_sEc3.1">
74 3.1</A>&nbsp;&nbsp;The horizontal <tt>yfact</tt> table</H3>
75
76 <p>
77 The horizontal table <tt>yfact[3][imagerows]</tt> gives the
78 multiplication factor for each row to convert pixels to meters. It
79 can be used to convert m meters to p pixels on a given row. <tt>
80 campos</tt> is the numeric representation of current camera position.
81
82 <p>
83
84 <br clear="all"><table border="0" width="100%"><tr><td>
85 <Table align="center"><tr><td nowrap align="center">
86 p = </td><td nowrap align=center>
87 m<hr NOSHADE>yfact[campos][row]<Br></td><td nowrap align=center>
88 </td></Table>
89 </td><td width="1%">(1)</td></table>
90
91
92
93 <p>
94 This is used to predict the ball width in pixels when searching for
95 the ball in the images.
96
97 <p>
98 It can also be used to convert pixels to meters. This assumes that
99 the camera is mounted in the center of the robot, and calculates
100 pixels to the left or to the right of the robot axes. The field pixel
101 located at (xpos,ypos) will then be located m<sub>y</sub> meters to the left or
102 right of the robot.
103
104 <p>
105
106 <br clear="all"><table border="0" width="100%"><tr><td>
107 <Table align="center"><tr><td nowrap align="center">
108 m<sub>y</sub> = </td><td nowrap align=center>
109 imagecolumns<hr NOSHADE>2<Br></td><td nowrap align=center>
110 <font face=symbol>-</font
111 > 1 <font face=symbol>-</font
112 > ypos &times;yfact[campos][xpos]</td></Table>
113 </td><td width="1%">(2)</td></table>
114
115
116
117 <p>
118 The <tt>yfact</tt> table is generated using a rectangular white or light
119 sheet of paper, placed perpendicular to the view direction. The robot
120 is placed on the soccer field, and the paper is placed in the upper
121 center of the image. The camera servo must be set to the correct
122 angle. Check <tt>servos.c</tt> for the correct values.
123
124 <p>
125 Make sure the paper is visible in the upper rows of the image, and
126 that the edges are visible. Take a snapshot. Move the robot closer
127 to the paper. The paper will now cover rows further down in the
128 image. Take a new snapshot. Continue with this procedure until all
129 rows are covered. You will need tree sets of snapshots, one for each
130 camera servo setting. To take snapshots I used the ImACam eyebot
131 program. This will produce PPM images and upload them to the PC.
132
133 <p>
134 Using these images, you then measure the pixel width of the piece of
135 paper for each row in the image. This width, p, is then used
136 together with the paper width, w, to calculate the <tt>yfact</tt>
137 value, f<sub>y</sub>.
138
139 <p>
140
141 <br clear="all"><table border="0" width="100%"><tr><td>
142 <Table align="center"><tr><td nowrap align="center">
143 f<sub>y</sub> = </td><td nowrap align=center>
144 w<hr NOSHADE>p<Br></td><td nowrap align=center>
145 </td></Table>
146 </td><td width="1%">(3)</td></table>
147
148
149
150 <p>
151 To measure the pixel width, I used xv to display the image, ' &gt; ' to
152 enlarge the image and the middle mouse button to read the pixel width.
153
154 <p>
155 <H3><A NAME="tth_sEc3.2">
156 3.2</A>&nbsp;&nbsp;The vertical <tt>x2m</tt> table</H3>
157
158 <p>
159 To find the distance to the robot along the view direction, the soccer
160 programs uses the table <tt>x2m[3][imagerows]</tt>. It translates from
161 pixel row to distance in meters from the camera. To make this table,
162 the distance to the edge of a sheet of paper is measured, together
163 with the row number it appears in. This needs to be done for each
164 row, and for each camera servo setting.
165
166 <p>
167 To find the distance m<sub>x</sub> from the robot, a simple table lookup is
168 performed:
169
170 <br clear="all"><table border="0" width="100%"><tr><td>
171 <Table align="center"><tr><td nowrap align="center">
172 m<sub>x</sub> = x2m[campos][xpos]</td></Table>
173 </td><td width="1%">(4)</td></table>
174
175
176
177 <p>
178 <H2><A NAME="tth_sEc4">
179 4</A>&nbsp;&nbsp;Interpolation</H2>
180
181 <p>
182 When the measured values are collected, one can use various tools to
183 find a formula which closely matches the measured values. I used
184 Mathematica, with the help of Thomas Hanselmann, to interpolate one
185 reading into a polynomial.
186
187 <p>
188 From the dataset, I made a textfile <tt>dataXY.txt</tt> with the
189 coordinates (x,y) as s space separated list: ``x1 y1 x2 y2 ...''.
190
191 <p>
192 I then used the following Mathematica commands to make the formula.
193 You might have to adjust the parameters to <tt>Fit</tt> to generate a
194 more accurate formula.. The datafile must have all values on one line
195 to make Mathematica happy.
196
197 <p>
198
199 <blockquote>
200 <pre>
201 dataXY = ReadList["data.txt",
202 {Number, Number},
203 RecordLists -&#62; True][[1]];
204 func = Fit[dataXY, {1, x, x^1.1}, x];
205 ymax = Max[Transpose[dataXY][[2]]];
206 pOriginal = ListPlot[dataXY,
207 PlotJoined-&#62;True,
208 PlotRange-&#62;{{0,61},{0,ymax}},
209 PlotStyle-&#62;{Hue[0.1]}];
210 pFit = Plot[func, {x,0,61},
211 PlotRange-&#62;{{0,61},{0,ymax}},
212 PlotStyle-&#62;{Hue[0.6]} ];
213 Show[pFit,pOriginal]; func
214 </pre>
215 </blockquote>
216
217 <p>
218 <H2><A NAME="tth_sEc5">
219 5</A>&nbsp;&nbsp;The hard way</H2>
220
221 <p>
222 The best way to do such camera calibration would be using a
223 mathematical model for the camera, taking the known properties of the
224 camera and the servo into account. If we make sure the HDT contains
225 enough information to calibrate the cameras, the programs should be
226 more generic and adapt better to changing settings. I hope to find
227 time to investigate this further.
228
229 <p>
230
231 <H2>Appendix A</H2>
232
233 <p>
234 Complete Postscript file to make lens focus pattern.
235
236 <p>
237 <A NAME="appendix:lensfocus">
238 </A>
239 <pre>
240 %!PS-Adobe-1.0
241 %%Title: Camera lens focusing sheet
242 %%Creator: Petter Reinholdtsen &lt;pere@td.org.uit.no&#62;
243 %%CreationDate: 1999-12-04
244 %%BoundingBox: 13 14 574 575
245 %%Pages: 1
246 % Place this in front of the camera, and change
247 % focus until the black center spot is as small
248 % as possible. The camera should then be in
249 % focus.
250 /cm { 28 mul } def .00001 setlinewidth
251 % Center of circle
252 10.5 cm 10.5 cm translate
253 % scale 1 to fill page
254 10 cm 10 cm scale
255 newpath 0 0 1 0 360 arc stroke
256 72 { newpath 0 0 moveto
257 0 0 1 0 2.5 arc closepath fill 5 rotate
258 } repeat showpage
259 </pre>
260
261 <p>
262 <hr><H3>Footnotes:</H3>
263
264 <p><a name="tthFtNtAAB"></a><a href="#tthFrefAAB"><sup>1</sup></a>Thanks
265 to Mark Gaynor for this method.
266 <p><hr><small>File translated from
267 T<sub><font size="-1">E</font></sub>X
268 by <a href="http://hutchinson.belmont.ma.us/tth/">
269 T<sub><font size="-1">T</font></sub>H</a>,
270 version 2.67.<br>On 23 Apr 2000, 20:18.</small>
271 </HTML>