1 \documentclass[a4paper,
10pt, twocolumn
]{article
}
5 \title{Camera calibration for CIIPS Glory soccer programs
1998/
1999}
6 \author{Petter Reinholdtsen $<$pere@td.org.uit.no$>$
}
12 \section{Purpose of this paper
}
14 The image processing functions in the CIIPS Glory soccer programs for
15 year
1998 and
1999 uses lookup tables to determine distances. These
16 lookup tables depends on the cameras physical position, the servo
17 configuration on the hardware description table and the lens view
18 angle. The tables assume a tilting camera.
20 The original programs ran on Eyebot robots with Color Quickcam, and
21 the current tables reflect the settings which was valid then. The
22 current robots have different HDT settings, some of them have
23 different lenses and others use different cameras (EyeCam).
25 This
document gives a short description on how to calibrate the
26 tables. The method was developed by Birgit Graf for her diploma
29 \section{Camera focusing
}
33 \includegraphics[height=
2.7cm,width=
2.7cm
]{lensfocus
}
34 \caption{Focus pattern
}
38 Before the camera can be used, we need to make sure the lens is in
39 focus. Focusing is done by turning the lens in it's socket. The
40 QuickCam lenses can be turned right away, while the EyeCam lenses need
41 to have a screw on the side unscrewed before they will move.
43 The simplest way to focus the cameras is to connect them to a PC to
44 see the
color images at full framerate. By using a simple pie drawing
45 with
2.5 degree black and white arcs, focusing is done by turning the
46 lense until the blurry center is as small as possible
\footnote{Thanks
47 to Mark Gaynor for this method.
}. Figure
\ref{fig:lensfocus
}
48 gives an example of this pattern.
50 \section{Lookup tables
}
52 The soccer code uses two lookup tables to calculate different
53 distances and pixel widths. Each table has tree different settings,
54 reflecting the tree camera angles used; middle=
0, up=
1 and down=
2.
55 The two tables are used for horizontal and vertical calculations.
57 \subsection{The horizontal
{\tt yfact
} table
}
59 The horizontal table
{\tt yfact
[3][imagerows
]} gives the
60 multiplication factor for each row to convert pixels to meters. It
61 can be used to convert $m$ meters to $p$ pixels on a given row.
{\tt
62 campos
} is the numeric representation of current camera position.
65 p =
{m
\over yfact
[campos
][row
]}
68 This is used to predict the ball width in pixels when searching for
69 the ball in the images.
71 It can also be used to convert pixels to meters. This assumes that
72 the camera is mounted in the center of the robot, and calculates
73 pixels to the left or to the right of the robot axes. The field pixel
74 located at (xpos,ypos) will then be located $m_y$ meters to the left or
78 m_y =
{imagecolumns
\over 2} -
1 - ypos
\times yfact
[campos
][xpos
]
81 The
{\tt yfact
} table is generated using a rectangular white or light
82 sheet of paper, placed perpendicular to the view direction. The robot
83 is placed on the soccer field, and the paper is placed in the upper
84 center of the image. The camera servo must be set to the correct
85 angle. Check
{\tt servos.c
} for the correct values.
87 Make sure the paper is visible in the upper rows of the image, and
88 that the edges are visible. Take a snapshot. Move the robot closer
89 to the paper. The paper will now cover rows further down in the
90 image. Take a new snapshot. Continue with this procedure until all
91 rows are covered. You will need tree sets of snapshots, one for each
92 camera servo setting. To take snapshots I used the ImACam eyebot
93 program. This will produce PPM images and upload them to the PC.
95 Using these images, you then measure the pixel width of the piece of
96 paper for each row in the image. This width, $p$, is then used
97 together with the paper width, $w$, to calculate the
{\tt yfact
}
104 To measure the pixel width, I used xv to display the image, '$>$' to
105 enlarge the image and the middle mouse button to read the pixel width.
107 \subsection{The vertical
{\tt x2m
} table
}
109 To find the distance to the robot along the view direction, the soccer
110 programs uses the table
{\tt x2m
[3][imagerows
]}. It translates from
111 pixel row to distance in meters from the camera. To make this table,
112 the distance to the edge of a sheet of paper is measured, together
113 with the row number it appears in. This needs to be done for each
114 row, and for each camera servo setting.
116 To find the distance $m_x$ from the robot, a simple table lookup is
119 m_x = x2m
[campos
][xpos
]
122 \section{Interpolation
}
124 When the measured values are collected, one can use various tools to
125 find a formula which closely matches the measured values. I used
126 Mathematica, with the help of Thomas Hanselmann, to interpolate one
127 reading into a polynomial.
129 From the dataset, I made a textfile
{\tt dataXY.txt
} with the
130 coordinates (x,y) as s space separated list: ``x1 y1 x2 y2 ...''.
132 I then used the following Mathematica commands to make the formula.
133 You might have to adjust the parameters to
{\tt Fit
} to generate a
134 more accurate formula.. The datafile must have all values on one line
135 to make Mathematica happy.
139 dataXY = ReadList
["data.txt",
141 RecordLists -> True
][[1]];
142 func = Fit
[dataXY,
{1, x, x^
1.1}, x
];
143 ymax = Max
[Transpose
[dataXY
][[2]]];
144 pOriginal = ListPlot
[dataXY,
146 PlotRange->
{{0,
61},
{0,ymax
}},
147 PlotStyle->
{Hue
[0.1]}];
148 pFit = Plot
[func,
{x,
0,
61},
149 PlotRange->
{{0,
61},
{0,ymax
}},
150 PlotStyle->
{Hue
[0.6]} ];
151 Show
[pFit,pOriginal
]; func
156 \section{The hard way
}
158 The best way to do such camera calibration would be using a
159 mathematical model for the camera, taking the known properties of the
160 camera and the servo into account. If we make sure the HDT contains
161 enough information to calibrate the cameras, the programs should be
162 more generic and adapt better to changing settings. I hope to find
163 time to investigate this further.
167 \section*
{Appendix A
}
169 Complete Postscript file to make lens focus pattern.
171 \label{appendix:lensfocus
}
172 \input{lensfocus.tex
}