{"id":1791,"date":"2017-10-13T22:44:31","date_gmt":"2017-10-13T20:44:31","guid":{"rendered":"https:\/\/www.adimec.com\/camera-requirements-for-3d-metrology\/"},"modified":"2018-07-27T16:09:23","modified_gmt":"2018-07-27T14:09:23","slug":"camera-requirements-for-3d-metrology","status":"publish","type":"post","link":"https:\/\/www.adimec.com\/ja\/camera-requirements-for-3d-metrology\/","title":{"rendered":"Camera Requirements for 3D Metrology"},"content":{"rendered":"
The increasing trend from 2D measurements to 3D in semiconductor<\/a> and electronics metrology<\/a> and inspection systems results in more stringent requirements for the OEM camera. <\/p>\n <\/p>\n There are camera parameters to consider in addition to resolution and frame speed for accurate 3D metrology. Here are some other machine vision camera parameters to include when analyzing the fit of a camera within an automation system:<\/p>\n Today 4 to 25 Megapixel cameras are common. Higher resolution allows for larger inspection areas and\/or higher accuracy<\/p>\n Whereas consumer cameras have pixels that are less than 1 um in size, machine vision cameras typically have pixel sizes ranging from 4 to 10um. Smaller pixels may result in cheaper cameras at the cost of poorer measurement accuracy.<\/p>\n System speed is one of the most important selling points for in-line equipment manufacturers; this is directly achieved by increasing frame speed.<\/p>\n The choices greatly influence system design as it determines cable length, flexibility, and overall system costs<\/p>\n This is important during the design phase, but it also influences maintenance costs during the lifetime of the equipment.<\/p>\n The image sensor\u2019s spectral response has to match the spectral response of the light source used. A poor match between illuminator wavelength and sensor quantum efficiency (Qe) results in poor measurement accuracy due to too much noise in the image, or will lead to the selection of a more expensive light source.<\/p>\n This determines the noise floor in dark image areas and influences the accuracy of the measurement<\/p>\n This determines the noise in bright areas of the image and influences the accuracy of the measurement<\/p>\n Most measurement methods assume a linear response of the pixel to light but this may not be true from raw image sensor output<\/p>\n Dark signal non-uniformity (DSNU), photo response non-uniformity (PRNU), striping, shading, and defective pixels, columns and rows all influence measurement accuracy<\/p>\n The MTF determines how \u201csharp\u201d the image is and determines the smallest errors that can be detected and the measurement accuracy. Note that the MTF of an image sensor is wavelength-dependent and in general deteriorates at longer wavelengths (such as near-infrared light)<\/p>\n <\/p>\n Not all of these parameters can be optimized at the same time and the most important specifications depend on the application. With the most critical specifications prioritized, unnecessary costs can be avoided. There are several measurement methods commonly used in semiconductor and electronics manufacturing, which provide good examples for this.<\/p>\n\n
\n
\n
\n
\n
\n
\n
\n
\n
\n
\n