Most image sensors in use
today are made of silicon. Silicon has some
amazing properties. So here, you see a silicon atom. And when you hit a
silicon atom with a photon of sufficient energy,
it releases an electron. And what's created is called
an electron-hole pair. So now, if you have a
silicon crystal that's a lattice of silicon
atoms, and you can make this with very high
purity, you hit it with light. You have photon flux coming in. And you have electron flux,
which is being generated. There's going to be an
equilibrium between the photon flux and the electron flux. So really, silicon does
most of the work for you when it comes to image sensing. You hit it with light. And it generates electrons. And the work that
remains to be done, which is really
challenging, is to be able to read out
these electrons, convert them into voltage,
and read them out. And also, not to
forget that you're not looking at a single pixel,
just one lattice of silicon. You actually have
millions of pixels that you want to be able to
read these charges out from. That's where a lot of
the work has gone in to create these image sensors. Now, this is what an image
sensor actually looks like. This is an 18
megapixel image sensor. And each pixel here is
roughly one micron along each of its two dimensions,
1.25 microns in this case. That's really small,
so you can actually pack in 100 million
pixels on an image sensor today with ease using
today's technology. Now, this isn't quite
like Moore's law. You know, in
computations, according to Moore's law, every 18 months
you're going to be able to, with the same real
estate, double your computational power. Well, that doesn't happen in
the case of image sensors. In this case, you come down
to around the wavelength of light, which is around,
say, let's say half a micron. Once your pixel
is in that region, further making it
smaller doesn't really help you because the
resolution is now limited by this
diffraction effect itself, the wavelength
of light, the size of the wavelength of light. And making pixels any
smaller doesn't really buy you anything. So image resolution will
continue to grow a little bit, but at some point the only way
you can increase resolution is by making the chips
larger and larger. And we're almost there. So let's talk about the
first technology that's used to create image sensors. This is called CCD or
charge coupled devices. So here, you see your pixels. These are all your pixels. Each pixel has, look
at it as a bucket. We call these potential wells. These are wells in which
photons arrive and get converted into electrons. So it's photon to
electron conversion. And like I said before, the
real challenge is reading out, converting these electrons
into a voltage that's proportional to the
number of electrons. So the way CCD works
is that each row-- actually, let's take
a look at this row-- the photon the
electron conversion happens in the pixel itself. And each row passes
its electron counts, all its electrons
to the next row. And that passes it to the
next row and the next row and the next one. And finally comes
down to this bottom row, where it is read out
horizontally one pixel at a time. That is the electrons in
each pixel are converted to a voltage right here. And then this voltage
is, of course, an analog voltage, which
is then converted by analog to digital conversion,
which is A to D conversion to get your digital
output right here. So that's the process
that these pixels are read out in this fashion. So that sounds
simple, but it really is a transfer of charges from
one row to the next, which is a real innovation here. And this is a technique
called bucket brigade. So imagine that you
have a string of people. Each one has bucket of water. And I would pass on my
bucket to the next person and at the same time take bucket
from the person before me. So that's the way bucket
brigade actually works. And so in this case, how do
you actually move these charges from row to row? Well, the way you do it is
you apply electric fields to appropriate positions
underneath these buckets to slide or to
shift these charges from one row to the next. And that is a really
sophisticated piece of technology
because along the way you don't want to
lose any electrons. And you don't want to collect
any spurious electrons either. So that's CCD technology. And then you have
CMOS image sensors. CMOS is complementary
metal-oxide semiconductor, another type of technology. In this case, again, you
have a potential well, where you're collecting light,
for instance this one right here. But sitting right
next to it is also the circuit that converts
your electrons to a voltage. So its an electron to
voltage conversion circuit which is sitting at each pixel. Each pixel has its own
circuit, and it's not one circuit being shared
by the entire chip. So in this particular
technology, what you can do is simply address or
pull one particular pixel and be able to read
its voltage out. So you can go to this
pixel right here, or you can go to
another pixel and so on. And for that matter,
if you were interested not in the entire image, but
a small region in the image, you can read out those
pixels at a much faster rate because there are fewer pixels
and less values to read out. You can actually increase
the frame rate of the camera substantially by just reading
out that region of interest. So a lot of flexibility in
the case of CMOS technology, but the price that
you pay is that now your light sensitive area that
you're exposing to the world is smaller because
sitting next to it you need the circuit that
converts electrons to voltage. So CMOS and CCD are both
very popular technologies in consumer cameras today. I would say that CMOS
technology dominates because of its flexibility. And it's really come a long
way in terms of its quality. And there's more
to an image sensor. So here, you see the potential
wells corresponding to pixels. So this right here is one pixel. This is the next pixel. So that's the area of pixels. And we'll call them photodiodes. But then sitting on top of
each one is a color filter. You see, a pixel doesn't really
know which color of light is arriving there. It just is counting photons. And so in order
to measure color, you're going to use
color filters which sit above the pixel
itself, the potential well. And at any given location,
you can only measure one color because, again, the pixel can't
differentiate between colors. So you use one particular
filter right here, let's say a red one
here, a green one here, a blue one here. And after you've
captured your image, you can actually take these red,
green, and blue values, which are scattered around the
image and interpolate them to figure out what
red, green, and blue would be at each point. And we'll talk about this later. So those are your color filters. It's called the color mosaic. And then you may be
interested in knowing that each pixel actually has
a lens sitting on top of it, this is called a microlens. This is not the lens
that's forming the image. What this lens does is
that it just takes light from the main lens, and
it focuses this light onto the active area of
the light sensitive area of the pixel, which
is shown down here. So this lens focuses light onto
this tiny little window here. And the reason that
this window is smaller than the size of the pixel is
because, often, like I said, there's circuitry, and there
are leads and so on that are sitting around the pixel. And you don't want to
waste the light that's falling on that region. So you take all the
light, and you channel it down to the active area. And here, you see a scanning
electron microscope, a beautiful image of the cross
section of an image sensor. You see right here
the microlenses. This is one microlens for
this particular pixel. Here, you see the color filter. In this case, it happens
to be a blue filter. Sitting next to it is a red
filter for the next pixel. And underneath that is
your potential well, the pixel itself where
charges are being collected and circuitry to go with it. And note that the
distance between the top of this microlens, right
here, and all the way down here to the bottom of the
circuitry, is 9.6 micrometers. So there's a lot
of stuff happening. There's a lot of action going
on in this very thin layer of silicon. And that's why I
said to you earlier that you will see with
the passage of time that there's going to be
more and more circuitry built underneath these. So that in a single
wafer you can have perhaps image processing
and computer vision happening with the image sensor layer. And then your color layer
and then your microlens layer, all being grown on one
single wafer of silicon. And in fact, the main lens being
grown on top of that as well. This is optics on wafer. This is all coming in
the decades to come.