Bellow, there is a rough scematic of the workflow Android uses to synchronize the different hardware components. The most important thing is to capture a frame while displaying the pattern with the MHL-bridge. Tricky, but doable!
The image processing is shifted to OpenCV Android Native (C++). Faster and better prototyping. It was also possible to let – for example the FFT – work on the phones GPU which increases speed by a factor of about 5-6. I can publish the code eventually if required!
One thing in advance! I’m a really bad programmer!
Light detectors are not yet able to record at the speed of light, thus, they are always integrating over time. The frequency a light train is sent out is in the range of terra-Herz and follows in wavelengths of 400-750nm in the visible area. A specimen can vary a wave coming from a source in its amplitude or phase. A phase is always a relative change of a wave compared to another part of the wave. There are two major techniques to sense this change in phase. The one is interferometry, which compares two waves and has the ability to measure the phase directly, the other one is a phase-recovery procedure which was – so far I know – introduced by the Gerchberg-Saxton Algorithm when reconstructing the phase of an X-RAY image. „Phaserecovery – A Finite Support Approach“ weiterlesen →
So, what does this mean? Super-Resolution? This simply means, that it’s possible to extend the support of available information coming from the sensor by simply shift the objects image. In the first section of the lens-less project thing I mentioned, that one is capturing an interference pattern of the object multiplied with the reference wave and the reference wave itself. A so called hologram. Waves are continuous, the detectors used are not, so there is always a sampling which is depending on the size of the pixels – in this case.
In the last post I was talking about the efficiency of the setup. When illuminating the sample with a but-coupled LED one is loosing about 95% of the LEDs light-power. In case of the DMD it’s a bit less, but without the trouble to make the light-guide fitting in the LEDs housing.
The spectral-bandwidth of the in-line microscope should not extend 20nm measured at full width half maximum. Therefore one can use a nodge-filter. To get the highest power from the LED, the spectrum of the LED with smallest bandwidth has to be determined therefore.
In the Graph below, the normalized emission spectrum of the projector (with RGB LEDs) is shown. Peak intensity is at blue-line at 445 nm. Smaller wavelength also gives the opportunity to image smaller details which can later be seen.
From the figures above it can be seen directly, that either the distance from the object to the sensor has to be small or the interference-partners have to be close together to get the required fringes.
Lensless Microscope – New Approach (Version 2)
When building the first version of the Holoscope using but-coupled LEDs, it turned to be really hard putting the fibres in to the right position of the LEDs. The fibres are really thin and the efficiency degrades dramatically if the fibre is not positioned directly on the led’s dye.