Realizing the Microscope using a Smartphones camera makes removing all lenses necessary, thus one needs to destroy a usually working phone.
Even though it’s open-source it’s really expensive. So why not finding a solution which still uses a smartphone, but only for processing raw data which is coming from a 3rd party device. Looking for a good solution having in mind, that the pixelsize should be as small as possible and accessible by a smartphone I’ve found the Sony QX 10 digital still camera. „Holoscop V3 – Sony Wifi-Cam as lensless Microscope (R.I.P)“ weiterlesen
So next Generation Holoscope is coming. And: It looks promising! Everything’s open-source. The concpet will work as follows:
- Raspberry Pi acts as a server; It handles communication between the Smartphone and the Hardware-stuff which is: Synchronisation of the Camera/Illumination Pattern, Driving the motor and sending the images to the phone
- The Smartphone shows a live-view of the live-stream coming from the Picamera and gives the chance to setup stuff like ISO, Exp-Time, etc. Triggering an acquisition set is also possible; It will also do reconstruction processing as it has – at least in my case – a better CPU/GPU than the PI.
- So far a router is needed as a Peer-to-peer Communication is not implmented yet
- Camera Sensor is the one from the PI-Camera module
„Raspberry Pi + Picamera + LCD + Zoomlens .. Holoskop V4“ weiterlesen
I stopped counting. Each iteration or revision of the Holoscope gave me troubles. This release looks promising. A raspberry creates a server and streams the camera to an android device.
The mobile-phone send and recieves http-requests and downloads the images from the Raspi. The first stream with a source-shift looks like that:
I’ll go into the details soon!
Bellow, there is a rough scematic of the workflow Android uses to synchronize the different hardware components. The most important thing is to capture a frame while displaying the pattern with the MHL-bridge. Tricky, but doable!
The image processing is shifted to OpenCV Android Native (C++). Faster and better prototyping. It was also possible to let – for example the FFT – work on the phones GPU which increases speed by a factor of about 5-6. I can publish the code eventually if required!
One thing in advance! I’m a really bad programmer!
Funktion – Neuer Ansatz
Light detectors are not yet able to record at the speed of light, thus, they are always integrating over time. The frequency a light train is sent out is in the range of terra-Herz and follows in wavelengths of 400-750nm in the visible area. A specimen can vary a wave coming from a source in its amplitude or phase. A phase is always a relative change of a wave compared to another part of the wave. There are two major techniques to sense this change in phase. The one is interferometry, which compares two waves and has the ability to measure the phase directly, the other one is a phase-recovery procedure which was – so far I know – introduced by the Gerchberg-Saxton Algorithm when reconstructing the phase of an X-RAY image. „Phaserecovery – A Finite Support Approach“ weiterlesen
Flower of the value 650nm, distance (sensor/object) ca. 9mm (Logitech)
„Testing the best – well..at least testing“ weiterlesen
Example for the subpixel Super-Resolution
So, what does this mean? Super-Resolution? This simply means, that it’s possible to extend the support of available information coming from the sensor by simply shift the objects image. In the first section of the lens-less project thing I mentioned, that one is capturing an interference pattern of the object multiplied with the reference wave and the reference wave itself. A so called hologram. Waves are continuous, the detectors used are not, so there is always a sampling which is depending on the size of the pixels – in this case.
„Super-Resolution is a super solution!“ weiterlesen