afrivilla.blogg.se

Viewing google photosphere
Viewing google photosphere















There is some residual light artifacts left in the filtered image, but I have to bite the cost or else I lose the resolution of the very thin hydra such as in the top left of the original image. I have so far applied a circle mask and an orange color space mask to create a cleaned up image so that it's mostly just the shrimp and hydra. An example of a snap image from the machine of the petri dish looks like so:

Viewing google photosphere code#

Here is how the code could look like:įor my research project I'm trying to distinguish between hydra plant (the larger amoeba looking oranges things) and their brine shrimp feed (the smaller orange specks) so that we can automate the cleaning of petri dishes using a pipetting machine. One way to do that is to create a CGImage from CAShapeLayer containing the mask and then create CIImage out of it. So the mask image has to be a CIImage as well. And now you can use the CIKMeans filter with it as described at the beginning.īTW, if you want to play with every single of the 230 filters out there check this app out: ĬIFilters can only work with CIImages. The output of that filter will give you the image with all pixels outside the contour fully transparent. inputMaskImage is the mask you created above.inputBackgroundImage is a fully transparent (clear) image.Create a mask image but setting all pixels inside the contour white and black outside (set background to black and fill the path with white).Now, to make all pixels transparent outside of the contour, you could do something like this: If you could make all pixels outside of the contour transparent then you could use CIKmeans filter with inputCount equal 1 and the inputExtent set to the extent of the frame to get the average color of the area inside the contour (the output of the filter will contain 1-pixel image and the color of the pixel is what you are looking for).















Viewing google photosphere