This post consists of: Table of Contents
1 Short Answers
1.AFirstly, We use following formula to calculate Then we can get number of zeros using this formula: \begin{equation} nz = 4x_n + 4y_n + 4p^2 \end{equation}
1.B Separable KernelsWe know the output matrices of the result of multiplying a row and a column matrix is rank 1. So the only requirement of being separable is that the kernel itself must be rank 1. Here is an example: \begin{equation} x= \begin{bmatrix} 1
1.C Speed Up of Separable KernelsWe follow this formula which exists in reference book: \begin{equation} speed = \frac{MNmn}{MNm+MNn} = \frac{mn}{m+n} \end{equation}
2 Functionality of Separable Kernels of Kernel so-called Sobel\begin{bmatrix} -1 & 0 & 1
2.A Getting Separable Kernels
2.B Functionalities
2.B.a\begin{bmatrix} -1
Smooths the image (decreases the frequency) for example decreasing the difference from 5x to 2x. 2.B.b\begin{bmatrix} -1
Intensifies the difference between low and high value pixels 3 Calculate on Paper:
3.1 Norm
3.2 Inner ProductThey are orthogonal not orthonormal. 3.3 Find Degree
4 Implementations
4.1 Implement make_gaussian(std, size)
4.2 Implement convolve2d(image, kernel)
4.3 Implement opencv_median_blur(image, size) from cv2In Open CV, `size` argument also exists and it is mandatory which you omitted.
4.4 Implement opencv_gaussian_blur(image, std, size) from cv2
4.5 Analyze time and ResultsIn Open CV, `size` argument also exists for `median` and it is mandatory which you omitted so I changed the `for` loop even though you said no to, to add `size=11` to `median`.The reason I chose
Comparing Time ConsumptionFirst of all, our implementation time is much bigger that open cv’s and it is completely obvious. If we want to compare gaussian, out convolution2d function has not been implemented efficiently which the main reason is the 2 But if want to compare gaussian and median time, we can say that median operation is ordered and needs sorting every time and furthermore, it cannot be implemented using kernel logics which means most of the optimizations related to parallel computing or efficient vectorization will be lost, so that is why we can see an explosion about 10x increase in time. Note: As you can see one of the
times are Comparing Quality
First two row of images correspond to the images with normal noise. So A gaussian blur can easily handle this kind of noises by reducing the frequency between pixels in the vicinity of each other without making image to blurry. But median, cannot find the good pixel to retain information without making the image completely blurry, because the noise is distributed almost between all values so median cannot do much and as you can see in the last column of the two first row, images are just blurry without removing noise. But about the latter second rows, we can see the salt and pepper noise and this time, gaussian cannot do much because when it tries to decrease intensification, the output of convolution consists of 0 or 255 values and they will simply dominate other values so the output will be a blurry image without removing salt or peppers. They are still completely visible. But when we have such a noise in our images, we know they are minority in number so taking median, will focus on pixels that are more common in each window, so 0 or 255 values will not effect in median operation because they are almost rare in each window, so the output of images using this median filter will be blurry image without salt and peppers. |