Talk:Kernel (image processing)
This is the talk page for discussing improvements to the Kernel (image processing) article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||
|
Add these Concepts
[edit]Concept which should be added to the article.
Details
[edit]Separability of the kernel, which can significantly increase algorithmic efficiency (though memory requirements also increase)
http://www.songho.ca/dsp/convolution/convolution2d_separable.html
Convolution
[edit]Flipping of the kernel, which preserves commutativity and associativity (evidently...)
http://s000.tinyupload.com/index.php?file_id=00035872171331523574
Terminology/relationship
[edit]Why is this termed a "kernel"? Is it simply an example (applied to image processing) of precisely something that was already termed a kernel in other pre-existing fields of mathematics? Is there an agreed definition? For example, does it cease to be a kernel if the input image is as small as the matrix being convolved with it? Cesiumfrog (talk) 02:09, 11 October 2015 (UTC)
And "convolution" seems to have a generic meaning, and a specific meaning of "flipping" (as described above in this talk article). A "convolution matrix" is not flipped? I am guessing here but note there is no point "flipping" the symetric martricies used in this article. Can someone who knows about image processing check that.
- I teach image processing and my students dug up this source. I have never heard of "flipping" being necessary and neither have five colleagues of mine. But apparently, I lean something new every day. It is important to know that "flipping" means "mirror both ways" – not "transpose the matrix". Some sources like to hide this by relocating the origin and reversing the axis directions. Others show the already flipped kernel and do not mention the change at all.
- It becomes obvious only if you compare the mathematical definitions of Cross-correlation and Convolution:
- In convolution, the kernel's elements are read in reverse direction. This affects commutativity and separability, which in turn is important for fast implementations. I am still trying to figure out what this means for the practical aspects of image processing and how to teach this in the future. --Hoehermann (talk) 13:36, 13 June 2018 (UTC)
If someone reading this feels ownership over this article, I urge you to define and better in the "Details" section--specifically the set of values they can take. If you already understand convolutions for image processing, it will probably be obvious, but the target audience should be those who do not. No reputable text or article omits defining any term that appears in a mathematical expression.Chafe66 (talk) 21:10, 25 October 2019 (UTC)
Adding on to Chafe66's and Hoehermann's points, if one actually attempts to substitute the values of the summation range into the definition, you start off immediately with , where and are nowhere defined. Does this imply negative indices in the kernel matrix? How are negative indices interpreted, e.g., using a Python-esque convention where x[-1] refers to the last element in an array? That would be consistent with rotation of the kernel in 2D convolution, but if that's the case, that should be explained.
Unsharp masking kernel might not be correct
[edit]I believe the Unsharp Masking kernel should have a central element of 220/256, not 476/256, so that the sum of the elements is 1 (not 2), just like the Sharpen kernel. The fix, if I am correct, is to replace "-476" with "-220". The answer depends on how the author used the Unsharping Masking operator. Engineer editor (talk) 19:24, 5 February 2018 (UTC)
The unsharp operation is: identity + amount * (identity - gaussian blur). Engineer editor (talk) 19:28, 5 February 2018 (UTC)