Graph kernel
In structure mining, a graph kernel is a kernel function that computes an inner product on graphs.[1] Graph kernels can be intuitively understood as functions measuring the similarity of pairs of graphs. They allow kernelized learning algorithms such as support vector machines to work directly on graphs, without having to do feature extraction to transform them to fixed-length, real-valued feature vectors. They find applications in bioinformatics, in chemoinformatics (as a type of molecule kernels[2]), and in social network analysis.[1]
Concepts of graph kernels have been around since the 1999, when D. Haussler[3] introduced convolutional kernels on discrete structures. The term graph kernels was more officially coined in 2002 by R. I. Kondor and J. Lafferty[4] as kernels on graphs, i.e. similarity functions between the nodes of a single graph, with the World Wide Web hyperlink graph as a suggested application. In 2003, Gärtner et al.[5] and Kashima et al.[6] defined kernels between graphs. In 2010, Vishwanathan et al. gave their unified framework.[1] In 2018, Ghosh et al. [7] described the history of graph kernels and their evolution over two decades.
Applications
[edit]The marginalized graph kernel has been shown to allow accurate predictions of the atomization energy of small organic molecules.[8]
Example Kernels
[edit]An example of a kernel between graphs is the random walk kernel,[5][6] which conceptually performs random walks on two graphs simultaneously, then counts the number of paths that were produced by both walks. This is equivalent to doing random walks on the direct product of the pair of graphs, and from this, a kernel can be derived that can be efficiently computed.[1]
Another examples is the Weisfeiler-Leman graph kernel[9] which computes multiple rounds of the Weisfeiler-Leman algorithm and then computes the similarity of two graphs as the inner product of the histogram vectors of both graphs. In those histogram vectors the kernel collects the number of times a color occurs in the graph in every iteration. Note that the Weisfeiler-Leman kernel in theory has an infinite dimension as the number of possible colors assigned by the Weisfeiler-Leman algorithm is infinite. By restricting to the colors that occur in both graphs, the computation is still feasible.
See also
[edit]- Tree kernel, as special case of non-cyclic graphs
- Molecule mining, as special case of small multi-label graphs
References
[edit]- ^ a b c d S.V. N. Vishwanathan; Nicol N. Schraudolph; Risi Kondor; Karsten M. Borgwardt (2010). "Graph kernels" (PDF). Journal of Machine Learning Research. 11: 1201–1242.
- ^ L. Ralaivola; S. J. Swamidass; H. Saigo; P. Baldi (2005). "Graph kernels for chemical informatics". Neural Networks. 18 (8): 1093–1110. doi:10.1016/j.neunet.2005.07.009. PMID 16157471.
- ^ Haussler, David (1999). Convolution Kernels on Discrete Structures. CiteSeerX 10.1.1.110.638.
- ^ Risi Imre Kondor; John Lafferty (2002). Diffusion Kernels on Graphs and Other Discrete Input Spaces (PDF). Proc. Int'l Conf. on Machine Learning (ICML).
- ^ a b Thomas Gärtner; Peter A. Flach; Stefan Wrobel (2003). On graph kernels: Hardness results and efficient alternatives. Proc. the 16th Annual Conference on Computational Learning Theory (COLT) and the 7th Kernel Workshop. doi:10.1007/978-3-540-45167-9_11.
- ^ a b Hisashi Kashima; Koji Tsuda; Akihiro Inokuchi (2003). Marginalized kernels between labeled graphs (PDF). Proc. the 20th International Conference on Machine Learning (ICML).
- ^ Ghosh, Swarnendu; Das, Nibaran; Gonçalves, Teresa; Quaresma, Paulo; Kundu, Mahantapas (2018). "The journey of graph kernels through two decades". Computer Science Review. 27: 88–111. doi:10.1016/j.cosrev.2017.11.002.
- ^ Yu-Hang Tang; Wibe A. de Jong (2019). "Prediction of atomization energy using graph kernel and active learning". The Journal of Chemical Physics. 150 (4): 044107. arXiv:1810.07310. Bibcode:2019JChPh.150d4107T. doi:10.1063/1.5078640. PMID 30709286.
- ^ Shervashidze, Nino, et al. "Weisfeiler-lehman graph kernels." Journal of Machine Learning Research 12.9 (2011).