Kernel Null Foley-Sammon Transform for Novelty Detection

Telechargé par Hans Fotsing
Novelty detection using kernel null Foley-Sammon
transform
Babel Raïssa GUEMDJO KAMDEM
Higher School of Economics and Commerce, University of Douala, Cameroon
September 11, 2024
1 Introduction
This research project follows the call for proposals "Science by women" launched by the
African Women’s Foundation. The call for proposals consists of a 6-month postdoctoral
fellowship program in collaboration with recognized Spanish Centers of Excellence. The
research project that we propose perfectly matches the themes explored at the Instituto
de Ciencias Matemáticas (ICMAT), Madrid.
During this postdoctoral stay, we wish to use the multi-call open recognition using
the probability of inclusion (PI-SVM) introduced in [1] to extend novelty detection [2]
to multi-class recognition. Support Vector Machine is popular for solving problems in
classification, regression, and novelty detection [3].
2 The (kernel) null space methods
We consider input features X={x1, x2, . . . , xN} ⊂ Rnclustered into cdifferent classes
C:= {C1, C2, . . . , Cc}, and denote Njthe number of samples in the class Cj. We denote
by mjthe mean of the samples in class Cjand by mthe mean of all samples in X. The
between-class scatter matrix Sband the within-class scatter matrix Sware defined by:
Sb:=
c
X
j=1
Nj(mjm)(mjm)T, Sw:=
c
X
j=1
Nj
X
k=1
(xj,k mj)(xj,k mj)T,
where xj,k belongs to the class Cjfor k= 1, . . . , Nj. The total scatter matrix is St:=
Sb+Sw.
The Linear Discriminant Analysis (LDA) aims to maximize the Fisher discriminant
criterion
(1) J(W) := WTSbW
WTSwW,
with WRc1. This is done by maximizing Sband minimizing Sw. The idea of frequently
used LDA algorithms is to find a matrix Wthat simultaneously diagonalize Swand Sb.
As the diagonalization procedure start with Swit is required that Swbe non-singular,
what is not the case in general. For instance, Swis surely singular when the number N
of training samples is smaller than the dimension nof the sample space. The matrix Sw
is singular in most face recognition tasks.
Null space method or NFST (for Null Foley-Sammon transform) are usually used to
overcame this problem of singularity. NFST consist of projecting the image space onto
E-mail: bab[email protected]
1
a lower dimensional subspace. This procedure remove null space from both Swand Sb,
and potentially loses useful information. This procedure is not needed when the number
of training sample is bigger than the dimension of the sample space. Therefore, NFST is
only adapted for small size sample problem (SSSP). Let’s recall that the null spaces of
Sb, Swand Stare defined by:
Zb:= {zRn;Sbz= 0}(2)
Zw:= {zRn;Swz= 0}(3)
Zt:= {zRn;Stz= 0}=ZbZw.(4)
We refer to Ztas the null space and the null space of a given class Cjis defined by
N(Cj) = ZtCj.
To overcome the restriction on the training sample size and allow more flexibility in
the model Kernel Null Foley-Sammon transform (KNFST) has been introduced [4,5].
In this method, features are mapped implicitly to a kernel feature space with a kernel
function.
3 Novelty detection with null space methods
The task in multi-class novelty detection is to calculate a score indicating whether a test
sample belongs to one class in C, no matter to which class. For each class Cjin C, we
determine one target point tjcorresponding to the projection of class samples in the null
space. To compute de score of a test sample x, we first project xto a point tin the
null space, and the novelty score of xis given by
(5) MultiClassNovelty(x) := min
1jcdist(t, tj).
The larger the score and thus the minimum distance in the null space, the more novel is
the test sample. Note that an arbitrary distance measure can be incorporated and we use
Euclidean distances in our experiments.
4 The work to do during the stay
The work to do during the stay in Madrid include:
describe a clear algorithm to detect novelty class using kernel null Foley-Sammon
transform;
apply this pattern recognition method to detect chromosomal abnormality ;
study the convergence of a regression problem.
References
[1] L. P. Jain, W. J. Scheirer, and T. E. Boult, “Multi-class open set recognition using probability of
inclusion,” in European Conference on Computer Vision, pp. 393–409, Springer, 2014.
[2] P. Bodesheim, A. Freytag, E. Rodner, M. Kemmler, and J. Denzler, “Kernel null space methods for
novelty detection,” in Proceedings of the IEEE conference on computer vision and pattern recognition,
pp. 3374–3381, 2013.
[3] C. M. Bishop, “Pattern recognition,” Machine learning, vol. 128, no. 9, 2006.
[4] W. Zheng, L. Zhao, and C. Zou, “Foley-sammon optimal discriminant vectors using kernel approach,”
IEEE Transactions on Neural Networks, vol. 16, no. 1, pp. 1–9, 2005.
[5] G. Gu, H. Liu, J. Shen, et al., “Kernel null foley-sammon transform,” in 2008 International Conference
on Computer Science and Software Engineering, vol. 1, pp. 981–984, IEEE, 2008.
2
1 / 2 100%
La catégorie de ce document est-elle correcte?
Merci pour votre participation!

Faire une suggestion

Avez-vous trouvé des erreurs dans l'interface ou les textes ? Ou savez-vous comment améliorer l'interface utilisateur de StudyLib ? N'hésitez pas à envoyer vos suggestions. C'est très important pour nous!