Regularized Sparse Kernel SFA with Decorrelation Filtering For Separating Correlated Sources | Open Access Journals

ISSN ONLINE(2320-9801) PRINT (2320-9798)

Yakışıklı erkek tatil için bir beldeye gidiyor burada kendisine türk Porno güzel bir seksi kadın ayarlıyor Onunla beraber otel odasına gidiyorlar Otel odasına rokettube giren kadın ilk önce erkekle sohbet ederek işi yavaş halletmeye çalışıyor sex hikayeleri Kocası fabrikatör olan sarışın Rus hatun şehirden biraz uzak olan bir türk porno kasabaya son derece lüks bir villa yaptırıp yerleşiyor Kocasını işe gönderip mobil porno istediği erkeği eve atan Rus hatun son olarak fotoğraf çekimi yapmak üzere türk porno evine gelen genç adamı bahçede azdırıyor Güzel hatun zengin bir iş adamının porno indir dostu olmayı kabul ediyor Adamın kendisine aldığı yazlık evde sikiş kalmaya başlayan hatun bir süre sonra kendi erkek arkadaşlarını bir bir çağırarak onlarla porno izle yapıyor Son olarak çağırdığı arkadaşını kapıda üzerinde beyaz gömleğin açık sikiş düğmelerinden fışkıran dik memeleri ile karşılayıp içeri girer girmez sikiş dudaklarına yapışarak sevişiyor Evin her köşesine yayılan inleme seslerinin eşliğinde yorgun düşerek orgazm oluyor

Regularized Sparse Kernel SFA with Decorrelation Filtering For Separating Correlated Sources

Rekha P1, S. Shobana, M.E. 2
  1. Second M.E CSE, Dept. of CSE, Dhanalakshmi Srinivasan College of Engineering, Coimbatore, India
  2. Assistant Professor, Dept. of CSE, Dhanalakshmi Srinivasan College of Engineering, Coimbatore, India
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Innovative Research in Computer and Communication Engineering


Advances in digital image processing were increased in the past few years. Blind source separation is one of the important research area with numerous applications in signal processing, image processing, telecommunication and speech recognition. In this paper the Blind Source Separation is performed using Slow Feature Analysis(SFA). It is necessary to use multivariate SFA instead of univariate SFA for separating multi-dimensional signals. This paper makes use of Regularized Sparse Kernel SFA(RSKSFA) instead of multivariate SFA and applies it to the problem of blind source separation in particular to image separation. Here the kernel trick is used in combination with sparsification to provide a powerful function class for large data sets. Sparsity is achieved by a novel matching pursuit approach that can be applied to other tasks as well. For small but complex data sets the kernel SFA approach leads to over-fitting and numerical instabilities. To enforce a stable solution, we introduce regularization to the SFA objective. If the original sources are correlated, it is not possible to achieve perfect separation. So apply a decorrelation filter on the image mixtures before applying SFA for separating the correlated sources. For SFA, when the number of mixtures is greater than or equal to the number of sources, the paper demonstrates how to determine the actual number of sources via regularization technique


Image separation, Blind Source Separation, Slow Feature Analysis, Regularized Sparse Kernel SFA, Decorrelation Filter.


In signal and image processing, there are many instances where a set of observations is available and we wish to re-cover the sources generating these observations. This problem, which is known as blind source separation (BSS)([3][20]) is an important and exciting domain of research, with numerous potential applications in image processing, telecommunication, speech recognition etc. The linear blind source separation problem seeks to unmix unknown signals that have been mixed by an unknown linear method. It refers to a wide class of methods in signal and image processing, which extract the underlying sources from a set of mixtures without almost any prior knowledge about the sources nor about the mixing process. Firstly, we show how to use Slow Feature Analysis (SFA)[6] for the problem of linear blind source separation for signals of any finite dimension. Secondly, we introduce a technique, decorrelation filtering, for separating corre-lated sources.
Slow feature analysis (SFA)([10][17][19]) is a new method for learning invariant or slowly varying features from a vectorial input signal. It is based on a nonlinear expansion of the input signal and application of principal component analysis to this expanded signal and its time derivative. It is guaranteed to find the optimal solution within a family of functions directly and can learn to extract a large number of decorrelated features, which are ordered by their degree of invariance. In this work, we generalize the original multivariate formulation of SFA to regularized sparse kernel SFA which is suitable for vector-valued sources in several variables. This allows us to accurately unmix several color images.
Slow feature analysis[6] is mainly based on Slowness principle. In this a multidimensional input signal is given and its aim is to find scalar input-output functions that generate output signals that vary as slowly as possible but carry significant information and output are uncorrelated. It is guaranteed to find a globally optimal solution within the finite-dimensional function space and thus does not suffer from local optima. It yields a set of uncorrelated output signals that are ordered by slowness. The existing paper generalizes the univariate SFA to multivariate which allows us to unmix several color images. For standard SFA if the original sources are correlated it is not possible to achieve an accurate separation. In this case decorrelation filtering is applied before the Slow Feature Analysis to get accurate separation.
In this paper the image separation is performed with the following steps. First we use a linear filter to decorrelate the sources and their derivatives in the mixtures. Then apply Regularized sparse kernel SFA to the filtered mixtures for obtaining an unmixing matrix. After that apply that unmixing matrix to the original mixtures. Then we get the original sources. The decorrelation filtering reduces correlations between the original sources and their derivatives, leading to a significant improvement of SFA.


The existing system generalize the original univariate SFA which is suitable for scalar valued sources in one variable to a multivariate SFA which is suitable for vector valued functions in several variables. Two main mathematical approaches are presented in[18] . The first is Slow Feature Analysis for linear blind source separation. The second approach is Decorelation Filtering which aims to minimize the correlations between the original sources and their derivatives.
Many results are available in the literature on the problem of image separation. Some algorithms utilizing ICA for image separation that include: [11] for separating lighting and reflections; [12][13] for separating astrophysical images; and [4] for separating real-life, nonlinear mixtures of documents acquired through scanning using both linear and nonlinear ICA. This approach to image separation is different from our paper. There have recently been many examples of using sparsity for source separation, but their underlying assumption, namely sparse representation of the signals, is different from those in our method. Algorithms related to our decorrelation filtering technique have been presented in [14], [20]. In [14], it applies ICA to the innovation processes of time-dependent stochastic processes. In [20], under the assumption that the original sources can be dependent but their sub-components in some frequency bands are mutually indepen-dent, the authors estimate a filter to extract the independent sub-components and apply ICA to the filtered sources; the filter was estimated by minimizing the mutual information between outputs.


This paper provides such an extension in the form of a kernel SFA algorithm. Such an approach has previously been made by Bray and Martinez [4] and is reported to work well with a large image data set. Small and complex sets, however, lead to numerical instabilities in any kernel SFA algorithm. Our goal is to provide an algorithm that can be applied to both of the above cases.
There are several drawbacks in a kernel approach to SFA. First, choosing feature mappings from a powerful Hilbert space is naturally prone to over-fitting. More to the point, kernel SFA shows numerical instabilities due to it’s unit variance constraint. This tendency has been analytically shown for the related kernel canonical correlation analysis [10]. We introduce a regularization term to the SFA objective to enforce a stable solution. Secondly, kernel SFA is based on a kernel matrix of large size which is not feasible for large training sets. Our approach approximates the optimal solution by projecting into a sparse subset of the data. In the following section, we first introduce the general SFA optimization problem and derive a regularized sparse kernel SFA algorithm.


Slow feature analysis is an unsupervised learning algorithm for extracting slowly varying features from a quickly varying input signal. It has been successfully applied to the self-organization of complex-cell receptive fields, the recognition of whole objects invariant to spatial transformations, the self-organization of place-cells, extraction of driving forces, and to nonlinear blind source separation. Slow feature analysis is mainly based on Slowness principle. In this a multidimensional input signal is given and its aim is to find scalar input-output functions that generate output signals that vary as slowly as possible but carry significant information and output are uncorrelated. It is guaranteed to find a globally optimal solution within the finite-dimensional function space and thus does not suffer from local optima. It yields a set of uncorrelated output signals that are ordered by slowness.
The angular brackets, (.)t , indicate averaging over time and y˙ is the derivative of y with respect to time.The Δ-value is the objective of the optimization problem, and measures the slowness of an output signal as the time average of its squared derivative. A low value indicates small variations over time, and therefore slowly-varying signals.


In blind source separation, the input signal is assumed to be a mixture of some sources and the task is to recover the sources without any detailed knowledge about the sources or the mixing. Since nothing is known about the sources in detail, the separation can only be done on the basis of some general statistical properties. Assume you had two correlated images and a linear mixing matrix available. Then the normal procedure of mixing and unmixing would be to mix the images with the mixing matrix and then apply Regularized sparse kernel SFA to it to recover the images. This would not work, since the original images are correlated but the extracted images are uncorrelated by construction.
To avoid that, in this paper we apply the decorrelation filter to the images, then mix the filtered images and apply SFA to do the unmixing. That should yield a proper unmixing matrix that you could also apply to the mixtures of the originalimages. The procedure is as follows,
 Filter the mixed original images to obtain the mixed filtered images.
 Apply your linear unmixing algorithm.
 Apply the unmixing matrix to the mixed original images to extract the estimated original images.
We illustrate this procedure in the schematic diagram shownin Figure 1. To make this all work, the filter must be designed such that it actually decorrelates the images.
In this paper it make use of Regularized sparse kernel SFA in combination with Decorrelation filtering. It yields perfect separation if there are more number of mixtures than sources. In this case we make use of a regularization parameter. Consider the case where the number ofdistinctsources is N1 and the number of mixtures is N2, with N2 >N1.We can considerxas mixtures ofN2 sources, only N1 of which are distinct, mixed by a random, invertible matrixMof size N2×N2.
Theorem 1:Assume that the number of mixtures in x is greater than or equal to the number of distinct sources in s is and that for the distinct sources, E(si) =0,E(sisj) =δij, Eij =0(i =j),1≤i, j ≤N1. Assume further that the mixing matrix satisfiesMMT=I,thatisMis an orthogonal matrix. Then the number of distinct sourcesN1is the number of strictly positive eigenvalues of the generalized eigenvalue problem
AW=(B+γI)W ………(1)
for any regularization parameter γ>0. Furthermore, the solutionyis independent of M.
Decorrelation is a general term for any process that is used to reduce autocorrelation within a signal, a set of signals, while preserving other aspects of the signal. A frequently used method of decorrelation is the use of a matched linear filter to reduce the autocorrelation of a signal as far as possible. Since the minimum possible autocorrelation for a given signal energy is achieved by equalising the power spectrum of the signal to be similar to that of a white noise signal, this is often referred to as signal whitening. The primary purpose of decorrelation stretch is visual enhancement. Decorrelation stretching is a way to enhance the color differences in an image. In image processing decorrelation techniques can be used to enhance or stretch, colour differences found in each pixel of an image. This is generally termed as 'decorrelation stretching'.
 S = decorrstretch(A)
 S = decorrstretch(A,name,value...)
S = decorrstretch(A) applies a decorrelation stretch to an m-by-n-by-nBands image A and returns the result in S. S has the same size and class as A, and the mean and variance in each band are the same as in A. A can be an RGB image or can have any number of spectral bands.For achieving perfect separation Decorrelation Filtering for separating correlated sources, is applied before Multivariate SFA. The key idea behind decorrelation filtering is the commutativity between linear mixng and linear filtering that is for every linear filter operator F and any mixing matrix M, FMs=MFs. After the application of Decorrelation Filter to the image mixture it yields filtered mixtureas as in fig 2 . Then it is given as an input to SFA so that SFA can work well with these filtered mixtures.
The RSK-SFA algorithm (Algorithm 1)[3] is closely related to the linear SFA al-gorithm of Wiskott and Sejnowski [23]. It consists of three phases: (1) fulfilling zero mean by centring, (2) fulfilling unit variance and decorrelation by sphering and (3) minimizing the objective by rotation. Unit Variance and Decorrelation. Analogue to linear SFA, we first project into the normalized eigenspace of 1/n KK⊤=: UΛU⊤. The procedure is called spher-ing or whitening.
Algorithm 1 Regularized Sparse Kernel Slow Feature Analysis (RSK-SFA)
Given a mixturex=Ms
1) Apply a decorrelation filterFto x.
2) Apply Regularized sparse kernel SFA algorithm to the filtered mixture Fx, obtain an un-mixing matrixW.
3) Output:y=WTx=WTMs.
The Hilbert spaces corresponding to some of the most popular kernels are equivalent to the infinite dimensional space of continu-ous functions [17]. Depending on hyper-parameter σ and data distribution, this can obviously lead to over-fitting. Less obvious, however, is the tendency of kernel SFA to be-come numerically unstable for large σ, i.e. to violate the unit variance constraint. Fukumizu et al. [10] have shown this analytically for the related kernel canoni-cal correlation analysis. Note that both problems do not affect sufficiently sparse solutions, as sparsity reduces the function complexity and sparse kernel matrices KKT are more robust w.r.t. eigenvalue decompositions. One countermeasure is to introduce a regularization term to stabilize the sparse kernel SFA algorithm, which thereafter will be called regularized sparse kernel SFA (RSKSFA).


First apply the decorrelation filter to the images, the mix the filtered images and apply SFA to do unmixing. That yield a proper unmixing matrix that could also apply to the mixtures of the original sources. If the original images are not available but only the mixed images and mixed filtered images one can proceed as follows. Fisrt determine the unmixing matrix on the mixture of the filtered images and the apply the unmixing matrix to the mixture of the original images to extract the sources. The experimental results are shown in following figures. An image mixture is given as an input to Decorrelation flter. It produces a decorrelated output. Then SFA is applied on that giving original image sources.
When we compare the regularized sparse kernel SFA with Multivariate SFA, it gives perfect separation that is correlation factor almost equal to one. The graph showing the comparison of multivariate SFA and Regularized SFA is given below:


In this paper two main mathematical approaches are presented. The first is Regularized Slow Feature Analysis for image separation, for general vector-valued, multivariate signals. The method does not make any assumption on the statistical distributions of the signals except for spatial smoothness. It is simple to implement and provides a unique global solution under mild conditions. Since SFA perfectly separates two signals if and only if they and their derivatives are uncorrelated a second method is developed, namely Decorrelation Filtering, which aims to minimize the correlations between the original sources and their derivatives. Decorrelation filtering can be applied as a preprocessing step not just to SFA, but in general to any linear separation technique whose outputs are uncorrelated such as ICA. The decorrelation filters perform well on images of different types, namely textures, manmade objects and astronomical images, shows the generalization capability of the proposed approach. For SFA, when the number of mixtures is greater than or equal to the number of sources, we have demonstrated how to determine the actual number of sources via a simple regularization technique. In order to provide a powerful but easily operated algorithm that performs non-linear slow feature analysis, propose a method that makes use of a kernelized SFA algorithm (RSK-SFA). Our results, both theoretically and numerically, show that SFA and Decorrelation Filtering are promising techniques that can be exploited in various blind source separation problems in signal and image processing.


[1] M. Abramowitz and I. Stegun, Handbook of Mathematical Functions: With Formulas, Graphs, and Mathematical Tables, 2nd ed. Boston, MA, USA: National Bureauof Standards, 1964.

[2] A. Bray and D. Martinez. Kernel-based extraction of Slow features: Com-plex cells learn disparity and translation invariance from natural images. Neural Information Processing Systems, 15:253–260, 2002

[3] W. Bohmer, Steffen, H. Nickisch and Klaus Obermayer “Regularized sparse kernel Slow feature analysis” Neural Comput,,

[4] L. B. Almeida, “Separating a real-life nonlinear image mixture,”J. Mach. Learn. Res., vol. 6, pp. 1199–1230, May 2005.

[5] S. Amari, A. Cichocki, and H. Yang, “A new learning algorithm for blind source separation,” inAdvances in Neural Information Processing Systems, vol. 8. Cambridge, MA, USA: MIT Press, 1996, pp. 757–763.

[6] T. Blaschke, P. Berkes, and L. Wiskott, “What is the relation between slow feature analysis and independent component analysis?” Neural Comput., vol. 18, no. 10, pp. 2495–2508, Oct. 2006.

[7] S. Choi, A. Cichocki, and Y. Deville, “Differential decorrelation for nonstationary source separation,” in Proc. Independ. Compon. Anal., 2001, pp. 319–322.

[8] P. Comon, “Separation of stochastic processes,” in Proc. Workshop Higher-Order Spectral Anal., Jun. 1989, pp. 174–179.

[9] S. Dähne, J. Höhne, M. Schreuder, and M. Tangermann, “Slow feature analysis: A tool for extraction of discriminating event-related potentials in brain-computer interfaces,” in Proc. Int. Conf. Artif. Neural Netw., 2011, pp. 36–43.

[10] A. N. Escalante-B. and L. Wiskott, “Slow feature analysis: Perspectives for technical applications of a versatile learning algorithm,” Künstl. Intell., vol. 26, no. 4, pp. 341–348, 2012.

[11] H. Farid and E. H. Adelson, “Separating reflections and lighting using independent components analysis,” inProc. Comput. Vis. Pattern Recog-nit., 1999, pp. 262–267.

[12] M. Funaro, E. Oja, and H. Valpola, “Independent component analysis for artefact separation in astrophysical images,” Neural Netw., vol. 16, nos. 3– 4, pp. 469–478, 2003.

[13] A. Homayounzadeh and M. Yazdi, “Astrophysical image separation using independent component analysis,” inProc. Int. Conf. Digit. Image Process., Mar. 2009, pp. 275–278.

[14] A. Hyvärinen, “Independent component analysis for time-dependent stochastic processes,” in Proc. Int. Conf. Artif. Neural Netw., 1998, pp. 541–546.

[15] A. Olmos and F. Kingdom. (2004). McGill Calibrated Colour Image Database[Online]. Available:

[16] R. Szeliski, S. Avidan, and P. Anandan, “Layer extraction from multiple images containing reflections and transparency,” inProc. Comput. Vis. Pattern Recognit., 2000, pp. 246–253.

[17] L. Wiskott, “Slow feature analysis:A theoretical analysis of optimal free responses,” Neural Comput., vol. 15, no. 9, pp. 2147–2177, 2003.

[18] L.Wiskott and H.Q Minh “Multivariate slow feature analysis and decorrelation filtering for searating correlated sources” vol.22, no.9 july 2013.

[19] L. Wiskott and T. Sejnowski, “Slow feature analysis: Unsupervised learning of invariances,” Neural Comput., vol. 14, no. 4, pp. 715–770, 2002.

[20] K. Zhang and L. Chan, “Enhancement of source independence for blind source separation,” in Proc. Independ. Compon. Anal., 2006, pp. 731–738.