Doktoranden-/Diplomandenseminar WS 08 /09

Wednesday, 11.02.09 14.30 FR6046

Thomas Vanck

Asset pricing via optimal Importance Sampling


Pricing complex derivatives is one of the main goals in financial mathematics. For example, in the Black-Scholes model it is possible to give a closed formula for pricing European options. However, in a lot of applications it is not possible to give closed solutions to these problems. Instead, numerical computations must be applied. An appropriate method is the Monte Carlo simulation, e.g. path-dependent options can be very easily simulated by generating a number of sample paths of the underlying asset. These sample paths are generated according to a distribution under an equivalent martingale measure. Subsequently, one would calculate the corresponding payoffs followed by averaging over these. This average yields an approximation of the fair price of the derivative by the strong law of large numbers. The calculation of the price of some derivatives, such as options that offer a payoff only in a specific event of interest to the buyer, can be problematic. Suppose the payoff of a special event is very high but also very rare to happen. A low number of simulations of the underlying paths might not hit these important events which would yield an incorrect or poor price. On the other hand a large number of simulations would result in a huge cost of computation time and still there might be no samples hitting the important event. The solution to this dilemma is the application of importance sampling. Importance sampling pays more attention to crucial events. In the case of an option such a crucial or rare event would be an event that generates a high payoff. The goal of importance sampling is a reduction of the variance which is achieved by making a change of measure that accounts for important outcomes and therefore minimizes the variance. Importance sampling has two advantages. Firstly, it needs less samples and secondly it is very accurate. The problem of importance sampling is that it cannot be implemented very easily. Therefore, an approach will be stated for path-dependent options.

Wednesday, 14.01.09 15.30 FR6046

Leo Jugel

Wednesday, 14.01.09 14.30 FR6046

Felix Bießmann

Investigating neurovascular coupling using temporal cross‐correlation between pharmacological MRI and electrophysiology


Despite its young age, functional Magnetic Resonance Imaging (fMRI) has become one of the most popular brain imaging techniques. However, the relationship between brain activity and the blood oxygen level dependent (BOLD) contrast as measured with fMRI, the so called neurovascular coupling, is not yet fully understood. Using pharmacological interventions to manipulate the neurovascular coupling mechanisms in combination with simultaneous measurements of electrophysiological and BOLD response, we investigate the dependencies between neural activity and the fMRI signal. This imposes additional demands on the data analysis: electrophysiological data is very high dimensional in time, whereas fMRI data is typically very high dimensional in space. We developed a method based on canonical correlation analysis (CCA), that is able to deal with the high dimensionality of the fMRI signal while exploiting the high temporal resolution of the electrophysiological signal. After a short introduction in the experimental setup, I will present an extension of linear kernel CCA, that allows to find the maximally correlated representation of the two data streams (electrophysiology and fMRI). The algorithm computes a temporal crosscorrellation function between the two variables, instead of only a correlation coefficient, like in the case of classical CCA. This allows to gain new insights in the coupling mechanisms between the two modalities, neural and vascular response, in particular with respect to their temporal dynamics. The filters computed with temporal CCA have a temporal dimension and are easy to interpret. Moreover, we show how the computed crosscorrelation function can be used as a feature for classification tasks, e.g. for diagnostic purposes.

Wednesday, 07.01.09 14.30 FR6046

Patrick Düssel & Christian Gehl

Incorporation of Application Layer Protocol Syntax into Anomaly Detection


The syntax of application layer protocols carries valuable information for network intrusion detection. Hence, the majority of modern IDS perform some form of protocol analysis to refine their signatures with application layer context. Protocol analysis, however, has been mainly used for misuse detection, which limits its application for the detection of unknown and novel attacks. In this contribution we address the issue of incorporating application layer context into anomaly-based intrusion detection. We extend a payload-based anomaly detection method by incorporating structural information obtained from a protocol analyzer. The basis for our extension is computation of similarity between attributed tokens derived from a protocol grammar. The enhanced anomaly detection method is evaluated in experiments on detection of web attacks, yielding an improvement of detection accuracy of 49%. While byte-level anomaly detection is sufficient for detection of buffer overflow attacks, identification of recent attacks such as SQL and PHP code injection strongly depends on the availability of application layer context.

Thursday, 18.12.08 14.30 FR6046

Masanori Shimono

A prediction of success/fail of controlling perceptual switching


While several studies have shown classification of mental content from neural activities, no reports have succeeded in a prediction of mental contents in the future from some present MEG (Magnetoencephalography). We introduce a research to predict suceess or fail of intentional control of pereptual switching for DDQ (Dynamical Dot Quartet) using single-trial MEG. This research consists of (1) an investigation of new cognitive phenomenon and the neural correlate of it with MEG, and (2) an application of single-trial analysis for the MEG.

In detail: (1) The investigation of a new cognitive phenomenon:

(2) An application of single-trial classification technique:

In the second analysis, we used Signal Space Projection or ICA

single-trial classification.

Furthermore, I will introduce our ongoing projects using a MEG real-time neurofeedback system.

Wednesday, 17.12.08 14.30 FR6046

Alex Binder

Current Research Question in Theseus: Multiclass Classification with Taxonomies


The problem of image classification over semantic categories is usually solved by one vs all classifiers which treat the semantic categories as a relationless set. We incorporate semantic information via multiclass classification over a taxonomy which is at the current stage provided by hand. We show the preliminary results on a 45 class problem (Caltech 101 Lifeforms) which are not satisfying and present potential extensions to replace the hand taxonomy by a learned direct acyclic graph.

Wednesday, 03.12.08 16.00(!!) FR6046

Arno Onken

Modeling Short-term Noise Dependence of Spike Counts with Copulas


Correlations between spike counts are often used to analyze neural coding. The noise is typically assumed to be Gaussian. Yet, this assumption is often inappropriate, especially for low spike counts. In this talk, I will present copulas as an alternative approach. With copulas it is possible to use arbitrary marginal distributions such as Poisson or negative binomial that are better suited for modeling noise distributions of spike counts. Furthermore, copulas place a wide range of dependence structures at the disposal and can be used to analyze higher order interactions. I will introduce a framework to analyze spike count data by means of copulas. Methods for parameter inference based on maximum likelihood estimates and for computation of mutual information are provided.

Wednesday, 26.11.08 14.30 FR6046

Siamac Fazli

Subject independent mental state classification in single trials


Current state of the art in Brain Computer Interfacing (BCI) involves tuning classifiers to subject-specific training data acquired from calibration sessions prior to functional BCI use. Using a large database of EEG recordings from 45 subjects, who took part in movement imagination task experiments, we construct an ensemble of classifiers derived from subjectspecific temporal and spatial filters. The ensemble is then sparsified using `1 regularization such that the final classifier generalizes reliably to data of subjects not included in the ensemble. Our offline results indicate that BCI-naive users could start real-time BCI use without any prior calibration at only very limited loss of performance.

Wednesday, 19.11.08 11.30 (!!) FR6046

Frank Meinecke

Stationary Subspace Analysis


Non-stationarities are an ubiquitous phenomenon in real-world data, yet they challenge standard Machine Learning methods: if training and test distributions differ we cannot, in principle, generalise from the observed training sample to the test distribution. This affects both supervised and unsupervised learning algorithms. We present a new method to decompose multi-variate time-series into a stationary and a non-stationary subspace and show (on an BCI example) that restricting the classification on the stationary subspace can improve the classification performance.

IDA Wiki: Main/DoktorandenSeminarWS0809 (last edited 2009-04-27 13:52:40 by MikioBraun)