Stupidity
Arrogance
Laziness
Carefreeness
Beauty
Rebellion
Humor
* Flabbergasted? Click here.
You're probably here to check what kind of research I have done and currently am doing. For that, feel free to scroll!
For everything else, check left.
Except for my CV... CVs are boring.
Variational autoencoders
Latent variable models
Synthetic smart meter data
User privacy
TL;DR: Every household PV generation is unique. Or are they? We cannot say for sure without looking at their VERY LONG measurements spanning years. But how to handle such data? How to find similarities between solar generation patterns with lots of missing values? Maybe you are not similar to your neighbor as you think. There is one way to find out. Unleash the power of entity-embeddings!
TL;DR: Are you still living in 2024, and do you have to train a separate forecasting model for each entity in your dataset to make entity-specific predictions? Good news for you. Now, you can transform your favourite GUIDE-VAE model into an entity-specific forecaster that can give probabilistic forecasts on 4 (four) different levels. What a time we live in.
TL;DR: Imagine you have a multi-user dataset and you want to train your favorite generative model with that. How are you planning to generate a datapoint for a specific user? I give you a method to condition your model to users... with some extra realism.
TL;DR: Data-copying (overfitting to a single data point) is an optimal yet degenerate solution for probabilistic modelling. This can be mitigated for kernel density estimation-based models by excluding the effects of "self-kernels" in the maximum log-likelihood objective.
TL;DR: High-dimensional data can be clustered in the embedding space more efficiently using autoencoders. Interval Type-2 Fuzzy Sets can be used to model the uncertainty within the clusters and improve the performance.
TL;DR: There is this thing called "disentanglement" in variational autoencoders that uncouples the generative factors of data. Sometimes these generative factors are very interpretable and can be modelled with linguistic variables. Fuzzy sets are here to help for this modelling and give qualitative insight about the interpretations.