Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Short description of portfolio item number 1
Short description of portfolio item number 2 
Published in 31st International Conference on Artificial Neural Networks (ICANN), 2022
Recently neural network architectures have been widely applied to the problem of time series forecasting. Most of these models are trained by minimizing a loss function that measures predictions’ deviation from the real values. Read More
Recommended citation: Kosma, C., Nikolentzos, G., Xu, N. and Vazirgiannis, M. (2022). "Time Series Forecasting Models Copy the Past: How to Mitigate." In Proceedings of the 31st International Conference on Artificial Neural Networks (pp. 366-378). Cham: Springer International Publishing. https://link.springer.com/chapter/10.1007/978-3-031-15919-0_31
Published in Arxiv (Preprint), 2023
In this paper, we parameterize convolutional layers by employing time-explicitly initialized kernels. Such general functions of time enhance the learning process of continuous-time hidden dynamics and can be efficiently incorporated into convolutional kernel weights. We, thus, propose the time-parameterized convolutional neural network (TPCNN), which shares similar properties with vanilla convolutions but is carefully designed for irregularly sampled time series. We evaluate TPCNN on both interpolation and classification tasks involving real-world irregularly sampled multivariate time series datasets.
Recommended citation: Kosma, C., Nikolentzos, G., & Vazirgiannis, M. (2023). Time-parameterized convolutional neural networks for irregularly sampled time series. arXiv preprint arXiv:2308.03210.
Published in Transactions on Machine Learning Research (TMLR), 2023
We focus on the susceptible-infectious-recovered (SIR) epidemic model on networks. Given that this model can be expressed by an intractable system of ordinary differential equations, we devise a simpler system that approximates the output of the model. Then, we capitalize on recent advances in neural ordinary differential equations and propose a neural architecture that can effectively predict the course of an epidemic on the network. Read More
Recommended citation: Kosma, C., Nikolentzos, G., Panagopoulos, G., Steyaert, J.M. and Vazirgiannis, M. (2023). "Neural Ordinary Differential Equations for Modeling Epidemic Spreading." Transactions on Machine Learning Research. https://openreview.net/pdf?id=yrkJGne0vN
Published in International Conference on Complex Networks and Their Applications, 2023
Graph neural network approaches, which jointly learn a graph structure based on the correlation of raw values of multivariate time series while forecasting, have recently seen great success. However, such solutions are often costly to train and difficult to scale. In this paper, we propose TimeGNN, a method that learns dynamic temporal graph representations that can capture the evolution of inter-series patterns along with the correlations of multiple series.
Recommended citation: Xu, N., Kosma, C. and Vazirgiannis, M. (2023). "TimeGNN: Temporal Dynamic Graph Learning for Time Series Forecasting." In Proceedings of the 12th International Conference on Complex Networks and Their Applications.
Published in HAL Online Archive, 2023
Doctoral dissertation in Machine Learng (Mathematics and Computer Science), completed at LIX, Ecole Polytechnique. A comprehensive analysis of modern deep learning approaches for time series modeling and their limitations with respect to robustness, along with novel architectural contributions.
Recommended citation: Kosma, C. (2023). Towards Robust Deep Learning Methods for Time Series Data and their Applications (Doctoral dissertation, Institut Polytechnique de Paris). https://theses.hal.science/tel-04606898/
Published in International Conference on Artificial Intelligence and Statistics (AISTATS), 2025
The Signed Graph Archetypal Autoencoder (SGAAE) framework extracts node-level representations that express node memberships over distinct extreme profiles, referred to as archetypes, within the network. This is achieved by projecting the graph onto a learned polytope, which governs its polarization. The framework employs a recently proposed likelihood for analyzing signed networks based on the Skellam distribution, combined with relational archetypal analysis and GNNs. We introduce the 2-level network polarization problem and show how SGAAE is able to characterize such a setting. The proposed model achieves high performance in different tasks of signed link prediction across four real-world social network datasets, outperforming several baseline models.
Recommended citation: Nakis, N., Kosma, C., Nikolentzos, G., Chatzianastasis, M., Evdaimon, I., & Vazirgiannis, M. (2024). "Signed Graph Autoencoder for Explainable and Polarization-aware Network Embeddings." In Proceedings of The 28th International Conference on Artificial Intelligence and Statistics. 258:496-504. https://proceedings.mlr.press/v258/nakis25a.html
Published in OUP Bioinformatics, 2025
To accurately represent biological relationships, we present the Signed Two-Space Proximity Model (S2-SPM) for signed PPI networks, which explicitly incorporates both types of interactions, reflecting the complex regulatory mechanisms within biological systems. This is achieved by leveraging two independent latent spaces to differentiate between positive and negative interactions while representing protein similarity through proximity in these spaces. Our approach also enables the identification of archetypes representing extreme protein profiles.
Recommended citation: Nakis, N., Kosma, C., Brativnyk, A., Chatzianastasis, M., Evdaimon, I., & Vazirgiannis, M. (2025). The signed two-space proximity model for learning representations in protein–protein interaction networks. Bioinformatics, 41(6), btaf204.
Published in International Conference on Machine Learning (ICML), 2025
We mathematically formulate and technically design efficient and hard-coded invariant convolutions for specific group actions applicable to the case of time series. We construct these convolutions by considering specific sets of deformations commonly observed in time series, including scaling, offset shift, and trend. We further combine the proposed invariant convolutions with standard convolutions in single embedding layers, and we showcase the layer capacity to capture complex invariant time series properties in several scenarios.
Recommended citation: Germain, T., Kosma, C. & Oudre, L.. (2025). Time Series Representations with Hard-Coded Invariances. Proceedings of the 42nd International Conference on Machine Learning, In Proceedings of Machine Learning Research 267:19172-19195. https://proceedings.mlr.press/v267/germain25a.html
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Executive Education, École Polytechnique, IPP, 2020
Teaching coordinator, Lab assistant (55 hours, 2020 - now). Machine and Deep Learning for Time Series, X-EXED program.
Computer Science M1 programme, Ecole Polytechnique, IPP, 2021
Lab assistant (14 hours - autumn 2021).
Computer Science M1 programme, Ecole Polytechnique, IPP, 2022
Lab assistant (36 hours - autumn 2022).
Master in Ergonomics, Université Paris Cité (UPC), 2025
Lab assistant (6 hours, autumn 2025). Introduction to signal processing and e-health.