Skip to the content.


Date Friday, 7 May 2021
Location The workshop will be held virtually.
Submission link OpenReview
Submission deadline Friday, 26 February 2021 (23:59 AoE) Sunday, 7 March 2021 (23:59 AoE)

This is the homepage for our ICLR 2021 workshop on ‘Geometric and Topological Representation Learning’.

You can find our Gather.Town instance here. Please note that you need to be logged in with your ICLR account to access the site.

The workshop is finished but please consider joining our Slack Community Geometry and Topology in ML to stay up-to-date with upcoming events, ask questions, and be part of a vibrant, diverse scientific community.


Over the past two decades, high-throughput data collection technologies have become commonplace in most fields of science and technology, and with them an ever-increasing amount of big high dimensional data is being generated by virtually every real-world system. While such data systems are highly diverse in nature, the underlying data analysis and exploration task give rise to common challenges at the core of modern representation learning. For example, even though modern real-world data typically have high dimensional ambient measurement spaces, they often exhibit low dimensional intrinsic structures that can be uncovered by geometry-oriented methods, such as the ones encountered in manifold learning, graph signal processing, geometric deep learning, and topological data analysis. As a result, recent years have seen significant interest and progress in geometric and topological approaches to representation learning, which enable tractable exploratory analysis by domain experts who are often not computation-oriented. Our overarching goal in the proposed workshop is to deepen our understanding of the challenges and opportunities in this field, while breaking the barriers between the typically disjoint computational approaches (or communities) that work in this field, with emphasis on the domains of topological data analysis, graph representation learning, and manifold learning, on which we shall subsequently briefly comment.