Academic article
Authors: Charlotte Merzbacher & Liam Carpenter-Urquhart
The futures cone is a heuristic framework that describes what future possibilities people believe to be probable, plausible, and desirable from the perspective of the present moment. When used to guide collective storytelling about the future (e.g. in a futures workshop), the model is useful for helping people articulate their assumptions about what is most (un)likely. However, it has been criticized for favoring dominant projections regarding what "probable" means.
This paper will extend the futures cone into high-dimensional spaces: by framing futures as existing within high-dimensional spaces, the work highlights how different worldviews can generate overlapping yet distinct projections of potential outcomes. Concepts such as the "curse of dimensionality" are reinterpreted to explain the tendency of foresight and futures storytelling to privilege certain "desirable" narratives while neglecting the full breadth of possibility. The paper proposes that adopting a multidimensional lens can improve reflexivity in futures thinking and open new pathways for inclusive scenario construction.
Essay
Joseph Voros's canonical account of the futures cone framework — its origins in foresight practice and how it maps possibility space from the present moment outward. The starting point for this paper's argument.
Read essay ↗Academic Paper
Terry et al. (2024) introduce the Entangled Time Tree as a decolonial alternative to the futures cone — centering non-linear, relational, and Indigenous conceptions of time and future imaginaries of nature. Environmental Science & Policy.
Read paper ↗Lecture Notes
Jonathan Shewchuk's notes on reasoning about high-dimensional geometry — covering counterintuitive properties of high-dimensional spaces including the curse of dimensionality, concentration of measure, and the geometry of hyperspheres.
Read notes ↗Academic Paper
A tutorial paper on how algorithms like t-SNE and Least-Square Projection map high-dimensional data into visual 2D/3D spaces — covering the mathematics of Laplacian matrices, Kullback-Leibler divergence, and PCA that underlie these methods.
Read paper ↗