NO.265 Extreme Analytics: Analytics at Bad Times!
December 6 - 9, 2027 (Check-in: December 5, 2027 )
Organizers
- Maxime Cordeil
- The University of Queensland, Australia
- Christophe Hurter,
- French Civil Aviation University, France
- Kiyoshi KIYOKAWA,
- Nara Institute of Science and Technology, Japan
Overview
Introduction
Data visualization is often summarized by the adage “A picture is worth a thousand words,” yet its power extends far beyond illustration: well‐crafted visualizations reveal patterns, anomalies, and insights that tabular displays simply cannot (Card, Mackinlay, & Shneiderman, 1999; Tufte, 2001). Whereas data collection once represented the principal challenge, today the bottleneck lies in processing and interpreting massive, heterogeneous streams of information (Witten, Frank, Hall, Pal, & Foulds, 2025). Over the past two decades, visual analytics research has largely focused on developing sophisticated interfaces for desktop and laptop systems (Shneiderman & Plaisant, 2010), but the rise of wearable technologies, and concurrent advances in artificial intelligence—now creates opportunities to extend these insights into non-conventional, on-the-move contexts. AI promises to render every bit of data, no matter how noisy or unstructured, automatically analysable, classifiable, and actionable (LeCun, Bengio, & Hinton, 2015), while wearables enable continuous streams of physiological and environmental metrics that traditional dashboards cannot accommodate.
We contend that this burgeoning research direction holds promise across a spectrum of physical activities, beyond elite sport, where users must engage with their data under movement and environmental stress. The recent “Visualization in Motion” agenda pushes this boundary further, exploring how users can access, visualize, analyse, and understand their data while in motion (Yao, Bezerianos, Vuillemot, & Isenberg, 2022). For example, Li et al. (2025) demonstrated that mixed reality overlays can help runners maintain consistent pacing by embedding smartwatch metrics directly into their surrounding environment. Yet despite this progress, most recreational athletes, joggers, cyclists, hikers, remain constrained by small, wrist-mounted displays, limiting both visualization and interaction.
Physical activity spans a spectrum from everyday movement, like walking or casual cycling, to more demanding pursuits such as trail running or mountain biking, and even to truly extreme endeavours like high-altitude mountaineering, deep-water diving, or desert ultrarunning. Each point along this gradient introduces its own environmental challenges, motion-induced blur, limited display real estate, rapidly changing lighting, temperature extremes, or connectivity blackouts, that strain both data capture and visualization. “Extreme Analytics: Analytics at Bad Times!” therefore isn’t limited to traditional “extreme sports,” but embraces any context where external conditions make it difficult to sense, process, and present information in real time. By examining use cases from simple urban mobility to rescue operations in hostile environments, we will identify common pain points and share solutions, combining robust wearable sensing, AI-driven filtering, and context-aware visualization, to support better decisions whenever and wherever the data must flow.
In our seminar Extreme Analytics: Analytics at Bad Times!, where “bad times” denotes any situation in which harsh environmental conditions, rapid motion, constrained interfaces, or volatile data streams undermine traditional analysis, we will explore the key challenges, investigate cutting-edge methods, and develop data-processing and visualization techniques to make analytics truly ubiquitous.
Seminar Objectives
The seminar is built around four core aims that together will advance the field of extreme sport analytics and lay the groundwork for future collaboration and discovery. It will bring into sharp focus the latest breakthroughs in how we collect, process, visualize, and apply AI to data gathered under the most challenging conditions, forging a unified view of state-of-the-art methods. By surveying these technologies side by side, we can pinpoint the gaps and bottlenecks that still hinder end-to-end analysis when every second or every meter counts. At the same time, we will chart the open research questions that emerge where ubiquitous sensing meets immersive analytics and remote teamwork. From ensuring reliable data capture in hostile environments to designing interfaces that let distributed teams explore results together, these challenges demand fresh ideas and shared expertise. To tackle them, the seminar will deliberately mix disciplines, pairing computer scientists with statisticians, engineers with sports scientists, and even philosophers of data with application-domain specialists. This cross-pollination will spark new approaches that neither group could have developed in isolation. Finally, all discussions will funnel into a concrete research roadmap, complete with targeted publication strategies, whether through comprehensive survey articles, challenge papers that define benchmarks, special-issue proposals, or individual studies. In doing so, we’ll ensure that the insights and partnerships born here translate into lasting scholarly and practical impact.
Structure and Format
The seminar unfolds over four days, carefully balancing individual presentations, collaborative problem‐solving, and roadmap development to ensure both depth and momentum. On the first morning, each participant delivers a rapid “lightning” introduction, five to seven minutes apiece, to share their background, tools, and research interests. This whirlwind round sets the stage for mutual understanding and highlights complementary expertise. After lunch, we transition into a facilitated brainstorming session where the group surfaces the most pressing challenges in extreme sport analytics. Participants then vote to prioritize and select the top topics that will shape our subsequent work. Days two and three are structured around short, focused “lightning talks” (10–15 minutes) that seed ideas across the seminar’s themes. These presentations are immediately followed by breakout group sessions, where teams drill down on problem statements, methods, datasets, and evaluation metrics. Each day concludes with a plenary in which groups distil their progress into five-minute reports, ensuring that insights are shared across the entire cohort and that discussions remain connected. The final day is devoted to crafting deliverables and planning next steps. Through an open discussion, the seminar collectively refines a joint white-paper or survey article, explores the feasibility of a special-issue proposal, and establishes working groups charged with pursuing follow-up studies. This structure guarantees that the seminar’s energy translates into concrete outcomes and sustained collaboration.
Conclusion and Expected Outcomes
This document will form the backbone of a broader research roadmap, prioritising the most urgent questions and outlining cross-institutional, multidisciplinary collaborations. We will also leave with concrete publication plans, whether survey articles, benchmark challenge papers, or special-issue proposals, and with a network of working groups ready to pursue these lines of inquiry. The surge in wearable sensing and AI has opened new frontiers in sport and physical-activity analytics, but “bad times” expose the limitations of existing tools. By uniting computer scientists, data scientists, sports engineers, and field practitioners, this seminar will reframe those limitations as opportunities, charting a path toward robust data capture, intelligent filtering, and context-aware visualization that empower users exactly when and where insights are hardest to obtain.
References
Baca, A., Kornfeind, P., & Müller, E. (2015). Wearable sensors in sports and rehabilitation. Journal of Sports Science & Medicine, 14(3), 475–479.
Card, S. K., Mackinlay, J., & Shneiderman, B. (Eds.). (1999). Readings in information visualization: Using vision to think. Morgan Kaufmann.
Cook, K. A., & Thomas, J. J. (2005). Illuminating the path: The research and development agenda for visual analytics (PNNL-SA-45230). Pacific Northwest National Laboratory.
de Coubertin, P. (1894). Olympic Charter.
Düking, P., Fuss, F. K., Holmberg, H.-C., & Sperlich, B. (2016). Recommendations for assessment of the reliability, sensitivity, and validity of data provided by wearable sensors designed for monitoring physical activity. JMIR mHealth and uHealth, 4(3), e1.
Jordan, M. I., & Mitchell, T. M. (2015). Machine learning: Trends, perspectives, and prospects. Science, 349(6245), 255–260.
LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444.
Li, A., Perin, C., Knibbe, J., Demartini, G., Viller, S., & Cordeil, M. (2025, May). Embedded and situated visualisation in mixed reality to support interval running. In Computer Graphics Forum (e70133).
Marriott, K., Schreiber, F., Dwyer, T., Klein, K., Riche, N. H., Itoh, T., … & Thomas, B. H. (Eds.). (2018). Immersive analytics (Vol. 11190). Springer.
Pantagruel. (1552). Rabelais, F.
Shneiderman, B., & Plaisant, C. (2010). Designing the user interface: Strategies for effective human-computer interaction (6th ed.). Pearson Education India.
Tufte, E. R. (2001). The visual display of quantitative information (2nd ed.). Graphics Press.
Weiser, M. (1991). The computer for the 21st century. Scientific American, 265(3), 94–104.
Witten, I. H., Frank, E., Hall, M. A., Pal, C. J., & Foulds, J. (2025). Data mining: Practical machine learning tools and techniques (4th ed.). Elsevier.
Yao, L., Bezerianos, A., Vuillemot, R., & Isenberg, P. (2022). Visualization in motion: A research agenda and two evaluations. IEEE Transactions on Visualization and Computer Graphics, 28(10), 3546–3562.