Human-Computer Interaction, Information Theory & Learning
Understanding and designing better ways for humans to think, learn, and create with computational tools. As AI capabilities grow, the primary bottleneck for effective systems is shifting from model performance to the interface between humans and computers. This research area tackles that information bottleneck, emphasizing embodied cognition, community agency, and hyperlocal knowledge systems. Drawing from Geoffrey Litt's framework, we're particularly interested in systems that preserve "good friction" and serendipity in workflows rather than over-optimizing for efficiency. We explore malleable interfaces that communities can adapt rather than rigid systems.
Core Research Areas
Embodied Cognition & Tangible Interfaces
Understanding through physical manipulation. Moving interaction off screens and into the material world
Research Focus: How physical manipulation and embodied interaction enhance learning, traditional knowledge transfer, and community decision-making. We investigate tangible interfaces that leverage the body-spatial cognition relationship to support skills development and ecological understanding.
See touching computers and spencer's story
Notation Systems & Visual Languages
How symbolic systems shape thought, enable discovery, and make abstract structures manipulable
Research Focus: Understanding how notation design influences cognition and creativity. Key examples demonstrate this power: juggling notation (siteswap) enabled discovery of entirely new tricks, Conway's knot notation reduced enumeration work from years to hours, and the shift from Roman to Arabic numerals transformed mathematical reasoning. We study notations across domains—mathematics, music, dance, logic, programming—to extract design principles. This work is guided by the Congruence Principle (a representation should be congruent with its underlying data) and the concept of "free rides" (new information generated through notational rules). Following Andy Matuschak's research on tools that enable "previously unthinkable thoughts."
We are also documenting and studying notation systems from Northeast India—textile and weaving pattern languages, indigenous scripts (Meitei Mayek, Ahom, etc.), and traditional music and dance notation—to understand how these regional systems encode knowledge and enable transmission across generations.
See notation history and notation & cognition
Transformative Tools for Thought & Augmented Cognition
Developing systems that don't just manage information, but actively augment and shape the process of thinking itself, drawing from cognitive science and AI.
Research Focus: This area moves beyond simple information management to investigate "transformative tools for thought," systems designed to help people have new, more powerful kinds of thoughts. Inspired by Matuschak and Nielsen's critique of passive media, we explore how to create dynamic environments for thinking that demand active engagement. We also draw on Barbara Tversky's work on the "cognitive design of tools," studying how external representations (diagrams, sketches, gestures) can be augmented with AI. The goal is to create systems that act as true cognitive partners, enhancing memory, decision-making, and creativity.
Research Questions:
How can we design "executable books" or learning environments that, like "Quantum Country," integrate active recall and spaced repetition to ensure deep understanding of complex topics?
What are the "graphic semantics" (per Tversky) of AI-native interfaces? How can we use embedding-based graphs, generative diagrams, and other AI-generated representations to create novel visual languages for thought?
Can AI be used to facilitate "constructive perception" in messy sketches and diagrams? For instance, could a system suggest novel interpretations or highlight emerging patterns in a designer's rough sketch to spur creativity?
How can we apply the concept of "spraction" (action in space to create abstractions) to digital environments? What would a tangible or gestural interface for generative AI look like, allowing users to physically "sculpt" ideas?
What does an AI-augmented Zettelkasten or personal wiki look like? Can generative models automate the linking of ideas, suggest new connections, and synthesize information from a user's entire digital footprint (notes, browser history, etc.)?
What new forms of decision-making and creative ideation become possible with generative interfaces that can instantly produce customized visualizations, summaries, or counter-arguments?
Key Research Connections & References
Leading Research Labs & Institutions
- MIT Media Lab - Tangible Media Group: Hiroshi Ishii's research on Tangible Bits, Radical Atoms, and shape-changing interfaces
- Stanford HCI Group: Sean Follmer's work on physical telepresence and embodied interaction
- Ink & Switch: Independent research lab pioneering local-first software and collaborative tools that preserve data ownership
- Dynamicland: Bret Victor's community research space exploring physical computing environments
- UC Berkeley, CMU Morphing Matter Lab: Environmental sensing and adaptive interfaces
- University of Delaware Sensify Lab: Environmental sustainability and human-building interaction
- University of Tokyo: Hiroki Kobayashi's HCBI research extending HCI to biosphere interactions
- Lossfunk: independent minds exploring foundational questions
Key Researchers
- Bret Victor: Dynamicland founder, "Inventing on Principle" vision for humane computing
- Andy Matuschak: Independent researcher on tools for thought, transformative notation systems
- Geoffrey Litt: Research on malleable interfaces, AI as augmentation vs replacement
- Hiroshi Ishii: Pioneer of tangible user interfaces, MIT Tangible Media Group
- Martin Kleppmann: Co-author of local-first software manifesto, distributed systems researcher
- Hiroki Kobayashi: University of Tokyo, leading HCBI research on non-human-centric interaction and environmental sustainability
Relevant Conferences & Venues
- UIST (User Interface Software and Technology) - Innovative interface research
- DIS (Designing Interactive Systems) - Participatory and community-centered design
- CSCW (Computer-Supported Cooperative Work) - Collaborative systems and communities
- TEI (Tangible, Embedded & Embodied Interaction) - Physical computing interfaces
- Vizchitra