The Creative Coding Research Group is a team of interdisciplinary researchers and artists working in the Electronic Visualization Laboratory, housed within the Department of Computer Science at University of Illinois at Chicago. We focus on applied research in interaction and visualization and as well as on explorations of experimental and creative works based on current techniques in human-computer interaction, scientific and information visualization, computer vision, immersive environments, and machine learning. A core philosophy of the Creative Coding Research Group is that by incorporating research methodologies from media arts, design, and computer science, we can develop novel solutions to interdisciplinary problems. Moreover, we believe that the creative outputs generated at the intersections of artistic and empirical research can meaningfully elucidate issues in science and technology relevant to contemporary culture.

Recent projects have been published in or presented at top-tier journals and conferences in computer science and media arts, including: IEEE Transactions on Visualization and Computer Graphics, IEEE Computer Graphics & Applications, SAGE Information Visualization, Leonardo Journal, ACM SIGGRAPH, IEEE VIS, ACM SIGSPATIAL, and ACM Multimedia, among others. Our research group is currently funded by multi-year federal grants from the Defense Advanced Research Projects Agency, the National Institute of Biomedical Imaging and Bioengineering, and the U.S. Department of Agriculture, as well as by a corporate grant from Keysight Technologies, a UIC College of Engineering award, and a National Endowment for the Humanities “Common Good” award.

 
Novel methods for visualizing and interacting with large, complex, heterogeneous datasets are needed in order to make sense of the increasing amount of information that researchers have access to. Interactive visualization tools can help users to understand, trust, and validate computational models generated by sophisticated data analysis techniques. We are particularly interested in developing interactive visualization systems for domain experts utilizing machine learning and data mining algorithms that enable effective collaboration. Our research explores the opportunities that arise when moving beyond traditional computing environments, and we investigate visualization techniques on mobile devices as well as within immersive virtual reality environments.
 
Funded by DARPA’s “Big Mechanism” program, we are developing a variety of novel interaction and visualization methods that will allow systems biologists and cancer researchers to more easily navigate and discover patterns at different levels of the biological hierarchy of signaling pathways. We have recently developed new techniques involving interactive animation that make it easier for biologists to explore and test hypothesis about the causality inherent to these pathways. Understanding the downstream effects of specific biochemical reactions could make it easier to design drugs that target cancer and other diseases that involve issues with intracellular communication. Additionally, we have developed new network visualization techniques that make it possible to interactively investigate the provenance of biological data culled from different experiments. Our research has led to the development of a series of open source visualization projects, including ReactionFlow, PathwayMatrix, BranchingSets, and ProvenanceMatrix, each of which are available online at our code repository. Articles have recently appeared in BMC Proceedings and been presented at VOILA’15 and BioVis’15, and we currently have articles in submission to EuroVis’16 and PacificVis’16.
Magnetic resonance imaging techniques enable neuroimagers to collect and derive data about how different brain regions connect from both a structural and a functional point of view. Analogous to the concept of genome for genetic data, a brain connectome is a whole-brain comprehensive map of neural connections. As neural connections exhibit complex patterns of function and structure, the field of brain connectomics has emerged in order to understand these imaging big data. The brain connectome is typically mathematically represented using connectivity matrices that describe the interaction amongst different brain regions. Recent work, developed in collaboration with Drs. Alex Leow and Olusola Ajilore of the Collaborative Neuroimaging Environment for Connectomics group at UIC, introduces an innovative visualization technology with the ability to reconstruct and analyze the intrinsic geometry of brain data— the topological space where brain connectivity natively resides (independent of anatomy). Understanding this intrinsic geometry could potentially lead to a greater distinction of differences in clinical cohorts, and help track longitudinal changes in individual brains in order to better deliver precision medicine. Additional projects are developed in collaboration with Drs. Tanya Berger-Wolf, Robert Kenyon, and Daniel Llano to explore new visualization techniques to explore the community dynamics of brain regions in response to stimuli. Our work on interactive visual analytics tools for connectome analysis has been presented at BIH’15 and EuroVis’15, and has been recently published in Brain Informatics and the Journal of Imaging Sciences & Technology.
The use of electronic health records (EHRs) in clinical environments provides new opportunities for clinicians to integrate data analyses into their practice. While having access to these records has many benefits, the act of recording, retrieving, and analyzing this data can nonetheless introduce communication issues— navigating and interpreting large amounts of heterogeneous data can be difficult, and conclusions can be hard to validate. In collaboration with Jane Carrington and Mihai Surdeanu from the University of Arizona, our lab has created a series of integrated visual interfaces to help nurses document and reason about patient data and about clinicians’ understanding of patient data. The interfaces present the output of a predictive algorithm that makes use of historical EHR data, patient vital signs, and nurse handoff reports in order to classify a patient in terms of their likelihood of experiencing clinical events. The interfaces enable the nurses to quickly explore the original data and to examine other nurses’ interpretation of patient activity during previous shifts. This work, funded by the National Institutes of Health as part of the NSF/NIH Smart and Connected Health Program, has been presented at TextVis’13 and VAHC’15, and is currently being evaluated in the context of real-world healthcare situations.
While contemporary artists often look toward recent developments in science and engineering to inspire new artworks, novel techniques introduced in interactive arts installations may also prove beneficial to scientific applications. Although it may not be initially clear during the creation of these artistic techniques how they could lead to “useful” functionalities, rigorously pursuing creative interactions as a means to explore new ideas invariably produces new insight and new perspectives. Artistic research and scientific research can be complimentary to each other, and the Creative Coding Research Group offers an interdisciplinary space that welcomes empirical and creative thinkers. We have presented investigations of these hybrid “art-science” approaches at SIGGRAPH’15 and ISEA’15, and in the Leonardo Journal of Arts, Sciences and Technology. Through collaborations with a range of artists, including George Legrady, Andres Burbano, Javier Villegas, Brett Balogh, and Christopher Jette, among others, we have created a series of large-scale public interactive artworks that have been shown at galleries and festivals throughout the world. In order to promote dialogue about the relation of aesthetics and design to visualization research, Angus Forbes co-chaired the IEEE VIS Arts Program from 2013 through 2016. Angus also chaired the Expressive’15 arts exhibition and is the papers chair for the Computational Aesthetics track for Expressive’16.
Reading and Assembling Contextual and Holistic Big Mechanisms, 2014-2018. DARPA Big Mechanism, Grant #BAA-14-14. PI: M. Surdeanu; Co-PIs: K. Barnard, C. Morrison, A. G. Forbes, R. Gutenkunst, G. Yao.

Visualizing the Structure and Function of Biological Pathways to Accelerate Discovery in Cancer Research, 2015-2016. UIC College of Engineering Seed Funding Award. PI: A. G. Forbes.

iAnimal: Cyberinfrastructure Enabling Animal Breeding, Genetics, And Genomics, 2013-2016. USDA Agriculture and Food Research Initiative (AFRI), Grant #2013-67015-21231. PI: E. Lyons; Co-PIs: F. McCarthy, A. G. Forbes, J. Koltes.

Enhancing Nurse Effectiveness via Augmented Communication Tools, 2014-2017. NIH/NIBIB R01, Grant #R01EB020395 (part of the NSF/NIH Smart and Connected Health Program). PI: J. Carrington; Co-PIs: A. G. Forbes, M. Surdeanu.

Interactive and Immersive Electronic Measurement Visualization, 2015-2016. Keysight Technologies, Inc. Program for University Research. PI: A. G. Forbes.

Creative Challenges at the Intersections of Visualization Research and New Media Arts, 2015-2016. UIC Office of the Vice Provost for Faculty Affairs, Faculty Scholarship Support Fund. PI: A. G. Forbes.

Making the West Side: Community Conversations on Neighborhood Change, 2016. NEH The Common Good: Humanities in The Public Square. PI: J. Scott; Key Personnel: A. G. Forbes.