Faculty

Winslow Burleson headshot

Winslow Burleson

Associate Professor
Affiliated Associate Professor, Tandon School of Engineering
Affiliated Associate Professor, Courant Institute
Affiliated Associate Professor, College of Global Public Health

1 212 998 5376

433 First Avenue
Room 740
New York, NY 10010
United States

expand all

collapse all

Professional overview

Winslow Burleson joined the New York University College of Nursing as Associate Professor in September 2014. Win leads the NYU-X Lab (www.nyu-x.org) advancing transformations in Health, Technology, Education, and Innovation. He is the Principal Investigator for the 5 year, $4 million NSF Major Research Infrastructure project, "Development of Experiential Supercomputing: A Transdisciplinary Research and Innovation Holodeck." Previously, Win was an Assistant Professor at the School of Computing, Informatics, and Decision Systems Engineering, in the Ira A. Fulton School of Engineering, and Senior Sustainability Scientist at the Julie Ann Wrigley Global Institute of Sustainability at Arizona State University (ASU). He was recently selected as a Fulbright Specialist Roster Candidate in Computer Science. As a social inventor, he led Motivational Environments, a 20-person research group advancing human-centered technology and design strategy to improve quality of life through increased creativity, learning, and health. Win was recognized by ASU as an outstanding professor and by the National Academy of Engineering (NAE) as “one of the nation’s leading engineering researchers and educators” for service and leadership roles in advancing world-class transdisciplinary research and education for minority and under-served learners. Win has raised over $11 million in collaborative funding and received the best paper award at the 2009 International Conference on Artificial Intelligence, the field’s top conference, showing Affective Learning Companions’ ability to have large-scale societal impact on thousands of students, by bringing cyber-enabled learning research into classroom settings. Win has been awarded 10 patents, four inventor and innovator awards from IBM Research, two Time Magazine Awards for the top 10 and top 50 inventions of the year, authored over 100 scientific publications, exhibited at the Pompidou Centre, and performed in Carnegie Hall.

Honors and awards

White House Fellow Regional Finalist (New York, NY) (2015)

Specialties

Gerontology
Technology

Professional membership

Association for Computing Machinery

Publications

Publications

An assistive technology system that provides personalized dressing support for people living with dementia: Capability study

Burleson, W., Lozano, C., Ravishankar, V., Lee, J., & Mahoney, D. (2018). Journal of Medical Internet Research, 20(5). 10.2196/medinform.5587
Abstract
Background: Individuals living with advancing stages of dementia (persons with dementia, PWDs) or other cognitive disorders do not have the luxury of remembering how to perform basic day-to-day activities, which in turn makes them increasingly dependent on the assistance of caregivers. Dressing is one of the most common and stressful activities provided by caregivers because of its complexity and privacy challenges posed during the process. Objective: In preparation for in-home trials with PWDs, the aim of this study was to develop and evaluate a prototype intelligent system, the DRESS prototype, to assess its ability to provide automated assistance with dressing that can afford independence and privacy to individual PWDs and potentially provide additional freedom to their caregivers (family members and professionals). Methods: This laboratory study evaluated the DRESS prototype's capacity to detect dressing events. These events were engaged in by 11 healthy participants simulating common correct and incorrect dressing scenarios. The events ranged from donning a shirt and pants inside out or backwards to partial dressing-typical issues that challenge a PWD and their caregivers. Results: A set of expected detections for correct dressing was prepared via video analysis of all participants' dressing behaviors. In the initial phases of donning either shirts or pants, the DRESS prototype missed only 4 out of 388 expected detections. The prototype's ability to recognize other missing detections varied across conditions. There were also some unexpected detections such as detection of the inside of a shirt as it was being put on. Throughout the study, detection of dressing events was adversely affected by the relatively smaller effective size of the markers at greater distances. Although the DRESS prototype incorrectly identified 10 of 22 cases for shirts, the prototype preformed significantly better for pants, incorrectly identifying only 5 of 22 cases. Further analyses identified opportunities to improve the DRESS prototype's reliability, including increasing the size of markers, minimizing garment folding or occlusions, and optimal positioning of participants with respect to the DRESS prototype. Conclusions: This study demonstrates the ability to detect clothing orientation and position and infer current state of dressing using a combination of sensors, intelligent software, and barcode tracking. With improvements identified by this study, the DRESS prototype has the potential to provide a viable option to provide automated dressing support to assist PWDs in maintaining their independence and privacy, while potentially providing their caregivers with the much-needed respite.

Affect Measurement

Gonzalez-Sanchez, J., Baydogan, M., Chavez-Echeagaray, M. E., Atkinson, R. K., & Burleson, W. (2017). In Emotions and Affect in Human Factors and Human-Computer Interaction: A Roadmap Through Approaches, Technologies, and Data Analysis (pp. 255-288). Elsevier. 10.1016/B978-0-12-801851-4.00011-2
Abstract
Affect signals what humans care about and what matters to them. By providing computers with the capability to measure affect, researchers aspire to narrow the communication gap between the emotional human and the emotionally detached computer, with the ultimate aim of enhancing human-computer interactions. This chapter explores the multidisciplinary foundations of affective state measurement as a multimodal process. Specifically, it: (1) describes popular sensing technologies, including brain-computer interfaces, face-based emotion recognition systems, eye-tracking systems, physiological sensors, body language recognition, and text-based language processing; (2) explores the data gathered from each technology and its key characteristics; (3) outlines the pros and cons of each technology; (4) examines sampling, filtering, and multimodal affective data integration methodologies; and (5) presents the tools and algorithms used to analyze affective data off-line, seeking to make inferences regarding the meaning of that data and to correlate it with stimuli.

Development of a responsive emotive sensing system (DRESS) to aid persons with dementia dress independently

Mahoney, D. F., Burleson, W., Lozano, C., Ravishankar, V., & Mahoney, E. (2014). Gerontechnology, 13(2). 10.4017/gt.2014.13.02.020.00
Abstract
Purpose: The goal of this study is to develop a 'smart dresser' for in-home use by people with moderate memory loss. This device is designed to provide individualized audio and visual task prompting and enables people with moderate memory loss to dress while giving a respite to their caregivers. Prior research indicates that there has been insufficient attention to the stressors associated with dementia-related dressing issues1, which include stigmatizing clothing2 and wearable technology challenges3. Other researchers have suggested the need for innovative gerontechnology initiatives4. The DRESS system uniquely combines interactive contextaware/skeletal movement-, wrist-affective emotion sensing and fiducial fabric tag components to assess and respond to users in real time. Method: Mixed methods approach. To critique DRESS design and provide usability recommendations, qualitative inductive focus group research was done with 25 family caregivers of persons who displayed dressing difficulties. System development followed an iterative path incorporating caregivers' feedback. Quantitative technical feasibility testing occurred in a controlled lab with ten actors portraying nine different standardized dressing scenarios. Results & Discussion: Caregivers validated the need for tangible dressing assistance to reduce frustration from time spent in repetitive cueing and from struggles over dressing5. They contributed six changes that influenced the smart dresser's conceptual stage prototype development, most notably adding a dresser top iPad to mimic a familiar 'TV screen' for the audio and visual cueing (Figure1). DRESS demonstrated reliable operations, but the accuracy of clothing identification ranged from 16% for the most difficult inside out pants layout to 100% for shirts. Adjustments were made to Kinect and fiducial threshold values, which increased the pants orientation accuracy rate to 81%. The findings demonstrate proof of feasibility and validation of the conceptual development phase. Beta stage development will follow this initial work.

Developing interactive and emergency response devices for people with disabilities and their canine assistants

Kussalanant, C., Takamura, J., Shin, D., & Burleson, W. (2012). In Advances in Human Aspects of Healthcare (pp. 721-732). CRC Press. 10.1201/b12318
Abstract
This study explores the extent to which we can trust and rely on dogs to use technology in order to perform critical behaviors that enhance health, safety, and well-being. Through the use of such technologies, humans and dogs foster a more robust and supportive environment than either assistive technologies or canine assistants can provide alone. We implement a user-centered approach focused on natural observation, respondent field survey, scenario-based design prototyping, and ethnographic case studies, including interviews and photo journals as key methods. This approach presents opportunities for further advances in human-animal interaction, new strategies for advancing assistive technologies, and richer human computer interaction (HCI) experiences. These scenarios are increasingly inclusive of dogs as users of and contributors to more fulfilling interactions with technology.

Bayesian networks and linear regression models of students’ goals, moods, and emotions

Arroyo, I., Cooper, D. G., Burleson, W., & Woolf, B. P. (2010). In Handbook of Educational Data Mining (pp. 323-338). CRC Press. 10.1201/b10274
Abstract
If computers are to interact naturally with humans, they should recognize students’ affect and express social competencies. Research has shown that learning is enhanced when empathy or support is provided and that improved personal relationships between teachers and students leads to increased student motivation.1-4 Therefore, if tutoring systems can embed affective support for students, they should be more effective. However, previous research has tended to privilege the cognitive over the affective and to view learning as information processing, marginalizing, or ignoring affect.5 This chapter describes two data-driven approaches toward the automatic prediction of affective variables by creating models from students’ past behavior (log-data). The first case study shows the methodology and accuracy of an empirical model that helps predict students’ general attitudes, goals, and perceptions of the software, and the second develops empirical models for predicting students’ fluctuating emotions while using the system. The vision is to use these models to predict students’ learning and positive attitudes in real time. Special emphasis is placed in this chapter on understanding and inspecting these models, to understand how students express their emotions, attitudes, goals, and perceptions while using a tutoring system.