Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2 
This project integrates AI-based visual models with augmented reality (AR) to address fall-from-height (FFH) risks in high-rise construction. In collaboration with Lendlease, we explore how these technologies can improve the speed, accuracy, and scalability of hazard detection compared to manual inspections. We developed proof-of-concept prototypes that combine AI-driven computer vision with AR to enable real-time analysis, automated reporting, and immersive inspection and training.
This study explored how Generative AI can support productive group discussions by acting as a low-interference regulatory partner rather than a content generator. Using a Design-Based Research approach, we developed and evaluated ProDAIS, a lightweight AI facilitator grounded in SSRL and APT frameworks, with results showing high usability, low cognitive load, and effective support for participation balance and time management.
This project focuses on the comparison and potential implementation of Large Language Models (LLMs) aiming to assist the clinical handovers process in an Intensive Care Unit (ICU) environment. This process is commonly identified as “the transfer of professional responsibility and accountability for some or all aspects of care for a patient and generally occurs between the shifts of supervising nursing staff.
This project uses a comparative experimental design to evaluate a baseline smartphone-only AR navigation system against a hybrid system supported by sparse infrastructure cameras. Through a phased approach, first extending the Monash campus app into an AR prototype, then integrating stereo camera-based 3D reconstructions, the study investigates whether periodic infrastructure “checkpoints” can reduce drift and improve robustness in indoor AR navigation under controlled lab conditions.
This project aims to address the limitations identified in the current state of the art by designing and evaluating a context-aware computing framework for distributed vision-based AR systems.
This project proposes a simulation-based research framework that investigates how VLA-inspired decision-making can support adaptive robotic behaviours in tasks such as storage, retrieval, sorting, and replenishment. By focusing on system integration, realistic task modelling, and evaluation under dynamic operational conditions, this research aims to contribute toward more practical and scalable intelligent manufacturing systems in the agentic era.
This research examines text input methods in virtual and mixed reality, highlighting limitations in speed, accuracy, and usability across keyboards, speech, and gesture-based approaches, and identifies stylus-based input as a promising alternative. It proposes a system that converts 3D stylus trajectories into text using machine learning, aiming to provide a more natural, efficient, and adaptable input method for immersive environments.
This systematic literature review examines how Augmented Reality (AR) and Artificial Intelligence (AI) are being integrated to support medical clinical trials, with a focus on where AI augments AR (e.g., real-time sensing, decision support, adaptive guidance) across trial workflows. It synthesises current integration patterns, application areas, and research gaps to inform the design of more intelligent, immersive trial tools.
This research explores how context-dependent memory influences users’ ability to interpret and recall insights in prox-situated data analytics. This work examines how physical and spatial context, such as environment, viewpoint, and interaction modality, affects understanding and memory retention during analysis. The goal is to inform the design of immersive and situated analytics systems that better support cognition and decision-making.
SafePhARm project explores the use of Augmented Reality (AR) to provide context-aware information directly within pharmacists’ field of view. It aims to identify workflow inefficiencies, co-design AR interfaces with pharmacists, and develop a proof-of-concept prototype. The system will be evaluated in a controlled environment to assess its impact on task efficiency, usability, and cognitive load, ultimately generating design guidelines and evidence for AR-enabled digital health solutions.
The project tests flexible virtual personas that vary in tone, emotion, and resistance to create adaptive role play. It aims to improve the quality, accessibility, and scalability of simulation training, especially for regional and remote learners. Educators can supervise sessions in real time or review interactions through secure recordings and analytics, strengthening feedback loops.
This project develops an AI-driven system to automate brain tumour segmentation from complex 3D MRI scans, reducing the need for time-consuming and error-prone manual annotation by medical experts. It combines deep learning models with immersive visualisation (e.g., AR/VR), allowing clinicians to interact with and refine tumour predictions directly in 3D space.
This project addresses the challenges in the nurse handover process in critical care settings by designing an integrated solution that leverages immersive and AI technologies. We propose an integrated approach consisting of the following innovative concepts: in-situ augmented reality (AR) overlays, an AI-assisted conversational agent, and a cross-reality collaborative model.
Published in 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2020
We explore the adaptation of 2D small-multiples visualisation on flat screens to 3D immersive spaces. We use a “shelves” metaphor for layout of small multiples and consider a design space across a number of layout and interaction dimensions. We demonstrate the applicability of a prototype system informed by this design space to data sets from different domains. We perform two user studies comparing the effect of the shelf curvature dimension from our design space on users’ ability to perform comparison and trend analysis tasks. Our results suggest that, with fewer multiples, a flat layout is more performant despite the need for participants to walk further. With an increase in the number of multiples, this performance difference disappears due to the time participants had to spend walking. In the latter case, users prefer a semi-circular layout over either a fully surrounding or a flat arrangement.
Recommended citation: Liu, J., Prouzeau, A., Ens, B. and Dwyer, T., 2020, March. Design and evaluation of interactive small multiples data visualisation in immersive spaces. In 2020 IEEE conference on virtual reality and 3D user interfaces (VR) (pp. 588-597). IEEE.
Download Paper | Download Bibtex
Published in Cartography and Geographic Information Science (CGIS), 2021
With the increasing availability of head-mounted displays for virtual reality and augmented reality, we can create immersive maps in which the user is closer to the data. Embodiment is a key concept, allowing the user to act upon virtual objects in an immersive environment. Our work explores the use of embodied interaction for immersive maps. We propose four design considerations for embodied maps and embodied gesture interaction with immersive maps: object presence, consistent physics, human body skills, and direct manipulation. We present an example of an immersive flow map with a series of novel embodied gesture interactions, which adhere to the proposed design considerations. The embodied interactions allow users to directly manipulate immersive flow maps and explore origin-destination flow data in novel ways. Authors of immersive maps can use the four proposed design considerations for creating embodied gesture interactions. The discussed example interactions apply to diverse types of immersive maps and will hopefully incite others to invent more embodied interactions for immersive maps.
Recommended citation: Newbury, R., Satriadi, K. A., Bolton, J., Liu, J., Cordeil, M., Prouzeau, A., & Jenny, B. (2021). Embodied gesture interaction for immersive maps. Cartography and Geographic Information Science, 48(5), 417–431. https://doi.org/10.1080/15230406.2021.1929492
Download Paper | Download Bibtex
Published in CHI EA '22: CHI Conference on Human Factors in Computing Systems Extended Abstracts, 2022
Immersive Analytics is now a fully emerged research topic that spans several research communities, including Human-Computer Interaction, Data Visualisation, Virtual Reality and Augmented Reality. Immersive Analytics research has identified and validated benefits of using embodied, 3D spatial immersive environments for visualisation and have shown benefits in the effective use of space and 3D interaction to explore complex data. As of today, most studies in Immersive Analytics have focused on exploring novel visualisation techniques in 3D embodied immersive environments. Thus far, there is a lack of fundamental study in this field that clearly compares immersive versus non immersive platforms for analytics purposes, and firmly delineates the benefits of immersive environments for analytic tasks. We feel that it is time to establish an agenda to assess the benefits and potential of immersive technologies, spatial interfaces, and embodied interaction to support sensemaking, data understanding, and collaborative analytics. This workshop will aim at putting this agenda together, by gathering international experts from Immersive Analytics and related fields to define which studies need to be conducted to assess the effect of embodied interaction on cognition in data analytics.
Recommended citation: Barrett Ens, Maxime Cordeil, Chris North, Tim Dwyer, Lonni Besançon, Arnaud Prouzeau, Jiazhou Liu, Andrew Cunningham, Adam Drogemuller, Kadek Ananta Satriadi, and Bruce H Thomas. 2022. Immersive Analytics 2.0: Spatial and Embodied Sensemaking. In Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (CHI EA '22). Association for Computing Machinery, New York, NY, USA, Article 91, 1–7. https://doi.org/10.1145/3491101.3503726
Download Paper | Download Bibtex
Published in Proceedings of the ACM on Human-Computer Interaction (ISS), 2022
In immersive environments, positioning data visualisations around the user in a wraparound layout has been advocated as advantageous over flat arrangements more typical of traditional screens. However, other than limiting the distance users must walk, there is no clear design rationale behind this common practice, and little research on the impact of wraparound layouts on visualisation tasks. The ability to remember the spatial location of elements of visualisations within the display space is crucial to support visual analytical tasks, especially those that require users to shift their focus or perform comparisons. This ability is influenced by the user’s spatial memory but how spatial memory is affected by different display layouts remains unclear. In this paper, we perform two user studies to evaluate the effects of three layouts with varying degrees of curvature around the user (flat-wall, semicircular-wraparound, and circular-wraparound) on a visuo-spatial memory task in a virtual environment. The results show that participants are able to recall spatial patterns with greater accuracy and report more positive subjective ratings using flat than circular-wraparound layouts. While we didn’t find any significant performance differences between the flat and semicircular-wraparound layouts, participants overwhelmingly preferred the semicircular-wraparound layout suggesting it is a good compromise between the two extremes of display curvature.
Recommended citation: Liu, J., Prouzeau, A., Ens, B. and Dwyer, T., 2022. Effects of display layout on spatial memory for immersive environments. Proceedings of the ACM on Human-Computer Interaction, 6(ISS), pp.468-488.
Download Paper | Download Bibtex
Published in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 2023
Recent studies have explored how users of immersive visualisation systems arrange data representations in the space around them. Generally, these have focused on placement centred at eye-level in absolute room coordinates. However, work in HCI exploring full-body interaction has identified zones relative to the user’s body with different roles. We encapsulate the possibilities for visualisation view management into a design space (called “DataDancing”). From this design space we extrapolate a variety of view management prototypes, each demonstrating a different combination of interaction techniques and space use. The prototypes are enabled by a full-body tracking system including novel devices for torso and foot interaction. We explore four of these prototypes, encompassing standard wall and table-style interaction as well as novel foot interaction, in depth through a qualitative user study. Learning from the results, we improve the interaction techniques and propose two hybrid interfaces that demonstrate interaction possibilities of the design space.
Recommended citation: Liu, J., Ens, B., Prouzeau, A., Smiley, J., Nixon, I.K., Goodwin, S. and Dwyer, T., 2023, April. Datadancing: An exploration of the design space for visualisation view management for 3d surfaces and spaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-17).
Download Paper | Download Bibtex
Published in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 2023
This paper presents the design and evaluation of GestureExplorer, an Immersive Analytics tool that supports the interactive exploration, classification and sensemaking with large sets of 3D temporal gesture data. GestureExplorer features 3D skeletal and trajectory visualisations of gestures combined with abstract visualisations of clustered sets of gestures. By leveraging the large immersive space afforded by a Virtual Reality interface our tool allows free navigation and control of viewing perspective for users to gain a better understanding of gestures. We explored a selection of classification methods to provide an overview of the dataset that was linked to a detailed view of the data that showed different visualisation modalities. We evaluated GestureExplorer with two user studies and collected feedback from participants with diverse visualisation and analytics backgrounds. Our results demonstrated the promising capability of GestureExplorer for providing a useful and engaging experience in exploring and analysing gesture data.
Recommended citation: Li, A., Liu, J., Cordeil, M., Topliss, J., Piumsomboon, T. and Ens, B., 2023, April. Gestureexplorer: Immersive visualisation and exploration of gesture data. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1-16).
Download Paper | Download Bibtex
Published in 2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2024
Augmented Reality (AR) is touted to be beneficial in supporting situated information display, allowing virtual information panels to be overlaid on real-world scenes. People must then use their spatial memory to navigate among these virtual panels effectively. While spatial memory has been studied in physical environments (wall displays) and virtual reality environments, there has been little research on how physical surroundings might affect memorisation of virtual content in a mixed environment like AR. Therefore, we provide the first AR study of spatial memory, comparing two different room settings with two different situated layouts of virtual targets on an abstract spatial memory task. We find that participants recall spatial patterns with greater accuracy and higher subjective ratings in a room with furniture compared to an empty room. Our findings lead to important design implications for mixed-reality user interfaces, particularly in information-rich applications like situated analytics and small-multiples information visualisation.
Recommended citation: Liu, J., Satriadi, K.A., Ens, B. and Dwyer, T., 2024, October. Investigating the effects of physical landmarks on spatial memory for information visualisation in augmented reality. In 2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 289-298). IEEE.
Download Paper | Download Bibtex
Published in 2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 2024
Together with industry experts, we are exploring the potential of head-mounted augmented reality to facilitate safety inspections on high-rise construction sites. A particular concern in the industry is inspecting perimeter safety screens on higher levels of construction sites, intended to prevent falls of people and objects. We aim to support workers performing this inspection task by tracking which parts of the safety screens have been inspected. We use machine learning to automatically detect gaps in the perimeter screens that require closer inspection and remediation and to automate reporting. This work-in-progress paper describes the problem, our early progress, concerns around worker privacy, and the possibilities to mitigate these.
Recommended citation: J. Liu, A. S. Rao, F. Ke, T. Dwyer, B. Tag and P. D. Haghighi, "AR-Facilitated Safety Inspection and Fall Hazard Detection on Construction Sites," 2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Bellevue, WA, USA, 2024, pp. 12-14, doi: 10.1109/ISMAR-Adjunct64951.2024.00010.
Download Paper | Download Bibtex
Published in IEEE VIS 2025 Poster, 2025
Brain tumours differ significantly in shape, size, location, and contrast imperfections. Reliable segmentation is crucial for accurate identification and effective treatment. Recent advances in AI-based 3D medical image segmentation, such as SAM-Med3D, have improved automation; however, expert supervision remains vital for brain imaging. We present a virtual reality probe that integrates an expert-in-the-loop approach with SAM-Med3D, allowing users to iteratively enhance brain tumour segmentation by selecting points for AI refinement.
Recommended citation: Pooryousef, V., Peiris, H., Liang, H., Li, A., Chen, Z., Dwyer, T., & Liu, J., (2025 Oct) SAMMed-VR: Integrated Segment Anything Model in Virtual Reality for Supervised Brain Tumour Segmentation. IEEE VIS Poster
Download Paper
Published in International Journal of Human-Computer Studies (IJHCS), 2025
Many medical procedures, including pedicle screw placement, require intricate hand-sight coordination. In recent years, immersive virtual reality (VR) technologies have gained traction in supporting training of such complex tasks. To effectively perform the task, the ability to see the screw position from different angles during the procedure is crucial, as it meets the user’s need for comprehensive spatial information to guide their actions. Yet, current literature lacks guidelines for designing view layouts for VR simulators in this context. We conducted a repeated measure experiment investigating various layout parameters (8 layouts and 2 view sizes). We gathered behavioral metrics, eye-tracking data, and subjective ratings from 27 participants. We found that layout design significantly impacts task performance, with placing views on the left of the visual field in a vertical arrangement reducing task response time. Furthermore, we found the effects of view arrangements on the flow of visual search patterns. Our study provides design guidelines to inform future design of VR pedicle screw placement simulators and other types of simulators requiring the combination of manual tasks and multiple-perspective views.
Recommended citation: Qin, L., Satriadi, K.A., Liu, J., Zhan, Y., Shao, J., Liu, P., Chen, Z. and Liu, Y., 2025. Effects of interface layouts on cognitive performance for pedicle screw placement simulator in immersive environments. International Journal of Human-Computer Studies, p.103650.
Download Paper | Download Bibtex
Published in IEEE VIS 2025 Workshop Proposal, 2026
It has been ten years since the term ‘Immersive Analytics’ (IA) was coined and research interest in the topic remains strong. Researchers in this field have produced practical and conceptual knowledge concerning the use of emerging immersive spatial display and interaction technologies for sense-making tasks through a number of papers, surveys, and books. However, a lack of truly physically and psychologically ergonomic techniques, as well as standardized human-centric validation protocols for these, remains a significant barrier to wider acceptance of practical IA systems in ubiquitous applications. Building upon a series of workshops on immersive analytics at various conferences, this workshop aims to explore new approaches and establish standard practices for evaluating immersive analytics systems from a human factors perspective. We will gather immersive analytics researchers and practitioners to look closely at these human factors – including cognitive and physical functions as well as behaviour and performance – to see how they inform the design and deployment of immersive analytics techniques and applications and to inform future research.
Recommended citation: Li, Y., Satriadi, K.A., Liu, J., Khurana, A., Wu, Z., Tag, B. and Dwyer, T., 2026. Human Factors in Immersive Analytics. arXiv preprint arXiv:2601.11365.
Download Paper | Download Bibtex
Published in ACM CHI 2026 XR4CE Workshop, 2026
Effective nurse handover is crucial in high-pressure environments like Intensive Care Units (ICUs), where accurate communication of patient-specific information directly shapes patient care and clinical decision-making. We conducted seven interviews with ICU nurses to understand current handover practices. Preliminary findings reveal significant challenges, including high cognitive load from fragmented EMR data, the risk of technology hindering interpersonal rapport, and the loss of nuanced data during shift transitions. These issues lead to cognitive overload and information omission, particularly during fast-paced shift transitions when staff fatigue is prevalent. We explore the potential for in-situ Augmented Reality (AR) overlays and Artificial Intelligence (AI) agents to support ICU nurse handover by enabling hands-free information access, procedure guidance and documentation assistance.
Recommended citation: Li, M., Zhang, P., Liu, J., Haryanto, A., Satriadi, K.A., Nguyen, T., Mehta, D., Lokmic-Tomkins, Z. and Dwyer, T., 2026. HandovAR: Towards AR and AI Support for ICU Nurse Handover.
Download Paper | Download Bibtex
Published:
This is a description of your talk, which is a markdown file that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Undergraduate course, Monash University, 2019
Unit Code: FIT3164 2019S2
Undergraduate course, Monash University, 2020
Unit Code: FIT3179 2020S2
Undergraduate course, Monash University, 2021
Unit Code: FIT3163 2021S2
Postgraduate course, Monash University, 2021
Unit Code: FIT5152 2021S2
Undergraduate course, Monash University, 2021
Unit Code: FIT3179 2021S2
Undergraduate course, Monash University, 2022
Unit Code: FIT3164 2022S1
Undergraduate course, Monash University, 2022
Unit Code: FIT3175 2022S1
Postgraduate course, Monash University, 2022
Unit Code: FIT5147 2022S1
Undergraduate course, Monash University, 2022
Unit Code: FIT3179 2022S2
Undergraduate course, Monash University, 2023
Unit Code: FIT3179 2023S2
Postgraduate course, University of Melbourne, 2024
Unit Code: COMP90044 2024S1
Undergraduate course, Monash University, 2024
Unit Code: FIT2081 2024S1
Undergraduate course, Monash University, 2024
Unit Code: FIT1050 2024S1
Postgraduate course, University of Melbourne, 2024
Unit Code: COMP90044 2024S2
Undergraduate course, Monash University, 2024
Unit Code: FIT2095 2024S2
Undergraduate course, Monash University, 2024
Unit Code: FIT3179 2024S2
Undergraduate course, Monash University, 2025
Unit Code: FIT2081 2025S1
Undergraduate course, Monash University, 2025
Unit Code: FIT2095 2025S2
Undergraduate course, Monash University, 2025
Unit Code: FIT3179 2025S2
Undergraduate course, Monash University, 2026
Unit Code: FIT2081 2026S1
Undergraduate course, Monash University, 2026
Unit Code: FIT2095 2026S2