Research

Most of my research falls into one of four areas: visualization, the Leeds Virtual Microscope, virtual reality, and the world-wide web. My current projects are Event Sequences, QuantiCode and QualDash (all visualization) and the Leeds Virtual Microscope. I maintain an up-to-date list of my publications, which includes links to all of the videos that illustrate my research and that are mostly on my YouTube channel. You can also look at my papers on Google Scholar.

I have also created some writing tips to help you overcome writer’s block and structure reports, papers and dissertations.

Visualization Research

Orchestral

orchestralThis proof-of-concept tool demonstrated the advantages of using purpose-designed visualization software on Powerwalls to analyse genomics data. The primary benefit came from allowing users to see detailed data for an entire chromosome and up to 100 patients at once, which allowed users to identify data processing errors and understand key limitations of a new statistical algorithm. Funding: Yorkshire Cancer Research. Key paper: Ruddle, R. A., Fateen, W., Treanor, D., Sondergeld, P., & Quirke, P. (2013). Leveraging wall-sized high-resolution displays for comparative genomics analyses of copy number variation. Proceedings of the IEEE Symposium on Biological Data Visualization (BioVis), 89-96.

Paramorama

paramorama-2015Paramorama helps users to design and optimize data processing pipelines, which is very different from the established use of visualization to explore data and communicate findings. Paramorama was developed by Hannes Pretorius during the WelMec project. Funding: Wellcome Trust & EPSRC. Key paper: Pretorius, A. J., Zhou, Y., & Ruddle, R. A. (2015). Visual parameter optimisation for biomedical image processing. BMC Bioinformatics 2015, 16(Suppl 11):S9.

Powerwall Interfaces

rooney-its-app-320x256Chris Rooney developed a novel multi-modal visualization tool for Powerwalls, and interaction techniques that were fast because of their emphasis on low-precision movement. Funding: University of Leeds PhD Scholarship. Key paper: Rooney, C. & Ruddle, R. A. (2012). Improving window manipulation and content interaction on high resolution, wall-sized displays. International Journal of Human-Computer Interaction, 28, 423-432.

Tangible User Interfaces

shiroq-tuiShiroq Al-Megren developed a wonderful tangible user interface (TUI) for visualizing genomic data on a tabletop display. Users analysed the data significantly faster and more efficiently with the TUI than a touch display. Video. Funding: PhD Scholarship from the Saudi Arabian Government. Key paper: Al-Megren, S. & Ruddle, R. A. (2016). Comparing tangible and multi-touch interaction for interactive data visualization tasks. ACM International Conference on Tangible, Embedded and Embodied Interaction (TEI), 279-286.

PETMiner

petminerPETMiner is a visualization tool for the analysis of petrophysics data. The tool is novel because of the methods it uses to dramatically reduce the cost of user interaction. Those methods centre on the interaction style, integrating subjective & objective data, exploiting ultra-high definition displays, and making repetitive operations fast. Funding: Joint Industry Project (for details, see Industry). Key paper: Harrison, D. G., Efford, N. D., Fisher, Q. J., & Ruddle, R. A. (in press). PETMiner – A visual analysis tool for petrophysical properties of core sample data. IEEE Transactions on Visualization and Computer Graphics.

Event Sequences

eventevent2016This research with the Fraunhofer Institute for Computer Graphics Research (IGD) is investigating techniques for visualizing multiple attibutes in the electronic health records (EHRs) of patient cohorts, and evaluating the effect of different visualization techniques on people’s ability to judge the similarity of event sequences. Funding: Alexander von Humboldt Foundation. Key paper: Ruddle, R., Bernard, J., May, T., Luecke-Tieke, H., & Kohlhammer, J. (2016). Methods and a research agenda for the evaluation of event sequence visualization techniques. Proceedings of the IEEE VIS 2016 Workshop on Temporal & Sequential Event Analysis. Available online at github.

QuantiCode

QuantiCode (or to give the project its full name “Intelligent infrastructure for quantitative, coded longitudinal data”) aims to develop novel data mining and visualization tools and techniques, which will transform people’s ability to analyse quantitative and coded longitudinal data. Such data are common in sectors such as health data (a hierarchy of hundreds of thousands of Read Codes) and retail (supermarkets such as Sainsbury’s sell 50,000+ types of products). We have seven partners: NHS Digital, Leeds City Council, J Sainsburys PLC, Bradford Institute for Health Research, Leeds North Clinical Commissioning Group, Consumerdata Ltd and aql Ltd. Funding: EPSRC.

QualDash

QualDash (“Designing and evaluating an interactive dashboard to improve quality of care”) is developing visualization dashboards to allow clinical teams, quality sub-committees, NHS Trust boards, and commissioners to better understand and make use of National Clinical Audit (NCA) data, thereby leading to improved quality of care and clinical outcomes. Our partners are Leeds Teaching Hospitals NHS Trust (LTHT), Manchester University NHS Foundation Trust (MFT), East Lancashire Hospitals NHS Trust (ELHT), Mid Yorkshire Hospitals NHS Trust, and University Hospitals of North Midlands NHS Trust (UHNM). Funding: NIHR.

Leeds Virtual Microscope Research

The Leeds Virtual Microscope (LVM) is half-way between visualization and virtual reality. The LVM was originally conceived as a “VR Microscope for Diagnostic Pathology”, because it used techniques similar to a flight simulator to allow users to pan and zoom gigantic images in real time. However, the LVM evolved to become a sophisticated image visualization tool, which is used in several hospitals and is being commercialised (see Industry). Each pathology slide (a scanned biopsy) is about 10 gigapixels in size (similar to a Landsat image of the whole of the Amazon rainforest), and multi-slide patient cases often invove one trillion pixels of image data.

The LVM project is a collaboration with LTHT. The fundamental research was funded by the Pathological Society of Great Britain & Ireland and NIHR, with follow-on support from the Medical Technologies Innovation and Knowledge Centre, an Impact Acceleration Award from the EPSRC, and the Yorkshire & Humber NHS Deanery.

Powerwall

histopathology-2009The LVM project started with the development of novel Powerwall software and a proof-of-concept evaluation, which indicated that pathologists could diagnose cancer as quickly as when using a conventional light microscope. That was in marked contrast to existing desktop digital pathology systems, which were 60% slower. The success of this evaluation led to a major NIHR grant and the installation of Powerwalls in LTHT and the Leeds Institute of Cancer and Pathology (LICAP). Video. Key paper: Treanor, D., Jordan Owers, N, Hodrien, J., Quirke, P., & Ruddle, R. A. (2009). Virtual reality Powerwall versus conventional microscope for viewing pathology slides: an experimental comparison. Histopathology, 5, 294-300.

Medical-grade Desktop

human-pathology-2014Three years of design & development led to a version of the LVM that ran on high-definition medical-grade displays. We then demonstrated that pathologists could diagnose long cancer cases as efficiently with the LVM as with a conventional microscope. Industrial interest led to commercialisation funding. Video. Key paper: Randell, R., Ruddle, R. A., Thomas, R. G., Mello-Thoms, C., & Treanor, D. (2014). Diagnosis of major cancer resection specimens with virtual slides: Impact of a novel digital pathology workstation. Human Pathology, 45, 2101-2106.

Teaching

lvm-chi-2012When the Powerwall version was used to teach pathology to 2nd Year Medical Students the giant display and cooperative navigation led to many serendipitious expansions of the tutorial content. Key paper: Randell, R., Hutchins, G., Sandars, J., Ambepitiya, T., Treanor, D., Thomas, R., Ruddle, R. (2012). Using a high-resolution wall-sized virtual microscope to teach undergraduate medical students. CHI ’12 Extended Abstracts on Human Factors in Computing Systems, 2435-2440.

User Interface

lvm-best-paperThe LVM’s design and evaluation won the inaugural ACM ToCHI Best Paper Award. Key paper: Ruddle, R. A., Thomas, R. G., Randell, R., Quirke, P., & Treanor, D. (2016). The design and evaluation of interfaces for navigating gigapixel images in digital pathology. ACM Transactions on Computer-Human Interaction, 23(1), Article No. 5.

Virtual Reality Research

I have conducted many user experiments using desktop and immersive virtual reality (VR) worlds. Below are key examples, and for a full list see my publications. External funding for this research has come from the EPSRC, the British Council and the Alexander von Humboldt Foundation

Long-term Navigation

rand-1997We recreated the classic Rand Building study of Thorndyke & Hayes-Roth, bring our users into the lab 10 times to learn the building’s layout, using a ‘day at the office’ matephor. No other study has come close to studying spatial learning in VR over such a long period of time. Key paper: Ruddle, R. A., Payne, S. J., & Jones, D. M. (1997). Navigating buildings in “desk-top” virtual environments: Experimental investigations using extended navigational experience. Journal of Experimental Psychology: Applied, 3, 143-159.

Landmarks

sports-hall1-320x256Vision is our dominant sense in navigation, but landmarks produce surprisingly little benefit when people are struggling to find their way in VR. Video. Key paper: Lessels, S., & Ruddle, R. A. (2005). Movement around real and virtual cluttered environments. Presence: Teleoperators and Virtual Environments, 14, 580-596. Video.

Walking

odt-320x256Users get lost much more often navigating in VR than in the real world. We showed that when users physically walk through a VR world on a linear or 2D treadmill they develop significantly more accurate spatial knowledge than users who navigate using an ordinary (tethered) head-mounted display (HMD) system or desktop VR. Video. Key paper: Ruddle, R. A., Volkova, E., & Buelthoff, H. H. (2011). Walking improves your cognitive map in environments that are large-scale and large in extent. ACM Transactions on Computer-Human Interaction, 18, 2, Article 10.

Field of view

wide_fov-2004Widening the field of view from 48 to 144 degrees significantly reduced the amount of time that users spent standing in one place, planning where to travel when searching a VR world. Key paper: Lessels, S., & Ruddle, R. A. (2004). Changes in navigational behaviour produced by a wide field of view and a high fidelity visual scene. Proceedings of the 10th Eurographics Symposium on Virtual Environments (EGVE’04), 71-78.

Trails

trails-2005We studied the benefits of trails during first-time navigation, and developed an algorithm to create summary trails that helped users to navigate when they revisited a VR world after after an interval of 5-8 weeks. Ruddle, R. A. (2008). Generating trails automatically, to aid navigation when you revisit an environment. Presence: Teleoperators and Virtual Environments, 17, 562-574.

Maps

seascape-1999Using virtual seascapes, we investigated the effects on navigation of local and global maps. Once users became familiar with a seascape a global map was sufficient by itself. Key paper: Ruddle, R. A., Payne, S. J., & Jones, D. M. (1999). The effects of maps on navigation and search strategies in very-large-scale virtual environments. Journal of Experimental Psychology: Applied, 5, 54-75.

Collaborative Navigation

mgd-320x256In his PhD, Trevor Dodds developed some novel ‘mobile group dynamics’ techniques that helped users to collaborate as they navigated around large-scale VR worlds. Funding: EPSRC Doctoral Training Grant Studentship. Key paper: Dodds, T. J., & Ruddle, R. A. (2009). Using mobile group dynamics and virtual time to improve teamwork in large-scale collaborative virtual environments. Computers & Graphics, 33, 130-138.

Object Manipulation

pianomover-2002I have performed a variety of pieces of research that investigated how users manipulate objects individually and collaboratively in VR worlds. Video. Key paper: Ruddle, R. A., Savage, J. C., & Jones, D. M. (2002). Symmetric and asymmetric action integration during cooperative object manipulation in virtual environments. ACM Transactions on Computer-Human Interaction, 9, 285-308. Video.

Non-Euclidean worlds

ijhcs-2000We have investigated how users’ mental models and navigational ability is affected when VR worlds contain non-Euclidean features such as spatial overlap or hyperlinks. Key paper: Ruddle, R. A., Howes, A., Payne, S. J., & Jones, D. M. (2000) The effects of hyperlinks on navigation in virtual environments. International Journal of Human Computer Studies, 53, 551-581.

Virtual Ballet

royce-dancer-320x256Using his unique professional and academic experience, Royce Neagle developed a virtual ballet dancer that understood the Laban choreography notation and interpreted different emotional themes. Funding: University of Leeds scholarship. Key paper: Neagle, R. J., Ng, K., & Ruddle, R. A. (2004). Developing a virtual ballet dancer to visualise choreography. Proceedings of the AISB 2004 Symposium on Language, Speech and Gesture for Expressive Characters, 86-97. Leeds, UK: Society for the Study of Artificial Intelligence and the Simulation of Behaviour.

VR Sickness

bucketWe published a meta-analysis of levels of VR sickness from which our users had suffered, drawing on data from several of the above experiments. Key paper: Ruddle, R. A. (2004). The effect of environment characteristics and user interaction on levels of virtual environment sickness. Proceedings of IEEE Virtual Reality (VR’04), 141-148. Video.

World-Wide Web Research

MyWebSteps

mywebstepsTrien Van Do’s PhD research culminated in MyWebSteps, a Firefox add-on that uses novel interactive visualization to make it much easier or you to revisit webpages. Funding: Leeds Fully-funded International Research Scholarships (FIRS). Key paper: Do, T. V. & Ruddle, R. A. (2017). MyWebSteps: Aiding revisiting with a visual web history. Interacting with Computers.

Mapping Websites

inspireWe rely too much on ‘search’ to navigate the WWW. This exploratory research developed a method for creating 2D and 3D isometric maps of websites, using webcrawl data and a city metaphor. Key paper: Ruddle, R. A. (2010). INSPIRE: A new method of mapping information spaces. Proceedings of the 14th International Conference on Information Visualisation (IV’10), 273-279.

Revisiting Websites

revisiting-websitesThis is one of very few user experiments that have investigated revisiting using a website with which the users were very familiar. In our case it was an intranet that they had used for 8-20 months. Key paper: Ruddle, R. A. (2009). How do people find information on a familiar website? Proceedings of the 23rd BCS Conference on Human-Computer Interaction (HCI’09), 262-268.

Other Research

Cognitive Route Recommendation

cookSarah Cook’s PhD research made good progress toward a ‘cognitive’ method that recommends routes for everyday, leisure and tourist journeys, based on attributes such as route length, turns, decision points, vegetation, land use, dwellings and points of interest. Funding: EPSRC Doctoral Training Grant Studentship. Key paper: Cook, S., & Ruddle, R.A. (2014). Effect of simplicity and attractiveness on route selection for different journey types. Proceedings of Spatial Cognition (SC 2014), 190-205.

Advertisements