Summary
Skeletal output images and joint coordinate and angle csv data files from the TUG study undertaken as part of IMACTIVE Project
Keywords: |
Human-Robot Interaction |
Creators: |
|
Contributors: |
|
Academic units: |
Faculty of Science, Technology and Arts (STA) > Academic Departments > Department of Computing |
Funders: |
Funder Name | Grant Number | Funder ID |
---|
EPSRC | EP/W031809/1 | |
|
Copyright Holders: |
Sheffield Hallam University, EPSRC |
Publisher of the data: |
SHU Research Data Archive (SHURDA) |
Publication date: |
31 July 2024 |
Data last accessed: |
No data downloaded yet |
DOI: |
http://doi.org/10.17032/shu-0000000208 |
SHURDA URI: |
https://shurda.shu.ac.uk/id/eprint/208 |
Types of data: |
Dataset |
Collection period: |
From | To |
---|
5 March 2024 | 21 March 2024 |
|
Geographic coverage: |
Sheffield, United Kingdom |
Data collection method: |
People from the Sheffield region aged between 30-80 years old completing the Timed-Up and Got test. Participants were recorded using Turtlebot4 and PAL ARI robots. The joint coordinates were obtained using ROS4HRI and MoveNet, and the skeletal outputs created by MoveNet.
The study was conducted in the Motion Analysis Laboratory at the Advanced Wellbeing Research Centre in Sheffield, UK. Participants were provided with a Participant Information Sheet prior to the study and after an initial briefing were asked to sign a consent form. Within the briefing participants were shown the two robots and given a description of the TUG test and the data that the robots would be collecting. The TUG test involved the participant starting from a seated position and then when instructed they would stand up, walk forward 3 metres to a cone, turn around (in either direction), walk back to the chair, and sit down. Participants were asked to complete the TUG test a minimum of 5 times and were offered the chance to rest between tests. Participants were also offered to do more tests if they felt physically able to do so and informed this was entirely voluntary. Participants completed a trial run of the TUG test before commencing data collection. After answering any questions the participants were provided the QTUG sensors and straps to secure the sensors to their legs. Participants were instructed to place the sensors on their leg just below the knee with the sensor facing forwards and told to tighten the straps so they were comfortable and remained in this position during the study.
The skeleton tracking software was initiated on the Turtlebot4, followed by the PAL ARI. Once these were confirmed to be stable, participants were given a countdown of 3 seconds to begin the TUG test, at which point the QTUG was started on the tablet with the application. As the person walked towards and away from the cone, the Turtlebot was controlled by teleoperation and rotated to keep the person within the field of view. The ARI uses a built-in tracking function to rotate the head to keep the person in frame, however the field of view was such that the head rotation was not required. Once the person returned to the seat, the QTUG was stopped and the data was saved. The joint coordinates were stored in CSV files. |
Data processing and preparation activities: |
Raw data has no changes, analysed data has been filtered to remove anomalies. |
Resource language: |
English |
Metadata language: |
English |
Depositing User: |
Matthew Story
|
Date Deposited: |
18 Sep 2024 09:12 |
Last Modified: |
18 Sep 2024 09:12 |
|
Files
Full Archive
- Description: UNSPECIFIED
- Downloadable by: On request only
- License: Non Disclosure Agreement
- Type: Archive
- Mime-Type: application/zip
- File size: 149MB
- Software Application: UNSPECIFIED
- Version of Software Application: UNSPECIFIED
Actions (Log-in required)
View item