Vicon, the Oxford-headquartered world leader in motion capture, has released a significant update to its motion analysis mobile app, Capture.U 1.4, to cater for the growing demand from the biomechanics community for support in applying inertial technology to their research and studies.
Inertial technology measures and reports a body’s specific force, angular rate, and sometimes the orientation of the body.
Established in 1984, Vicon is an award-winning provider of motion capture systems for the life sciences, media and entertainment, location-based virtual reality and engineering industries.
Vicon is a subsidiary of Oxford Metrics, the international software company servicing government, life sciences, entertainment and engineering markets and Yotta, a provider of software and services for infrastructure asset management.
With the release of Capture.U 1.4, Vicon is offering an accessible and practical way to learn about inertial measurement units (IMUs). For universities and schools, it provides a means to develop upon the theory-based applied learning model as students will learn by using the sensors by applying them to human movements.
For the physiotherapist or athletic trainer, it can be the springboard to taking objective, dynamic measurements of patients or athletes in their natural environment. All prospective users of IMUs, regardless of application, will appreciate the interactive manner in which the content is presented and will find it easier to use inertial in their research or studies.
Felix Tsui, Vicon’s Life Sciences Product Manager, said: “We’ve seen an increasing demand for more tools and resources that can help users of all technical backgrounds better understand what is still a relatively novel technology. Long gone are the days when IMUs were only of interest to high-level engineers — IMUs are more prominent in monitoring human movement and we need resources on them to reflect that.
“This latest development of the app makes it much simpler for users to not only understand the type of data being captured by IMUs but also to learn how they can analyse that data to interpret human movements.”