Finding your lab feet

Today I want to write a little about my frustrations in learning how to use different versions of equipment you already can use. I’m currently working on advancing a method of identifying Achilles tendon moment arm using ultrasound and motion capture. In both my undergrad and masters training, we learnt how to use MoCap – using the Qualysis system in a very guided step-by-step process in undergrad and using Vicon with a lot more freedom in my masters. I felt I had a relatively competent grasp on how to collect movement data using a passive 3D motion system. And then I started my PhD.


I came into a lab at a time where very little data collection was actually going on in the lab, and had to figure out how to use the systems here predominantly solo (with a little guidance from our lab tech). We run off the Eagle cameras and use the Cortex software from Motion Analysis Corp, neither of which I had prior experience with.


As with all motion systems, the first step was calibration. After opening all the cupboards in the lab, I found our calibration L and wand (in hindsight, asking our lab tech where things are should have been my first step. Oh well, you live and learn!). I figured calibrating would be the same as before – place the L on the center of the capture space (normally on the force platform), enter the capture space and wave the wand high and low making sure all the cameras see it sufficiently. This is how it works, but it took me a few times messing up before I found the screen in the software that shows all camera views and turns green when each camera is properly calibrated (data views, all cameras for those of you querying this).


Next up of course is data capture. My current testing involves markers on rods and an ultrasound probe so this is what I learnt the system with. A few trial and error attempts and I figured out the system. After testing a few times, I think I’ve got a handle on the way it works but I’ll feel more confident once I’ve run a few more pilot tests. My main apprehension at the minute is processing the data in Cortex after it’s been collected – especially with dynamic tasks which have the risk of marker drop out. I’ve been doing my analysis in MATLAB (which again is a whole new set of skills I’ve had to learn), and have been surprisingly pleased with how I’ve been getting on. It’s always a wonderful feeling when you show your advisor something they didn’t previously know – even if it is as obscure as playing a video in Matlab!


We also had to figure out the ultrasound system – a Telemed LogicScan 128. This is a PC based system that plugs in via USB, and all the controls are on the computer, which makes saving data and editing settings nice and straightforward. For the most part, learning this has been fairly intuitive despite having very little prior experience with ultrasound (observing its use in a single class in my masters).  Integrating the two systems with synchronization is still a problem we’re figuring out – while we have an analog signal that shows when the ultrasound is collecting data (on/off), it would be excellent to get the sample rate of the ultrasound system (frames/second) so it can be linked to the MoCap accurately.


That’s a little overview of the equipment I’m using, and some of the challenges I’ve faced and am working on in understanding it. I’m hoping my next post will also be informative – I recently submitted an IRB (ethics) proposal so I’ll detail that process soon.