Maritime Jobs
Sunday, August 9, 2020

Training Tips for Ships #14: Collect, Analyze Data to Improve Training

July 24, 2020

© Pixelbliss/AdobeStock

© Pixelbliss/AdobeStock

As I wrote in last month’s Training Tips for Ships, “If you do not measure it, you cannot manage it. This is especially true with training because the inputs (the training provided) are often far removed in time and apparent direct causality from the key output (performance). Therefore, it is important that we measure everywhere we can”.

To address this problem, we discussed the value of student evaluations of their learning experiences and included some advice on best practices for their delivery. But typical student evaluations are just the beginning. In fact, there are many excellent sources of data which are easy to access and collect, and which can yield tremendous improvement in training experiences and outcomes. Let’s look at a few of these now.

Ask Your Students (Again)!

Traditional student evaluations are delivered at the end of a course and (as suggested in last month’s TTforS) about half way through the course. One advantage of that timing is that instructors have good access to their learners – since they are still in the class! However, this timing is not necessarily the best in terms of producing insightful answers. Instead, or in addition, some advocate for asking for student evaluations well after the course is finished – say 3 – 9 months after.

The value of conducting another evaluation after some time has passed is easy to understand.  By waiting, the learner has had an opportunity to determine whether the learnings could be applied helpfully to their job activities. This is a critically important measure of any learning and while the student could speculate about the answer right after the course, the truth will only be known after the learner has had time to apply what was taught and reflect on its value. Therefore, when able, it is a very useful practice to conduct a short follow-up evaluation some months after the training. Questions should focus on the applicability of the learning to their work, as well as asking the respondents to identify any lasting benefits of the course, and any remaining advice for improvement.

Ask the Trainee’s Supervisors:

A supervisor is often the best judge of the competency of their reports. Therefore, an often overlooked but very useful source of information as to the effectiveness of training is the learner’s supervisor. As above, some time should have elapsed between the end of training and the questions posed to the supervisor; this gives the effects of training time to reveal themselves. But once the time has passed, we can ask the supervisor whether there any observable improvements in professionalism, competency, capabilities, knowledge, etc. – according to the subject of the course. This is also an excellent opportunity to ask the supervisor to identify gaps they see in their reports that training might be able to close.

Asking the supervisor has the additional benefit of making the supervisors first-order participants in the quest for excellence through improved training. Training of an individual is something that benefits the entire organization and all employees should not only have a voice, but should feel listened to and involved. Asking a supervisor to comment on the training of one of his or her reports is an excellent way to gather data, to demonstrate inclusivity and to generate broader interest in training optimization.

Ask the Instructor:

Finally, it has always surprised me that it is exceedingly uncommon to ask the instructor for feedback at the end of the course. That is a missed opportunity.

First, the instructor is very well positioned to know what went well and what requires improvement. He or she also often has the benefit of being able to compare the most recent offering with past offerings they have taught; students do not have that benefit. Third, instructors can usually make concrete suggestions on how to address the shortcomings observed.

By asking the instructor for their feedback, including asking how the organization can better support excellence in the course, we are reinforcing the notion of shared responsibility for the learning process. We are reminding all parties that they are accountable for excellence in learning and we are creating a document that identifies ways in which this can be achieved.

Conclusion:

These are just a few examples of data sources that are incredibly useful, but often overlooked in our quest to improve training experiences and outcomes. Incorporating them as metrics in your continuous improvement process is easy and highly effective.
Stay healthy and sail safe!

Featured Jobs

Deputy Pilot Openings in Florida

Board of Pilot Commissioners

Able Seaman

Military Sealift CommandNorfolk, VA, USA

Chief Radio Electronics Technician (IAT)

Military Sealift CommandNorfolk, VA, USA

Electronics Technician

Military Sealift Command

First Radio Electronics Technician (IAT)

Military Sealift CommandNorfolk, VA, USA

Electrician

Military Sealift CommandNorfolk, VA, USA

Steward Cook

Military Sealift Command

Assistant Cook

Military Sealift CommandNorfolk, VA, USA

Assistant Damage Control Officer

Military Sealift Command

Third Officer

Military Sealift CommandNorfolk, VA, USA

Featured Employers

Board of Pilot Commissioners

The Board of Commissioners of Pilots (the “Board”) is a public agency, created by the New York State

Fraser River Pile and Dredge

Fraser River Pile & Dredge (GP) Inc. (FRPD) is Canada’s largest Marine & Infrastructure, Land Founda

Military Sealift Command

The Military Sealift Command (MSC) is a United States Navy organization that controls most of the re