I don't think it is by luck but certainly we have been fortuitous in the way our training is structured.
Looking at what the best way for individuals to acquire knowledge, much is placed on the relevance on the topic at hand by the attendee.
As a result, the trainer must deliver war stories along with demonstration of the shrapnel scars.
As you grow the number of trainers, it becomes harder to ensure consistency around delivery of war stories unless you start inserting them and making the stories generic case studies. Instantly, there can be a perceived lack of relevance do to them being prompted not spontaneous anecdotes. It is as if the more ad-hoc a story is, the more believable it is, while metrics collected over years that demonstrate a particular technique (think software inspections) somehow lose their validity due to the way they are repeated.
Perhaps it is a question of why metrics are treated with scepticism, but equally perhaps it demonstrates that the transparent and open story telling with the pros and cons of an approach, with the emotional attachment that comes with it, is what convinces people during training to adopt a particular technique.
With that in mind, thinking how training can be delivered around a case study, relies upon thinking of a case study that has a fluffy, 'and finally' news item quality about it. It needs to engage people on an emotional level.
As the diagram above suggests, you provide a case study to students, which replicates the proposition of a puzzle or problem. You then suggest a high level solution to the problem. For example, the problem is that the benefits of testing are questioned. The high level solution is to provide examples of how testing helps the organisation, and with most organisations focusing on revenue and costs, the benefit of testing is provided with identifying cost savings. The detailed solution is found through optimising testing to address the most critical areas, and hence a solution is found.
This allows a natural flow, as the next problem would be "you are asked to define a process to identify the most critical areas of the system to help ensure the effectiveness of testing". High level solution - risk assessment. Detailed solution - Use of Quality Attributes, risk catalogues, technical eval etc, and then do EDIP (Explanation, Demonstration, Imitation, Practice) on one technique.
Its got me thinking, as benefits to training include an improved flow through a training course, as well as an improved learning outcome due to the increase of perceived relevance (notice I added the word perceived) of training content, making the training material more, for want of a better word, believable.
I'm sure I'm missing a trick with the various educational approaches that are out there considering humans have taught others for thousands of years, but there is a huge shortage of competent trainers in IT. I can remember a trainer called William who had worked with some of my colleagues to develop HLPlex (High Level Language for Exchanges) and I had the utmost respect for him; not only did he know his subject matter liked he knew his bowel movements, but he had the ability to inject learning into brains. So many trainers lack either the subject matter expertise or the capability of putting information out there and inspiring others.
Too many people consider training, especially IT training, an easy gig, but we are change merchants. Some how, we have to overcome the cultural inertia that exists within organisations, groups and the minds of individuals. Good trainers have to make their ideas so palatable through the relevance, emotion and approachability, that the concepts presented are accepted without question.
Like I said there is a dearth of good presenters, hence we need to come up with a way of average testers being able to deliver content, overcoming the inertia aspect. If there is a reliable process or technique for getting information over to a variety of individuals then I have to find it.
No comments:
Post a Comment