Sunday, 21 March 2010

Hot Pixels

After having my Nikon D90 for about 14 months now, I've noticed two hot pixels. At ISO200 with shutter speeds less than 1/125, they are visible. Above ISO1600 you notice a few more, but below ISO800 two become obvious.

This isn't a surpise. I was more surprised that there were no issues with any of the pixels. What got me was that apart from taking the camera to a Nikon service centre, there is no way to re-map the CCD so as the offending pixels are turned off and surrounding pixels are interpolated. The Olympus cameras apparently have a built in system, and Nikon have a system that removes such artefacts but this only works for long exposures.

Nikon don't provide any utilities for pixel re-mapping, which considering this would save their service centre staff a log of aggravation, doesn't make business sense. A guy from Russia has written his own remapping software for Nikon Coolpix and is available here, and there was a tool around for a D100, but yet there isn't anything available for Digital SLRs.

I'm almost thinking of writing my own tool for this. The Russian gent has posted details about the protocol to view and perform the pixel mapping here, but it is probably just too much effort. To be honest, if Nikon service technicians already have this, what is preventing Nikon from providing it to consumers, albeit with a more user friendly interface. A simple check to confirm before changing the EEPROM would be sensible for your average user, which displays a zoomed area of a black image before with the hot pixels, and expected behaviour after with the correction hopefully being applied to the correct pixel.

Anyways, I'll keep hunting that someone might have posted the Nikon software online somewhere.




Wednesday, 3 March 2010

Training Steps

I had a bit of a think while at the ANZTB conference (which has nothing to do with infectious diseases of the lungs) about how to approach training.

I don't think it is by luck but certainly we have been fortuitous in the way our training is structured.

Looking at what the best way for individuals to acquire knowledge, much is placed on the relevance on the topic at hand by the attendee.

As a result, the trainer must deliver war stories along with demonstration of the shrapnel scars.

As you grow the number of trainers, it becomes harder to ensure consistency around delivery of war stories unless you start inserting them and making the stories generic case studies. Instantly, there can be a perceived lack of relevance do to them being prompted not spontaneous anecdotes. It is as if the more ad-hoc a story is, the more believable it is, while metrics collected over years that demonstrate a particular technique (think software inspections) somehow lose their validity due to the way they are repeated.

Perhaps it is a question of why metrics are treated with scepticism, but equally perhaps it demonstrates that the transparent and open story telling with the pros and cons of an approach, with the emotional attachment that comes with it, is what convinces people during training to adopt a particular technique.

With that in mind, thinking how training can be delivered around a case study, relies upon thinking of a case study that has a fluffy, 'and finally' news item quality about it. It needs to engage people on an emotional level.

As the diagram above suggests, you provide a case study to students, which replicates the proposition of a puzzle or problem. You then suggest a high level solution to the problem. For example, the problem is that the benefits of testing are questioned. The high level solution is to provide examples of how testing helps the organisation, and with most organisations focusing on revenue and costs, the benefit of testing is provided with identifying cost savings. The detailed solution is found through optimising testing to address the most critical areas, and hence a solution is found.

This allows a natural flow, as the next problem would be "you are asked to define a process to identify the most critical areas of the system to help ensure the effectiveness of testing". High level solution - risk assessment. Detailed solution - Use of Quality Attributes, risk catalogues, technical eval etc, and then do EDIP (Explanation, Demonstration, Imitation, Practice) on one technique.

Its got me thinking, as benefits to training include an improved flow through a training course, as well as an improved learning outcome due to the increase of perceived relevance (notice I added the word perceived) of training content, making the training material more, for want of a better word, believable.

I'm sure I'm missing a trick with the various educational approaches that are out there considering humans have taught others for thousands of years, but there is a huge shortage of competent trainers in IT. I can remember a trainer called William who had worked with some of my colleagues to develop HLPlex (High Level Language for Exchanges) and I had the utmost respect for him; not only did he know his subject matter liked he knew his bowel movements, but he had the ability to inject learning into brains. So many trainers lack either the subject matter expertise or the capability of putting information out there and inspiring others.

Too many people consider training, especially IT training, an easy gig, but we are change merchants. Some how, we have to overcome the cultural inertia that exists within organisations, groups and the minds of individuals. Good trainers have to make their ideas so palatable through the relevance, emotion and approachability, that the concepts presented are accepted without question.

Like I said there is a dearth of good presenters, hence we need to come up with a way of average testers being able to deliver content, overcoming the inertia aspect. If there is a reliable process or technique for getting information over to a variety of individuals then I have to find it.


Tuesday, 2 March 2010

eHealth security

I've had a fair bit of involvement with IT solutions for Health, looking at everything from process improvement of methodology for development of a system integrating patient diagnostic imagery, to upgrades of VoIP, to compliance and conformance of eHealth systems.

And while the vast majority of work is considered from a testing perspective, there is always an element where I have to consider operational procedural perspectives, and compliance with any standards that cover such things.

This article highlights the issue. Even with all the debate regarding patient safety and security, and even with what you might consider a professional and responsible organisation, Medicare has shot itself in the foot by not ensuring patient privacy.

Having had my notebook stolen by a Federal Government employee once, I might be predisposed to expecting all employees to posses questionable ethics, but actually my expectation is that those who work for government fully understand the significance of the role of a public servant. The responsibility is in the meta-title itself; the person is there to serve the public. This person is entrusted with the records of citizens to provide a variety of services, from health, education, aged care, welfare payments, etc.

A single act unfortunately tarnishes the group as a whole, plus it places doubt in people's mind regarding the intent of initiatives that the public servants try and implement. In this case, people will question the advertised benefits of introduction of the Health Identifier, if those proposing it, implementing it, controlling it or using it, are likely to misuse the information provided to them in confidence.

So while I continue to look at the conformance of new eHealth systems, I have to keep a beady eye on the compliance and associated risk with non-compliance, of operational procedures and guidelines; not because its fun but because the benefits of eHealth systems will never be realised when the scope is scaled back or additional guidelines are in place, to reduce the risk of a few individuals misusing the system.