Really good article on the issues and solutions around syncing. More importantly, it has pretty pictures.
An English immigrant IT consultant in Australia. This is my journey learning to surf, consult, mountain-bike, train, hold down a decent relationship, hold my weight down, purchase as many CDs as possible before they become obsolete, grow up, grow facial hair, develop a stand-up comedy act and stand up on my own two feet.
Wednesday, 20 October 2010
Synchronize Outlook, Google Calendar, Gmail with Funambol/ScheduleWorld : clipclip
Tuesday, 19 October 2010
From closed Windows to fresh air
I've finally taken the plunge and purchased a Mac. For the ease of image manipulation to support my expensive photography hobby, to aggregation of my personal data effects in one place, within a few hours I am up and running. . . . . I don't recall any MS Windows migration being so painless, even if it was between machines, let alone OSs.
Some top tips if you are thinking of doing it.
- iPhone - follow links in new page. Pressing and hold a link will trigger a pop-up prompting if you want to open link in new page, or current page.
- iMac - iTunes migration. Have all your music in one place; e.g. NAS drive. Do not move it.
- iMac - iTunes migration. Move your iTunes library XML file, and find/replace all references to your NAS music store location. e.g. PC was file://localhost/X:/Music and I replaced with file://localhost/Jupiter/Public/Shared Music/Music/.
- iMac - iTunes migration. Once the XML file from your decrepit PC has been COMPLETELY updated with the new absolute path on your sexy new iMac, import this into iTunes.
- iMac - iPhone App migration for iTunes. Move the folder "Mobile Applications" in your iTunes base directory somewhere else, then drag and drop them from Finder into iTunes. Otherwise you will receive an "Error -50" and start getting pissed off.
- iMac - general migration. Ensure you have GB ethernet. Any network file transfer will occur in minutes rather than hours, and you know how many films containing the skin end of the spectrum you have; it would aeons to transfer over 100MB.
- iMac - SW install. Don't do it after a glass or two of wine; you end up purchasing Lightroom 3 and other things. The package arrives tomorrow.
- iMac - SW install. Don't install the OSX updates immediately, that can wait while you get your core apps done. Kick it off after you are done or in the morning and let it do it's thing; it might require 1-2GB of updates.
- iMac - SW config. Set Google Chrome on your PC to sync with your Google account (if you haven't got one, get one). Install Chrome on your Mac and also allow Chrome to sync. Hey presto, Bobs your Uncle, Fanny's your Aunt, wudubileevit, you now have matching bookmarks across devices.
- iMac - iTunes/iPhone/Outlook. Firstly, export your contacts from MS Outlook to CSV (bye bye!), then import into GMail. Gmail will give you some dialogues to ensure you have minimal or in my case zero duplicates. Now update the contacts sync info in iTunes to specify sync with GMail contacts.
- If you are tired then go to bed. No good will come of configuring your electronic life when tired. Tiredness leads to mistakes. Mistakes lead to errors. Errors lead to data loss. The road to data loss is dark and one way, and we can all feel the disturbance in the force when someone has data loss.
Thursday, 7 October 2010
Product Risk Management
I'm just updating content for a training course, and I know I spend a lot of time conveying the importance of testing etc etc, but to do so in the most succinct and digestible way, to ensure the thought or lesson objective is retained.
I'm currently trying to communicate the importance of risk-based test strategies, and one of the biggest concepts ofr me is that "The main purpose of testing is to mitigate risk, therefore our testing strategy should be focused on risk mitigation, rather than purely requirements conformance".
The basis for this is my perception that the majority of testing is founded upon the idea that the requirements as defined are to be treated as 'gospel' and therefore deviation from the requirements alone is all that is required to ensure successful delivery.
In actual fact, it is the tacit knowledge; the undocumented requirements that are seemingly obvious to end users and seemingly invisible to others, that should be captured and assessed. Such things as performance, useability, security, compatibility, in fact all the other abilities, should be factored in.
So, my question, is if testing is about finding defects against system, is there a more elegant and or accurate way to describe a test strategy as the artefact that documents the focus of testing as identifying the most critical defects which would expose the system to the biggest risks.
Thursday, 23 September 2010
Medicare e-health contract in limbo | The Australian
The (U/O) HI story continues.
I don't know what the problems are, but it seems government organisations work to different time-scales than private enterprise, who are all ready to press ahead in this case.
Tuesday, 7 September 2010
ongoing GMail
Down to 41 pst files left, and I'm starting to enjoy GMail especially with some of the new stuff like priority mail.
Wednesday, 1 September 2010
Moving slowly into the past
After years of trying to avoid it, I have finally bitten the bullet, faced the music and kept up with the Joneses, by moving to GMail. For a start, it runs far more quickly than MS Outlook, especially when dealing with the amount of email I have kept, but also it means I can access it wherever rather than having to boot up to view.
The only challenge is migrating 61 pst files, yes that's right, 61 dirty fat pst files, from my PC to the interwebs. For each project I work on, I create a new PST file, and move pertinent material from Inbox to the project PST file. This works great as I know everything will be in one place, and at the end of the project, I can close that PST from Outlook and reduce the number of emails in my face.
Now however, it is getting harder and harder to manage all of this so a cloud-based solution makes sense.
Unfortunately, unless you have a Google Docs domain you can only upload slowly via Outlook itself, copying files into the imap mirror folder. There are a few scripts that can assist, but due to the miss and miss rate, it is easier, but rather time consuming to upload project by project.
16 down, 45 to go. . . . .
Thursday, 26 August 2010
Agile recipe
One of the most interesting things of Agile is that it means so many different things to so many people. The first thing I would have to say is that it is just like waterfall or rigorous software methodology (RSMs) projects. Yes you heard me right; there is no difference between Agile and Waterfall.
The fundamentals of software development remain the same regardless of the process, and I tire of the Luddites who espouse the virtues of Waterfall or the evangelists who reckon there is no way other than the Agile way.
What is software development?
- A Customer requests a product.
- A Developer creates a product.
- A Tester verifies and validates that the product has integrity and is suitable to be presented back to the customer.
I've obviously simplified these roles, so the same person may wear the hat of customer and tester, plus each role might be decomposed further, but isn't the above the essence of software development?
Let's take one part of the process. What is the underlying process for a customer requesting a product?
- The customer speaks to the developer and asks for a product.
- The developer says whether or not it can be done and agrees, providing a time for expected delivery.
Lets look at the two approaches in ideal implementations.
In Waterfall, or more accurately a RSM, requirements are captured, documented in a clear and unambiguous fashion, with extensive formal reviews to ensure quality prior to agreement with all parties.
In Agile, the requirements are captured individually on cards and verbally agreed between the customer and developer. Confirmatory questions and regular feedback ensure mutual understanding.
Now looking at them realistically:
In Waterfall, or more accurately a RSM, requirements are partly captured, poorly documented by those who have little understanding of how to capture true business requirements or communicate the value of each feature requested, followed by a cast of thousands demanding sign-off, followed by endless changes, updates and political positioning around getting what some departments want out of the budget.
In Agile, the requirements are captured individually on cards and verbally agreed between the customer and developer. Confirmatory questions and regular feedback ensure mutual understanding.
So there we have it. I've been fortunate enough to work on RSM implemented projects and where the domain is well known and stable, say a telecommunications protocol (Q.931 for example), it has been really clear what the requirements are, and hence was easy to implement and test. However, it seems the majority of IT systems involve a customer who isn't sure, a developer who doesn't always understand the domain, and a tester who is dropped in the deep end and is primarily an end user.
What Agile does is give immature IT shops, or fast-moving industries where time to delivery is key, an approach to allow the customer constant feedback to ensure the project stays on track.
What Agile does is address the inadequate implementations of RSMs employed by software firms, by ensuring communication, courage, and response to change operates more effectively. The thing is, if people followed the process correctly, then they wouldn't need to try out Agile.
My biggest concern for companies moving to Agile is not what it gives them but what it doesn't. If you don't have the staff that are capable of following a detailed rigorous process with documentation that ensures communication is maintained, what makes you think they can deliver in an Agile world?
I see a lot of organisations struggle with Agile, and in most cases it is because the calibre of the people involved is not high enough to deliver in a courageous and fast moving environment. There are some amazing individuals in organisations; I met some testers yesterday working their way to making testing integrated into the hybrid Scrum teams, but you can tell there are some people they are dealing with that are better suited to having a strict process regime.
My point is Agile is not lacking in rigour, only the rigour has to be applied by the individual rather than the process. It is just as 'expensive' (this issue of cost comparison between Agile and RSM is something else that bugs me), just as hard work, but the onus is on skilled and motivated individuals, rather than average SW engineers and end user testers.
I'm now faced that my wrap up will sound more like a rant, but I suppose it surmises or at least demonstrates the strengths of the many different approaches to SW development.
Firstly, I do not think nor believe anyone else stands by there being only one way to develop software. There are many Agile and many RSM processes, and like a persons sexuality, there are many positions and processes along the scale. Terrible analogy I know but it makes sense. Actually that doesn't work, because people can't change their sexuality. I believe the process should adapt to the conditions of the project and the needs of the application. As a result, we have as many different processes as we do have religions (that's better, sex and religion and software development processes in the same paragraph).
I feel an epiphany coming on.
What are the conditions for a project moving to Agile?
- Do you have highly skilled motivated staff. (be honest now; are they really that motivated? Does drinking 6 Ouzo and cokes at lunch really count as a skill?)
- Is the domain ABSOLUTELY clear (does the customer really know what they want?)
- Is there a formal process in place that already works? (if it is working for you, why change it?)
- Do you have the skills and infrastructure to support highly automated testing? (you ARE going to automate aren't you?)
- Do you have the support of management to move away from heavyweight documentation? (are you really going to get your BAs, developers and testers to develop 148 page word documents like test summary reports, negating any benefit achieved by using story cards).
- Do you have an engaged customer? (without that you are just doing Waterfall without requirements? You're on your own now.)
- Are the developers doing unit testing? (If they aren't doing unit testing, why do you employ them?)
Now lets get back to my original point. How many issues above can affect both Agile and Heavyweight processes alike? I think the issues that affect success are more down to the rigour of implementation of approach rather than the approach itself.
Thursday, 29 July 2010
Cheap equipment
Unreal deal at Harris Technology for a 5 port Gigabit switch. I bought mine last week from Officeworks only because I showed them the ht.com.au website and Officeworks were selling the D-Link DGS-1005D for $80, compared to only $48 here.
HT have the same price for the DES-1005D (10/100) as the DGS-1005D (10/100/1000) so I can only assume it is a typo.
Considering that the majority of other 5 port and 8 port switches are around 90 to 130 AUD, this is an awesome deal.
Happy days.
ATM pokie
In what the presenter suggests is a more successful attack than John Conner on an ATM, Barnaby Jack gave a successful second-time presentation at the 2010 Black Hat Conference in Las Vegas, after last years was pulled at the request of one of the ATM vendors. He demonstrated two exploits, one via the network connection, the other via USB. You might ask how you can get at the ATMs USB slot; well the vendors provide a standard lock. As happens so often it seems that only a combination of physical and software security will provide a robust system.
Really interesting demonstration and a great example of thinking outside the (strong) box.
Thursday, 22 July 2010
iPhone 4 antenna continued
The lack of testing of the software and design for the iPhone is continuing to affect customers. Apple has taken steps with a news conference, and the handing out of the rubber band-aid to reduce the likelihood of attenuation from shorting the two antenna's and changing the resonant frequency, but there are still plenty of articles analysing the problem here and here.
Thursday, 15 July 2010
iPhone 4
Weak signal strength issues seem to be increasing from a few isolated incidents, with the associated software fix, to a full-blown design issue. If, as it is looking increasingly likely, that the antenna design means that the signal can be attenuated by shorting the two antennas out, then a redesign or some fix is going to become necessary. The Telegraph has estimated the cost of a recall and or fix in their article.
Apple originally claimed that the issue was down to an incorrect signal strength algorithm, thereby showing the users an incorrect number of signal strength bars. That's now be shown to be a tiny part of the problem, with the issue being in hardware, not software. Nevertheless, it is another example of where more comprehensive testing would have revealed the problem. One news article identifies that only a very small number of altered prototype phones were available for testing in the network.
Apple now claim that all phones suffer attenuation when held in a particular fashion and the iPhone 4 issues are just a demonstration of that. However, not all phones place the antenna without insulation on the outside of the phone, so this explanation seems a little weak, considering the analysis this article goes into explaining the design flaw. This made me consider the possibility that somehow, accidentally, the entire phone could be destroyed by an electric charge touching the antenna. I'm surprised the whole thing is exposed as any grounding or connection to another electrical source could damage the phone. Most phone antennas are internal but when phone antennas were external (think Ericsson GH337) the antenna was housed in rubber.
Regardless of any internal circuit protection, the fact you can change the length of the GSM/UMTS antenna by shorting it with the WIFI antenna, thereby changing the resonant frequency, is more than just an issue of your fat fingered hand phone holding style; its bad design. Sadly it seems the aesthetics were more important than the functionality according to another story.
Monday, 28 June 2010
Unit testing and training; a day to reflect and update.
I can't believe how long it has taken me to be bothered to convert the JUnit 3 tests to JUnit 4, and transfer the exercises on our training courses to be set around Eclipse. Sad I know, but without an appropriate lull in other work, and a need due to a forthcoming course that would benefit the change, I've never got around to doing the updates.
It didn't even take that long, but I'm off to PNG at the end of the week, and I thought it was an opportunity to update the pracs on the unit testing exercises. Previously, these were arranged to run from command line, using Ant. It was a simple transfer to Eclipse, with my main concern how Eclipse managed the project.
If any of the classpath definitions was absolute rather than relative, then anything I set up in a project would fail once I transfer Eclipse from my PC to another. In the end, the .classpath file and .project file look simple and are based on current node as the root.
This way, I can take my install of Eclipse, transfer to memory sticks, along with the exercises, and not worry about configuration problems.
As I only look at Java a few times a year, I end up ramping up over a time to get to the same level of comfort. The tool integration provides a great opportunity to demonstrate further approaches with greater ease, such as easier refactoring, clearer code coverage reports, and greater relevance to attendees work experience.
With that in mind, I am reviewing the code coverage tools I added as plugins, in the hope that there is something as sophisticated as Clover but that costs the same as a coffee; at the moment, most of the tools cost the same as a coffee machine.
Ideally I'd like to have centralised training servers with SVN and CI tools running, but without a regular need, it makes it hard to justify.
Now I've made the change with the tools, I can look at enhancing the content to increase the information on mock objects, have greater information on refactoring, demonstrate the ease with which you can write unit test cases, rather than having to assist students in editing code using notepad.
Friday, 25 June 2010
skulim tisa wokim studens amamas, or 'How to keep students happy'
Of to PNG in a few days to do some CSTP and developer training. One of the most interesting aspects are the genuine and honest nature of the folks there. They are incredibly polite, to the point that it can be hard to extricate information from them when they are placed as students in a classroom. The poorly translated title is my question.
It may well be that I'm not reading the situation correctly, but I'm interpreting their reticence to speak as respect to the teacher, or perhaps the way education is performed in PNG. I like to engage with students for a number of reasons.
- To confirm they are assimilating the information.
- To identify areas, concepts and topics they are not comfortable with.
- To ensure they are within their comfort zone, and are able to ask me further questions.
- To be able to gauge when a break is required.
- To understand if the material presented is relevant to the role they perform.
- To keep my ego inflated.
The last point stems from the perspective of the trainer as an entertainer. In order to provide the right level of energy, and to pace the delivery correctly, I must understand how the students learn, and what makes them tick.
Without any feedback, this becomes almost impossible.
This will be particularly hard when I have to run one training session on a Saturday. This presents two issues, as I won't be able to go diving, and I face a far greater challenge to normal to ensure the students are engaged and excited about the training.
Ice-breakers, energisers and other techniques for engagement are ok but . . . . actually typing those last few words has pinpointed perhaps what I should be looking for.
They are not comfortable voicing their opinion, so perhaps mime might be the way to go. One of the easiest ways to engage people is through humour. Good comedians make you laugh as they provide a mirror to an embarrassing event or situation that you the audience has faced, and use themselves as the protagonist in the centre of the situation. Through exaggeration of the circumstances, and self-depreciation, they allow us to laugh at them, and also relate how they are embarrassed by a situation we might have also come across.
So humour is the weapon to break down the silence, mime is the vehicle, and I must be the target.
Simple. Bugger.
Wednesday, 9 June 2010
Risk Analysis
I've got my head immersed in risk assessments at the moment. The organisation I'm currently working with is developing a risk based approach for testing and auditing, which will affect a large number of organisations and individuals in the domain, and the consequence of some areas reaches into high safety integrity.
The standards and processes around risk identification itself though is the hard part. There are multitudinous ways to identify risk; FMEA, FTA, HAZOP & CHAZOP (the Chas and Dave of the risk world?), SWIFT etc etc.
- Which one do you choose?
- Which one would you choose if your life depended on it?
- Which one would you choose if you were a vendor developing a system and didn't want to spend much money?
So I'm having to evaluate all the risk assessment standards and approaches. Thank goodness for both ISO. In fact, thank goodness for the AS/NZS organisation, as the AS/NZS 4360 was adopted as the new ISO 31000 standard for risk assessment. The accompanying standard (the catchy) ISO 31010 goes into some wonderful detail about how many approaches you might take to identify risks, including weighing up the pros and cons of each approach.
Yet there isn't a one size that fits all. While I'm not surprised by this, my single model for risk assessment is turning into a 3 or 4 part model.
Step 1 - Generic high level top down approach using risk catalogue.
Step 2 - Mandatory analysis of the standard.
Step 2a - choose from FMEA, SWIFT or CHAZOP
Step 3 - Optional. Do a specialist risk assessment such as Privacy Impact Assessment.
For once the standards bodies have got it right, in recognising that no one approach is satisfactory under different conditions. While that is a blindingly obvious statement, many standards documents contradict this viewpoint, and appear to be fairly rigid in their structure. On a side note, this is all the more surprising considering how few SW testing tools adopt IEEE 839 terminology and instead come up with all kinds of weird shizzle.
This has led to an interesting consulting cul-de-sac. Most of the time we are able to recommend one particular course of action, based on the information we have. In this instance I am having to recommend that you can do all sorts of things. The challenge is to place some structure so that whoever does the evaluation of the approach against the target, employs a consistent repeatable approach, that delivers the same outcome time after time.
Two things I'd like to invest in; cloning and foresight.
Tuesday, 6 April 2010
Hot Pixels continued
I've searched the internet and the hot pixel thing is big.
There seem to be many many postings of Nikon users with hot pixel stories:
- http://www.flickr.com/groups/nikond90club/discuss/72157607795021505/
- http://photo.net/nikon-camera-forum/00Tx71
- http://prashchopra.blogspot.com/2009/01/another-d90-hot-pixel-story.html
- http://digitalphotographer.com.ph/forum/showthread.php?p=849679
- http://www.broadbandreports.com/forum/r21331974-Hot-Pixel-Problem-with-new-camera-what-to-do
- http://nikonheadache.blogspot.com/ This article is particularly interesting, as it covers the poor service Nikon provides to address this issue. Sending customers to service centres to fix a common problem is unecessary, when there is software out there to fix this. Turn the servicing software into an end-user product, and Nikon will reduce service costs.
- http://www.nikonians.org/forums/dcboard.php?az=show_topic&forum=312&topic_id=1239&mesg_id=1239&page=
- http://www.dpchallenge.com/forum.php?action=read&FORUM_THREAD_ID=391316
- http://www.dvxuser.com/V6/showthread.php?t=158350
- http://www.planetnikon.com/forums/index.php?showtopic=11414
While Minolta, Konica & Sony and the Olympus cameras have a built in solution:
- http://ylovephoto.com/en/2009/05/12/minoltasony-solution-against-hot-pixels-stuck-pixels/
- http://www.dvxuser.com/V6/showpost.php?p=1515619&postcount=2
I've contacted Nikon twice to find out if they are able to provide a more convenient solution to sending your camera away to a service centre, and I hope to get a better answer than being pushed back to the service centre to answer my query.
Sunday, 21 March 2010
Hot Pixels
After having my Nikon D90 for about 14 months now, I've noticed two hot pixels. At ISO200 with shutter speeds less than 1/125, they are visible. Above ISO1600 you notice a few more, but below ISO800 two become obvious.
This isn't a surpise. I was more surprised that there were no issues with any of the pixels. What got me was that apart from taking the camera to a Nikon service centre, there is no way to re-map the CCD so as the offending pixels are turned off and surrounding pixels are interpolated. The Olympus cameras apparently have a built in system, and Nikon have a system that removes such artefacts but this only works for long exposures.
Nikon don't provide any utilities for pixel re-mapping, which considering this would save their service centre staff a log of aggravation, doesn't make business sense. A guy from Russia has written his own remapping software for Nikon Coolpix and is available here, and there was a tool around for a D100, but yet there isn't anything available for Digital SLRs.
I'm almost thinking of writing my own tool for this. The Russian gent has posted details about the protocol to view and perform the pixel mapping here, but it is probably just too much effort. To be honest, if Nikon service technicians already have this, what is preventing Nikon from providing it to consumers, albeit with a more user friendly interface. A simple check to confirm before changing the EEPROM would be sensible for your average user, which displays a zoomed area of a black image before with the hot pixels, and expected behaviour after with the correction hopefully being applied to the correct pixel.
Anyways, I'll keep hunting that someone might have posted the Nikon software online somewhere.
Wednesday, 3 March 2010
Training Steps
I had a bit of a think while at the ANZTB conference (which has nothing to do with infectious diseases of the lungs) about how to approach training.
I don't think it is by luck but certainly we have been fortuitous in the way our training is structured.
Looking at what the best way for individuals to acquire knowledge, much is placed on the relevance on the topic at hand by the attendee.
As a result, the trainer must deliver war stories along with demonstration of the shrapnel scars.
As you grow the number of trainers, it becomes harder to ensure consistency around delivery of war stories unless you start inserting them and making the stories generic case studies. Instantly, there can be a perceived lack of relevance do to them being prompted not spontaneous anecdotes. It is as if the more ad-hoc a story is, the more believable it is, while metrics collected over years that demonstrate a particular technique (think software inspections) somehow lose their validity due to the way they are repeated.
Perhaps it is a question of why metrics are treated with scepticism, but equally perhaps it demonstrates that the transparent and open story telling with the pros and cons of an approach, with the emotional attachment that comes with it, is what convinces people during training to adopt a particular technique.
With that in mind, thinking how training can be delivered around a case study, relies upon thinking of a case study that has a fluffy, 'and finally' news item quality about it. It needs to engage people on an emotional level.
As the diagram above suggests, you provide a case study to students, which replicates the proposition of a puzzle or problem. You then suggest a high level solution to the problem. For example, the problem is that the benefits of testing are questioned. The high level solution is to provide examples of how testing helps the organisation, and with most organisations focusing on revenue and costs, the benefit of testing is provided with identifying cost savings. The detailed solution is found through optimising testing to address the most critical areas, and hence a solution is found.
This allows a natural flow, as the next problem would be "you are asked to define a process to identify the most critical areas of the system to help ensure the effectiveness of testing". High level solution - risk assessment. Detailed solution - Use of Quality Attributes, risk catalogues, technical eval etc, and then do EDIP (Explanation, Demonstration, Imitation, Practice) on one technique.
Its got me thinking, as benefits to training include an improved flow through a training course, as well as an improved learning outcome due to the increase of perceived relevance (notice I added the word perceived) of training content, making the training material more, for want of a better word, believable.
I'm sure I'm missing a trick with the various educational approaches that are out there considering humans have taught others for thousands of years, but there is a huge shortage of competent trainers in IT. I can remember a trainer called William who had worked with some of my colleagues to develop HLPlex (High Level Language for Exchanges) and I had the utmost respect for him; not only did he know his subject matter liked he knew his bowel movements, but he had the ability to inject learning into brains. So many trainers lack either the subject matter expertise or the capability of putting information out there and inspiring others.
Too many people consider training, especially IT training, an easy gig, but we are change merchants. Some how, we have to overcome the cultural inertia that exists within organisations, groups and the minds of individuals. Good trainers have to make their ideas so palatable through the relevance, emotion and approachability, that the concepts presented are accepted without question.
Like I said there is a dearth of good presenters, hence we need to come up with a way of average testers being able to deliver content, overcoming the inertia aspect. If there is a reliable process or technique for getting information over to a variety of individuals then I have to find it.
Tuesday, 2 March 2010
eHealth security
I've had a fair bit of involvement with IT solutions for Health, looking at everything from process improvement of methodology for development of a system integrating patient diagnostic imagery, to upgrades of VoIP, to compliance and conformance of eHealth systems.
And while the vast majority of work is considered from a testing perspective, there is always an element where I have to consider operational procedural perspectives, and compliance with any standards that cover such things.
This article highlights the issue. Even with all the debate regarding patient safety and security, and even with what you might consider a professional and responsible organisation, Medicare has shot itself in the foot by not ensuring patient privacy.
Having had my notebook stolen by a Federal Government employee once, I might be predisposed to expecting all employees to posses questionable ethics, but actually my expectation is that those who work for government fully understand the significance of the role of a public servant. The responsibility is in the meta-title itself; the person is there to serve the public. This person is entrusted with the records of citizens to provide a variety of services, from health, education, aged care, welfare payments, etc.
A single act unfortunately tarnishes the group as a whole, plus it places doubt in people's mind regarding the intent of initiatives that the public servants try and implement. In this case, people will question the advertised benefits of introduction of the Health Identifier, if those proposing it, implementing it, controlling it or using it, are likely to misuse the information provided to them in confidence.
So while I continue to look at the conformance of new eHealth systems, I have to keep a beady eye on the compliance and associated risk with non-compliance, of operational procedures and guidelines; not because its fun but because the benefits of eHealth systems will never be realised when the scope is scaled back or additional guidelines are in place, to reduce the risk of a few individuals misusing the system.
Thursday, 25 February 2010
There has been an interesting court case I have been tracking. It has led to the conviction of three former Google executives of violations to Italy's privacy code, due to the uploaded of video by some school kids of themselves bullying another child who suffers with Down Syndrome.
While ISPs are not held accountable, the hosts of material that infringes on such laws are. This seems to cross over into the same territory that led to Anna Bligh to complain about Facebook allowing alleged offensive content to be published onto websites dedicated to two schoolchildren murdered in separate incidents.
One of the big attractions for the internet, hence a catalyst for it's growth was the perceived freedom that allowed individuals to indulge in their fantasies. Whilst the content posted onto the dedication pages was most probably offensive and possibly illegal, we seem to have reached or at least nearing the crossroads of the internet.
If we allow freedom of expression, then prepared to be trolled, and do not be surprised at input that some individuals might provide. If you dislike the input that some individuals provide, then feel free to lock down the interwebs. If you lock down the internet, do not be surprised when your online services are attacked.
While I understand Mrs Bligh's sympathies for the irreverence shown towards the victims, the internet is not the preserve of the political classes for communicating to the population. Rather the internet was developed by individuals to aid communication for academic, social and at times nefarious means, and have, from one perspective, been hijacked for commercial and political benefit. The twitterati should not be surprised when IRL incidents are impacted by Anonymous.
The internet is not regulated, controlled, logical, rational, sociable, ethical, nor civilised. It transcends national boundaries, crosses cultural divides and makes money where none is found physically. Internet users in every single country can be contacted. Chinese dissidents are supplied with the tools to communicate freely from internal and external sources. Russian gangsters acquire identity information. A girl in the US posts her musings about cute Japanese culture. A boy remixes the wav files from Windows to make a song.
The rule about the internet is there are no rules. That is until governments have complete control over it, including content, access and speed. I think governments are starting to realise, that the easiest way to control public opinion is to control what is accessible through the internet. A sanitised, healthy version will appear and I don't think I'm delusional in this opinion; Mr Conroy has the plan going forward, and dangerous websites peddling harm to our children will be excluded from public view.
The thing is, the government doesn't own the internet, so why should they control it?
The Falkland Islands
It seems Argentina seems to have got the backing of such wholly respected politicians and statesmen such as Mr Chavez.
Due to a British company now drilling for oil in what some people would claim is within British territorial water, Argentina are looking to the UN to approve their claim that the continental shelf geography allows them to claim territorial claim over the drilling area.
Due to a British company now drilling for oil in what some people would claim is within British territorial water, Argentina are looking to the UN to approve their claim that the continental shelf geography allows them to claim territorial claim over the drilling area.
The thing is, I'm still not sure how Argentina can claim sole ownership of the islands and surrounding waters. It was once occupied by Argentinians during the early 1800's but prior to that, it has been variously explored, mapped and claimed by French, Spanish, Patagonians, Portuguese and British seaman.
Simple argument; as it is closer to Argentina, they can have it. If we are going by the continental shelf ruling then Australia now own Papua and Papua New Guinea, while Malaysia now owns Singapore, Indonesia and Brunei.
While we are at it, lets hand the Canary Islands (Spanish) and Madiera (Portuguese) to Morroco, Gibralter (British) to the Spanish, Cyprus(erm) to Turkey, Jersey (British) to France, Bermuda (British) to USA, Clipperton Island (French) to Mexico, Hawaii (USA) to Kiribati, American Samoa (er USA obviously) to Samoa, Christmas Island (Aus) to Indonesia.
If we are going to do this seriously, then lets hand over Japan to the Chinese, Ireland to UK, UK to France, Indonesia to Australia or visa versa, Sri Lanka to India, Seychelles to Madagascar, and Taiwan to the Philippines.
3000 British people call the Falkland Islands home. /discussion.
Friday, 19 February 2010
Training - What is it?
The K. J. Ross & Associates Summer School has been a good opportunity to reflect on what makes a good trainer. I developed a peer review form ages ago to record a trainer's performance, yet there are so many qualities essential to training and presentation, I'm starting to find it hard to place adequate measurements in place to determine if a trainer is ready for the big wide world. In other words, "Who am I to say!".
- Firstly, by being a trainer, you end up, or have to have, such an in depth knowledge of the subject you are teaching to gain the trust and be perceived as being an authentic purveyor of the information. If the class don't believe you then they won't believe what you teach them and you will fail.
- Secondly, you have to be able to relax. If you look under pressure, then chances are you are stressed, and the class will read this, as a lack of confidence.
- You. Must. Engage. With. The. Audience. You are on-stage. You are a performer. It doesn't matter if you have the knowledge and expertise if your delivery style is as dry as a Tanquery Gin Martini without a smidge of Noilly Prat.
3 covers so much.
- Gauge the pace of delivery to meet the brain capacities of the audience. Is this startingly new material for them? Don't say that it is, but slow down. Does it appear like they know it? Speed up, but confirm they are keeping pace.
- Vary your voice. Your voice is a very useful tool for getting information over, but use it too often and its like a waterfall to peoples ears; they hear it but it's a constant stream and doesn't have any detail or clear distinction.
- That means vary pace and volume, including talking quietly and loudly, insert pauses and don't be afraid of dead air, so don't add "umm, err".
- Be careful about moving around too much. They are focused on your voice and face. It is helpful to gauge if they are awake when their eyes don't follow you around room, but make sure you are walking to check the attendees alertness, and to move to a closer position to help engage with them subtly.
- Powerpoint. Its incredibly useful yet deadly. Use it to display paraphrased ideas, rather than lengthy details or descriptions. Use it to prompt you, and project key messages, rather than for it to contain more than you are saying. And whatever you do, never ever ever ever read each slide word for word. You will lose the crowd within 5-10 minutes..
- Use of props. This can be both useful and a distraction. I haven't worked out if there is a magic rule for when the former becomes the latter. Certainly, they are a visual cue when doing a call-back to a previous discussion point, and they might inject humour into the introduction of a topic, but I'm sure there are limits.
There are other aspects too. Some people have a natural ability for quick thinking, and I also think it can be developed in some individuals over time. This allows you to adapt to particular situations, perhaps with difficult students or when the exercise doesn't go according to plan. Saying that, all adaptability does is to provide you a back-up plan, when others might have prepared more fully and hence not needed to adapt.
Training has become one of my main areas of focus over the past 4 or 5 years, and yet it still challenges me; there is always a better way to get your message across. Many more people now consider online training a critical path to rolling out training in the future. Having sat CBT courses when I first joined Ericsson 14 or 15 years ago, I know they are far from replacing a classroom and a skilled trainer. Yet if the skilled trainer provides guidance how CBT might provide a bit of variation in the delivery and present the most appropriate information in an online manner, then we might have some success.
One of aspects I see attendees value in attending IRL courses is the ability to have a skilled trainer relate and explain content to an individual, and enthuse that individual with the same passion in the topic as the trainer. CBT has no passion, so what content is best suited for that delivery? Content that might have heated debate with CBT thereby removing the opportunity for contentious discussion, or where content is bland?
I'm going to continue looking at some open source and commercial LMS and see how well they integrate with our CRM, and while there seems to be something fundamental missing when discussing online delivery, I'm hoping a great epiphany will strike me and make me push ahead with CBT.
Friday, 22 January 2010
'es mad about e health.
I got emailed an article in Computerworld this week, which seem
This article seems to be based on the rants of one individual who is blogging on this topic. The fact that one of the main objectives of this person's blog is to "provide commentary on what seems to have become the lamentable state of e-Health in Australia" and "to foster improvement" is hilarious. Obviously there will be a negative perspective of a topic irrespective of any merits of benefits there might be.
The blogger seems to be aiming target at NeHTA, as he has failed to recognise the challenges of being handed a poisoned chalice. Having been involved a little at NeHTA, it was easy to see that while every effort is being made to define pragmatic Australian standards for e-health information systems, numerous other parties will directly influence the likelihood of success. One instance that springs to mind was NeHTA had developed a simple directory service to identify nodes on the network. It was put forward as an Australian standard, but one vendor who had a hugely complex and excessively featured commercial non-standard product, vetoed the standard, forcing the market place to use it.
When the interests of individual stakeholders trumps the benefits to the health industry, which by it's very nature is expected to be humanitarian and altruistic, you have to question the probability of success.
By no means to I think NeHTA is exceeding expectations; I think as a quasi-government organisation there is a lack of commercial awareness, a work pace that would make chess players fall asleep, and an immaturity of project management, but I don't think this is any different from most organisations, and they recognise the importance of being pragmatic with the suggested implementations. If a vendor already has a product out, or has some of the features, they won't be penalised by the standards, more the level of conformance will be established.
Dr More also seems unaware that as well as defining technical implementations, operational standards are also being defined to ensure the technical implementations are not compromised by poor work practices. Certainly, some areas in NeHTA (although I can't speak for all) recognise that no matter how well the system is designed, if the configuration, operational practices, support processes and policies are weak, the technical systems will be vulnerable.
But hey, I haven't read any suggestions on his blog as to what should happen.
Friday, 15 January 2010
What does Apple do next?
Some interesting conjecture around what it is that Apple will deliver next. The latest idea, or at least it is the idea that hasn't been delivered yet, is an Apple Tablet.
Big iPhone or keyboard-less slow MacBook? Who knows. It is hard to think that Apple would get it wrong though. As a Tablet user (Toshiba M750) I like the concept of the tablet, but there are limitations.
- I like having a large screen
- I get frustrated at having to flip the screen around
- I don't like having to reach over the keyboard, hold the screen still with one hand whilst writing on it with the other.
Yet there are so many benefits and opportunities with a Tablet. I don't know if it comes down purely to limitations of the underlying OS or the understanding of what the problem is, that is meant to be solved by a tablet. For me, I can take notes during consulting, and have an electronic white board when I am training/mentoring. It offers a more accurate control experience, as we are more used to holding a pen and 'pointing' at things than having a slight separation of controlling a mouse to guide an arrow around a screen. I certainly noticed a difference between mouse and stylus navigation and the ease in which I could focus on the task in hand with the latter approach.
So then I find an article which mentions an Apple patent for use of eye-tracking software to allow users to navigate and control a computing device.
Perhaps this will change the way we work. One can hope.
Tuesday, 5 January 2010
Acceptance Test Driven Development
Interesting presentation on tools and process for Acceptance testing in an Agile environment.
Acceptance Test Driven Development
View more documents from Naresh Jain.
Subscribe to:
Posts (Atom)