It's less than a decade ago that I wrote a piece for a software application developer title urging all programmers to believe in the "always develop with mobile in mind" mantra - actually it was in 2004.
Not ten years on then... and what seemed like a potential overstatement at the time now appears to be nothing less than stock standard common sense.
If we have to make the same kind of proclamations now in 2012, surely our "enabling technologies" of focus should be speech and touch. It's less of a shockwave now to say this kind of thing, but maybe that's because the scope of our comprehension for the pace of technology innovation has changed too.
Touch is of course driving the as yet still skeptically received Windows 8 operating system and we are all increasingly used to this method of user input due to our widespread love and affection for tablet PCs in their various forms.
Apple's Mountain Lion release officially reached us last week and there is a reasonable speech function.
TEST: "This sentence is written with mountain lion speech recognition to show clarity and accuracy." OK so that appears to work pretty well.
So what's coming next?
After our Nuance pushes speech recognition towards full "Star Trek"-ness piece earlier this year, this week we see the company release version 12 of Dragon NaturallySpeaking with what are claimed to be more than 100 new features and enhancements.
But what does that mean?
This technology is engineered to be able to adapt to:
• a person's preferred style of writing
• the audio characteristics of their voice
• pitch, speaking style, accents
• ...and even speech impediments.
"Dragon 12 brings in Smart Format Rules, a new technology that adapts to the way the user prefers to format their words. The software detects word, phrase and format corrections, including abbreviations and numbers, so dictated letters, emails and documents reflect a person's own writing style," said the company, in a press statement.
The software reminds users to adapt their profile's vocabulary based on any documents or emails of their choosing - so the words and phrases each person uses the most are recognised.
Dragon 12 features support for the Dragon Remote Mic App for iOS and now for Android devices. Users can turn their mobile phone into a microphone for use over a WiFi network using the free Dragon Remote Mic app.
Nunace CTO Vlad Sejnoha explains that what his company has done with this software is work to take advantage of multi-core processors and look for ways to exploit the opportunities for programming with concurrency and parallel programming in mind.
So OK, I can't suggest that mobile is important for programmers any more - that would just be silly.
Can I suggest instead that programmers need to consider parallel programming on multi-core processors aligned to take advantages of speech, touch & mobile as fundamental architectural concerns for every application they produce?
Or would that sound out of place too at this stage?