October 2010 Archives

"Operationalizing" analytics: IBM's new UK Cloud Computing Lab

bridgwatera | 1 Comment
| More

Spending a few days with IBM at the company's Information on Demand conference comes with three guarantees: you'll get a tightly run event (they've been doing this for a few years now after all); you'll get plenty of news; and you'll be hit with plenty of new Americanised terms such as the "templatizing" of business intelligence tools and the "operationalizing" of data analytics.

Well, as cheesy as these terms may be - they are used with such gung-ho gusto that you at least get to see that the company truly believes in the software technologies that it is currently developing in the data management space.

Supporting the furtherance of IBM's investment into this market is the launch of the company's new UK Cloud Computing lab at the Hursley IBM Innovation Centre in Hampshire.

IBM lab.jpg

Image: courtesy of IBM

The company says that a recent IBM developerWorks survey of 2,000 IT professionals from 87 countries, 91 percent of respondents said they anticipate cloud computing to overtake on-premise computing as the primary way organisations acquire IT by 2015. Industry analysts also are predicting significant growth for cloud computing services, estimating that this year's US$68 billion market opportunity will reach nearly US$150 billion in four years.

Partners at the new Cloud Computing Lab in the Hursley IBM Innovation Center can access the latest in IBM cloud technologies to develop and test new cloud services and work with industry experts to build a go-to-market plan. IBM says that a typical project at the lab will help a partner explore a wide variety of cloud computing models and become cloud builders, application, technology and infrastructure providers -- as well as cloud resellers and aggregators, depending on their individual business.

Partners can access the lab from any of IBM's network of 38 Innovation Centres worldwide. As a result, partners at these centres can work virtually with the cloud experts at the Hursley IBM Innovation Centre to enable their technologies and gain the skills they need to build and deliver new cloud services.

"Our business partners are ready to cash-in on cloud computing, and they are looking to IBM for hands-on assistance to drive new business opportunities," said Jim Corgel, general manager, IBM ISV and developer relations. "The new Cloud Computing Lab will help our partners gain the skills they need to build next generation business applications and services for the cloud using IBM technologies."

IBM: 80 percent of our global data is unstructured (so what do we do?)

bridgwatera | No Comments
| More

IBM has thrown a lot of stats out to support its Information on Demand conference being held in Las Vegas this week. Of quirky interest perhaps was the slightly random fact that 92 percent of monetary transactions in India take place using cash, so there could be a major impact on global data if and when India starts to use electronic payments at any major level.


Indian Rupees: more popular than chip-and-pin by far

Also thrown out as a data-driven trend to ruminate on this week was IBM's comment that 80 percent of data is unstructured -- and this is mainly down to video and email and texting. That figure may be inflated given IBM's willingness to tell us how well Cognos 10 business intelligence (BI) will save us from unstructured data hell, with its new collaborative tools and new support for mobile devices.

Senior VP for middleware software and IBM veteran Robert LeBlanc talks profusely about all the problems associated with unstructured data & how IBM wants to help manage it, but he doesn't explain why this modern phenomenon has come to be so, so why should we trust him?

Is it enough to simply say that this is IBM, or does that old adage (nobody ever got fired for buying IBM) not apply any more? It's almost like IBM is talking up the spiraling global unstructured data mountain and saying well hey guys, 'it's not OUR fault, but we're here to help you fix it'.

"A new car now has around 30 processors in it, so you are going to see information being created everywhere at levels never previously imagined - so performing analytics on that data and scaling it to the level where it becomes big data analytics will be very important - Hadoop is one of the biggest things in the industry right now and IBM is there to support the growth of these new technologies," said LeBlanc.

"We have to get to a world where we get to workload optimised systems that are focused on a particular set of capabilities or a particular process -- and you are going to see a lot more of that from IBM," said LeBlanc.

Ah ha so finally, we get gutsy and get some hard core opinion from Big Blue. In fact the opening sessions do get better once you get past the "showboating" of day one with the "Dale Winton" style ultra-upbeat presenter Mark Jeffries.

The keynote sessions themselves are enticingly titled: Smarter Systems - Powering A Flexible Information Platform; Empowering Information Governance; and Maximising the Business Analytics Vision -- and you can watch them all online here throughout this week.

IBM woman.jpg

We started this story with global data stats, so let's end it the same way with some interesting notes on what's happening to our planet's appetite for information. These stats come from an internal IBM source and were shared for the first time this week.

  • This year alone, 1,200 exabytes of data will be generated from a variety of sources.
    According to industry analysts, enterprise data growth over the next five years is estimated at 650 percent.
  • 80% of this data will be unstructured generated from a variety of sources such as blogs, web content, email, etc.
  • In fact, 70% of this data is stale after ninety days.
  • The mobile workforce is expected to reach more than 1.19 billion by 2013 and mobile transactions are expected to grow 40 times 2015, generating even more data for businesses.
  • Today, 70% of first customer interactions with a product or service starts on the web.

More as it happens throughout the week...

IBM Information on Demand Las Vegas: notes from a keynote

bridgwatera | No Comments
| More

This week sees IBM host its Information on Demand conference in (fabulous) Las Vegas. But this is not an extension of the IBM Rational Software Developer Conference, so what is this event all about? I initially thought we were going to be fed and endless stream of business intelligence (BI) content as there is some fairly heavy Cognos branding emblazoned across some of the welcome areas.

In truth, this event is rather more focused on 'real-time data analytics with scalability'. Now that was five words if you don't count the 'with' -- and if IBM is doing one thing this week so far, it is using a long list of descriptive terms to talk to the 10,014 registered attendees for this conference.

IBM picture 1.png

Robert LeBlanc, IBM VP for software & middleware took to the stage for a broad brush definition of what the company's recent CEO survey had uncovered in terms of 'insight' into current data trends.

CEO's want creative leadership according to LeBlanc -- something he says translates into new and innovative business models where customer relationships are approached from a data-centric viewpoint i.e exactly HOW should a company share information with its customers?

But if that sounds like marketing-speak then LeBlanc's next comments were peppered with so many 'flowery' tech-business terms that he started to lose me to be honest. "We're looking at data environments with new additional volatility," -- "We're looking at fundamental economic enablement," -- and finally (wait for it) IBM and its partners want to explain how we should, "Operationalize the benefits of a single view of scaled integrated data."

Thankfully, LeBlanc spent some time explaining what IBM is really doing to make this "operationalization" actually happen. It is work to enhance the company's massive storage compression technologies, innovation within the IBM information server, fault tolerance provisioning and more.

Some more hard facts for you: IBM has invested US$14bn in data analytics over the last four years, over which time it has made 24 acquisitions. The company also employs more than 200 IBM mathematicians who focus exclusively on nothing but data analytics -- and this has given rise to over 500 patents in the field of analytics to date.

Somehow you just wish the speakers could start off with more hard facts and less big picture postulating - or am I being picky? More to come throughout the week...

Geeks urged to "cook up" better development in the kitchen

bridgwatera | 1 Comment
| More

In the spirit of a good Friday blog, I want to mention the recent release of a new book called "Cooking for Geeks: Real Science, Great Hacks, and Good Food" from O'Reilly. The company so well known for its 'CookBook' series of developer-centric books normally focuses on C++, Java or .Net as its main ingredient source - but this new geek cookbook is actually directed at software programmers' interest in the kitchen and all its mechanics.

But hang on - developers don't cook do they? They eat pizza and drink copious amounts of Dr Pepper and Mountain Dew don't they?

cooking for geeks.gif

Well, the opposing theory is that software engineers ought to be interested in the mechanics of, well, pretty much everything really and that these guys (and girls) ought to be fascinated by the science behind what happens to food while it's cooking - right?

"Cooking for Geeks applies your curiosity to discovery, inspiration, and invention in the kitchen. Why is medium-rare steak so popular? Why do we bake some things at 350° F/175° C and others at 375° F/190° C? And how quickly does a pizza cook if we overclock an oven to 1,000° F/540° C? Author and cooking geek Jeff Potter (@cookingforgeeks) provides the answers and offers a unique take on recipes--from the sweet (a "mean" chocolate chip cookie) to the savory (duck confit sugo)," says O'Reilly.

"Readers of Cooking for Geeks will be much more comfortable walking into the kitchen, picking up a frying pan, and trying something new after reading the book," says the book's author Potter, who has been cooking since he was a child growing up in California. "Cooking for Geeks shows you how to have fun in the kitchen by blending science with cooking and takes a playful, quirky approach to teaching you how to be a better cook."

The publicity information accompanying this book's launch say that thus book will help you: "initialise your kitchen and calibrate your tools," as well as give you the chance to "learn about the important reactions in cooking, such as protein denaturation, Maillard reactions and caramelisation, and how they impact the foods we cook."

So what's your take on this? Should programmers be interested in the science behind their food? Will they want to know the perfect temperature to broil a Philly CheeseSteak sandwich at?

Or will they just order the Dominoes/Papa-John double pepperoni meatzanator and not think about how it came to be as they get back to the command line?

Open Kernel Labs' gung-ho Americana approach to secure mobile

bridgwatera | No Comments
| More

Open Kernel Labs (OK Labs) is a company that describes itself as an embedded virtualisation software for mobile phones specialist. Its SecureIT Mobile offering is based on the company's OKL4 Microvisor and is said to enable mobile OEMs, mobile network operators (MNOs) and integrators the option to build secure wireless communications devices from commercial off-the-shelf hardware and software.

The company contends that in the past, secure communications devices emerged from highly proprietary design and acquisition cycles, resulting in systems that were, "Hard to build, expensive to acquire, difficult to maintain and impossible to upgrade."

OK Labs attempts to justify its position in the market by using a selection of warm fuzzy terminology. Stating that government agencies such as 'homeland security' (as is it were a global term and not an Americanization) need, "secure communications in a user-friendly form-factor" in the shape of "commercial-off-the-shelf (COTS) solutions".

That all sounds safe, simple, pre-packaged and secure doesn't it?


Ideally says OK Labs, secure mobile communication builds on commercially available devices that deploy off-the-shelf software platforms and applications (running Android, Symbian, etc.).

The company also says that devices need to support regular communications and applications for "normal" conversations (personal communications, social networking, etc.) but also support secure exchanges (encrypted voice, text, even video) among similarly equipped devices and/or infrastructure. In keeping with "COTS Initiatives" launched by the US and other governments.

That was the US and 'others' if you missed that. The 'rest of the world' as it is sometimes known. Or 'the other guys' if you prefer, you know - not American.

"Police, firefighters (that's firemen) and other government workers need a single device for both secure and personal communications," said Steve Subar, president and CEO of OK Labs. "Previously, such devices were only seen on TV or in movies, with actual secure handsets requiring costly development of custom hardware and software. Now, OK Labs SecureIT Mobile solution brings together the technology and knowhow to streamline the supply chain and deliver secure smartphones built on COTS hardware and software."

OK Labs has produced a SecureIT Mobile White Paper at http://www.ok-labs.com/landing/secure-it-mobile/ if you wish to read more on this subject.

Join the security debate: an ongoing challenge with new dimensions

bridgwatera | No Comments
| More

Software security has been and still continues to be a top line issue for most organisations. Yet software development teams still continue to produce and deploy insecure code and applications with serious consequences for their brand, reputation and, of course, the finances of their customers and their own organisation.

So where can we draw the line? Where are the real truisms to be uncovered here? What are the indisputable industry axioms, tenets and best practices that we should all be aware of?


The following 'textual debate' is based on an original piece by Bola Rotibi CEng. who is research director of software development, delivery and lifecycle management for Creative Intellect Consulting Ltd.

We need to ask those involved in the software development, delivery and lifecycle management processes questions that can really expose and uncover the mechanics of their current practices. More importantly, we need to look at how they recognise and address the software security challenges that are presented by a variety of deployment platforms and application architectures in play today (e.g. web, mobile, virtualised desktops and production environments etc.)

Essentially, we must look at how software security risks are identified and how much importance is placed on the governance, education and training process.

Software Security Survey: evolution of role, delivery and deployment

Fundamentally, the premise of our recent research survey and stream is to understand the security challenges facing software security architects, software developers and in general the software delivery team in building applications deploying to multiple runtime platforms and environments. This is especially necessary as an increasingly mobile user audience is accessing software applications in multiple ways (using multiple devices) with high expectations for engagement and experience.

Editor's note: This point should be reinforced to say that users demand a consistent experience across all these new devices (desktop, mobile, tablet pc - and let's add virtualised desktop in there too). The challenge here is that the responsibility for consistency across the application landscape falls at several feet i.e. the architect who lays out the initial development model, the GUI designers who work on look and feel - and the programmers who build the mechanics. Ensuring consistency across these levels, then across devices and still retaining security and application effectiveness and integrity - well that's a tough call.

Our survey will address whether software security is handled better within certain industries and why? What are the trigger points and drivers for actively engaging in, improving or evolving a software security strategy and how important a role does tooling and automation play? These are important questions for determining how capable IT organisations are in dealing with software and application security effectively now and in the future.

The answers will allow many interested parties to anticipate the gaps and holes that are currently preventing IT organisations from tackling software security appropriately and successfully. It will also offer suggested strategies to improve an organisation's ability to do so in the light of evolving deployment environments and delivery models.

It is vital that software is developed correctly and effectively but also securely, not least because the alternative creates a barrier to future innovation and has a detrimental impact on the end user's overall experience and capacity to trust.

The survey link is shown right here: http://www.surveymonkey.com/s/SecuritySurvey-CIC

All respondents will get a copy of the full report and will be entered into a draw to win a half day consulting session with Creative Intellect Consulting Ltd in the field of software delivery and application lifecycle management.

Sennheiser plugs into Cisco Developer Network

bridgwatera | 1 Comment
| More

Think back 10 or even 15 years and the technology newswires were alight with news of DECT, Unified Comms & new PBX technologies, all of which were designed to push a new envelope in telephony and communications and change the way we enjoyed crystal clear voice and sound over the airwaves.

Since that time we may have all become too preoccupied with VoIP, handset operating systems and open source contenders to the proprietary behemoths in the telephony industry to pay enough attention to the ground level work still being carried out.

Well, that and wondering when Microsoft is going to sort out Windows Phone 7 at least!


Anyway, darling of the quality headset market Sennheiser however is still with us and its interesting to note that manufacturers like this are pushing to extend compatibility with their products outward to the latest IP phone ranges.

As such, Sennheiser Communications has just announced that it has joined the Cisco Developer Network as in the Unified Communications technology category. In line with this, the company has completed interoperability verification testing its latest wireless DECT headset product with Cisco's IP Phones and IP Communicator 7.0.3.

Sennheiser's official statement on the news says that, "The Cisco Developer Network unites Cisco with third-party developers of hardware and software to deliver tested interoperable solutions to joint customers. With offerings such as DW Office headsets, customers can more quickly deploy a broad range of Cisco compatible business applications, devices, or services that can enhance the capabilities, performance, and management of their Cisco network."

We hear so much today about interoperability at the software level from a developer/programmer perspective, it's quite refreshing to pick up on some hardware-driven software-related developer interoperability.

Don't you think?


Semantic elements: it's all in the meaning

bridgwatera | No Comments
| More

This is the fourth guest post by Mat Diss who is founder of bemoko, a British mobile Internet software company that aims to pioneer new ways for web designers to quickly construct better websites that can be delivered across all platforms from desktop to mobile.

Here Mat continues his series of posts that demonstrate the advantages of designing with HTML5. In this blog, demonstrates the significance of the latest semantic elements...


There is also another significant evolution in HTML that is improving the foundations upon which sites are created on. This is the inclusion of some new semantic elements, which more effectively describe the content of the page. The page at http://bemoko.com/html5demo/semantic uses these new elements such as <header>, <footer>, <nav> and <article>.

The designers of the HTML5 specification have identified a handful of elements, which most pages use. By standardising this, a site developer can describe the page in a common way.

This makes the site easier to maintain (other site developers can clearly identify specific parts of the page) and easier for other systems to interact with, for example text to voice representation for visual impaired or extraction of article for repurposing the content in another platform (e.g. mobile app).

Previously a site developer might have simply used a with appropriate classes set - however the class name would be an arbitrary choice so use by other services would not be possible in a standard way.

Ignoring the cloud is akin to staying with DOS

bridgwatera | 4 Comments
| More

Magic Software MD David Akka has used his corporate blog to liken a refusal to embrace the cloud as being parallel to having stuck with DOS and ignoring Windows (and other operating systems of course) all those years back.

"For those who ignore the cloud and carry on regardless, in around two to four year's time their solutions will NOT survive," said Akka. "Dramatic I know, but as I see it the move to the cloud is very much like the move from DOS to Windows all those years ago. Simply ask yourself where would you be right now if you had elected to stick with DOS?"


There's a lot of cloud propaganda out there right now and, as Akka also points out, not all companies will be shifting their mission-critical business applications and databases off their own servers and into a vendor-hosted cloud offering tomorrow.

The option to consider hybrid clouds that provide the flexibility for organisations to move at their own pace -- deploying applications in the cloud as and when they feel comfortable -- should perhaps be discussed more openly.

The mechanics of cloud computing connections

bridgwatera | 1 Comment
| More

CA Technologies used its recent appearance at VMworld 2010 Europe to hang out the flags for a new agreement to integrate its own CA IT Process Automation Manager with the VMware vCloud Application Programming Interface (API).

The integration, which according to official sources is under development using the VMware vCloud API in BlueLock's VMware-based public cloud is intended to "enable process automation and orchestration" for providing and consuming virtual resources in the cloud.

For the record, BlueLock is an Infrastructure-as-a-Service (IaaS) provider of cloud hosting solutions.

So what can the industry surmise from this triumvirate of cloud computing mechanics?

Cloud england.jpg

Photo: Adrian Bridgwater - Durdle Door, Dorset 16 Oct 2010

Perhaps initially we can see that real collaboration across the cloud is starting to happen? Perhaps we can see that true enterprise deployment across private, public and hybrid clouds may take more than one company's technology? Perhaps we can see that virtualised workload management across the cloud is also complex enough to warrant integration agreements between vendors?

After all, you don't just click ON turn a cloud solution on - even if it is on demand - you need to think about planning, modeling, instantiation, testing, deployment and operation/management.

"As a certified VMware vCloud™ Datacentre Services provider, BlueLock is able to provide secure and agile public cloud resources using the robust VMware vCloud functionality," said Pat O'Day, CTO, BlueLock. "CA Technologies use of the VMware vCloud API demonstrates the powerful cloud bursting capabilities of its process and workload automation solutions."

Developer cheatsheet: Getting Started With ALM

bridgwatera | No Comments
| More

Software application development professionals love to learn and, for the most part, any opportunity to increase their skill sets is also viewed as an opportunity to increase their earning potential.

Application Lifecycle Management (ALM) company Aldon has produced a new developer cheat sheet for software architects unfamiliar with ALM to get them started.

In the new cheat sheet written by (or at least attributed to) Daniel Magid, chief technology strategist for Aldon, ALM is defined as follows:

Application Lifecycle Management encompasses the automation, control and tracking of all activity involved in delivering software solutions for resolving business challenges and taking advantage of emerging opportunities. When appropriately implemented, ALM simplifies the lives of everyone involved in the change process, improves communication, automates regulatory compliance and supports the organisation in rapidly providing high quality solutions to the highest priority business requirements.

Within, we cover everything from End-to-End ALM to Process and Deployment Automation so that you can better understand the way that you can leverage ALM to your advantage.


This cheat sheet is one of a number of "Refcardz" hosted by the DZone website.

Qt Developer Days - Day One Keynote Notes

bridgwatera | No Comments
| More

Qt has clearly grown. Back in 2008, this Qt Developer Days event drew 520 developers and I was there to participate in the proceedings. In 2009 the company (post Nokia acquisition as it was) had clearly outgrown the hotel it had been using. So in its new location for 2010, we now see over 1000 attendees here in Munich.

So what's a keynote without a special guest? Well, Rich Green is CTO for Nokia now that he's moved on from his previous tenure at Sun. Green took the stage this morning to explain that his track record has encompassed the creation of J2EE and he ran Solaris during the dot com era.

Green then tried to explain what the role of the CTO really is in day-to-day practice. For Green, it's all about applications and he seems to understand the difference between an API and a boardroom table.

As an ex-developer himself, he promised us that this time next year he would also be 'hands-on' with Qt itself and have spent time at the command line with the framework and toolkit - something that surely all CTOs should really do if they are going to come good on the promise to 'eat their own dog food' right?

Note: "dogfooding" seems to be being coined to describe the use of one's own technology, you may not like it, but its probably going to stick.

Qt Auditorium.JPG

It's all about "upward and forward compatibility" right now said Green, where new toolkits and frameworks may result in new apps that stretch devices, but these should all work until the device itself physically can not support the software any longer - and therefore the time for a hardware upgrade becomes essential.

"Open governance and community engagement is key to open source success," said Green.

Qt president Sebastian Nystrom also took to the stage this morning and his 'leave behind' was that the latest iteration of the Qt application development framework (version 4.7 in fact) has been focused on performance refinements rather than feature augmentation. Yes there are new features too, but it's interesting to see an app framework reach a watershed point where the vendor's focus returns to usability and being 'performant' rather than simply ramping up new functionality.

More as it happens...

What to expect from Qt Developer Days 2010

bridgwatera | No Comments
| More

It's hard work coming to Munich and bypassing the city centre, the bierkellers and the Gothic spires of the Marienplatz to head straight for an out of town 'conference hotel' where the Bavarian flavour is limited to the 'Oompah-Band-Burger' on room service. But such is life and such is Qt Developer Days 2010, which I am secretly pleased to attend as I have been at the last four year's events and know the team to be straight talking developer-focused 'Trolls' as they like to call themselves.

The 'Troll' reference is a throwback to the days of Trolltech and the company's initial iteration in its pre-Nokia acquisition form. Speaking to developer evangelists last night, it appears that Nokia has been mindful in terms of brand and culture awareness with Qt and allowed the company to retain much of its original identity and approach to cross-platform application framework development - situated as it is in its Oslo headquarters.

So what to expect from the week ahead?

Crowd 2small.jpg

Photo credit: David James Stone

Well, it's a refreshing start this morning. Rather than kicking off with a corporate keynote, we're straight into a training day to match the newly announced Nokia Qt 'Specialist' certification. There are 50 technical tracks, almost all of which are being presented by Qt's own developers making up a total of 62 hours of training. Qt hosts up video sessions from last year's event as well as the new training content that it creates throughout the year and this last twelve months has seen 35,000 hours of e-learning clocked up.

Speaker Line Up:

Up on the podium this week once again is Sebastian Nyström, VP of application service frameworks for Nokia, Qt Development Frameworks - and he'll be joined by Qt director of R&D Lars Knoll who between them will 'tag team' the roadmap ahead for the next year to eighteen months.

Nokia has brought out the big guns for the week ahead and we do get to hear from (and meet) Rich Green who is senior VP and CTO of Nokia itself. It won't quite be a press one-on-session, rather more of a seven-on-one, but it'll be interesting to hear what the big man has to say one day after the launch of Windows Phone 7.

As well as building the GUI and the application structure for the Air Traffic Control system at Munich airport, Qt is also being used by DreamWorks Animation for a new application and lighting system.

As for more, I'll keep a few things for other blogs. Suffice to say for now that this year looks completely different to previous years, substantially bigger, many new attendees showing new interest in Qt - and a lot more dedicated developer training.

Here's a link to some photos from last year that I took myself - they have been illegally hosted on this Chinese website, so feel free to click the link and have a look if you wish.

Developers, could the future of online apps be, well, offline?

bridgwatera | 1 Comment
| More

The Holy Grail of application development is online apps that work with great WiFi connections right? There's nothing like being able to download all your favourite data and surf the web, especially on a mobile device, right?

The thing is, national city-wide WiFi zones haven't quite arrived yet have they? Also, what about all those times when coverage is hard to get or prohibitively expensive - think hotels and planes as the prime examples there if you will.

So what we really need are downloadable apps for, say iPhone and iPad users, that work in full when not connected - but yet emulate the full functionality of an app that we would normally expect to only use online, right? A street map would be a great example here.

A street map of England perhaps, or even better the whole of the UK - or even better than that, the whole of Europe. As a download, with a one off payment fee, fully searchable and touchscreen enabled.

It's not as fanciful as it sounds, Skobler has just released ForeverMap Europe for the iPhone and iPad (an Android version is coming soon) using data from the open source OpenStreetMap project.


Berlin-based skobbler has been independently developing navigation software for mobile phone platforms since 2008. Since its UK launch in June 2010, skobbler says it has topped both the free and paid navigation category in the UK Apple App Store for four consecutive months with UK downloads exceeding 200,000 to date.

"This is It's something that hasn't been done before (at least not for an entire continent) and we believe that there's a real need for people who either travel a lot or who don't have a mobile data connection in their device (i.e. all iPod and non-3G-iPad users). So we developed ForeverMap from scratch, using our own map compression and routing algorithms," said Marcus Thielking, co-founder of skobbler. "We want to create an array of 'lighthouse' products that show what's possible based on the OpenStreetMap in all sorts of categories on various mobile device platforms."

ForeverMap Europe costs £3.49, while TomTom Europe will set you back around £60.

Dictionary.com provides new word power for developers

bridgwatera | No Comments
| More

Although undeniably American in its general influence, Dictionary.com is attempting to extend its global reach by launching its new API Development Centre to extend its Application Programming Interface (API) to software developers everywhere.


Text-based applications and, to be honest, any application featuring the option for a user to work with and manipulate text, can now be built using the site's word definition power which also extends to a thesaurus, quotes, an encyclopedia and a translation tool.

There are also etymologies, pronunciations, slang and word of the day.

"Our robust API enables developers to leverage Dictionary.com's comprehensive offerings to enhance word games, create learning language applications and other word-related apps for online, mobile, eReaders and other connected environments. The API will empower the developer community to deliver more exciting content and experiences to their users," says the site.

Dictionary.com sits under the ownership of Ask.com and had previously opened the API to its preferred partners last year. This latest move, should developers wish to adopt the site's offerings, will allow apps to be built with word definition power baked in - side stepping the need for the user to open a browser at all.

The API is offered in a selection of free options for non-commercial apps and paid options, which can be based on a revenue-sharing deal or a license fee.

HTML guru: canvas & CSS3 advances deliver sophisticated animation & drawing power

bridgwatera | No Comments
| More

This is the third guest post by Mat Diss who is founder of bemoko, a British mobile Internet software company that aims to pioneer new ways for web designers to quickly construct better websites that can be delivered across all platforms from desktop to mobile.

Here Mat continues his series of posts that demonstrate the advantages of designing with HTML5. In this blog, Mat looks at the concept of Canvas for creation of graphics...

HTML5 provides a whole new canvas concept which allows you to draw graphics on your web page. CSS3 also brings in some great new features such as transitions - http://www.w3.org/TR/css3-transitions/ - which allows CSS properties to be changed smoothly from one value to another. For example, rotate an image when the user clicks on it.

With the combination of both the canvas and CSS3 advances, an HTML developer has a lot more power at their fingertips. This can include drawing of a button (with text generated on the fly), sophisticated animation and interactive gaming. A lot of UIs that were once the realm of platforms like flash are now are now accessible via native HTML.

The canvas example - at http://bemoko.com/html5demo/canvas - draws a few gradient shaded circles when the user clicks a button. This is done with the following Javascript and HTML snippet:


<script type="text/javascript"> <br / />
  function draw(){ <br / />
    varctx = document.getElementById('whiteboard').getContext('2d');    <br / />
    // Create gradients  <br / />
    varradgrad = ctx.createRadialGradient(45,45,10,52,50,30);  <br / />
    radgrad.addColorStop(0, '#A7D30C');  <br / />
    radgrad.addColorStop(0.9, '#019F62');  <br / />
    radgrad.addColorStop(1, 'rgba(1,159,98,0)'); <br / />
    <br / />
    varlingrad = ctx.createLinearGradient(0,0,0,150);  <br / />
    lingrad.addColorStop(0, '#00ABEB');  <br / />
    lingrad.addColorStop(0.5, '#fff'); <br / />
    <br / />
    var radgrad4 = ctx.createRadialGradient(0,150,50,0,140,90);  <br / />
    radgrad4.addColorStop(0, '#F4F201');  <br / />
    radgrad4.addColorStop(0.8, '#E4C700');  <br / />
    radgrad4.addColorStop(1, 'rgba(228,199,0,0)');  <br / />
      <br / />
    // draw shapes  <br / />
    ctx.fillStyle = lingrad;  <br / />
    ctx.fillRect(0,0,150,150);<br / />
    ctx.fillStyle = radgrad4;  <br / />
    ctx.fillRect(0,0,150,150);  <br / />
    ctx.fillStyle = radgrad;  <br / />
    ctx.fillRect(0,0,150,150);  <br / />
  }    <br / />
</script> </p>


<div class="content">
  <div><button type="button" onclick="draw();">Draw Something</button></div>
  <canvas id="whiteboard" width="150" height="150">

See http://9elements.com/io/projects/html5/canvas/ for quite an impressive experience built using the HTML5 canvas.

Visit the demo site at : http://bemoko.com/html5demo/i

Pulling together: interoperability and the user at the heart of business computing

bridgwatera | No Comments
| More

This is a guest blog written by Phil Lewis who is a business consulting director with Alpharetta, Georgia headquartered enterprise software company Infor.

Without promoting his own company's brand in any blatant way, Lewis gives us a relatively impartial view of how interoperability and connecting applications to execute cohesive business processes should always stem from user perception of business needs and real user control.


It is incredible to think that despite over two decades of business computing and the efforts of the multi-billion dollar software industry, companies in 2010 still struggle with the same issues they started tackling when they adopted their first computer:

• Complex integration projects with expensive and unfamiliar technology
• Rigid business processes which are difficult to re-configure
• Proprietary technology creating barriers to interoperability
• Visibility of business problems forcing a reactive approach instead of proactive improvement
• Islands of isolated information and data causing ineffective decision making and complex reporting

Alongside these "traditional" issues we can now add new challenges such as the ongoing debate between on premise vs. cloud applications, where businesses don't just face the need to make a choice, but a new integration issue when they have made their selection.

The key to solving this history of challenges is ensuring that the technology that is implemented is done so in a way that does not prohibit or restrict future options.

For example, connecting applications to execute cohesive business processes, with a lightweight, standards-based, technology architecture, not only pulls together related business activities such as production and stock, but keeps the door to the future open with a set of connectors capable of integrating with third party applications and cloud services.

To do this without drowning in new technology investment, businesses need to align business processes to put the user in control. That is not to say that the technology should be entirely passive - it should help guide users to execute critical tasks, which support overarching company goals.

This eye on the future and end user focus mean business network reporting will become critical whereby a repository subscribes to messages published by all connected applications. This provides a single data source, which represents data from all business systems, giving a view of the entire business, opposed to an individual application.

Of course, finding applications that are delivered with this level of interoperability built-in is a whole new challenge...

Explaining "Intelligent Workload Management" in simple terms

bridgwatera | No Comments
| More

This is a guest post by Mark Oldroyd, a senior technology identity and security specialist with Novell.

In this piece Oldroyd discusses the risks and challenges of computing across multiple environments. Despite the hurdles and obstacles ahead he says, companies can achieve their aims if they strike a balance between flexibility and control.


If a business adopts a highly flexible approach by increasing its adoption of virtualisation and allowing employees to work remotely - the corresponding risk to confidential data rises. But try to control application usage and the movement of data too closely - and IT managers risk damaging the competitive advantage of their organisation and its ability to respond quickly to market changes.

According to IDC, a new approach to managing IT environments is required - that approach is Intelligent Workload Management. A workload is an integrated stack of applications, middleware and operating system. These workloads can be made "intelligent" by incorporating security, identity-awareness and demonstratable compliance. Once made intelligent, workloads are able to:

  • Understand security protocols and processing requirements
  • Recognise when they are at capacity
  • Maintain security by ensuring access controls move with the workload between environments
  • Work with existing and new management frameworks

Intelligent workloads offer organisations multiple benefits. They ensure security and compliance; reduce IT labour costs in configuration; improve capital utilisation; reduce provisioning cycle times and reduce end user IT support costs.

Adopting an Intelligent Workload Management approach to IT isn't a nice thing to do - it's an essential thing to do in today's complex IT landscape. Organisations today must deliver seamless computing across any environment, whether that be physical, virtual or cloud if they are to successfully achieve a balance between security and control.

Subscribe to blog feed

About this Archive

This page is an archive of entries from October 2010 listed from newest to oldest.

September 2010 is the previous archive.

November 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.