BIM + IoT + GIS = well, something interesting.

Clive Longbottom | No Comments
| More
I have just returned from EsriUK's user event in London.  Some 2,500 users, channel and Esri people were comparing ideas and showing what they were doing with Esri's GIS tools - and the place did seem to be buzzing.

To my mind, the world is moving towards a bigger need for spatially intelligent tools.  However, for many, such a need is still nascent - there is still little understanding of what a fully intelligent GIS tool can do within a business.  To many, Esri (and its main competitor, PBBI's MapInfo) are just mapping tools - maybe with greater detail than Google or Bing Maps with better overlays and so on, but nothing much more than that. That perception is present in those who have heard of Esri or MapInfo - but also too many organisations that could make use of the tools are still unaware of these companies.

So, what can GIS do for a business?  In the retail space, it can be used to optimise logistics; to decide where a new distribution warehouse or a new outlet should be positioned; or to decide when seasonal perishable goods should be delivered to different shops around the country to minimise waste and maximise sales.  In utilities, it can be used to log assets and to help in planning such things as where to dig trenches for laying pipes and cables, ensuring that existing underground items are not impacted.  In education, it can be used as an educational tool to help students learn more about the world around them.

All of these are common usage cases and have been around for some time.  But GIS is also seeing other technologies come through that will play to its strengths.

Consider building information modelling (BIM) systems.  Here, vendors such as Arup, Autodesk and Bentley provide systems that are asset-focussed in dealing with how buildings are managed.  For example, a BIM system will contain details of all the items that make up a specific building: depending on the system, this could be at a high level (covering things like the chairs, tables and other fixed and non-fixed items placed within a building) through to highly granular systems that cover the components that went in to constructing the building in the first place - the type of concrete, the position of lintels, the type of glass used and so on.

BIM systems have historically been pretty much self-contained.  Items that were moved from one building to another, were broken and disposed of or replaced as part of standard maintenance procedures have had to be manually input into the system.  The internet of things (IoT) may be in a good position to help change this, though.

Some time back, there was the thought that radio frequency identification (RFID) tags would be used to track assets, but a lack of true standardisation and cost meant that except for with certain areas, this did not take off.  Now, as the IoT starts to take off, low cost sensors and monitors that are visible directly through existing networks will become possible. IoT tags can be attached to pretty much anything and monitored through the right systems - already, disposable temperature sensors are being dropped into concrete to optimise how concrete is poured, and the military are looking at using IoT devices in battlefield situations in a massive way.  BIM will be an obvious one for these - but it will also require spatial context.  Where, in both a 2D and 3D world, do these assets actually exist?  In real time, can they be plotted in a meaningful way?

And the assets may not just be the inanimate objects of chairs, tables, computers, vending machines and so on.  People are just as important when looking at the security needs of today's highly dynamic environments.

Esri demonstrated an example of this at the event.  Replicating the Harry Potter "Marauders' Map", where the footprints of people at Hogwarts could be seen on a 2D map in real time was a pretty nifty demonstration.  However, taking this to the next level and twisting the map to one side so that it became 3D showed the power of GIS.

Whereas the 2D map showed that 2 people were close together, the 3D map showed how they were several storeys apart in the building.

Now let's take this a little further.  A personal example was when I had to take my wife to hospital to have a broken ankle seen to.  She was obviously not that mobile - yet the fracture clinic was some way from the entrances.  When we checked in at reception, we were told that we needed to go to the X-Ray clinic first - a distance away.  We got to the X-Ray clinic, and were sent back to get paperwork that the fracture clinic had neglected to give us.  We bounced backward and forward between various departments for a couple of hours.  Equipment that could have been optimised in its usage was left unused; time was wasted - the process was a bit of a mess.

It could have been so different through the smart use of technology. Track our phones using a GIS system and overlay this onto a BIM and then analyse the results to see where the process involved is broken.  Then, use a BIM project planning tool, such as Asta Powerproject BIM to run a project that optimises the whole process.

In discussion with EsriUK's managing director, Stuart Bonthrone, we discussed how GIS is misunderstood.  Where GIS is really now aiming is as a data aggregator, combining its own data sets with data from other systems, such as BIM and other systems and then acting as a full business intelligence analysis tool ensuring that the visualisation of the results is carried out in a flexible and effective manner to suit the needs of as many users as possible.

It's a long way from mapping - but it is a potentially massive market; and a very exciting one.

A report from Splunk Live 2015: the real world use of machine data

Bob Tarzey | No Comments
| More

Given the chance to address customers, partners, staff and the media en masse, any company likes to lay out its vision. This was certainly true when Splunk's CEO Godfrey Sullivan spoke to an audience of almost 600 at Splunk Live in the London on May 13th 2015. Vision is all well and good, but only if it chimes with the problems faced by customers and prospects. In a well-orchestrated event, there was plenty of evidence that Splunk's customers endorsed and benefited from initiatives being undertaken by the vendor.


In a nutshell, Splunk turns all the data churned out by computing infrastructure, applications and security systems into operational intelligence, aiding both IT and security management. The volume of this machine data has increased a lot as infrastructure been extended to include cloud services; the numbers of users has increased as online applications are opened up to outsiders; the layers of security have increased and the internet-of-things has taken shape. Splunk says it has moved from the static review of machine data to dynamic big data analytics; i.e. more insight from more data with the capability to respond in real time.


As the landscape Splunk is collecting data from has changed the tools it provides need to evolve too. Two initiatives help with this. First, along with many others in the industry, Splunk has moved to DevOps, enabling agile development and making new features available as soon as possible. This applies to its core Splunk Enterprise product and is native to the way the new Splunk Cloud service is delivered. Splunk has also extended its reach, with a Splunk Light for smaller businesses and Hunk, which enables its tools to be used directly against Hadoop big data clusters. Second, it encourages customers to develop and share their own applications, testing and certifying the most popular ones for download from its app store.


So, what do Splunk's customers think? The vendor is not shy to talk about them; big European names came up in presentations again and again including John Lewis, Tesco, the NHS, Sky and VW, the last of these using Splunk to help manage its connected cars program, a true internet-of-things challenge. Three customers presented during the morning sessions, with more time being given to them than Splunk's own spokespeople. Their testaments underlined the reality of the vision outlined by the CEO.


First up was Paddy Power; a familiar name in the UK and to many who gamble online, a service provider. It has all the IT challenges of a 21st Century online company; thousands of virtual machines, a mainly mobile customer base and huge spikes in demand, for example up to 12,000 bets to process per minute during the recent Grand National steeplechase. Splunk helps address all sorts of worries about performance and security. Perhaps, most interesting was Paddy Power's approach to development, agile DevOps, mirroring Splunk's own need to get new innovations to customers as soon as possible. The company's use of Splunk was initially in the area of security, but now it is providing business insight to senior execs via smartphones though analysis of machine data.


Next was Ticketmaster, another quintessential online operator, with thousands of virtual machines and over a quarter of a billion registered users. It experiences huge peaks in demand when tickets for popular events first become available; sales can top $1M a minute! Application failure is expensive and unacceptable. In Ticketmaster's own words "life was not pleasant before Splunk!" Initially Splunk was used for incident investigation, forensics, security/compliance reporting and monitoring known threats. In line with Splunk's own vision Ticketmaster has moved on to real time advanced threat and fraud detection and monitoring the insider threat.


Finally was CERT-EU, not an end user organisation but one providing security intelligence to a community of sixty plus opt-in European Union institutions. Here, in partnership with a range of IT security vendors, including Splunk, CERT-EU monitors threats in real time across all its members and is therefore able to provide much broader protection than any individual organisation could do for itself. Whilst this includes crime detection, nation state and terrorist activity are now an ever present threat for government bodies and a target of CERT-EU's monitoring.


In 2014 Quocirca worked with Splunk to get a better idea of the extent to which EU-based businesses recognised the value of machine data and were able to collect and analyse it. The results were published in a free report Master's of Machines. A new report, to be published in June 2015, will look at how similar businesses are using operational intelligence derived from machine data to manage IT complexity, improve the cross-channel customer experience (or omni-channel as some call it) and tighten security. Some are as advanced as Paddy Power, Ticketmaster and CERT-EU, but the research shows that for the majority machine data is a free resource that they are yet to fully exploit.

The changing face of project management

Clive Longbottom | No Comments
| More

Historically, a project has been defined as a time-bounded set of tasks.  That is, a project has a start date, a desired set of deliverables and an end date, along with human and cash resources that need to be allocated to it to make it happen.

So easy to describe, so difficult to do.  A whole market has grown up around project management, from the likes of Microsoft Project through to high-end project portfolio management tools such as seen in Artemis, Oracle Primavera or CA Clarity.  There are also more targeted packages, such as Deltek (aimed more at civil engineering) or AVEVA in the heavy engineering sector.

However, with continuous delivery of incremental improvements being required by organisations and more people across different skills and levels needing to be included in "projects", it is time for project management software to change.

The way companies work has changed, and this is forcing a redefinition of what is meant by a project.  In many cases, the definition given above is no longer the case: a project may be an ongoing sequence of tasks with less of a defined end point.  Instead, there may well be a series of desired outcomes spaced out along an evolving timeline.  Project members will join and leave as the project goes along in an increasingly ad-hoc manner.  The success of various tasks along the project timeline will define what the next steps are - and whether a project should continue or be brought to a halt.  Some projects will still work to deadlines - many will have review points that act as decision making points, but the project itself may have no direct end point.

For example, consider something like sales enablement.  A company could create a whole set of disparate projects overseen by someone viewing how the salesforce, product development, marketing, logistics and so on are working, trying to pull together different projects and approaches to create a desired end.  Or, an ongoing project could be put in place that leads to continuous reviews where the success or otherwise of a campaign can be measured against how many items sales have sold with a feedback loop being put in place such that product development can change what is provided to sales to meet the customers' needs.

High-end project management tools can't do this easily. Users generally have a need to understand the language of a project manager - Gantt charts, critical paths and so on.  They need to come out from the systems that they are used to working with and use specific project management tools.  All of this counts against broad use of project management - yet this may be about to change.

I have been talking with a couple of interesting project management vendors; both have a couple of things in common; each has its own way of operating.

Firstly, Clarizen is looking to make project management far easier to use and understand.  It provides all the standard views and tools that full project managers want - Gantt charts, resource management tools, overall portfolio management and so on - but also brings in social collaboration.  Users can choose to see projects in terms of tasks, Gantt charts, critical paths and so on if they so want, or can see a simple timeline with progress markers showing how complete different tasks within the project are.  A full, audited track of everything to do with a project is maintained in a manner that makes it easy for users to see exactly what is happening - not just at a "task 50% complete" level, but also from a "this person has severe issues around this area for these reasons, and here is what others think".  Through the use of Twitter-like hashtags and "@" identifiers, Clarizen is making it easy for people to track activities and people throughout an organisation's work.

As such, it can replace those horrendous email trails that many people involved in projects have had to deal with: trails that are not integrated into the project management software itself, and so lead to issues when decisions are being made at project management meetings. Indeed, it can also include outsiders, who can be sent static or dynamic views of what is happening in a project with granular security controls.

The way that Clarizen is being used by some of its clients shows how it is enabling users to manage their day-to-day lives.  Many users are seeing Clarizen not as a project management tool, but as a life management tool.

The other company I have talked with is InLoox.  It also realises that collaboration is part and parcel of how an organisation operates these days, and it also realised that the best way to get people to use project management is to embed it within an environment that they are already familiar with.  Therefore, InLoox made the decision not to be a stand-alone project management, instead embedding itself into the tool that the vast majority of people are using on a day-to-day business basis - email.  InLoox is an Outlook-native system - projects are created, managed and updated from within Outlook; notifications are made through email messages that take the recipient straight through to the InLoox project environment.  Again, InLoox retains a full list of social comments and interactions around a project - including all emails and attachments that are associated with the project.

By taking old-style project management and making it more user-centric and easy to use, Clarizen and InLoox stand a far better chance of democratising project management.  However, it will need either a change in perception from the prospective user base as to what project management means to them, or some heavy marketing from the vendors to create a new belief within prospective users that these solutions will change the way they work for the better.

The rise and rise of bad bots - part 2 - beyond web-scraping

Bob Tarzey | No Comments
| More

Anyone who listened to Aleks Krotoski's 5 short programmes on Radio 4 titled Codes that Changed the World will have been reminded that applications written in COBOL, despite dating from the late 1950s, remain in widespread use. Although organisations are reliant on these applications they are often impossible to change as the original developers are long gone and the documentation is poor. With the advent of Windows and then web browsers, there was a need to re-present the output of old COBOL applications. This led to the birth of screen-scraping, the reading of output intended for dumb terminals and repurposing it for alternative user interfaces.

The concepts of screen-scraping have been reborn in the 21st Century as web-scraping. Web scrapers are bots that scan web sites for information, when necessary manipulating i/o to get what they need. This is not necessarily a bad activity, price comparison sites rely on the technique, for example an airline or hotel wants its pricing information shared in the hope that their services will appear on as many sites as possible. However, there are also less desirable applications of web-scraping, such as competitive intelligence. So, how do you tell good bots from bad?

This was the original business of Distil Networks. It developed technology that could be deployed as an on-premise appliance or invoked as a cloud service, enabling bots to be identified and policy defined about what they can or cannot do. So, if you sell airline tickets, it can recognise bots from approved price comparison sites, but block those that are from competitors or are just unknown.

Distil does this by developing signatures that allow good bots to be white listed (i.e. allowed). It recognises bots in the first place by checking for a lack of a web browser (and therefore real user) and challenging suspects with CAPTCHAs (Completely Automated Public Turing test to tell Computers and Humans Apart). It has plans to extend this to APIs (application programming interfaces) that are embedded in the native apps that are increasingly being used to access online resources from mobile devices.

With the ability to recognise and block bots, Distil Networks has realised it has the ability to block other unwanted attention being received by its customers. For example:

·        Brute force logins are perpetrated using bots; these can be identified and blocked, and if necessary challenged with a CAPTCHA

·        Man-in-the-middle (MITM) attacks where a user's communication with a resource is interfered with often involve bots, they can be detected and blocked

·        Online ad fraud/click fraud rely of bots to click many times mimicking user interest and potentially costing advertisers dearly; such activity can be identified and blocked

·        Bot-based vulnerability scanners can be limited to authorised products and services, blocking others that are being used by hackers to find weakness in target systems, giving resource owners back the initiative in the race to patch or exploit

Distil charges by the volume of page requests, so for example, if you were worried about ad-fraud and a bot net was used to generate millions of clicks, then costs could spiral out of control. The answer to that is to use DDoS controls that can detect volume attacks (as discussed in part 1 of this blog post) in conjunction with Distil's bot detection and blocking capability.

Distil seems to be onto something. It has received $13M in VC funding so far, and has an impressive and growing list of customer. Unlike many security vendors, it seems happy to name its customers; perhaps just knowing such protection is in place will encourage the bad-guys to move on? In the UK this includes EasyJet and Distil is set to make life harder for bad-bots - as ever there will surely be a fight back from the dark side.

The rise and rise of bad bots - part 1 - little DDoS

Bob Tarzey | No Comments
| More

Many will be familiar with the term bot, short for web-robot. Bots are essential for effective operation of the web: web-crawlers are a type of bot, automatically trawling sites looking for updates and making sure search engines know about new content. To this end, web site owners need to allow access to bots, but they can (and should) lay down rules. The standard here is to have a file associated with any web server called robots.txt that the owners of good bots should read and adhere too.

However, not all bots are good; bad bots can just ignore the rules! Most will also have heard of botnets, arrays of compromised users devices and/or servers that have illicit background tasks running to send spam or generate high volumes of traffic that can bring web servers to their knees through DDoS (distributed denial of service) attacks. A Quocirca research report, Online Domain Maturity, published in 2014 and sponsored by Neustar (a provider of DDoS mitigation and web site protection/performance services), shows that the majority of organisations say they have either permanent or emergency DDoS protection in place, especially if they rely on websites to interact with consumers.

However, Neustar's own March 2015, EMEA DDoS Attacks and Protection Report, shows that in many cases organisations are still relying on intrusion prevention systems (IPS) or firewalls rather than custom DDoS protection. The report, which is based on interviews with 250 IT managers, shows that 7-10% of organisations believe they are being attacked at least once a week. Other research suggests the situation may actually be much worse than this, but IT managers are simply not aware of it.

Corero (another DDoS protection vendor) shows in its Q4 2014 DDoS Trends and Analysis report, which uses actual data regarding observed attacks, that 73% last less than 5 minutes. Corero says these are specifically designed to be short lived and go unnoticed. This is a fine tuning of the so-called distraction attack.

Arbor (yet another DDoS protection vendor) finds distraction to be the motivation for about 19-20% of attacks in its 2014 Worldwide Infrastructure Security Report. However, as with Neustar, this is based on what IT managers know, not what they do not know. The low level, sub-saturation, DDoS attacks, reported by Corero are designed to go unnoticed but disrupt IPS and firewalls for just long enough to perpetrate a more insidious targeted attack before anything has been noticed. Typically it takes an IT security team many minutes to observe and respond to a DDoS attack, especially if they are relying on an IPS. That might sound fast, but in network time it is eons; attackers can easily insert their actual attack during the short minutes of the distraction.

So there is plenty of reason to put DDoS protection in place (other vendors include Akamai/Prolexic, Radware and DOSarrest). However, that is not the end of the bot story. Cyber-criminals are increasingly using bots to perpetrate another whole series of attacks. This story starts with another, sometimes, legitimate and positive activity of bots - web scraping; the subject of a follow on blog - The rise and rise of bad bots - part 2 - beyond web scraping.

Smartwatch - wearable companion or independent app platform?

Rob Bamforth | No Comments
| More

Wearable technologies have started to pop up in a variety of form factors from a wide variety of sources, but the wrist has become a clear favourite in terms of acceptable and accessible location for deployment. It is familiar, convenient and yet something worn there can be unobtrusively masked by a well styled or tailored cuff.

Many wearable devices, however, have been less elegant and perhaps understandably, a lot of early ones are quite clumsy. Some styles and functionality are a bit too 'cyberpunk' and have lead to backlash and ridicule eg facial icons like Google Glass. Some are plagued with laughably short battery lives (did anybody think to talk to the Swiss about self wind watches?) and most are parasitically reliant on an external host to feed off for communications.

Largely, this means that many wearable devices are simply 'IoT on the wrist'. They can identify, track and monitor and with a small screen or nearby smartphone can alert or inform the user. For many this means simple, but potentially useful applications tracking health and lifestyle or for the overly socially connected, simply tracking, tagging and broadcasting 'life'.

Stepping, beyond health and fitness means more application and device sophistication, which is the point at which Apple has decided to move further into the wearable market (small iPods did have clips, so arguably have been wearable for some time) with the much heralded Apple Watch.

Apple's huge strength in the mobile sector with both iPhone and iPad has been driven specifically by its app store. Sure the devices are beautifully styled and have cool cachet and appeal, but as many would point out, they are more expensive than most and there's lots of innovation elsewhere. However, Apple followed others in understanding that the key to the success of any platform, as Microsoft, Sun and others have found in the past, is fuelling the virtuous circle of user adoption, developer engagement, app availability (and repeating).

Apple would surely like to replicate this iPhone/iPad success with its smart watch, which already looks - still prior to mass release - like a very desirable, functional and app-friendly platform. It will no doubt do very well as a lifestyle accessory for the rich (with the high end variants) and aspirational (with the colourful Swatch-alike styles), but there is one element of mobile functionality even this device is missing - independence.

Connectivity to the wider network will still rely on the proximity, availability and battery life of another device, which may be a drawback in certain applications - especially perhaps those with unencumbered, unobtrusive and unfettered business in mind; field, logistics and healthcare workers needing handsfree access to IT or shop floor sales staff not wanting to hide behind screens.

This is one area where Samsung has taken an interesting direction with its Gear S smart watch, which has its own SIM-card slot and therefore can have an independent mobile connection. The big upside of this cellular connectivity is that unlike all other smart watches, which are companion devices to mobile phones (although thankfully they can tell the time on their own), this is a fully capable wearable application platform with a connection that is no longer dependant on another device.

It might be based on an operating system considered unusual in some parts of the world (Tizen), but it does open up opportunities for developers to provide wearable applications of value - this could be very significant for apps for the enterprise, if not for consumers. Many workplace tasks require IT support, but devices used to access it over-encumber the user, and an independent wearable device removes this headache.

Wearable devices are still in their infancy, but already a variety of interesting usage models are emerging. It may look a little futuristic, but applications can be deployed on wearable devices today and make a real difference to both an organisation and the wearer. Any business engaging in evaluating mobile IT needs to seriously consider how wearable devices could have a positive impact - it's no longer only phones that have to be smart.

Wearable devices - now a reality for the workplace

Rob Bamforth | No Comments
| More

There has been a lot of hype about wearable technologies mostly focusing on smart glasses, watches and heath wristbands for consumers, but it is likely that there will be far more compelling reasons to use wearable devices in the enterprise. Here, wearable devices, perhaps often in combination with sensors and other interaction from the internet of things (IoT) will provide greater context and granularity of information and control.

The overall reasons that should drive this are less to do with fad and fashion and much more oriented around efficiency and effectiveness for the organisation. That is not to say that employees will not gain some advantage. Safety, security and wellbeing are all potential benefits for wearers of smart devices, and these might be recognised as such by employees if presented correctly - or indeed if the employees' consumer experiences motivate them to buy in to the style and innovation.

At a recent event hosted by TBS Mobility and Samsung, several very interesting case studies emerged that demonstrated a combination of the innovation and clear business value from wearable device use. These could be effectively applied in many other sectors and organisations, but there are some challenges to address.

The primary reasons for wearable devices are to gain access to IT resources without encumbering the user and getting in the way of the task in hand. So many other items of technology involve varying degrees of significant physical commitment - sitting down to use a desktop or laptop, two hands to use a tablet while standing and even cradling a smartphone requires a hand and at least one eye or ear.

Something worn on the wrist, accessed by a glance, tap or spoken word not only fits a Dick Tracey wish-list, it also frees up hands, is out of sight and allows the user to be 'footloose'. In challenging environments this means the user no longer has to be as concerned about losing the device or even be seen to be carrying something of value that might be stolen. 

Tasks that require both hands, looking customers in the eye rather than via the back of a screen or getting help without being noticed are all much simpler. The case studies on the day of the aforementioned event featuring wearable applications already in use - by snow clearance workers at Heathrow airport, shop floor staff in Dixons and lone workers in public sector housing - highlighted that this is not about glitz and glamour, but getting a job done easily and well.

Another related reason to wear is that these devices can be unobtrusive and operate quietly in the background. They can be used to identify, track and monitor automatically without intervention from the user. The information gathered can be analysed on an individual or aggregated level to provide insights into operational processes, or potentially more valuably, used in real time to effect improvements based on changing circumstances as they happen.

Both of these reasons to wear depend on something quite critical - the wearable device must always be worn otherwise it cannot perform its expected function. While it may seem pretty obvious that smartphones and tablets need to be remembered and carried in order to be used, their actual use is intermittent rather than constant. While there may be an expectation of an alert or incoming message at any moment, most applications are dipped into periodically - ie email, messaging, browsing - and are not in constant use.

Applications designed to make use of wearable devices on the other hand expect to be worn and in use for the entire period of operation. The problem is that employees may forget or more worryingly may lack the incentive or might even be hostile to the idea of being constantly 'plugged in' and potentially personally monitored. Or, in the case of recording technologies, such as Google Glass, others may be hostile to being recorded.

This highlights the need for the benefits of using wearable devices need to be felt and understood by both employer and employee. For the organisation the benefits should be easily quantifiable as efficiency, but for the individual the value might be indirectly related to the job - eg increased safety for lone workers, eliminating boring or laborious tasks - or the whole concept of wearing devices could be offered by employers based on incentives.

There has been some initial research conducted in this area, especially around promoting health and lifestyle benefits. Employees with devices might get reduced insurance premiums, offers of free gym membership or other incentives based on levels of activity. If that all sounds too Orwellian, some of the research found that employees would be quite happy to be bribed by offers of extra days leave, a straightforward bonus or shorter working hours as payback for wearing a device to track for performance and productivity reasons.

Whatever the incentive - financial or personal wellbeing - the principal is the same; technology that doesn't get in the way, gathers insights and is devastatingly simple to use can and should bring value to both individual and organisation. There may be some fine-tuning of form factors, a shakeout of frivolous chaff from the worthwhile wheat, and some application development to do, but most organisations should no longer just be thinking 'mobile', but also 'wearable' when considering how employees, customers and partners interact with IT.

Delivery by drone or dead duck?

Rob Bamforth | No Comments
| More

Tiny autonomous delivery drones dropping off everything from your book orders to pizza? There have been trials in many parts of the world by serious names - Amazon, DHL, Dominos Pizzas - but is this a serious post-cyberpunk instant gratification-by-delivery dream or a dystopian nightmare?

It is easy to think of many 'opportunities' that need to be overcome first for this to 'take-off', for example:

  • Noise - one drone is a buzzing irritant you could live with, especially if you've ever owned a radio controlled aircraft, but dozens constantly flying back and forth, really, you'd be happy with that? Ok, it might not be noticeable in busy city centres, but those octo-copter blades aren't exactly silent.
  • Safety - no problem say the pro camp - there are failsafe options built-in, add in a little regulation here and air traffic control there and it should be fine - and after all it will be keeping delivery drivers off the streets so it should be even safer replacing vans and motorbikes with parcelcopters.

Then there's fog, wind, snow, rain, lightning, malfunction, range, battery life, shotguns, slingshots, birds, humans, traffic, aircraft, overhead power and phone lines, not to mention landing sites, a tiny carrying capacity and huge legal issues.

OK, we're a pretty innovative race and those clever folk in Silicon Valley will surely be able to fix all that stuff won't they?


But looking at this a different way, perhaps the problem is less to do with finding a different/faster/cheaper way of delivering things, but doing something much more radical and not delivering them at all.

Newspapers (still not dead by the way) are rarely physically delivered to the door on a paper round anymore - they might be physically picked up, but now they are sold from far many more outlets than they used to be, or they are digitally delivered. No need for a drone to replace the paper boy or girl, there are alternatives and they are cheap and effective.

Many things have headed that way. Music and video? Digital download not DVD, CD or tape and vinyl (much). Anything digitise-able is easier to shift over the wire/fibre and with 3D printing that could apply to physical goods too, so maybe delivery won't be necessary at all? Well, there are many things that 3D printing can and does offer, but it's probably not a complete substitute for everything e.g. fresh food as they've not quite achieved Star Trek replicators.

Far too often really clever innovations appear yet get applied in an old fashioned way. Then they try to fix a problem that isn't there, isn't worth expending any sort of effort on, or will disappear when things turn out differently (as they inevitably do at some point). Many technologies fall into this trap, when really if they were effectively employed and targeted to where they add significant value they could have great success and make a real difference rather than coming across like PR stunts.

For example, drones could be used to deliver consignments into dangerous locations without putting a delivery person at risk (some might say that's where wars are heading anyway); they could be used to rapidly deliver something vital to a critical situation - defibrillator to a first responder at a casualty; or deliver precious cargo - transplant organs, medicine to an offshore rig etc.

These are still on the edge of practical and commercial, let alone legal considerations, but probably much more credible than pizza. Marketing messages have to avoid being too outrageous otherwise they end up sounding like the boy who cried wolf and no one will be believe the sensible ones anymore.

Goodbye "paperless", hello "clueless"

Rob Bamforth | No Comments
| More

Scott Adams' Dilbert cartoons are often too close to workplace reality for comfort. A favourite strip from his book "I'm not anti-business, I'm anti-idiot" (another sentiment many would readily recognise and endorse) has the pointy haired boss having an emailed document faxed in case the recipient doesn't read his email and the original copy posted to ensure he receives a 'clean' version.

The cartoon sums up so many business outbound communications problems. Was what was sent received? Was it in a readily useful format? Does the recipient believe it came from the purported sender? Can it be readily shared and worked upon? Which version is 'the master copy'? Was it changed or replaced after it was copied or sent? And finally, why is technology depleting rather than saving precious resources?

Information and data management should be such a simple thing for 'information technology' to deal with, however the weakness that IT often fails to overcome is in the 'wetware peripherals' (people).

Companies and people, have preferences as to how information should be communicated. These are generally not 'whims', but relate to existing and often difficult to change, business processes. 

Even in small companies (small and mid-sized enterprises/SME), which may on the face of it be more flexible, change is difficult. It costs time and money and is often a distraction as there is little scope, capacity or appetite for making anything other than changes that are essential or forced by regulatory bodies. Many processes are cumbersome and require an audit trail or at least some checks and balances to ensure mistakes are not overlooked. This is especially important with commercial and financial matters.

So what can a business do to automate and improve its business communications?

Organisations may want to shift everything online or to electronic only documents when it is convenient and offers a cost saving, but will be unlikely to be able to force this onto all their business suppliers and customers (unless perhaps they are a high street bank...).

According to recent research conducted by Opinion Way on behalf of communications and logistics provider Neopost, the online or by paper decision is still a relatively complex one. From a survey of 280 UK SMEs it was discovered that communications media preferences varied especially between documents used for different purposes.

Just to be sure they get there, invoices and contracts are likely to be sent by BOTH email and physically mailed by half of UK SMEs, but three fifths of SMEs will send POs only by email. The document most likely to be sent only in the post is pay slips, by two in five SMEs. Type and importance of document are cited as major reasons, along with recipient's preference.

Much of what is contained in all of these document types is financial and potentially critical information, and yet is often handled in a haphazard way. Unfortunately the Dilbert pointy haired boss with his ad hoc fax, email and post scenario is more common than many would like to think. 45% of those surveyed were concerned about the risk of human errors, around a third had trouble keeping track of communications across communications channels with any given client and a quarter thought there was a lack of visibility, traceability and security related to outbound documents.

These are all important concerns, but many organisations will try muddling through hoping the costs of outbound communications mistakes do not cause problems or go unnoticed. Despite any environmental or green policies, few SMEs will worry about the overuse of a little bit of extra paper either.

However, the survey did give pause for thought in one area - almost half of SMEs thought they were wasting time on repetitive tasks where outgoing communications were concerned. If the reduction of risks and errors were not good enough reasons for adding a bit of automation to this communications process, surely doing it to save precious time should be?

Doing more with video conferencing

Rob Bamforth | No Comments
| More

While video conferencing is not a new technology, several factors appear to be driving forward the adoption of video as a communications tool, especially for personal use. Whether Skyping with distant relatives, using video in gameplay or creating recordings to share on YouTube, video has become much more broadly accepted and valued. So, just as in other areas of technology is this consumer appetite for video translating into more widespread business usage? Not really, which begs the question why?

First, any investment in technology must demonstrate clear benefits. With video conferencing there are generally two that are often quoted - saving travel costs and environmental benefits. The current economic climate means that the second of these is no longer the burning issue it once was, and the former is more difficult to measure than it seems at first glance. The problem is that while travel costs can be measured, the savings made because video conferencing was used instead are harder to quantify.

This can mean that the business incentive for increasing the use of video could be muted from a narrow financial perspective. However, take a wider look at the value proposition and the position changes.

Among existing users of video conferencing, both frequently quoted benefits are recognised, but only travel cost savings are really thought of as important. What is more interesting are the secondary benefits, which all revolve around more effective and efficient collaborative and individual working. These benefits in particular appear to be magnified as the adoption of video becomes more widespread across an organisation and as frequency of its use increases.

Once the financial imperative is understood, the things that are holding adoption back fall into two categories - technology and people.

Many organisations have taken a half-hearted or tentative approach to installing video conferencing systems; they have put them in the boardrooms where only a select few people have access; they buy different, often incompatible, systems when they do broaden deployments to other areas; or they have been slow to roll out video to regular desktop, laptop and mobile devices. Generally, training is after the event rather than getting everyone up to speed and familiar at the time of installation. In many cases there are disconnects between the needs of the business, those in IT running the video conferencing setups and the facilities groups that managed the rooms and the workplace.

From the individual's perspective, the technology is perceived as hard to use, unreliable and probably not worth trying without a fair amount of hand holding. Many are already uncomfortable being 'on camera', something that's often not helped if the primary use of video conferences is for mass meetings, where the screen enhances a feeling of formality and not even the friendliness of a handshake or shared cup of coffee are on offer.

No wonder the ones who embrace video the most are those most comfortable with the technology and those in senior positions or with the greatest access. Digital natives, the young group most likely to be very comfortable with video are much less likely to be embracing video in the workplace - why? It is probably simply because they typically are not in sufficiently senior positions to take advantage. Given the potential productivity benefits, this would seem to be a huge mistake.

Most of the technology challenges surrounding video have a direct impact on the people issues too. Past experiences colour current judgment, and many will have had bad experiences with video conferencing, which knock their confidence.

Despite these negative perceptions, current technology has moved on a great deal. Video and audio quality can be incredibly high, interoperability issues are largely contained, and it can be easy and reliable to set up video calls if the right investment decisions have been taken.

Video can be used on every device from a smartphone to big screens in the boardroom and the formality of call setup in advance should be a thing of the past, making it simpler to encourage more frequent usage and familiarity on all screens. This breadth and increased frequency of use is when the real benefits of productivity and collaboration really kick in, but of course requires funding.

Those with video already will often find a mix of systems, some of which will require upgrading to gain consistency across all their video platforms and almost all will require further investment in software platforms to ensure that even the smallest mobile devices can be added into the video mix.

Consistency, widespread availability and encouragement to use video regularly will have a real impact on adoption and if this is mixed with induction education to build early familiarity and further training on how to feel comfortable, relaxed and proficient on camera, both people and technology challenges will have been adequately addressed.

It might require some investment in a mix of hardware, some upgrades, software and services, but creating a democratic and pervasive communication culture that is comfortable with using video will pay off.

For some thoughts on how to finance the changes required to address an enterprise video conferencing strategy as a whole, download this free report.


Redundant array of inexpensive 'things'

Rob Bamforth | No Comments
| More

There has been plenty of hype surrounding the internet of things (IoT) and especially super smart things such as the iconic Nest (now Google) thermostat. While many of these devices are interesting, they often come with premium pricing for an expansive set of features doesn't fit all needs - some things are a bit less smart (& sexy) but could bring significant value.

This is where historically the term 'Machine to Machine' (M2M) has been applied, especially by mobile telcos desperate to seek out new revenue streams from connecting lots of remote low data demand devices. Here, it is often only one or two attributes or states of each relatively dumb 'thing' that are of interest - a sensor taking a measurement, a system being switched on. One sensor on its own, is not really interesting, but using the concept en masse with dozens, hundred or thousands of sensors or devices makes things much more interesting.

This does not mean suddenly flooding the internet with masses of data - no matter how much the telcos might like that idea. The term IoT conjures up an image that all of these things, smart and dumb, will be connected to a single network, when in reality most have very little in common, except their ability to converse using a basic universal protocol set, based on IP. An effective IoT application is one that might take advantage of some connectivity to the wider internet, but is also built on internet technologies that exploit the economies of scale of standard components and common protocols.

When this concept of simple, mass connectivity crosses into the physical aspects of the 'things' as well, the proposition becomes even more interesting, even in what seems like the simple use of under floor heating.

In the industrial research centre, SPECIFIC, in Swansea, a combination of academic research from Swansea University, with industrial skills from Tata steel, NSG Pilkington and BASF, is leading to some interesting developments of 'things'. The SPECIFIC consortium has the concept of 'buildings as power stations' at its core, and is creating low cost, robust items to capture, store and give off, energy.

One example that is currently being commercialised is a heated floor tile. This is a standard 600mm metal wrapped square of chipboard, designed to be stood on free-standing posts to provide a raised access flooring system, typical of most office, business and educational environments. The only difference is a coating on the top and a power connector, and with power applied, the upper surface of the tile warms up.

Under floor heating is not new, but fine-grained control of a tiny area at very low cost, is. However, being able to individually heat every single floor tile is only of real interest when intelligent controls can be applied. What is the temperature in the office, how many people are currently in there, which tiles are exposed or covered by furniture, which rooms are they in, which tiles is the sun currently shining on etc.?

A single smart system that detects some of this data from sensors, has other elements filled in from information in employees' calendars or room booking systems and then access to weather sites and other external data sources over the internet can start to be very effective, efficient and comfortable.

This type of application requires some integration and might not look a elegant as the smart Nest thermostat, but could deliver significant benefits such as cost and energy reduction by applying heating when and where required. This would not be a smart IoT object that 'learns', but a distributed one that constantly takes into account the current and forthcoming situation and applies those requirements.

Some IoT applications focus too much on putting smarts into every individual 'thing', but without fine-grained control of the related physical attributes (in this case heating and energy) being managed, the cost of deploying some IoT applications might be higher than the value of the benefits they deliver.

Smart applications can be built by intelligently assembling and connecting a great many dumb components, not simply by adding expensive smarts into something that was historically dumb. This will be increasingly true for enterprise IoT applications, which have many legacy components and systems to accommodate.

With so much attention on sexy consumer IoT applications there is a danger that the skills necessary for commercial IoT integration to physical things as well as the requirements for IoT applications that deliver real value for businesses, will be overlooked. It would be a shame if the current IoT bandwagon leaves great masses of 'dumb', but worthwhile, things behind.

Extending unified communications

Rob Bamforth | No Comments
| More

In the last couple of decades the number of digital communications options for most workers has soared, bringing with them information overload and post holiday inbox anxiety now only offset by taking mobile devices everywhere. Where once simply not answering the phone or opening the mail would cut off interruptions, most now have multiple forms of telephony - online, mobile, desktop - instant messaging, email and a plethora of social networks clamouring for their attention.

It might seem easier if one preferred mechanism for communication was to become the default for each individual, however, things are not so simple. Individual communications preferences vary depending on the task being performed, and while this can clearly be seen with personal communications, very similar behaviours are just as prevalent in the workplace.

This means that employers need to provide a broad kit-bag of communications tools which will undoubtedly be added to by the ones employees bring themselves.

However, having to switch from one task to another can be very disruptive, hence the emergence of the idea of pulling together the multiple strands of communication, known as unified communications (UC) or to some vendors as unified communications and collaboration (UC&C).

Unfortunately, unifying the communications networks and 'plumbing' was initially seen as the most important aspect of this, especially to most vendors and those in IT managing the infrastructure. However, the critical element for the individual (and the business) is dealing with the flow of work and interruptions, as these affect personal productivity.  It also impacts working together with colleagues and 3rd parties, and so introduces a pressing need to sort out the collaboration element as this affects overall business process productivity.

So what's the best strategy for addressing unified communications? The most important thing is to recognise that ultimately, it is about connecting people not just networks. UC is therefore as fundamental as the heart of the traditional business communications through telephone system, the private branch exchange or PBX.

When many organisations were initially sold on the idea of UC, it was a PBX element that got their attention as UC was often pitched as a cheaper way to make phone calls and simplify network management though IP telephony. These are not a true reflection of the benefit of UC, nor do they generally justify the total cost of investment, especially when new network equipment is needed.

The real benefits come from how UC simplifies tasks for individuals, not just simplifies the network. UC has to provide flexible mechanisms and choices, enabling the right tools to be selected for productive and efficient communication and collaboration.

It is clear that business use of UC encompasses a huge set of capabilities from message immediacy to media rich content sharing. Its effective use, however, requires agile user behaviour to be able to seamlessly glide between and among these different strands of media. This requires consistency in approach and deployment otherwise employees will spend more time focussing on the tool rather than the job.

Why is it now important to push UC into all corners of the business?

First, because working patterns are changing; more people are working from home, on the move or flexibly in multiple locations across their workplaces. Many are working on remote sites and even those sat in the traditional desks, offices and areas of their 'static' workplaces often have many more connections to remote co-workers than in the past.

Second, there is a need for 'friction free' collaboration. The internet and globalisation are great levellers and so gaining any edge or just staying ahead of the competition is getting harder. Economic pressures mean that big budgets are no longer easy to come by, and organisations need to sweat more of their assets and this includes getting the best out of their workforce - not just individually, but as a collective team.

The challenge with UC is it needs to be applied everywhere and consistently which leads to a need for significant investment in a broad range of elements - software for UC clients, diverse hardware from new IP phones to servers and services to ensure deployment success. All elements are important to avoid the pitfalls encountered when moving to an enterprise wide full-scale production roll out.

Getting all elements in sync will require the certainty of funding to completion. It might be fine to trial different elements to see where preferences lay with different features or tools, but once the decision is made, UC needs to be a giant leap, not a timorous small step. The 'U' might stand for unified, but it might just as easily stand for 'universal', given that it is by achieving this that its real benefits will be realised.

For some thoughts on how to finance the changes required to address a unified communications strategy as a whole, download this free report.


Online security in insurance sector

Bob Tarzey | No Comments
| More

Much of Quocirca's research looks at the differing attitudes to IT between various business sectors. For example a 2014 report titled Online domain maturity, which was sponsored by Neustar, showed that retailers and financial services were the most likely to interacting online with consumers. Another 2015 report, Room for improvement, Building confidence in data security, which was sponsored by Digital Guardian, showed that by some measure, financial services were the most confident about data security.

Such comparisons are useful as they show what one sector is achieving and how another sector might benefit by taking similar measures. However, even within a given sector there are extremes; whilst more than half of financial services organisations are very confident about data security, 4% are not that confident. More granular research is needed to tease out where in a sector such differences lie.

Quocirca was recently invited to attend an insurance industry round table focussed on IT security. The event was hosted by Entrust Datacard, a provider of strong authentication tools, digital certificates and online fraud prevention products. If the views of the dozen or so attendees, who represented some of the best known names in the UK insurance industry, are anything to go by, their sub-sector has a lower level of confidence about data security than banks (some organisations have a foot in both camps, so called bancassurance).

Why should this be so? For a start, whereas banks deal directly with their customers money, for insurance companies it is largely secondary, in other words, if your bank account is hacked money may be transferred, it is harder to exploit and online insurance account. Secondly, it was evident that one of the biggest concerns for insurers is insurance fraud, however carried out, and it was not clear that this was harder or easier to deal with as the industry has moved online.

Before the round table Quocirca had considered in what areas insurance companies may be vulnerable. It was agreed that the two obvious ones were the protection of personal and payment card data. Protecting both is of course a regulatory requirement, but also makes good business sense. An insurance company may be targeted for such data, not because it is an insurance company per se, but because its defences are weaker.

However, during the discussion some interesting insurance specific threats emerged. Stealing lists of policy holders would be useful for planning crimes, for example the targeted thefts of high value cars. The task would be much easier with a current list of owners and their addresses than having to travel the streets to search for targets. Another involved intellectual property (IP); as quoting for insurance has moved online, the industry has become highly competitive. To appear high on the listings of comparisons sites, where many insurance buyers end up, involves quoting via tightly guarded algorithms, some felt there was a possibility of industrial espionage in this area.

Another area of concern was the insurance supply chain; many policies are sold via agents and brokers. However good a given insurance company's own data security is, its Achilles' heel could well turn out to be a smaller partner. It was noted that some well publicised data breaches relied on compromising smaller partners to find a way into a larger organisation's IT systems. There should be an onus on insurers to advise and certify the security of their supply chain partners.

There are of course many benefits of being able to safely transact online. Quocirca research, to be published later this year, shows that confidence in the omni-channel (the mix and match of mobile apps, web sites, telephone, face-to-face etc. for communication with customers) goes hand-hand with higher levels of confidence in data security. All agreed the insurance industry had to further embrace the omni-channel. Another was being able to verify the ownership of insured assets, many of which can now be certified electronically via the internet-of-things (IoT), reducing the possibility of fraud.

Another opportunity for some insurance companies is insuring their business customers against online risk. Just as in other areas, those who have taken measures to mitigate the risk will get cheaper premiums. As the sector relies more and more on online interaction to keep up with its customers, insurers cannot afford to be seen to fall short of the IT security standards they expect of those they insure.


Changing the world - company-by-company, city-by-city, person-by-person?

Clive Longbottom | No Comments
| More
I had a good conversation with a large computer company recently about its corporate social responsibility (CSR) programme.  Alongside all the normal stuff of recycling, minimising the use of chemical and elemental nasties and minimising the environmental impact on the surrounding environment, we got on to the subject of how a large organisation should view the use of its technology in the wider sense of being "good for the world".

This uncovered some interesting ground.  The first is that there are three main approaches to what is "good for the world":
1. minimising the impact of the company, its suppliers and its customers on the usage of non- (or slowly) renewable resources
2. creating a sustainable ecosystem such that the impact of the company, its suppliers and customers is nett neutral
3. creating a means of ensuring that the overall impact of the company, its suppliers and customers on the planet is nett positive.

From there, we came down to the issue of what is "good"?  In today's global village, there are many different cultural cliques; providing technology that helps one group may be completely against the aims, beliefs or cultural norm of another group.  This doesn't even have to be by country: the increasing mobility of the global citizens can lead to these issues being seen across very small areas, such as within individual cities.

From there, we also have to look at what it is that technology can do, and what those with access to the technology are aiming to do - which can be completely different things.  For example, big data analysis to identify people missing from home with medical needs can also be used to track other people for other reasons as they go about their daily business.  This can get into problems around the concept of "one man's freedom fighter is another man's terrorist".

However, the main discussions centred around what is it that technology vendors can do for the greater good of the planet, rather than the greater good of individual commercial concerns?  This is where it becomes apparent that the cultural differences between different groups can get in the way.

During the Industrial revolution, there was mass migration from the countryside into the cities - the saying the London's streets were "paved with gold" being a major draw pulling workers from the villages and towns into the high-growth cities.  However, not everyone found wealth - work houses flourished; deaths from malnutrition were commonplace.  The size of cities grew, and people suffered as land lay fallow and unused in the countryside due to lack of workers.

Wind forward a couple of hundred years, and we see the same happening: not only in emerging countries such as India, China and Brazil, but still within the UK, as workers from within and without the UK still try to follow the money to London.

The company I was talking to had the approach that they could provide technology that could ameliorate the problems caused by this sort of mass movement within the cities themselves - dealing with areas such as energy grids, intelligent water usage, traffic and logistics movement and so on.  It wasn't particularly aimed at ensuring that there would be jobs available for everyone, which could be more of an issue.  My view was slightly more radical - technology should be used to try and stop the mass movement of people from one are to another.

Why?  The world is increasingly dependent on a smaller number of people within the agrarian systems to provide enough food for a growing number of people who are getting more removed from any understanding of the food chain.  Agrarian economies are being ripped apart as high-tech, retail and manufacturing companies grow within their regions.  Families find themselves saving up money to send a child away from the family to a city, in the hope that they can make their fortune - even though it is unlikely that they will.

Why not use technology to keep families and communities together?  Make a farmer a better, more effective farmer; encourage farmers to get together as co-operatives to provide a better mix of farm produce to buyers; increase economies of scale of purchasing and production and share workforces.  Encourage artisans to work from their own towns and villages, providing work for others in their communities.  Use technology, such as video conferencing and screen sharing alongside IoT monitoring, for education and medical support for those towns and villages that are remote - and should stay viable as remote communities.

It seems to me that many tech vendors see technology as something that fits into a series of small problems that are not always joined up into a bigger picture - intelligent cities, organisations and so on.  The knock on impact from this is that technology is doing the exact opposite of what it should do: it is creating a two-tier environment of the technology haves and have nots, with the haves being focused on specific centres.  These centres then create problems such as the need for increased energy distribution, travel congestion, increasing housing and food costs where more technology is needed to minimise the impact, and the spiral continues.

No - let's improve the planet one person at a time: let's find out what an individual truly wants out of their life and apply technology to help them.  As each person gets what they need, they will work more closely together.  As communities become empowered, they will be more efficient and effective.  As communities build to provide support for themselves and to others outside the community, the growing world population may find that it can be adequately supported.

And only then technology can be seen to have had a nett positive impact on the planet.

Supporting serious software investment

Rob Bamforth | No Comments
| More

The language of IT is not only peppered with technical jargon and the eponymous TLAs (three letter acronyms), it also afflicts the purchasing processes. The term 'big iron' is not just for the benefit of the 'tin-shifters' involved in selling computer hardware, it also helps those 'server huggers' and 'bean counters' in control of IT budgets understand they are getting something substantial.

Indeed, when open systems burst through the cosy proprietary computer industry in the 1990s, many buyers were a bit put off to be told that their relatively few-year-old IT hardware systems, which were the size of a couple of bulky washing machines, could easily be replaced by a 'pizza box' whose performance and capacity would also provide future proofing for up to a decade or so.

Despite the frequently encountered resistance to spending any budget, IT managers generally find getting approval to buy hardware to be easier than software. Those in control of the purse strings as well as the IT department itself want to see something tangible for their large-scale investment. That software covers so many things from the hidden depths of operating systems, through application servers and databases to the applications users recognise does not help make its investment case.

Software may also suffer from being seen not only as a bit ephemeral, but also somewhat simpler than hardware, which after all needs to be fabricated in sheets of silicon, hot dipped in solder, cased in steel and eventually enveloped in plastic (thankfully less often beige than in the past).

Surely, software can be knocked up in someone's bedroom, as most people outside of the software industry have a young niece or nephew who has been writing programs at home in their spare time. And it is so easy and cheap to copy, so perhaps is not worth that much after all?

This hobbyist view of software dates back to teenage geek owner of early home computers and right up to present day image of casual software developers creating mobile apps at home. In reality the industry is massively more complex and sophisticated, despite never quite living up to the term 'software engineering' coined in the 1980s.

Software is complicated and takes time and effort to get right, even with the latest moves towards more rapid application development and shortening the develop/test/deploy cycles through the processes of 'Dev-Ops'. Despite being a virtual product, good, commercial software with sufficient industrial strength to run a business costs real money.

Software is just as worthy of investment, yet has been viewed differently to hardware, with many companies also finding it difficult to fund through a financial package as often many finance providers will also view software as separate and different.  There is no reason why this should be so, and there are opportunities to spread the cost of software investment through external financing, yet too often companies will try to cut corners in order to keep software costs down. These approaches are mistaken when alternatives allow software investment decisions to be made for business rather than spurious financial reasons.

So what are the potentially harmful software investment avoidance practices that might be dealt with through proper financing?

Delaying software upgrades. While many vendors might rightly be accused of capitalising on maintenance and frequent major re-releases of their software, there comes a time when delays will simply increase long term costs and could expose an organisation to other problems. A recent case in point has been the procrastination by many companies over leaving behind Windows XP, and the other major one is Windows Server 2003. There will always be extra costs incurred and mass upgrades should not be made too early to avoid making mistakes, but leaving it too late is costing more, opening up security risks and making staff less productive because they have to use out of date software.

Delaying hardware upgrades with software dependencies. Often a problem where companies struggle to find a way of financing both the hardware and software elements of a major upgrade programme. Rather than delaying the whole process, which will have been started for good business reasons, there is a need to find a way to finance the whole project.

Bearing down on license numbers. Restricting licenses on some arbitrary basis simply to keep software costs down is a false economy. While some software products may be available through a more flexible software-as-a-service model that allows for incremental users, not all will be. Forcing users to work less efficiently because of restricted access to certain software products, rather than what is best for the business, is false economy. Spend all the way up to the numbers required, look for opportunities where other users could be added to the mix and there may also be ways for the business overall to gain from the economies of scale of enterprise or site licenses.

Good enough? In many areas of software there are products that approximately compete or overlap in certain functionality. They may on the face of it appear comparable, but without meeting the complete set of user or business needs. Those without in-depth knowledge of the requirements   may assume they are good enough, especially if they are cheaper than a more functional alternative, which could end up causing problems for the business or leaving users frustrated and less productive. Good business value is returned from investing in what is needed rather than cheap and cheerful alternatives.

Hardware may appear for many to be the driver for IT investment, but it is software that delivers the business value in the end. Getting the right software to do the job is far more important than reducing the cost of investment. It needs to be fully and appropriately funded, and can be externally financed just like hardware to ensure that the right tools are delivered, at the right time, to get the right business outcomes.

For some thoughts on how to finance the changes required to address an enterprise software strategy as a whole, download this free report.


The software game - Perforce's new Helix Platform

Bob Tarzey | No Comments
| More

A few years ago a friend of mine told me he was concerned that his son wanted to take a degree and find employment in the software games industry. I think my friend thought it might be a frivolous career path. Not at all, I reassured him, the games industry is deadly serious, it is as Formula 1 is to the motor industry, at the bleeding edge and the UK is a world leader.

A 2014 report from Nesta, a charity that promotes innovation in the UK, estimates there are nearly 2,000 games companies in the UK contributing billions to the economy. The industry is estimated to be growing at 22% per year. Some may consider playing games frivolous, but building and selling them is not.

However exciting a game may be to play, building robust, performant and attractive packages requires the same rigorous processes that must be applied to any software development project. For games software the project management tools must be capable of handling not just software source code, with all the version control, configuration building and testing that entails, but also the wide range of other content, not least high definition video.

The team members that build games generally fall in to two camps: the more technical software developers (generally working in C/C++, with some scripting in Python and Lua) and more artistic types working on the models and textures that make up the game's landscape. The management tools must co-ordinate between them and bring their efforts together.

When it comes to software configuration management (SCM) in the gaming sector, the market leader is Perforce Software, which claims 18 of the top 20 games developers as its customers, including Electronic Arts and Ubisoft.  So, although Perforce is active across all industry sectors, it is not surprising that is biggest ever product announcement on March 6th had plenty of new stuff for its gaming customers, including extended support for multi-media.

The headline was effectively a re-branding of Perforce's SCM product set as what it now calls the Helix Platform. Perforce's aim is for the Helix Platform to sit at the heart of its customers development operations as a centre for co-ordination, like the role DNA plays in a cell, the structure of which the new platform's name alludes too.

New capabilities include:

·        Extended support for multiple content repositories: this goes beyond just those used for software such as Perforce's own P4D, now renamed the Helix Versioning Engine, and the popular Linux based GitHub. Helix also enables the sharing of assets from cloud storage systems such as Dropbox that are often used to share large multi-media files and include them in controlled software builds, something Perforce says has been tricky to do in the past.

·        Collaboration for multiple developers is enhanced through Helix Swarm and GitSwarm that connect widely dispersed contributors and improve project workflows.

·        Protection for software IP (intellectual property) is added through Helix Threat Detection, a kind of SIEM (security information and event management) capability that is specific to Helix and the digital assets it holds'. It looks for unusual behaviour, such as a user down loading an abnormal (for them) number of files. This might be a sign of a compromised account or an insider stealing IP as they head off to a new job (not uncommon as Quocirca reported in its 2014 report, What Keeps Your CEO Up At Night? The Insider Threat: Solved With DRM).

If the games and other software-based industries in the UK are to continue to be source of growth they need to continue to produce high quality products and that requires good management tools. To capture start-ups Perforce is planning a cloud-based free Community Edition due to be available later in 2015. This will be missing some key capabilities, for example it will not have threat detection, which is only available via a paid upgrade to Helix Cloud Premium Edition or Helix Enterprise which can be managed on-premise or hosted by Perforce.

As for my friend's son - he got a First in Gaming Technology and has gone travelling. As it happens he has never heard of Perforce, although they did cover project management on his course. He looks forward to a great career in the software industry, in gaming or elsewhere, he will then get fully involved in the rigours of software configuration management and may well discover Helix.

Boosting video conferencing confidence - simple and small steps

Rob Bamforth | No Comments
| More

The spread and adoption of video conferencing in the office might have been held back in the past when the technology was expensive and network capacity was limited. However, today high definition screens are even available on devices as small as smartphone, along with high quality cameras putting video into almost everyone's pocket. In addition, with the widespread availability of fibre and high-speed mobile networks, sufficient bandwidth to reach even remote locations is less and less of a barrier.

With many of the technical limitations largely removed, it is often the human issues that remain. Using and being filmed on video is something that many people have become much more familiar with as consumers, especially on smaller, mobile and non-traditional IT devices like tablets, but how does this familiarity turn into acceptance of video as a tool for the workplace?

New worldwide research surveying over 800 business video conferencing users, shows that in many companies video usage had spread well beyond the boardroom to general meeting rooms, desktops and even mobile devices. Several interesting questions remain, especially around getting employees to feel really comfortable with using video and I asked Roger Farnsworth a senior director of services from Polycom, the sponsors of the research, how he felt these issues were being addressed.

Rob: Why do users still feel they need handholding?

Roger: Often users find purpose-built video environments most intimidating.  Users are not reluctant to click links and mash buttons on their mobile devices or laptops; however, when entering a panelled boardroom with chic electronics they fear breaking something.  We've found that if a user is exposed to the group conference tools, either through training or simple how-to videos, they are far more likely to jump in and give it a go. In fact those users that have personal access to video as a regular part of their portfolio of business tools, personal Virtual Meeting Rooms (VMRs) for example, quickly catch on and begin using ad-hoc video more often.

So, some of the issue is a reluctance to let go from IT. Our research found that those whose video calls are more often assisted by IT are the ones most likely to also blame it for being complicated to use or unreliable. The survey found more than 50 percent of people who regularly use video rarely or never need IT to help them place a call. This is because video solutions have come a long way from their original iterations. Often, those with the most trepidation have previously been burned by a poor user experience which is why it is so important to get it right first time round.

Familiarity coupled with regular usage, normalises the use of video, and this seems to have the biggest impact on whether employees can manage without IT assistance or not. From the research it was also clear that user confidence rises with informal use of video, and it seemed better when video was put to use in a variety of communications applications - not simply replacing regular meetings such as team meetings, but also personal communications like interviews and one-to-ones.

In this way, video becomes just another day-to-day communication and not a 'special event' conference.

This normalisation is further improved when location restrictions are removed. At one time, video conferencing systems were only available in boardrooms or other special meeting rooms and customer briefing centres. Desktop conferencing systems widened access further, but often only to the desks of senior management. The move to general desktop PCs with low cost cameras, laptops and now smartphones and tablets completely removes location as a barrier to video usage - except perhaps for reasons of privacy - and video on the move can add a new dimension to its value.

Rob: What does small screen mobile usage contribute to video collaboration?

Roger:  An average individual today uses one or other kind of mobile device; in fact it is hard to find a person who doesn't. A majority of companies equip their workforce with mobile devices and support a BYOD culture. Companies are embracing mobility of their workforce because it means that business-as-usual can be conducted flexibly from any location at any time. Flexible working is a boon of the invention of small screen devices that allows workers to deliver their work from outside of the office premises and will soon become a norm. Its benefits extend into the event of emergencies, such as extreme weather conditions or train strikes, when video collaboration over mobile devices ensures business continuity. 

This is where proliferation of the small screen devices industry converges with the advances in video collaboration. Ultimately, this is how the investment in technology is paying off for businesses and chiselling the shape of the future of workplaces. It really is a case of #videoforall and the real question is why should your business be left behind?

Ultimately the success of mobile devices has been less to do with technology and more about people and in the work context, process. For individuals there is increased choice - from BYOD, but also from the variety of form factors and sizes of mobile devices available. Good design, styling and even just fashion branding have helped foster personal connections with mobile devices.

The business process impact is more on the organisation than the individual, but it is important to both. At one time the constraints of technology imposed themselves too much on the process and the person. You must return to your desk to communicate with someone (or visit them in person), you must also return to your desk to access the wealth of data stored by your organisation's IT systems. For those who are not office workers - on a factory floor, treating patients in hospitals, on site in the field, travelling with goods - this distraction of having to go somewhere special to use a communications tool affects productivity, and let's face it, even office workers are not glued to their desks.

Mobile devices and the shifting of IT access and communications tools directly to where the individual needs to be to work on their business tasks helps them be more productive, responsive and to collaborate better. Now that video can be an intrinsic part of that approach too, there is no reason why it should not be adding similar value to the business process. To read more about the impact of video adoption, download this free report.

Making more of print

Rob Bamforth | No Comments
| More

Mobile, cloud, the internet of things and other fast expanding technologies might be making dramatic changes in many organisations, but the much heralded 'paperless office' is still not only an out-of-reach concept, but for most it is way out-of-sight. The reality is that a great many companies - large and small - still operate a hybrid mix of paper and digital workflows. And both of these workflow models need effective management.

Many organisations and especially their employees have bought into the concept of bring your own device (BYOD), where some of the complexities and costs of mobile working are essentially borne by the individual. When it comes to getting any mobile thoughts and experiences onto paper however, even among small and medium enterprises (SMEs), the onus quickly moves back to the organisation. Quocirca research shows that the business activities of three quarters of SMEs depend on printing and over half are struggling to control costs.

It is clear that just trying to cut costs by simply and blindly limiting access or usage to printing is not the right approach, yet without some controls the entire process of managing print resources and using paper workflows becomes unnecessarily problematic.

This means all companies need a well thought out strategy for print, and a complete plan for how to accomplish what is required. Major elements to address include:

  • Print reliability - On the face of it, printing might seem like this is a simple, low cost investment - after all, who hasn't been amazed at the low cost of a personal printer for their home computing needs? However, paper workflows are integral to most business processes and the consequences of downtime can be serious; cutting corners with poor quality devices that too often break down or lax service with too many interruptions will disrupt business processes and frustrate employees.

  • Software and services - While this seems like a heavily hardware dependent part of the business support environment, there are plenty of key roles for software and services to play in the print environment, either installed and managed directly by a company or as part of a set of managed services for print (MPS). Not only is there a need to extend the availability of printer access to different types of devices and provide suitable queuing controls and management, many organisations will want to be able to limit who has access to which print resources and to have a way of accounting for usage.

  • Asset management - In addition to this operational management requirement, there is a need to manage the entire lifecycle of individual assets, including hardware systems - printers and multifunction printer (MFP) devices for document scanning, copying and print - as well as a wide range of consumables such as the ink, toner and paper which will require processes for ordering, stocking and effective deployment and installation.

  • Appropriate hardware - It is generally a false economy to keep any old IT equipment for too long. All technology improves, becomes cheaper to run and offers increased flexibility to users over time. Print is no different. New MFPs are far more efficient than older printers, both in power and more sophisticated paper, ink and toner handling to reduce the use of precious resources and keep running costs down.

  • Network connectivity - MFP devices also provide greater flexibility through their additional functionality, especially when connected to the internet. Increasingly they have features such as the ability to scan, perform OCR to determine content and then route automatically to other locations for example sending expenses or invoices to be processes. Connectivity means they can share 'upwards' to offer storage of scanned information in a cloud service and 'downwards' to provide wireless printing from mobile devices and on demand printing.

Printing has long been a key part of IT systems and despite of advances in displays or other technologies and the appeal of paperless (or at least 'less paper') offices, paper workflows still drive and support significant elements of almost all business processes.

Purchasing the right hardware to meet all user print needs will make employees more able to reach their productivity objectives and help ensure business continuity.

It should keep costs down with more efficient use of energy and other resources and have the management attributes that allow the organisation to not only securely control access, but also understand and identify what is being done at a sufficiently fine-grained level to properly account for usage.

A casual investigation of any print environment reveals a complex mix of requirements and the need for serious investment in many areas. However, the load can be spread to match usage though intelligent use of external financing and investigating managed services for print. The key is to ensure that available financing covers all aspects - increasingly complex hardware, software demands and managed services - to keep the ink and paper flowing.

For some thoughts on how to finance the changes required to address an enterprise print strategy as a whole, download this free report.

Stepping up to the cloud

Rob Bamforth | No Comments
| More

The concept of services provided from 'out there' over the network is often portrayed as what cloud is all about. Indeed, the commonly accepted definition of cloud is that it offers infrastructure, platform or software as a service. It might be a public cloud provider, that is offering services to anyone, an internal service from the organisation's own data centre or a hybrid of the both. There might often be reasons to start out with exclusively either public or private cloud, but the pragmatic approach will generally be balanced between the two, with an accompanying agility to allow for the movement of workloads.

A key reason behind cloud is flexibility. With a virtualised core, any operating platform or system can be provided on demand with a pay-as-you-use public commercial offering or as an adaptable base for a private cloud model.

It is this lowering of the total cost of ownership that is often an important driver for those considering adopting cloud. However cost saving alone is rarely the whole picture, as organisations are also looking to drive efficiency and new working practices and here cloud at the core is complementary to the adoption of mobile at the edge.

There are drivers offering opportunities to move into new services, such as extending the access to core infrastructure to external users, again often for business efficiency, or gaining access to new markets sectors as well as adding functionality or extending applications.

The cloud model also permits experimentation, not only for technical proof of concepts, but also business trials to for example extend into new territories or markets or try out alternative business models. This is a prime case for public cloud, to minimise impact on internal resources.

It might seem that a public cloud model has an impressively low upfront commitment or investment in that leads to several strong use cases in addition to the ability to trial or experiment with new ideas. For example, being able to deal with planned and unplanned peaks or dips in capacity as well as offering lower cost redundancy and software testing. Taking advantage of this will however require investment in existing internal systems to architect them to take full advantage of the external resources on offer.

Many organisations will have known peak times at year or quarter end or at particular stages. Owning the extra capacity required to handle this will rarely be cost effective and so the ability to call upon public cloud resources is very valuable. Unplanned changes in demand could occur on the back of an unexpected success which required a rapid ramp up of capability, perhaps only for a short duration.

Rare use of additional capability is a reason why on demand public cloud services are ideal for a failover platform. Again, owning and maintaining an entire system for business continuity reasons is an unnecessary expense if use can be made of a public cloud provider and only scaled up when actually required in the event of a failure of primary systems.

Private cloud on the other hand is sometimes viewed more sceptically from a cost benefit perspective, but it is here that investment in architecting a cloud-like core and transforming the data centre can also pay dividends over time. Not only is it a pre-requisite foundation for the fully flexible hybrid cloud model where workloads can be moved from private to public data centre capacity as demand arises, but it also encourages a more effective internal IT infrastructure capitalising on server virtualisation and therefore more efficient pooling of resources.

Putting aside sufficient resources and finding the right skills is difficult with any technology investment, and cloud is no different and it is these resourcing issues that often hold back cloud projects, rather than security per se.

Even those with an enthusiastic attitude to cloud adoption recognise this; despite believing they have more of the skills, they understand there is a need for significant investment in resources to make safe and secure use of the potential that cloud brings.

The cloud model of service delivery to an increasingly diverse and often mobile edge offers huge flexibility and agility advantages in both public and private capabilities to support workloads. As with mobile, this is a difficult task to perform incrementally since it requires investment and a re-architecting at the heart of IT, but with the right strategy it ultimately reduces both costs and risks which delivering greater capability for IT to provide increased value to the business.

There are often incremental improvements as technology advances - the oft-quoted marketing mantra of smaller, faster, more. However, every so often there are bigger changes that involve a different structure or way of thinking about what has been built before. There may be a need to rip up old systems and throw some things out - although with planned migrations this can be minimised - and there will certainly be a need to invest in something new.

Rather than always trying to take this route in a gradual fashion, some innovations like cloud imply a step change in thinking in order to take full advantage of the opportunities offered. The diversity of access to IT in an increasingly small, smart and mobile fashion, coupled with a cloud-based core using flexible service provision is one such instance where change is significant and needs significant attention and investment to maximise its value.

For some thoughts on how to finance the changes required to address an enterprise cloud strategy as a whole, download this free report.

Avoiding a piecemeal mobile strategy

Rob Bamforth | No Comments
| More

It is easy to see that world is 'going mobile', from smartphones and tablets to radical innovations such as wearable technologies and the highly connected internet of things. The impact on consumers is wide-ranging and fast-changing, but despite this, some businesses seem to think that this is a phenomena that they can take their time to evaluate further to see how it 'pans out'. Or their IT departments think that it will be ok to slowly edge towards decisions by perhaps dealing with those shouting loudest (often senior executives) first.

This would be a mistake.

While it is true to say that early mobile adoption was often the domain of the 'pink collar' executives (so termed because they might buy their shirts in Thomas Pinks in London or New York) with devices such as the business-like BlackBerry, mobile usage, acceptance and even eagerness has spread to all job roles through a multitude of desirable smartphones and tablets from Apple, Samsung and others.

This led to the trend of bring your own device (BYOD) among many employees, but rather than reducing company expenditure by eliminating the potential need for the organisation to purchase the hardware, it transfers investment demands into other areas, particularly IT management and security.

In the previous deployments of mobile technology, select individuals could be given, for example, a corporate laptop and costs could be foreseen and planned by making incremental increases in the number of employees to whom laptops were deployed.

In the modern mobile world the challenge rapidly surges to include all employees and all devices. Some organisations are trying to contain the growth of mobile access, but employees are aware of technology options through their consumer use and think that the technology they have at home is often better than in the workplace. Clearly this is not efficient or effective, so organisations need to embrace the mass adoption of mobile and apply the right resources to make it safe, secure and sustainable.

Acceptance of user choice, either by permitting a subset of BYO devices or by the organisation buying devices more in line with employee preferences essentially means that almost any type of device will need to be supported and managed. This means investment in a mobile device management (MDM) system, ideally closely related to desktop management tools, but with further capabilities to deal with mobile specific issues such as cellular and public Wi-Fi network access and airtime contracts.

The risk of loss or theft of mobile devices is not insignificant, although somewhat reduced when personal choices are exercised or the device belongs to the individual, and although insurance can cover hardware costs, loss of working time and data is more significant. A suitably sophisticated MDM will make it simple to not only apply some uniform protection and configuration controls to the complete fleet of devices, but also to quickly re-instate a new device with the setup of one that has been lost or stolen.

However just protecting the hardware is no longer sufficient, especially as employees expect to be able to use mobile devices for personal as well as business purposes. Organisations must also take a strong interest in mobile applications and apply suitable levels of security to both what apps are on the devices and what corporate applications and data can be created, used or accessed on the move.

It might be that the best way to serve a large fleet of mixed roles of mobile users is to put in place a corporate app store and have some way to allow employees to self-provision access to central services and applications, with appropriate controls to ensure that only those employees who are permitted certain applications are allowed them.

To get enough future flexibility to bring the best out of mobile working, this type of strategy and support model needs to be put in place today, rather than enduring the chipping away of IT support resources by the ad hoc creep of a diversity of mobile devices, users and applications.

Getting it right might require more investment upfront, but it will save time and money over time and ensure that employees are as productive as possible from the outset.

In addition to controlling who has access to what apps and on what devices, the vulnerability and integrity of data accessed and used on the move needs to be assessed and securely managed.

This is the issue most likely to be keeping IT managers awake at night, and no wonder. There have been plenty of high profile losses of data, and mobile makes the task of data security much harder. There are software tools that will protect against data loss and leakage, as well as applying digital rights management. In addition to the technology, this also requires one vital ingredient that is frequently overlooked - training. All too often only simple 'how to use' training is put in place, but mobile technologies encourage such a dynamic shift in work practices that employees would benefit greatly from coaching as to how to safely and securely get the best out the tools at their disposal.

Effective mobility benefits from a wide strategy that encompasses productivity and training. It is thus not a piecemeal approach, but all embracing. It doesn't necessarily mean providing everyone with the latest gadget from California, but it does mean having a way to cope with managing a portfolio of technologies, dealing with complexity, and requiring a step change from thinking small (focusing on the devices) to thinking big (how it changes the business).

Overall, mobile brings huge benefits but significant changes to organisations and the old model of small proof of concepts and slow rollout is no longer valid. Employees are well aware of what technology is available and want to participate in selecting what works best for them as individuals. However, the collective needs of the organisation mean that controls need to be put in place, and IT departments needs a strategy for the safe management of devices, apps and most critically data used by employees on the move.

For some thoughts on how to finance the changes required to address an enterprise mobile strategy as a whole, download this free report.


Have you entered our awards yet?

Find recent content on the main index or look in the archives to find all content.


Recent Comments

  • Adam: Cloud computing and BYOD go hand-in-hand. Cloud computing can make read more
  • David Chassels: Hi Clive Is the business emphasis not wrong in looking read more
  • Clive Longbottom: After a discussion with CA Technologies, I would just like read more




-- Advertisement --