Recently in CIO Category

CIOs need to nurture raw IT talent to power the future economy

| No Comments
| More

Each year, with the publication of A-level results, hundreds of thousands of school leavers embark on a journey that ultimately fuels the talent pool feeding the UK economy.

Some argue that a generation brought up on the internet have different expectations of work than previous generations. But, while the worldwide web was never part of their early education, the people who started their careers in the 1990s have only ever worked in the internet era. It is these people who are now the middle and senior managers in business and government.

There is an opportunity for these managers to think differently about how work should be organised and consider the real value of staff and how to motivate them, in a bid to help their organisation become the next Facebook or Google.

Empires are not created overnight; revolution is never instant. But there is a groundswell, a shift in public opinion. The trusted brands of the past look prehistoric compared with the likes of Amazon or Apple.

Two decades after the 30-year-old Jeff Bezos established Amazon, a new generation of business leaders are realising the tools of the internet era - social, cloud, analytics and mobile - offer the potential to rewrite business rules. In this respect, IT can make a difference. Amazon digitised book selling in 1995; today digitisation is set to propel old-school business ideas into the internet era.

How often has IT been greeted by the remark: "I didn't know it could do that?" More often than not, users complain the application does not do something they want. Today's school leavers instinctively know what "right" feels like from an application perspective.

Rather than hire to fulfil a current business need, there is a case for CIOs to nurture the skills their organisations will need in the future through job placements, apprenticeships and graduate recruitment programmes. In doing so, the CIO can build an IT talent pool to inform and support senior management on the journey to digitisation.

Blame the Poisson

| No Comments
| More

I recently met Mark Rodbert, CEO of Idax Software, who has an interesting theory on statistics. We often see the 'Normal' bell-shaped distribution - where the top of the bell represents the most likely outcomes, and the left and right tips (outliers) are rare events. Rodberts believes real world events are more likely to follow a Poisson distribution - and this has implications for IT. In this guest blog, Rodbert explains the theory:

At idax we spend a lot of time demonstrating that maths really can help describe the real world. As idax uses mathematics to identify individuals with unusual access it's pretty important that our clients share our understanding.

Of course, people are used to getting on planes, making a phone call or using Amazon, all of which require pretty sophisticated analytics, but in the realms of big data some things are still counter intuitive. If we got two sales leads last week and 1 the week before we're on an upward trend, if my train was late twice last week, it will be late this week, and most importantly for us, if I find several people with a high risk profile in their access then someone must be someones fault.

London 2012 - Mo Farah

London 2012 - Mo Farah (Photo credit: garda)

But how likely really are these events. Well it turns out that what we need is not someone to blame, but the Poisson distribution. The Poisson is a very versatile statistical tool rather like a lopsided normal distribution, that is good for estimating event frequency, especially if the events are rare. And my all time Poisson concerns the distribution of gold medals for Team GB at the London Olympics. It seems strange to remember that at the start of the games we went a whole three days without a British gold medal. As the press shrieked that we were heading for disaster, unable to meet our targets despite massive investment, the nation held its breath. So what really were Mo Farah's chances?

Well, as we all now know, actually pretty good. Of course only an idiot would assume that winning 29 medals over 16 days should equate to 2 every day with Sundays off, but how likely was a medal-less day. Well if you assume a Poisson distribution and take an average of 1.8 a day, the chance of a day with no medals is 16%. The chances of a super Saturday with 6 medals were actually 7%.

poisson.png

The bad news is that, as you can see from the chart above the Poisson doesn't quite fit what actually happened. The good news is that a day without any golds was actually more likely at 38% of all days. The least likely (below 5) was a single gold day, which only happened once. The last day of the boxing, since you ask. So why does any of this matter? Because it shows that human beings are very bad at estimating how frequently things are likely to happen. We assume that events are evenly distributed and get confused when they're not. Not much of a problem with gold medals; quite a big problem when you're tying to detect fraud, rogue trading and high levels of access risk. We assume that because unusual failures are, well, unusual they are also uniformly infrequent.

So when it comes to Access and Identity Management its clear that an approach that defines cumulative controls by exception management, otherwise known as "my boss checks my access" - will perform well with the frequent but not so bad but does nothing to stop the infrequent but high risk. So the good news is that if you ask your staff why they have access to something you'll probably remove a few copies of Visio, but you're unlikely to spot the guy with access to the general ledger and the payments system who's ripping the company off. Which just goes to show that what companies need is real analytical capability, and of course a bit of mathematics.

Mark Rodbert is CEO of Idax Software, the identity analytics software provider

Big data technology has its work cut out to harness web analytics

| No Comments
| More


English: eBay Logo

English: eBay Logo (Photo credit: Wikipedia)

What can we learn from companies such as eBay and Amazon? These internet businesses are at the cutting edge of technology.

The recent Gartner CRM summit gave delegates an understanding of what CRM means to a web-only retailer. The processing eBay conducts to understand customers better, for example, is eye-watering. The web gives retailers incredible insights into customer service. It is not only possible to track a customer's identity but, thanks to smart web analytics, eBay can follow the buyer's journey.

David Stephenson, head of global business analytics at eBay, says it's a bit like strapping a video camera to a customer's head. Recording every interaction a customer makes means the auction site collects millions of hours of web analytics. Making sense of it all is a big data problem. In fact, eBay produces 50TB of machine-generated data daily. It also needs to process 100PB of data every day to understand what its customers are doing. Sampling this data may have worked in the past, but this only gives a statistical snapshot.

In the era of customer focus, eBay strives to collect and analyse all the data it collects. With this information, Stephenson believes eBay can offer its customers intuitive, almost intelligent, recommendations. The technology supporting the web analytics eBay undertakes does not come cheap. Nor is it available off the shelf. There is no such thing as a "big data solution" for the level of data processing eBay shoulders

The company needs to work with suppliers to build bespoke hardware and software for its requirements, because using a traditional data warehouse would be too slow and prohibitively expensive to scale. But even a custom data processing engine cannot comprise the whole answer.

The firm uses three systems: a traditional data warehouse appliance, a NoSQL database and the custom appliance to analyse its customers' journeys. So while it makes perfect sense for businesses of all sizes to use web analytics to understand customer interaction, an immense amount of technical investment and expertise is required to do so effectively.

Enhanced by Zemanta

Will Microsoft pursue a single Windows kernel?

| No Comments
| More

During the Build developer's conference in April, Microsoft is expected to reveal more details of a future version of Windows codenamed Threshold.

Rumours on the web suggest that Threshold could become Windows 9. The OS is set to bring together Windows Phone, Windows 8 and the xBox One operating systems.

Microsoft's previous attempts at simplifying its various operating systems have had varying degrees of success.

Windows 2000 Workstation and Windows Millenium merged into Windows XP with a single kernel for home and professional users.

On the mobile side, Microsoft attempted to provide a common look and feel with Windows CE, and cross-platform development, but at the time, a desktop-like GUI on a smartphone did not gain acceptance.

With the evolution of the Windows Phone OS, Microsoft introduced a touch UI with tiles, that has made its way onto the Windows 8 OS. This time, however, the touchscreen UI, has not sat very well in the corporate market and home users have generally preferred cheaper Android-powered tablets over Windows 8 powered tablets.

The fact that Microsoft is looking to rebrand Windows RT, its low-end ARM-powered Windows operating system, suggests the company is moving towards a single Windows OS across all devices. Interestingly, the xBox One also runs a version of Windows .

A single core OS would greatly simplify application development and integration. For Microsoft, it would mean core services like Skype, Windows Live and Office 365 would work seamlessly between the xBox One, Windows-powered tablets and smartphones and traditional PCs.

The wider MS ecosystem would benefit - so, in theory, B2C companies could develop services once and  target customers across all three platforms.

All will be revealed at Build 2014, but Microsoft has some big changes to make this year, not least, hiring a new CEO to take over from Steve Ballmer. 

Video interview: Kim Stevenson, CIO Intel on BYOD

| No Comments
| More

I recently spoke to Kim Stevenson, CIO at Intel. When she started in IT, people used to try to issue one device per employee. She says Intel has been running BYOD for three years. "People like choice and they pick devices for the work they wish to accomplish." At Intel, this means becoming a more productive employee. So as an IT professional, she says it is important to understand this driver, rather than try to resist the change and loss of control that occurs through BYOD.

She says: "We are in an era of business  productivity. It is perfectly reasonable for employees to have seven devices. Through our BYOD programme we have documented a gain of 57 minutes in productivity per employee."

For IT professionals, she says the Number One issue they face is "velocity." Business unit managers can buy a service directly from a service provider. the consumer It experience is better than enterprise IT. She says IT must address how to deliver  the consumer IT experience within the confines of the enterprise.

Tackling the Big Blue software licensing challenge

| No Comments
| More

In this guest post Martin Prendergast, CEO and co-founder, Concorde Solutions and board Member, Cloud Industry Forum, writes about issues to consider when licensing IBM software.

Enterprise software can represent as much as 30% of an organisation's IT spend, so at a time when budgets are still being squeezed like never before, CIOs are understandably being careful to ensure that their investment in software represents value for money.

However, software licensing costs can be a real bugbear for CIOs, with the potential to quickly ratchet up the overall price through painful non-compliance fines, unwittingly incurred as a result of software vendors' complex and convoluted terms.

The challenge is exacerbated as each software vendor has its very own unique brand of complexity, which makes the jobs of the IT Asset Manager, the CIO and the CFO even more taxing. In this article, we examine some of the key challenges and solutions for dealing with IBM's software licensing.The problematic portfolio position.

IBM has over 1,500 products on offer available on around 30 licensing metrics; each metric may differ only very slightly, but can still have a significant impact on licensing requirements and position. Historically, the picture has been further complicated with IBM through its well-known practice of acquisitions, expanding the product portfolio and licensing metrics even further. IBM may choose to retain the licensing metrics of the company they acquire, and sometimes may choose not to.

For customers this can be incredibly difficult to track; and without careful management and analysis of their IT estate, businesses can find themselves operating under altered metrics and contracts without realising. It goes without saying that non-compliance fines can often be the result of this - and large software vendors, as we know, have found a lucrative income stream in such levies.It's relatively widely known that IBM has a tendency to be one of the most aggressive vendors on the market when it comes to non-compliance. IBM's fines, which can include a two year back penalty on maintenance clauses in addition to the costs of 'missing' licenses, are considered harsh even in comparison to other large vendors.

Indeed, just a few years ago, IBM sought to audit all of its corporate customers without warning and with huge audit teams, which netted them a considerable amount of income. Of course, IBM isn't the only vendor that is a fan of the surprise audit and there are a couple of things that businesses can do to ensure that if an audit arrives, they're not caught unaware.

1. Preparation is the first line of defence - ideally businesses should seek independent third-party confirmation of their licensing position both pre and post audit.

2. IBM has now lengthened the list of its products that are eligible for its sub-capacity licensing.

3. Dealing with sub-capacity licensing - irrespective of how enterprises partition their machine, without a sub-capacity license in place, they may still get charged for full-capacity. 

Five facts on Dell (Denali)

| No Comments
| More

Denali is the holding company through which Michael Dell hopes to reinvent Dell. A US securities and exchange filing at the end of March shows the company will actively move away from the PC and high volume servers.

  1. Denali hired John Swainson, former CEO at CA to run the software business. The company will look at expanding its business into areas like BI and storage software, presents a huge opportunities, software as a service will eat away revenue.
  2. While Denali has benefited from the trend to migrate workloads from expensive Unix systems to commodity x86 servers, Gartner notes that this potentially short term. According to Gartner, the move to virtualisation and server consolidation will enable businesses to defer server purchases
  3. A section of the SEC filing prepared by J P Morgan reflects this challenge. The investment bank highlighted Denali's management plans around reducing margins from end user computing devices, servers and storage to reflect increasingly aggressive competition and buyers spending less.
  4. The acquisition of Force 10 in 2011 will help drive networking sales. IDC expects the networking business to grow 7.3%. Gartner expects sits software revenue to grow 7.7%, due to the acquisition of Quest. Storage is expected to suffer as a result of the decline in its long-standing relationship with EMC.
  5. On the services side, J P Morgan notes that Denali should see modest growth in its PC maintenance business, but competitive pricing will put pressure on traditional outsourcing.

Oracle should heed warnings from the trends in enterprise

| 1 Comment
| More

The findings from Forrester's latest research on Oracle point to a worrying trend in the enterprise software landscape. Businesses are not generally doing large, transformational IT projects built around traditional enterprise resource planning (ERP).

The key suppliers are adapting their enterprise software portfolios in a bid to drive more sales. But the CIOs Forrester spoke to are not convinced it is a strategy that is working for Oracle.

In Forrester's Oracle's Dilemma: Applications Unlimited report, many people are happy with the software they are running and have no real plans to migrate onto Oracle's future enterprise platform.Since Oracle is a strategic supplier to many, there is little interest among CIOs for migrating away. There are concerns that Oracle may turn some of the products they have deployed into cash cows, potentially with high, annual maintenance fees and licensing costs.

Members of the IT director's group, the Corporate IT Forum, are angered by the changes to Oracle licensing. Head of research Ollie Ross told Computer Weekly that members were being pushed into taking certain technical directions like OVM (Oracle VM), rather than VMware. The forum's executive director, David Roberts, believes many CIOs are reacting negatively to Oracle's exceptionally high-pressured sales techniques. This is reflected in the supplier's poor software licence revenue when compared with its nearest rival, SAP. If businesses are not upgrading at a rate that looks good on the company's balance sheet, Oracle will need to take a different approach.

Newham Borough CIO Geoff Connell is concerned that Oracle (and other top tier vendors) will increase licensing, because their customers are "locked into" their products due to historical investments.He argues that many software suppliers appear to be ignoring the financial climate and are attempting to make up for reduced sales volumes with higher unit costs.

Coercing customers to buy more software is not the right way to go. But Oracle executives have not shown much willing to go wholeheartedly down the software as a service (SaaS) route, or even offer a roadmap for integrating SaaS and on-premise enterprise IT. Nor has Oracle been willing to adapt software licensing to make it more virtual machine friendly. The research shows customers are unhappy and the time for Oracle to make some tough decisions is long overdue.

Connell believes if Oracle and other leading suppliers continue to hike prices, users will abandon commercial enterprise software for open source alternatives.

Enhanced by Zemanta

Virtual Modelling: A new IT optimisation tool

| No Comments
| More

A few weeks ago I interviewed Paul Michaels, CEO of business technology consultancy ImprovIT, about a methodology for modelling decision-making. In this guest blog post Robert Saxby, consulting director at ImprovIT, explains a bit more about how the methodology, called Virtual Modelling, works, and the business benefits.

saxby.jpg

When it comes to re-engineering IT environments to save money or achieve best practice, a trial-and-error approach can be both complicated and costly. Virtual Modelling is a new business tool uses 'what if?' scenarios to simulate real world outcomes and identify efficiencies, future strategies and best sourcing options without chopping, changing or disruptive ongoing operations.

The challenge

CIOs today are caught between a rock and a hard place:  Having to slash IT costs while retaining productivity and service quality - often due to government mandate.  Of course cost cutting pressures are nothing new, and for many there is little blood left in the stone.  The question now is: "How and where can we make further reductions without knee-capping the entire operation?"  There are plenty of apocryphal tales about organisations axing staff and abandoning efficiency enabling technology projects only to discover their actions have mortally wounded deliverables and reputation. The result: a panicked and costly rehiring and/or re-purchasing exercise to redress the balance.  

Finding the cost/quality balance

Wouldn't it be great if you could work out the exact cost and productivity balance without the cost and disruption of making changes on a trial-and-error basis?  Virtual modelling creates scenarios that are based on real, current and accurate data mined from your own ICT operation that can predict real world outcomes without impacting current operations. But it can only do this based on available KPI data, and if it doesn't already exist it must be generated via benchmarking studies. For as Lord Kelvin, the 19th c. physicist once said:  'If you cannot measure it, you cannot improve it.'  

Measure it first

Once created, this baseline data provides the tools to compare performance against other public service (and commercial) entities of a similar size and complexity in terms of things like value for money, quality of service, best practice and competitive pricing.  Digging a bit deeper, you can also find out where your organisation stands in relation to best practice standards for staffing (quality and quantity), process complexity, outsourcers (scope & service levels) and IT governance.  

All of this information is then used to create 'what if' scenarios, typically dealing with areas such as: Cost/Price, Volumes, Staffing, Quality & Service Levels, Service Scope, Complexity, Project Efficiency and Process Maturity.  Providing the model has been mapped with well-researched data, the outcomes obtained offer  accurate indicators that can be used to make decisions about outsourcing, staffing, process re-engineering, cloud migration or anything else. 

In the video below, Paul Michaels, CEO of  ImprovIT, discusses how our Virtual Modelling methodology helps decision-making within a project.


Building up the model

Virtual modelling is designed to pinpoint the impact of one or more project parameters upon all the others. For example:  If I change Service Quality (SLAs) and/or 'Service Scope' what effect will this have on 'Cost?  Or: If I reduce 'Complexity' what effect will this have on 'Processes'?   It also shows the changing balances of the whole picture when one or more parameters are altered.  For example: 'If I want to increase 'Volumes' or 'Service Quality' what changes do I need to make to all the other segments and how will this impact the enterprise as a whole?

Virtual Modelling System

So, to find the Goldilocks balance between IT cost and service quality let's start by feeding staffing metrics into the simulation model, given the high impact of staffing on cost.  But this isn't just about a straightforward set of numbers: it also has to allow for a range of 'soft' factors such as varying levels of knowledge, skill sets and the specialist expertise that can make an individual or team difficult to replace.

Next let's look at complexity - typically the highest contributor to an IT department's spend after staffing.  This involves any and everything from security and data confidentiality to high availability requirements, legacy system integration and the number of nodes in the enterprise network. Rule of thumb: The greater the complexity the higher the cost. A virtual modelling analysis determines where simplifications can be made without jeopardising mission-criticality. Once established that these changes are advisable, modelling can also provide an accurate estimate of cost, timelines and impact on staffing and service levels.

Then there is the question of outsourcing. Will it save money?  What services should be outsourced?  And if we are to outsource, what kind of service - a traditional provider or a cloud-based service?  And what business model:  Iaas, SaaS or PaaS?  Data fed into a simulation model can provide an accurate estimate of the likely ROI and TCO - with timescales - of each option. 

Process maturity also impacts the cost/performance balance.  There are industry standards which provide best practice guidelines, such as ITIL (IT Infrastructure Library) 'Agile' and 'Lean' (a production practice that looks to reduce resource expenditure down to the minimum required to deliver value to the customer).  Comparisons with these guidelines can indicate where improvements can be made, but virtual modeling can determine what will cost and whether it's worth the disruption to operations.  It's also worth noting that achieving process maturity is rarely a quick win: it takes time and requires clear, unequivocal goals and plans led from the top.

Cloud migration

Given the chequered history of public sector IT projects, and the challenges that so many ITC departments are going through in making decisions about things like whether, when and how to migrate to the cloud, and how to optimise resources on an ever-diminishing budget, using Virtual Modelling to run scenarios on all the available options provides new decision-making tools that help to identify the best roadmap ahead while avoiding wrong roads and dead-ends.

Robert Saxby is consulting director, ImprovIT

Enhanced by Zemanta

Software licence audits: Confidence in Your Choices

| No Comments
| More

Over the last few weeks Computer Weekly has written about software licensing and how suppliers are demanding IT departments run costly software audits. At the same time, we have started looking at the complexities of licensing, such as in a virtualised environment.

In this guest blog post, Martin Thompson, a SAM consultant and founder of The ITAM Review and The ITSM Review, provides some top tips on what to do when you receive an audit letter:

martin-thompson.jpg

Payment Protection Insurance (PPI) spam is in vogue.

You may have received one or two of these recently:

"You are entitled to £2,648 in compensation from mis-sold PPI on credit cards or loans."

PPI claims and other spam solicitations are the bane of our inboxes. The vast majority of us know to simply ignore them. Unfortunately the handful of those who do respond justifies the exercise to the spammers. 

This mass-marketing technique is used in exactly the same fashion by trade bodies such as BSA and FAST to force their agenda and start software audit activity.

Supplier audits are a fact of life, some software audit requests are serious and expensive, some are merely spoof marketing campaigns - how can IT professionals decipher between the two?

Whilst I'm not a legal expert, fifteen years in this industry has taught me that there instances when you should respond to an audit request and instances when you should simply walk away.

When to Take Software Audit Requests Seriously

In my opinion there are two instances when you should take software audits seriously:

  1. When you are approached by a software publisher directly with reference to a signed contract
  2. When you are approached by an organisation with real proof of a breach of intellectual property law.

Contracts with software publishers have 'Audit Clauses', the right to come and audit you periodically at your own cost. Your company either signed this and agreed to it or will need to fight against it. Smart companies negotiate it out of the contract by demonstrating maturity in their internal processes.

Breaches of intellectual property supported by evidence are a legal dispute and should be treated as such - by passing the issue over to your legal team in the first instance.

When to Ignore Software Audit Requests

Requests for 'Self-Audit' or other direct mail fishing exercises can be ignored.

Trade bodies such as BSA and FAST commonly write letters to companies requesting them to 'Self-Audit' or declare a 'Software Amnesty'.

These organizations are masters at crafting well-written legal sounding letters but have no legal authority whatsoever. Nor do they have the resources to follow up to every letter sent.

Just like any other complaint made to your business it should only be taken seriously if there is firm evidence or the organisation issuing the dispute is supported by the appropriate government agency. For example the Federation Against Software Theft (FAST) has no teeth whatsoever unless accompanied by HM Customs and Excise.

Confidence in Your Choices

IT departments with the appropriate Software Asset Management (SAM) processes in place have both the confidence and the supporting data to discriminate between bogus claims and genuine supplier audit requests.

Whilst much noise is made in the industry of senior management being sent to prison or the company name being dragged through the gutter - the real and compelling downside to a lack of software management is UNBUDGETED cost and DISRUPTION. Surprise license costs and massive disruption whilst IT staff are diverted from key projects to attend to an audit or hunt down the appropriate data.

Unexpected software audits can be good for your health in the longer term if it allows the organisation to realize it is out of control.

SAM is so much more than compliance and counting licenses. Organisations with a solid SAM practice are more nimble, competitive and dynamic. No more stalling on that virtualisation project because we're unsure of the licensing costs, no more uncertainty about moving to the cloud because we don't know how that leaves us contractually. SAM provides the business intelligence to innovate and take action.

Martin is an independent software industry analyst, SAM consultant and founder of The ITAM Review and The ITSM Review. Learn more about him here and connect with him on Twitter or LinkedIn.


About this Archive

This page is a archive of recent entries in the CIO category.

Database Notes and Queries is the next category.

Find recent content on the main index or look in the archives to find all content.

Archives

Category Archives

 

-- Advertisement --