In this guest post Martin Prendergast, CEO and co-founder, Concorde Solutions and board Member, Cloud Industry Forum, writes about issues to consider when licensing IBM software.
Enterprise software can represent as much as 30% of an organisation's IT spend, so at a time when budgets are still being squeezed like never before, CIOs are understandably being careful to ensure that their investment in software represents value for money.
However, software licensing costs can be a real bugbear for CIOs, with the potential to quickly ratchet up the overall price through painful non-compliance fines, unwittingly incurred as a result of software vendors' complex and convoluted terms.
The challenge is exacerbated as each software vendor has its very own unique brand of complexity, which makes the jobs of the IT Asset Manager, the CIO and the CFO even more taxing. In this article, we examine some of the key challenges and solutions for dealing with IBM's software licensing.The problematic portfolio position.
IBM has over 1,500 products on offer available on around 30 licensing metrics; each metric may differ only very slightly, but can still have a significant impact on licensing requirements and position. Historically, the picture has been further complicated with IBM through its well-known practice of acquisitions, expanding the product portfolio and licensing metrics even further. IBM may choose to retain the licensing metrics of the company they acquire, and sometimes may choose not to.
For customers this can be incredibly difficult to track; and without careful management and analysis of their IT estate, businesses can find themselves operating under altered metrics and contracts without realising. It goes without saying that non-compliance fines can often be the result of this - and large software vendors, as we know, have found a lucrative income stream in such levies.It's relatively widely known that IBM has a tendency to be one of the most aggressive vendors on the market when it comes to non-compliance. IBM's fines, which can include a two year back penalty on maintenance clauses in addition to the costs of 'missing' licenses, are considered harsh even in comparison to other large vendors.
Indeed, just a few years ago, IBM sought to audit all of its corporate customers without warning and with huge audit teams, which netted them a considerable amount of income. Of course, IBM isn't the only vendor that is a fan of the surprise audit and there are a couple of things that businesses can do to ensure that if an audit arrives, they're not caught unaware.
1. Preparation is the first line of defence - ideally businesses should seek independent third-party confirmation of their licensing position both pre and post audit.
2. IBM has now lengthened the list of its products that are eligible for its sub-capacity licensing.
3. Dealing with sub-capacity licensing - irrespective of how enterprises partition their machine, without a sub-capacity license in place, they may still get charged for full-capacity.
Denali is the holding company through which Michael Dell hopes to reinvent Dell. A US securities and exchange filing at the end of March shows the company will actively move away from the PC and high volume servers.
Denali hired John Swainson, former CEO at CA to run the software business. The company will look at expanding its business into areas like BI and storage software, presents a huge opportunities, software as a service will eat away revenue.
While Denali has benefited from the trend to migrate workloads from expensive Unix systems to commodity x86 servers, Gartner notes that this potentially short term. According to Gartner, the move to virtualisation and server consolidation will enable businesses to defer server purchases
A section of the SEC filing prepared by J P Morgan reflects this challenge. The investment bank highlighted Denali's management plans around reducing margins from end user computing devices, servers and storage to reflect increasingly aggressive competition and buyers spending less.
The acquisition of Force 10 in 2011 will help drive networking sales. IDC expects the networking business to grow 7.3%. Gartner expects sits software revenue to grow 7.7%, due to the acquisition of Quest. Storage is expected to suffer as a result of the decline in its long-standing relationship with EMC.
On the services side, J P Morgan notes that Denali should see modest growth in its PC maintenance business, but competitive pricing will put pressure on traditional outsourcing.
The latest generation of laptops and hybrid devices use solid state disks to boost performance and speed up the time it takes for the operating system to boot. In this guest blog post, Robert Winter from Kroll Ontrack writes about some of the challenges on attempting to recover data from a damaged SSD.
When choosing a storage media type companies should understand how this decision can affect the ease of retrieving data when there is a data loss.
A lot of businesses are investing in Solid State Drives (SSDs) to leverage its numerous benefits, but users beware. Although SSDs are more robust than traditional hard drives (HDDs), data loss can still occur - and in the event of a data loss, it's also more complex to recover the data.
Unlike HDDs, SSDs store data in memory chips which have no moving parts, eliminating hardware damage like head crashes or motor defects. Yet, data loss can occur with SSD storage devices because the flash chips are susceptible to physical damage and the way data is stored is complex. SSDs are also exposed to the usual traditional data loss events such as human error, computer viruses, natural disasters, and software/programme corruption.
Recovering data from the common sources of SSD failures requires expertise in overcoming technical challenges that are unique to SSD and flash technology, such as decoding complex SSD data structures, specialised controller chips and numerous other SSD specific issues. Data is stored on SSD dynamically, and this complexity makes data recovery highly specialised and time consuming. Also a single SSD memory structure can be as complex as an enterprise RAID (redundant array of independent disks) with eight, 16 or even 32 drives!
Only a handful of data recovery experts have data reconstruction programmes in place to identify, separate and reassemble SSD memory so that data can be extracted and achieve high quality results. At Kroll Ontrack the recovery process involves the following actions:
Accessing and reading the data at chip level
Overcoming any encryption
Rebuilding data striping (much like RAID)
Overcoming any file system problems such as corruption or parts missing
The time it takes for Kroll Ontrack to recover data from an SSD is difficult to determine, because the recovery time is dependent on factors including the extent of data loss and the effort required to decode the data from the particular SSD in the device- which is the biggest challenges in the recovery process for SSD. The way data is configured also varies greatly between manufacturer and models of SSD. Each model requires Kroll Ontrack to work-out the configuration before data decoding can begin. In most cases this is done with no help from the manufacturers.
Performing secure data disk sanitisation techniques on SSDs is equally tricky since it's difficult to specify the exact location of where the data is stored to overwrite it. Therefore, the best way to permanently destroy the data is through physical media destruction. This typically involves shredding the media into small pieces so not a single chip escapes destruction. If the shredding process misses a chip, it's still possible to recover data from it, so care needs to be taken to destroy everything.
SSDs are durable and it's difficult to assess their lifespan because they vary depending on the manufacturer. To get an idea of how long a solid state-drive will last in application, the following calculations can be used to determine its life span:
It should be noted that these calculations are valid only for products that use either dynamic or static wearing levelling. Use the solid-state memory component specifications for products that do not use wear levelling.
There are various things a user can do to attempt to maximise its lifespan. The best way to find out the right methods is to look at reliable chat rooms and manufacturer recommendations here too.
SSD is a new technology and very few people have learned enough about it to expertly navigate through its RAID and the SSD layers and successfully find data when there's a failure. Best practice is that before choosing to use it, contact a data recovery specialist for more information about the impact on data recovery for the specific environment and technologies you are investigating.
Robert Winter is responsible for all operations within the area of disaster recovery in the Kroll Ontrack labs, based at the UK headquarters in Epsom.
The findings from Forrester's latest research on Oracle
point to a worrying trend in the enterprise software landscape. Businesses are
not generally doing large, transformational IT projects built around
traditional enterprise resource planning (ERP).
The key suppliers are adapting their enterprise software portfolios
in a bid to drive more sales. But the CIOs Forrester spoke to are not convinced
it is a strategy that is working for Oracle.
In Forrester's Oracle's Dilemma:
Applications Unlimited report, many people are happy with the software they are
running and have no real plans to migrate onto Oracle's future enterprise platform.Since Oracle is a strategic supplier to many, there is little interest among
CIOs for migrating away. There are concerns that Oracle may turn some of the
products they have deployed into cash cows, potentially with high, annual
maintenance fees and licensing costs.
Members of the IT director's group, the Corporate IT Forum,
are angered by the changes to Oracle licensing. Head of research Ollie Ross
told Computer Weekly that members were being pushed into taking certain technical
directions like OVM (Oracle VM), rather than VMware. The forum's executive
director, David Roberts, believes many CIOs are reacting negatively to Oracle's
exceptionally high-pressured sales techniques. This is reflected in the
supplier's poor software licence revenue when compared with its nearest rival,
SAP. If businesses are not upgrading at a rate that looks good on the company's
balance sheet, Oracle will need to take a different approach.
Newham Borough CIO Geoff Connell is concerned that Oracle (and other top tier vendors) will increase licensing, because their customers are "locked into" their products due to historical investments.He argues that many software suppliers appear to be ignoring the financial climate and are attempting to make up for reduced sales volumes with higher unit costs.
Coercing customers to buy more software is not the right way
to go. But Oracle executives have not shown much willing to go wholeheartedly
down the software as a service (SaaS) route, or even offer a roadmap for
integrating SaaS and on-premise enterprise IT. Nor has Oracle been willing to
adapt software licensing to make it more virtual machine friendly. The research
shows customers are unhappy and the time for Oracle to make some tough
decisions is long overdue.
Connell believes if Oracle and other leading suppliers continue to hike prices, users will abandon commercial enterprise software for open source alternatives.
A few weeks ago I interviewed Paul Michaels, CEO of business technology consultancy ImprovIT, about a methodology for modelling decision-making. In this guest blog post Robert Saxby, consulting director at ImprovIT, explains a bit more about how the methodology, called Virtual Modelling, works, and the business benefits.
When it comes to re-engineering IT environments to save money or achieve best practice, a trial-and-error approach can be both complicated and costly. Virtual Modelling is a new business tool uses 'what if?' scenarios to simulate real world outcomes and identify efficiencies, future strategies and best sourcing options without chopping, changing or disruptive ongoing operations.
CIOs today are caught between a rock and a hard place: Having to slash IT costs while retaining productivity and service quality - often due to government mandate. Of course cost cutting pressures are nothing new, and for many there is little blood left in the stone. The question now is: "How and where can we make further reductions without knee-capping the entire operation?" There are plenty of apocryphal tales about organisations axing staff and abandoning efficiency enabling technology projects only to discover their actions have mortally wounded deliverables and reputation. The result: a panicked and costly rehiring and/or re-purchasing exercise to redress the balance.
Finding the cost/quality balance
Wouldn't it be great if you could work out the exact cost and productivity balance without the cost and disruption of making changes on a trial-and-error basis? Virtual modelling creates scenarios that are based on real, current and accurate data mined from your own ICT operation that can predict real world outcomes without impacting current operations. But it can only do this based on available KPI data, and if it doesn't already exist it must be generated via benchmarking studies. For as Lord Kelvin, the 19th c. physicist once said: 'If you cannot measure it, you cannot improve it.'
Measure it first
Once created, this baseline data provides the tools to compare performance against other public service (and commercial) entities of a similar size and complexity in terms of things like value for money, quality of service, best practice and competitive pricing. Digging a bit deeper, you can also find out where your organisation stands in relation to best practice standards for staffing (quality and quantity), process complexity, outsourcers (scope & service levels) and IT governance.
All of this information is then used to create 'what if' scenarios, typically dealing with areas such as: Cost/Price, Volumes, Staffing, Quality & Service Levels, Service Scope, Complexity, Project Efficiency and Process Maturity. Providing the model has been mapped with well-researched data, the outcomes obtained offer accurate indicators that can be used to make decisions about outsourcing, staffing, process re-engineering, cloud migration or anything else.
In the video below, Paul Michaels, CEO of ImprovIT, discusses how our Virtual Modelling methodology helps decision-making within a project.
Building up the model
Virtual modelling is designed to pinpoint the impact of one or more project parameters upon all the others. For example: If I change Service Quality (SLAs) and/or 'Service Scope' what effect will this have on 'Cost? Or: If I reduce 'Complexity' what effect will this have on 'Processes'? It also shows the changing balances of the whole picture when one or more parameters are altered. For example: 'If I want to increase 'Volumes' or 'Service Quality' what changes do I need to make to all the other segments and how will this impact the enterprise as a whole?
Virtual Modelling System
So, to find the Goldilocks balance between IT cost and service quality let's start by feeding staffing metrics into the simulation model, given the high impact of staffing on cost. But this isn't just about a straightforward set of numbers: it also has to allow for a range of 'soft' factors such as varying levels of knowledge, skill sets and the specialist expertise that can make an individual or team difficult to replace.
Next let's look at complexity - typically the highest contributor to an IT department's spend after staffing. This involves any and everything from security and data confidentiality to high availability requirements, legacy system integration and the number of nodes in the enterprise network. Rule of thumb: The greater the complexity the higher the cost. A virtual modelling analysis determines where simplifications can be made without jeopardising mission-criticality. Once established that these changes are advisable, modelling can also provide an accurate estimate of cost, timelines and impact on staffing and service levels.
Then there is the question of outsourcing. Will it save money? What services should be outsourced? And if we are to outsource, what kind of service - a traditional provider or a cloud-based service? And what business model: Iaas, SaaS or PaaS? Data fed into a simulation model can provide an accurate estimate of the likely ROI and TCO - with timescales - of each option.
Process maturity also impacts the cost/performance balance. There are industry standards which provide best practice guidelines, such as ITIL (IT Infrastructure Library) 'Agile' and 'Lean' (a production practice that looks to reduce resource expenditure down to the minimum required to deliver value to the customer). Comparisons with these guidelines can indicate where improvements can be made, but virtual modeling can determine what will cost and whether it's worth the disruption to operations. It's also worth noting that achieving process maturity is rarely a quick win: it takes time and requires clear, unequivocal goals and plans led from the top.
Given the chequered history of public sector IT projects, and the challenges that so many ITC departments are going through in making decisions about things like whether, when and how to migrate to the cloud, and how to optimise resources on an ever-diminishing budget, using Virtual Modelling to run scenarios on all the available options provides new decision-making tools that help to identify the best roadmap ahead while avoiding wrong roads and dead-ends.
Over the last few weeks Computer Weekly has written about software licensing and how suppliers are demanding IT departments run costly software audits. At the same time, we have started looking at the complexities of licensing, such as in a virtualised environment.
In this guest blog post, Martin Thompson, a SAM consultant and founder of The ITAM Review and The ITSM Review, provides some top tips on what to do when you receive an audit letter:
Payment Protection Insurance (PPI) spam is in vogue.
You may have received one or two of these recently:
"You are entitled to £2,648 in compensation from mis-sold PPI on credit cards or loans."
PPI claims and other spam solicitations are the bane of our inboxes. The vast majority of us know to simply ignore them. Unfortunately the handful of those who do respond justifies the exercise to the spammers.
This mass-marketing technique is used in exactly the same fashion by trade bodies such as BSA and FAST to force their agenda and start software audit activity.
Supplier audits are a fact of life, some software audit requests are serious and expensive, some are merely spoof marketing campaigns - how can IT professionals decipher between the two?
Whilst I'm not a legal expert, fifteen years in this industry has taught me that there instances when you should respond to an audit request and instances when you should simply walk away.
When to Take Software Audit Requests Seriously
In my opinion there are two instances when you should take software audits seriously:
When you are approached by a software publisher directly with reference to a signed contract
When you are approached by an organisation with real proof of a breach of intellectual property law.
Contracts with software publishers have 'Audit Clauses', the right to come and audit you periodically at your own cost. Your company either signed this and agreed to it or will need to fight against it. Smart companies negotiate it out of the contract by demonstrating maturity in their internal processes.
Breaches of intellectual property supported by evidence are a legal dispute and should be treated as such - by passing the issue over to your legal team in the first instance.
When to Ignore Software Audit Requests
Requests for 'Self-Audit' or other direct mail fishing exercises can be ignored.
Trade bodies such as BSA and FAST commonly write letters to companies requesting them to 'Self-Audit' or declare a 'Software Amnesty'.
These organizations are masters at crafting well-written legal sounding letters but have no legal authority whatsoever. Nor do they have the resources to follow up to every letter sent.
Just like any other complaint made to your business it should only be taken seriously if there is firm evidence or the organisation issuing the dispute is supported by the appropriate government agency. For example the Federation Against Software Theft (FAST) has no teeth whatsoever unless accompanied by HM Customs and Excise.
Confidence in Your Choices
IT departments with the appropriate Software Asset Management (SAM) processes in place have both the confidence and the supporting data to discriminate between bogus claims and genuine supplier audit requests.
Whilst much noise is made in the industry of senior management being sent to prison or the company name being dragged through the gutter - the real and compelling downside to a lack of software management is UNBUDGETED cost and DISRUPTION. Surprise license costs and massive disruption whilst IT staff are diverted from key projects to attend to an audit or hunt down the appropriate data.
Unexpected software audits can be good for your health in the longer term if it allows the organisation to realize it is out of control.
SAM is so much more than compliance and counting licenses. Organisations with a solid SAM practice are more nimble, competitive and dynamic. No more stalling on that virtualisation project because we're unsure of the licensing costs, no more uncertainty about moving to the cloud because we don't know how that leaves us contractually. SAM provides the business intelligence to innovate and take action.
The changes Microsoft has made to client access licences (CAL) reflects a change in how people use the company's software. Today, people expect to have access to the MS Exchange Server via their Andoid or iOS device. This is not added-value. Email access from any device is essential to enable people to use their own devices at work. So why does Microsoft want to charge extra?
The problem Microsoft faces is that its traditional business model, where people would run out and buy a new Windows PC, every time it released a new OS, is broken. Windows 8 is a massive departure from previous OSes, and it will take an awful long time before people feel the need to upgrade. In the meantime, it is losing out, because Apple and Google devices are able to connect to Microsoft servers.
It still makes money: the users who connect to Microsoft systems have to pay a Microsoft client access user licence. Just because a user may have more than one device, does not give Microsoft the right to charge more. After all, most of the time they will only ever use one device at a time to access a Microsoft system. How often will someone want to accesses email simultaneously on a PC, a tablet and a smartphone. Come on Microsoft, we only have two eyes, two hands and one brain.
Forrester analyst Duncan Jones says per-device licensing for software is obsolete in the mobile and virtual world.
So rather than charging a premium for user-based CALs, Microsoft should make device CALs cheaper, since they are more restrictive.
A couple of weeks ago I received a telephone call at home claiming to be from the Windows Support team. The lady on the phone asked me if my PC was running slow (which it was!) and put me through to a tech lead.
"How did you get my number," I asked.
The tech support man said he worked for a company that had been approved by Microsoft to provide customer support. He then asked me to open the Windows Event Viewer. "Your PC has been infected," he said, when I told me what the Event Log was showing.
I guessed his next question would be to ask me to grant him remote access to the PC....The penny dropped. Ah this is a phishing scam. Had I agreed, the caller would probably have been able to install rogue software on my PC.
Okay so he very nearly got me. Lesson learnt.
But it is worrying how easy we can be tricked. And with more of us using our own computers for work, there is a very real risk that hackers will target us at home claiming they are tech support.
I have been a guest of Emirates today at its network control centre in Dubai, I am currently on an Airbus A380 flying back to the UK
There is Wi-Fi on this flight, which is connecting via satellite internet. Emirates charges just $5 for 20 Mb, which is not bad. I have been able to access Skype messaging and connect to the corporate VPN. The connection works, but it is not fast enough (Broadband Speed Checker failed to start) and is intermittent. However I have been able to connect to our Exchange Server and write this blog.
The network control centre manages the fleet of aircraft and the crew. The engineers have remote access to the aircraft in flight, allowing them to access telemetry data, reinitialise systems, or schedule replacement parts to be available at the destination airport.
Data from the flight is shared with suppliers, and fed into maintenance systems that use predictive analytics and forecasting to ensure the aircraft is kept operational and passengers are not delayed.
The £399 Surface is Microsoft's first foray into PC hardware. The tablet device runs Windows RT, which means it is unable to use x86 applications. This is a bit limiting if you would like your own browser since the only one available at the moment is Internet Explorer 10, that ships with Windows RT.
However, you can download applications from the Windows Store, Skype, eBay and the Kindle app are there, but sadly, no BBC iPlayer. There are several tools to improve the tablet, but there does not appear to be much in the way of enterprise software in the Windows Store.
Still, the relatively limited functionality means the Surface tablet could be deployed where people need simply email access, tasks, calendar and basic Word, Excel and PowerPoint functionality. It is probably suitable for a device in education and where a relatively locked-down environment is preferred.
I personally like Windows 8 Professional, which is only available on x86-based tablets and hybrid devices, as I prefer to run my own applications, rather than be restricted by the choice in the Windows Store.
Windows 8 is out, the Microsoft Store in New York is open and Surface has surfaced. Times are certainly changing for Microsoft as it takes aim at Apple with a consumer friendly device and OS. Microsoft needs to win hearts and minds. But the winning formula that has given it a licence to print money with Windows licences, is no longer compelling.
Certainly, it seems there is no compelling reason to switch over from Windows 7 to Windows 8. But Windows 8 devices and Microsoft's own Surface tablets will find their way into the enterprise, thanks to the touch UI.
Personally, I'm not convinced Windows 8 makes a good desktop OS, for a non-touchscreen PC, but Microsoft says it is 30% faster than Windows 7.
Oracle is well and truly pushing engineered (ie proprietary) systems.Speaking yesterday in London, Oracle president Mark Hurd claimed that Oracle's vertically integrated stack combining hardware, middleware, database and enterprise applications, has been pre-integrated so customers do not require expensive IT consultants to connect the system together.
This may be true of the bits within the Oracle stack, but most businesses connect systems across complex, heterogeneous environments.
Hurd also stated that Oracle will spend $5 bn this year on R&D. Now that it is playing in the hardware race, how far will that go? After all, Intel is expected to spend over $18 bn on R&D.
For many years, Microsoft has been building its credentials in the enterprise with Windows Server, providing an alternative to costly Unix systems. It has taken over 20 years since it divorced IBM to build this reputation.
On the desktop, it has had no competition, until Apple finally got its act together with the iPad. Beyond professional graphics and multimedia workstations, so far Apple has not shown much enthusiasm for products and services for enterprise users. But its devices are being used within business, and some companies are even contemplating supporting MacOS. It is with this backdrop, that Microsoft is setting the stage for the next battle over desktop IT.
Windows 8 shows where the company is heading. It can be used as an upgrade to Windows 7 for traditional PC desktop computing. But desktop computing is not what it used to be. Recognising the threat and opportunity of IT consumerisation, Microsoft has made Windows more like Android and iOS, even though these lack the enterprise heritage the company has worked hard to earn.
This new operating system shows where Microsoft is heading with Windows: it will be increasingly consumer-focussed. Windows 8 does work with a mouse and keyboard, but it is certainly not the same user experience as Windows 7. The move from Windows XP to Windows 7 was a comparatively small step; the move to Windows 8 will be a giant leap.
An article on Forbes has quoted the latest research from Forrester, which predicts Apple will sell $7 billion worth of Macs and $10 billion of iPads in the enterprise in 2012. Forrester analyst David Johnson believes Macs can make good corporate citizens in Windows-centric environment.
With IT planning to migrations off Windows XP, the roll-out of new MS server products and Office 2013, supporting Macs is probably the last thing IT admins need.
The biggest issue with Apple in the enterprise is how to engage with a company whose primary goal is to entice consumers with shiny gadgets. Apple's reseller channel certainly does not look like it is growing. Can we honestly expect the Genius Bar to provide a business with an enterprise-class SLA given the Apple Store is consumer focused?
Oracle has said "no comment" to the question I posed on when it would release a patch for a serious security hole in its Java runtime environment, that is currently being exploited.At the time of writing, there was absolutely no info or advice or the company's security blog.
Internet users are at the mercy of Oracle as reports have emerged of a zero-day vulnerability that capable of infecting PCs that run Java within their web browsers.
The next patch scheduled for release by Oracle is 16 October.
Java, the write once, run anywhere runtime environment is used on websites to add sophisticated interactivity. It requires a runtime download browser plug-in, and it is this plug-in that has been exploited.
Symantec said: "In our tests, we have confirmed that the zero-day vulnerability works on the latest version of Java (JRE 1.7), but it does not work on the older version JRE 1.6. A proof of concept for the exploit has been published and the vulnerability."
The FireEye site warned: "It will be interesting to see when Oracle plans for a patch, until then most of the Java users are at the mercy of this exploit. Our investigation is not over yet; more details will be shared on a periodic basis."
F-Secure added: There being no latest patch against this, the only solution is to totally disable Java. Since this is the most successful exploit kit + zero-day... qué horror. Please, for the love of your computer disable Java on your browser."
During the summer Microsoft gave developers who attended its TechEd developer's conference a Samsung 700T tablet preloaded with a pre-release version of Windows 8 to try out. The new operating system is key to Microsoft's strategy to bridge the gap between the corporate world and the consumer space,that is dominated by the likes of the iPad running iOS and the Samsung Note 10.1 running the Android OS.
Windows is likely to remain king of the enterprise desktop laptop and PC market for the foreseeable future, but how well can it run on a tablet?
The Samsung 700T is what used to be called a "Slate PC". It was originally released in 2011, and is effectively a full-blown 11.6 inch touch-screen PC without a keyboard, which is currently selling on Amazon for £766.
In terms of spec, it is powered by an Intel Core i5 2467M 1.6GHz, and has 4 GBytes of RAM and a 64 GByte solid state disk. The screen looks amazing. With a bluetooth keyboard and wireless mouse, the Samsung 700T can easily replace a notebook PC - the elegant docking station, that measures 11 x 10 x 1.5cm and doubles as a stand, has an Ethernet connector and HDMI port.
The device is well-suited to running the final shipping version of Windows 8, with its touch-screen user interface. As expected, thanks to Microsoft ActiveSync, connecting the Samsung 700T to an Exchange email server takes a matter of seconds, which should not burden the IT support desk. It requires a Windows Live account and connects seamlessly to Hotmail and Gmail.
There are not yet enough applications in the Microsoft Store... no YouTube, Dropbox, no security apps, VPN apps or even BBC iPlayer. Hopefully this changes when Windows 8 ships.
Weighing just under a kilo, and limited battery life of around 4-5 hours, it is certainly not a tablet that could be used on the road all day. But, the Kindle app works well, and the large screen makes reading in landscape format particularly comfortable.
It will be interesting to see how the Samsung 700T works in a full enterprise environment, as and when VPN software, anti-virus software, ERP, BI and apps like Citrix Receiver are certified for Windows 8.
I met John Abel, chief technology architect for Oracle EMEA today, to talk about Oracle's so-called "red stack". Far from being an Oracle-only strategy, Abel said Oracle's main objective with the strategy was to tackle the complex integration issues that exist between different layers of a customer's IT strategy. He says people want agility - ie speed of implementation. Vendor lock-in is less of an issue.
So while Oracle may indeed have a product to fit across all tiers of an IT architecture, Abel sees customers buying specific components and using Oracle's preferred orchestration methodologies to potentially lower the cost of IT integration.
From the hour-long conversation - the first I have had with Oracle for several months - it seems that the company's strategy is to give businesses the same software, whether they wish to deploy in the cloud, or on-premise.
Abel believes that IT needs to change how applications and infrastructure is procured. People today are building private clouds:so they buy each layer of their IT architecture separately. The hardware and infrastructure is separate to the platform, which is separate to the applications/business logic. He feels that IT must consider business benefits throughout all layers of the IT stack - but the constituent components do not have to be exclusively Oracle's.
In this guest blog post, Neil Colquhoun, business sales director, Epson UK, writes about how CIOs and IT managers can improve meeting room etiquette
You know the feeling. You've been in the same chair for the past two hours, looking at the same faces and being shown slide after slide of bullet points. I'm sure I'm not alone in the desire for shorter, more productive meetings...
In a recent survey we ran nearly half of office workers admitted to using a tablet, laptop or smartphone for non-work-related purposes during meetings. Alongside this, 68% are distracted when others use tablets, smartphones or laptops during meetings and 16% blame technology failure for wasted time in meetings. I can understand the power a smartphone can hold in the middle of a dull meeting.
One of the most impactful steps could be to suggest that management ban mobile technology in meetings for all but urgent phone-interruptions or minute taking. This might sound controversial but it's rare that an email will arrive that couldn't wait 30 minutes to be addressed - after all, only a minority of office workers are involved in life-and-death situations...
Second, make your meeting room technology fool-proof: Whilst projection and AV technology is hardly rocket science, I've seen many a poor soul try and fail to clone their display while others sit expectantly waiting for the meeting to begin. Nor do people necessarily realise the 3.5mm audio jack should go in the headphones socket of their laptop. That's why I think every IT manager should provide framed and laminated step-by-step guidelines to save time wasted in the meeting through staff trying and failing to get their presentations fired up.
Finally, don't give people an excuse for cancelling: Scheduling everyone's time can be a challenge in itself, and so I'm a big fan of Doodle.com, which allows participants to vote on their preferred time for a meeting. This saves employees from trying to work multiple calendars around each other. It also works wonders for arranging a stag night with a group of disorganised mates.
In this podcast recorded at the Forrester CIO Summit in Paris, Adriana Karaboutis, global CIO at Dell explains how her IT management team learns from summer interns. She says, "The Gen Ys have grown up with the technology. They provide insights that many of us have never thought about."
Among the sessions at Forrester's CIO Forum Emea, was one that looked at sourcing, and in particular, the role of the big system integrators (SIs).
According to Forrester the major US, Indian and European SIs focus on helping clients lower the cost of IT. This may be fine if your job is to run IT services cheaply. But what happens when the SIs are asked to innovate?
All the experts looking at the role of IT, discuss the need to build new businesses empowered by IT. Forrester calls this "digital disruption" and it involves a recipe of mobile development, social media and IT consumerisation.
If your supplier is focussed on lowering cost, will you get the best developers in these areas from a major SI? It is highly unlikely. As Forrester points out, most large SIs are publically listed companies, and will save their very best people for their largest, most lucrative contracts. It is not a litmus test, but the speed with which a request for proposals is delivered, may indicate how seriously the SI sees the contract.
So where does that leave everyone else?
For everyone else, the best third-party suppliers may, in fact, be small local specialists, who are able to deliver expertise in a narrow niche. But these niche players may be unknown and the due diligence process to assess their suitability and financial stability will be harder.