Another video interview from yesterday's launch event, this time with David Hobbs-Mallyon, Sql Server Product Manager, Microsoft.
February 2008 Archives
Microsoft’s response to the EU’s £680 million fine:
“We feel that the fine concerns events which are in the past and we’re very much looking to move forward on announcements like opening up code on APIs.”
I'm sitting in a presentation wih some Microsoft guy talking about how it's important to get the right data, to the right people at the right time with the right processes but in the back of my mind, I'm thinking: the EU has just fined you £680 million!!!!!!!
While getting the right data to the right person is important it's probably better not to break the law.
Microsoft will be launching several versions of Windows Server 2008. These include:
Data Centre edition
Windows Web Server Edition 2008
Web Server Itanium for the Intel/HP 64 bit architecture
It will also launch Small Business Server 2008 as a beta in Q2 of 2008 and a full release in Q3.
Reportedly, this will allow small businesses to build websites that can collect customer data right of the box – including things like registering domain names and capturing customer data online.
Microsoft execs say that businesses just want something out of the box – something that they don’t have to think about or worry about. While this might be true for some small businesses, I don’t think it’s true for the majority who are already turning to alternatives like MySql and Linux.
Reporting live from the launch today.
Confirmed customers who are using the new products include:
Easyjet – using all three
John Lewis Partnership – Windows Server 2008
Mclaren Racing – Sql Server 2008
If you want me to pose a question in near real time to execs about the new products, post a comment and I’ll make sure it is heard.
BT OpenReach has been criticized by attendees at the CMA conference here today for not guaranteeing SLAs to end users.
The funny thing is, is that end users aren’t OpenReach customers – it’s other ISPs that offer broadband services on to end user businesses that are BT customers, and they [ISPs] can’t guarantee service quality for a network they don’t own.
If you’ve been affected by BT OpenReach problems, let me know or post a comment.
Intellect has announced it will conduct a study with the University of Warwick to help businesses users go green.
John Higgins, Director General of Intellect, said there are four main areas where businesses need education when buying ICT equipment:
1-How much does the UK IT industry contribute to carbon emissions?
2-What practical steps can IT managers take to become green?
3-What green criteria does an IT manager specify in a RFP when buying equipment?
4-How do you break the psychology of the upgrade cycle?
The last one is the most interesting for the ICT industry to consider. Maybe part of going green isn’t specifying energy efficiency in products but making sure the ICT equipment you buy lasts longer with the applications you want to use five years down the line.
Matt Yardley at Analysys used the following quote on Net Neutrality from CEO Edward Whitacre at AT&T:
“How do you think they [Internet Content Providers] going to get to customers? Through a broadband pipe. Cable companies have them. We have them. Now what they would like to do is use my pipes free, but I ain't going to let them do that because we have spent this capital and we have to have a return on it. So there's going to have to be some mechanism for these people who use these pipes to pay for the portion they're using. Why should they be allowed to use my pipes?”
Maintaining quality networks while delivering innovative new services will be the next challenge, but should content providers who generate traffic foot the bill?
The annual Communications Management Association conference takes place today.
The CMA has been supporting telecoms and ICT professionals and organisations within private and public sectors for almost 50 years.
They were due to host a presentation by Stephen Timms MP Minister of State for Competitiveness, but because of the cabinet reshuffle, had to pull out.
Sources close to the CMA told me that they had extended the invitation to Shriti Vadera, the broadband minister, but that she turned them down.
Now it could have been that the baroness was busy, but given that the government launched a major review of broadband last Friday, the CMA conference would be an opportune time to meet the people she is representing.
Given that she said last week:
"We must be ready to respond to future technological developments, which will place unprecedented challenges for our communications networks over the coming decade.”
If there are really “unprecedented challenges for our communications networks” ahead, then why isn’t she attending a conference for network managers who might be best place to advise her?
It takes a big man to admit when he is wrong, but luckily, I am not a big man.
A case in point that illustrates this is the posting I made last week about how the government was dragging its heels over improving broadband speed and access for businesses in rural areas.
Then last Friday, The Department for Business, Enterprise & Regulatory Reform said it would launch a review to see how government could work with broadband providers to roll out faster networks.
A cause for me doing a complete 180, perhaps?
Well, no, frankly.
The Pipe Dreams report by the Broadband Stakeholders Group in 2007 on how to make fast broadband a reality said:
“For next generation broadband to move from pipe dream to reality in the UK, steps need to be taken now. The issues are complex and there are few clear or obvious solutions at this stage. However, there is a limited window of opportunity between now and April 2009 to get this right.”
This means that the government has until April 2009 to get its act together on broadband.
Given that we’ve already lost Stephen Timms, who was in the process of talking with ISPs, but published no recommendations publicly, and given that we have a new minister who is in the process of learning her brief, I’ll continue not to hold my breath.
So the average UK broadband speed is approximately under 3mb, eh?
It’s not surprising given that the majority of providers still use ADSL connections, which, given the web applications people are using today, has a limited lifespan.
As a consolation, some providers have begun using Asymmetric Digital Subscriber Line 2+ technology, which delivers more bandwidth over greater distances.
But both of these technologies pale in comparison to fibre optic connections, which need to be rolled out nationwide for us to compete in a global economy, especially in the case of SMBs.
Small and medium sized UK businesses aren’t competing with the business down the road anymore, they are competing with businesses worldwide.
Shriti Vadera, minister for business and competitiveness (and broadband), said in a recent Business Zone article: “Enterprise, especially small businesses, start ups and growing companies are the heart of the UK's economy.”
If that’s the case, why is she doing nothing to help them get decent broadband?
Get in touch if you still have problems with your broadband where you live.
Competition in broadband markets may be increasing, which is good news, as it means potentially lower broadband prices. But is the quality of broadband connections getting any better?
The European Commission’s approval of telecom regulator Ofcom’s proposal to de-regulate markets illustrates how fast Ofcom can act on information.
Ofcom identified markets with strong competition and has relaxed regulations that require broadband operators who have built their own networks to open them up to other providers.
I think this is fair, since making an investment in network infrastructure requires that these operators can make a return which they can then feed back into building more networks with better quality connections.
But if Ofcom can act with such swiftness on deregulating, why can’t it work equally fast on issuing regulation to improve access where it is poor? Especially since it was revealed today that the average broadband speed in the UK is only 3mb.
You have to wonder…
Four separate reported and confirmed failures of undersea cables serving the Middle East and North Africa over the last week have inspired conspiracy theories as to what caused this.
Whether any have validity, the fact is that undersea cable networks - despite multiple layers of built-in reliability - are highly vulnerable to deliberate attacks.
As the economic importance of undersea systems grows, risks from sabotage must be considered.
Matt Walker, senior analyst at Ovum, comments:
“There is an unspoken assumption that the networks are safe from deliberate human sabotage. The recent spate of cable failures in a politically volatile region has called this assumption into question."
“If ports, railways, gas pipelines, and other types of networks are being secured against possible sabotage, we must similarly increase the security of undersea optical highways. Guaranteeing reliability is impossible, but an improvement on the current hands-off approach is long overdue."
“The economic cost of losing, or even just slowing down, international communications is extremely high. This risk has to be factored into the calculations behind the investment level and design of undersea optical networks.”
There are three dimensions: length width and height. The fourth dimension is often identified with time in physics.
But Markus Nordberg, Resources Co-ordinator in the ATLAS-Project at CERN, said that their research could prove there are either a total of 11 or 26 dimensions. The long and short of this is that our universe is one among many and if that’s true, you can throw away traditional thoughts on probability, as everything that can happen does happen, somewhere.
“When you toss a coin, you assume there is a fifty-fifty chance of getting heads or tails. But the act of throwing a coin creates branching probabilities. Both outcomes occur but in different universes,” said Nordberg.
There’s an old joke about a Physicist who goes to Vegas and places all his money on black when the roulette wheel comes up red. The Physicist smiles and is questioned by another player about how he can be happy.
“Don’t worry, I lost here, but in another world I won.”
Today I’ll be viewing first hand how the Large Hadron Collider (LHC) - a particle accelerator at CERN – will be used in an experiment that will generate Petabytes (1000 terabytes) of information.
Dealing with petabytes of information today is not common practice, but will be ten years down the line, according to experts.
Managing unstructured data over the network will play a big part of this and if there’s one place to start, it’s at CERN.
I’m just hoping none of my particles get accelerated.
From a CERN dinner conversation I just had:
“Microsoft has traditionally been the window through which we have viewed the desktop. And Google is now the window through which we view the web. Microsoft has had its caged rattled.”
Google has, arguably, better search engine technology than Microsoft.
But for any network to function properly, information must be structured and organised properly, so that communities of users and search engines can find exactly what they are looking for.
The value of a network is proportional to the number of users using that network. Users will be more inclined to use one method of search over another if it offers them the ability to class information in a meaningful way.
This is the next big challenge for the internet: organising unstructured data.
Information is only going to grow and we, as users and contributors to the world wide web, are going to have to have to find a way to index it, if it is to remain meaningful.
If Microsoft was smart, it would start building features into IE to allow users to tag information that could only be read in IE and tie that to a web search service like Yahoo. An IE user using a Microsoft search would then have access to search features competing sites wouldn’t.
But then that would be evil, wouldn’t it?
I am reporting on hallowed ground this week.
CERN is where the World Wide Web began. In this current climate of billion dollar internet takeovers, it pays to examine where it all started and what can be learned.
Berners-Lee and Cailliau began their hypertext project as a way sharing information faster between researchers. Period.
No grandiose business case, no consultants; just a motivating force to get access to the files they needed across the network.
A couple of years later CERN announced that the World Wide Web would be free for anyone to use. Hypertext doesn’t ask you to pay five cents every time you connect through to a link. It could, just like a telephone call. The fact that it doesn’t shouldn’t be taken for granted.
Compare the design of the web to the design of MySpace and rise of blogging software.
Most people emailed and instant messaged once the web was up and running.
But if they didn’t have the html skills to build a web-site (or didn’t know a fourteen year old computer science student who would work for peanuts), then the extent to which they could fully engage with the web – sharing pictures, video and posting their own news and views - would always be limited.
When sites like MySpace and software like WordPress came along, they really didn’t innovate. They just made it easier for the majority to use what was already there.
In all the reports I’ve about Microsoft buying Yahoo I’ve seen, no one has really picked up on the fact that combining one bad search engine with another really won’t be worth squat to web users - and ultimately advertisers – if it doesn’t make things easier for people to network within the world wide web.
If you can’t make friends by networking, then buy them.
Eyeballs and attention spans make money in the new economy and Microsoft has never been an innovator.
It bought Hotmail, it didn’t invent it. The first edition of Bill Gates’ The Road Ahead famously neglected to mention the world wide web.
Public computer networking – independent of whatever operating system or browser you were running - as far as Microsoft was concerned, was never on the agenda.
The world wide web was always going to be the Microsoft wide web as far as it was concerned; a world in which you had to use Microsoft software to access certain features – just as Internet Explorer does with Hotmail now.
What Microsoft doesn’t realise is that what made sites like MySpace and Facebook worthy of top dollar was their adherence to open standards. Both sites work if you use a PC or a Mac or Explorer or Firefox.
It’s an all embracing network and not a closed one that wins in the end.
-- Advertisement --