Microsoft drives open AI, ups Azure cloud layers & is hot for bots

On  the very eve itself of Microsoft’s third major developer event this year — Connect() 2016 in New York — the firm has taken another directly open stride in its Artificial Intelligence (AI) efforts. Microsoft has announced  a new partnership with OpenAI, the nonprofit research organisation co-founded by Elon Musk, Sam Altman, Greg Brockman and Ilya Sutskever.

In line with this news, Microsoft also introduced a selection of Azure cloud releases designed to drive what it hopes will be advances in AI. But why did the firm do any of this?

We wanna democratize, with a zee

“We want to democratize AI,” asserts the usually fairly gutsy non-marketing official Microsoft blog channel.

Microsoft’s says its partnership with OpenAI will be focused on making ‘significant contributions’ to advancing the field of AI. Yeah… like what?

The firm points to advances in fields such as ‘reinforcement learning‘ and says that this technique helps make tangible progress in the effort to build systems that have true AI. This is an area of AI in which software ‘agents’ (a computer program that acts for a user (or indeed another other program) in what should be thought of a relationship of agency) learn how to complete a task by being given a lot of room for trial and error… and are then rewarded when they make the right decision.

Azure AI, aye aye

Microsoft points to AI inside the Azure cloud platform work including Azure Batch, Azure Machine Learning and the Microsoft Cognitive Toolkit (formerly CNTK).

OpenAI has also chosen Microsoft Azure as their preferred cloud platform. OpenAI’s use of Azure will help OpenAI advance their research and create new tools and technologies that are only possible with the cloud.

So what else is new in Microsoft AI Azure cloud land?

Azure bot, hot to trot?

The firm is now bringing forward a preview of Azure Bot Service: this service is supposed to allow developers to accelerate the development of bots using the Microsoft Bot Framework and deploy and manage them in a serverless environment on Azure. Also note the below:

  • General Availability of Azure Functions: Azure Functions can be used to maximise development agility and operational efficiency of nearly any app or service at lower cost — it runs on Azure Functions, which allows the bots to scale on-demand.
  • Azure N-series VMs: Generally available starting in December, the Azure N-series is designed for intensive compute workloads, including deep learning, simulations, rendering and the training of neural networks.

“Microsoft is infusing AI into all of its products and everything it does across hardware and software to accelerate the path from research into products. Today’s announcements are part of Microsoft’s continued commitment to democratize AI – making AI more accessible to all,” said the firm.

Can Microsoft beat IBM in the AI wars?

So can can Microsoft beat IBM and the monolithic Watson brand in the AI wars?

Well the answer kind of comes down to a) whether it can successfully re-position the Azure cloud platform, fabric, layers and tools as some kind of AI supercomputer and b) whether it can continue to drive its focus successfully enough away from the ‘old’ PC market (yes apparently the new Surface is very good and of course it still has a massive interest in PCs) to reaffirm a focus on IoT, mobile and the cloudy backend to drive all the new devices.

One might argue that Microsoft could see better developer and/or consumer-type level adoption of its cognitive APIs for AI areas such as natural language undertanding, speech recognition, text analytics including sentiment analysis and key phase extraction and other associated areas of machine learning. Why? Simply becuase the firm has broader popular reach at the developer level some would say… although that may be grossly unfair to IBM, the firm has no problem at all packing out a whole raft of developer-centric conventions on a year-on-year basis.

Look even further ahead at areas like Project Catapult — the technology behind Microsoft’s hyperscale acceleration fabric. The firm says that this at the centre of a set of investments Microsoft is making to build a supercomputing substrate that can accelerate our networking, security, cloud services and Artificial Intelligence.

Have we learn enough yet about AI? No sir, not by a long way.

Harry Shum, Microsoft AI and Research Group executive vice president, and Sam Altman, co-chair of OpenAI. (Photo by Brian Smale)

Harry Shum, Microsoft AI and Research Group executive vice president, and Sam Altman, co-chair of OpenAI. (Photo by Brian Smale)