May 2013 Archives

SAP hires autistic software testers and programmers

bridgwatera | No Comments
| More

SAP is working with software testers and programmers who sit on the autistic spectrum.

Known for their often heightened sense of analytical awareness and perception, people classified with an element of autistic behaviour could have untapped potential in software application development and testing.

SAP is working alongside Specialisterne to potentially hire hundreds of autistic staff around its global office locations.

Specialisterne describes itself as a foundation that works to enable one million jobs for people with autism and similar challenges through social entrepreneurship, corporate sector engagement and a global change in mind-set. SPF works with stakeholders around the world to bring about a vision of a world where people are given equal opportunities in the labour market.

"We are very excited by this opportunity to enable SAP global access to a huge pool of untapped talent and therefore help strengthen SAP's position as a global leader in innovation. The partnership will position SAP as a thought-leader and motivate the ecosystem to follow its example," says Thorkil Sonne, founder of Specialisterne and chairman of the SPF Board.

Based on a successful pilot project in India where (again working with Specialisterne) SAP Labs hired six people with autism, SAP now plans to take the collaboration to a global level.

Pilots in North America, including Montreal and Vancouver in Canada, Palo Alto in California and at its worldwide HQ in Waldorf, Germany are planned for 2013. Current running pilots are in SAP Ireland and India.

"By concentrating on the abilities that every talent brings to the table we can redefine the way we manage diverse talents," said Luisa Delgado, Member of the Executive Board of SAP AG, Human Resources. "With Specialisterne we have truly had a meeting of the minds with our common belief that innovation comes from the 'edges'. Only by employing people who think differently and spark innovation will SAP be prepared to handle the challenges of the 21st century. "


You can visit the National Autistic Society here for more information.

Programmers to embed video conferencing deeper into real life

bridgwatera | No Comments
| More

At some point during the 1900s (or thereabouts) what we used to know as video conferencing became known as presence, telepresence, or even immersive telepresence if you will.
Telepresence is in fact much the same video conferencing but carried out with the support of high-end "codecs" to enable users to speak and view each other over high bandwidth connections that support HD resolution images.

One of the higher profile players in this market is Polycom -- and the company is launching its new RealPresence CloudAXIS suite on the 6th June 2013.

The firm claims to have created a new piece of video software here to enable businesses to collaborate "independent of application or device" being used.

Attendees are sent a URL via email, IM or calendar invitation and can "simply click" to join a video session without downloading or purchasing software. It is billed as the industry's first global, presence-aware directory that integrates contacts from Skype, GoogleTalk and Facebook.

The application developer angle

Polycom ran its first developer contest earlier this year and challenged programmers to submit solutions using Polycom RealPresence open application programming interfaces (APIs).

NOTE: X2O Media won the grand prize with an application that delivers live and on-demand video content to digital signage devices, Polycom VVX IP phones, and video kiosks.

The firm now runs Polycom runs, a free portal and community for developers where they can search for and share code and app examples as well as ask questions of Polycom and independent developers.

The Polycom RealPresence CloudAXIS Suite is being supplied as an "off the shelf" retail product, so some customers are just buying the product and using it as is. BUT the suite includes APIs, which can be tailored and customised to create bespoke products which integrate into the workflows and processes of individual organisations.

There are 2 APIS:

1. The scheduling API, which is included in the price of the product
2. The Meeting Experience Application, which controls the in call/video client experience, and can be purchased for an extra fee.

The APIs can be used to create bespoke applications or a (developer) customer can purchase professional services (developer expertise) from Polycom in one of 2 ways: Hours of development time, in a 'pay as you go format'; or Project scope, working towards a customer defined specification.


The idea is that with the open APIs the suite can be adapted to suit the needs of customers in the various vertical sectors and industries that Polycom already service including the below examples:

In recruitment consultants -- who would integrate the suite with a CRM tool so that Meta data on the candidate can be accessed simultaneously, including CV, LinkedIn profile etc. Or contact centres -- who could include a video call button on contact pages so that calls be escalated from voice to video at the choice of the customer or the centre. For example, in terms of complaints management, seeing a customer eye-to-eye could help reassure them that their complaint is being dealt with

Where are all the real cloud toolkit technologies?

bridgwatera | No Comments
| More

Not enough companies are talking about the guts of cloud mechanics.

Areas we need more detail on include the implementation of automation controls, new security considerations for cloud environments, the real world deployment of governance layers -- and, crucially, application provisioning, scaling and integration.

Cloud-centric software development technology vendors talk about these fundamental aspects of new cloud installations and cloud migrations, but we rarely get down to the guts of the deal.
a tool.jpg
Given this shortfall, we need to sharpen our radar and look to companies producing "cloud toolkit" technologies.

Silicon Valley last night pumped out news of TIBCO's Cloud Bus offering. This subscription-based Integration Platform-as-a-Service (iPaaS) offering is designed to present cloud DevOps staff with a means of migrating applications and workloads to the cloud.

It's all about trying to combine the deployment flexibility of the cloud with enterprise-class software integration features that have existed long before cloud arrived.

"[This is a] single subscription service that customers can run anywhere -- on-premise, in the cloud, in bare metal or virtualised environments," said Matt Quinn, CTO for TIBCO Software.

"TIBCO Cloud Bus provides ready-made integrations across popular SaaS and critical on-premise applications, while allowing subscribers the ability to identify, configure and extend integration templates for their own business context with ease."

This is real-time integration such that changes are reflected in all connected applications as they happen, without waiting for the next batch update.

NOTE: VMware and TIBCO are working together to offer this technology as a service on VMware vCloud Hybrid Service.

"Once implemented, customers will be able to deploy TIBCO Cloud Bus and extend application integration from private to public clouds on the same infrastructure, with common management and orchestration," said Mercer Rowe, director of cloud services partner strategy, VMware.

TIBCO's Quinn asserts that this "brings a new level of maturity to the cloud integration market" - but then he would say that wouldn't he? What it does do is push forward more hands on cloud toolkit intelligence -- and this is what we need to hear more about.

The cloud developer's toolkit takes shape

bridgwatera | No Comments
| More

It's official. The software application development discipline now extends to a wider definition and the cloud software application developer has arrived.

Call him (or her) a 'cloud developer', call him a 'service-based computing programmer', call him a 'hosting coder' or call him Software-as-a-Service Application Developer (SaaSAD) if you will -- but the cloud developer has arrived.

NOTE: By virtue of this statement we must also be recognising the existence of the cloud DevOps role, so the Cloud DevOps pro also comes of age. But that's another story.

So what's different about cloud programming? Well developers will have to learn to code to and with a new set of APIs for sure, but that's not bad news.

It's important to realise that coding to Microsoft Azure environments (for example) is not the same as regular web development. Cloud doesn't always accommodate for multiple languages to be used inside any one instance as might be seen elsewhere, so the landscape will look different for sure.

Columns, stored procedures, runtimes -- these all behave differently in the cloud so a re-learning process is called for.

Developers approaching cloud deployments should be wary of the industry's cloudy (sorry!) marketing fluff that has instilled a perception of "unlimited resources are available for all" in the cloud. Unless a system has been architected to accommodate for this kind of extensibility, then the programmer may well find himself or herself shut out when they simply try and turn the volume up to 11.

Vendors are now lining up to serve the new cloud programming space. NetSuite this week used its SuiteWorld conference to announce Built for NetSuite, a new programme and method to verify the strength of applications and integrations built using the NetSuite SuiteCloud Computing Platform.

Built for NetSuite involves partners documenting and verifying their practices for architecture, development, privacy and security.

After the Built for NetSuite team reviews the partner's SuiteApp submission for completeness and appropriateness, partners must provide a demonstration of the product and/or positive customer references.

The SuiteApp may then be awarded the Built for NetSuite badge.
NetSuite badge.jpg
OK so this is an example of controls being put in place across the industry to try and lock down quality in terms of cloud development.

If we trust the efficacy and general worth of NetSuite's control layers (and we have no reason not to necessarily) then this is good news.

How many developers use in-memory computing?

bridgwatera | No Comments
| More

Big enterprise data vendors can't say enough about in-memory computing and the databases serving these new development streams right now -- but how much usage goes on in the real world?

The number of developers using in-memory databases in relation to their software application development has increased 40% worldwide from 18% to 26% during the last six months, according to Evans Data's new Global Development Survey.

Hardly the gospel by which all data devotees should now observe, but a potentially worthwhile estimation of the market as it stands.
The Evans technology adoption survey encompasses the views of over 1300 developers worldwide.

An additional 39% globally say they plan to incorporate in-memory databases into their development work within the next 12 months.

Developers in North America and the Asia Pacific region show the strongest upturn in adoption...

... and the EMEA region is the slowest to adopt the technology - oh dear, but it's only a survey so don't actually worry.

"In all regions we see a strong correlation between planned or current use of in-memory databases and the perceived importance of big data in the organisation so that is obviously a strong driver" said Janel Garvin, CEO of Evans Data Corp. "But the other thing that's interesting is the equally strong correlation between in-memory database use and use or plans for development in the cloud."

So then, it appears that interest centred on in-memory computing and cloud is on the rise, but it's higher in some regions than others... and big data is also quite important.


Intel 'Silvermont' Inside microserver microarchitecture

bridgwatera | No Comments
| More

New software application development stream possibilities have come to light this week after news from Intel as the firm announces its Silvermont microarchitecture.

This new technology targets tablets, smartphones, microservers, network infrastructure products, storage and other areas such as entry-level laptops and in-vehicle infotainment.

NOTE: microservers are smaller form factor scaled back servers that can be grouped into clusters -- these units come into their own for tasks that only require a smaller amount of processing power (and so NOT a multi-core behemoth) but where these tasks need to be carried out in large numbers, often in individual compartmentalised streams.

Silvermont sports 5x lower power consumption over current-generation Intel Atom processor cores in terms of its energy efficiency rating.

The new chip microarchitecture also (unsurprisingly) represents a boost in terms of performance as it sits in Intel's 22nm (nanometre) Tri-Gate SoC process in terms of its engineering.

NOTE: A nanometer is a unit of spatial measurement that is 10-9metre, or one billionth of a metre commonly used in nanotechnology, the building of extremely small machines.

Intel's Silvermont technology is aimed at low-power requirements in market segments from smartphones to the datacentre.

Web application developers may have their interest piqued here as Silvermont-powered microservers could be used to serve HTML web page components to millions of users.

"Silvermont is a leap forward and an entirely new technology foundation for the future that will address a broad range of products and market segments," said Dadi Perlmutter, Intel executive vice president and chief product officer.

"Early sampling of our 22nm SoCs, including "Bay Trail" and "Avoton" is already garnering positive feedback from our customers. Going forward, we will accelerate future generations of this low-power microarchitecture on a yearly cadence."


A secure global desktop through HTML5

bridgwatera | No Comments
| More

Oracle is combining the back end database zone that it is known well for with some front end application-centric advancements hinged around simplified remote access to enterprise apps through HTML5.

The firm has announced a new release of Secure Global Desktop.

This product that sits as a part of its Desktop Virtualization portfolio and the new version bids to extend "secure access to cloud-hosted and on-premise enterprise applications" and desktops from Apple iPad and iPad mini tablets, without the need for a VPN client.

So basically this is Oracle supporting the HTML5 standard to allow users to access enterprise applications with just a web browser.

The theory is that this technology provides access to a range of server-based applications and desktops, including those running on Windows, Solaris, Linux, as well as legacy mainframe and midrange systems.

In the Oracle Secure Global Desktop architecture, applications and desktops are deployed in the cloud (centrally managed servers) and can be securely accessed simply using a web browser on the client device.

This model attempts to shift the complexity of IT management away from individual desktop computers and into the datacentre, where it is (in theory) more-easily controlled and monitored.

"Enterprise users expect increasingly more mobile access to applications which are often designed to run on desktop PCs. Oracle Secure Global Desktop provides IT with a highly secure remote access solution for such applications, and even full desktop environments, from tablets," said Wim Coekaerts, Oracle senior vice president for Linux and virtualisation engineering.

oracle desktop .jpg

Image: Oracle Secure Global Desktop is a solution for accessing hosted
workspaces (diverse application and desktop environments) resident in the cloud from a
single Web browser.

How to avoid being a cash cow for cybercriminals

bridgwatera | 1 Comment
| More

In this contributed piece for the Computer Weekly Developer Network, principal consultant Paco Hope at software risk management company Cigital explains his security-centric approach to software application development.

Security from the start

For many years I have been telling organisations of all sectors, sizes and ages about the importance of building security into software early. The simple reason is that it is significantly more expensive if you find vulnerabilities further down the software lifecycle, which is true of any software defect.
This alone hasn't been motivation enough to get everybody building security in. In this piece, I will add a new piece of evidence to the argument, one that goes to the heart of many businesses' reluctance to change what they do or how they do it.

Why me?

So maybe you don't believe someone would bother with your software. Many organisations simply don't believe that cyber criminals have any reason to exploit their systems - and perhaps there was some truth to that for some firms in the past.

Today, however, cyber criminals do not care who you are or what your company stands for. If you have vulnerabilities in your software, they have real financial incentive to find it and build an exploit for it.

In recent years a very real and very large market has developed, where organisations (criminal, political or military) can buy and sell the knowledge of vulnerabilities and their corresponding exploits.

The shocking truth about hackers

A hacker may not care at all about your company, what it sells, or what happens to your company as a result of the vulnerability they find. They simply know that if they package that vulnerability with working exploit code, they can get paid real money for it. Although the money is in proportion to the ubiquity of your software (so exploits in software with smaller user bases may fetch a lower price), it's still money.

This is a phenomenon that is already happening. A prime example of this is the AT&T breach, whereby a security researcher was able to exploit a flaw in security and reveal the email addresses and details of 114,000 iPad users, including the White House Chief of Staff, Rahm Emanuel, as well as chief executives and military officials.

In this case, the perpetrator was jailed, but the point is that he was not out to make money; this was a politically motivated incident.

Had the hacker been financially motivated, he could have easily remained anonymous and sold the data to identity thieves. A few email addresses aren't worth much. But knowing that they are 100,000 iPad users on AT&T makes it slightly more valuable.

For years we've talked about creating good, solid software with the main goal of saving cost and time, but now, with the threat landscape being what it is, the incentive is to create good, secure software because there is a vigorous market and groups of people out to exploit deficiencies in your software and you will suffer the consequences as a side-effect.

Run fast without tripping

The other argument against doing it "later" in the lifecycle is that sometimes there is no "later."

Companies in fast-moving industries are growing from start up to multi-billion dollar enterprise within a matter of years. Going back and patching old software defects is simply not an option. By the time you know where your defects were, the defective version is on its way out.

Some of these companies that have experienced such rapid exponential growth are releasing new software so fast that instead of patching bugs in the software, they just completely replace it within six months. Building security in at the start allows them to retain the security lessons and propagate them into new versions.

This startup psyche is a relatively new phenomena, which has come from the birth of massive companies such as Facebook, Instagram, Pinterest and the like. There are definitely companies that, when it comes to patching software defects, think "we're moving so fast that going back to fix this is not an option."

Core design and architecture

However, if they get the key security principles right in the first place, they can run fast without tripping over. The design and architecture that you create early on will remain at the core of your business, and will be the foundation for your future.

So whether it's to keep running fast, to avoid being someone else's cash cow, or to keep traditional costs down, there are more reasons than ever to do security from the beginning, not just at the end.

What should a data evangelist's tattoo slogan say?

bridgwatera | No Comments
| More

Big data on its own is just big data, but big data with analytics equals insight -- and insight equals business value.

The industry is reverberating with this mantra and you can reasonably expect a few of the more committed data-centric evangelists to have the above slogan tattooed onto their forearms in the coming months and years.

Better still perhaps:

"Big data with real time analytics equals ACTIONABLE insight."

TIBCO Software (or as the purists among us know the firm: The Information Bus COmpany) is pushing for more recognition in this space through its partnership this month with Teradata -- the TIBCO Spotfire event analytics is now supplied as a component of the Teradata Unified Data Architecture (UDA).

So what we have here is big data capture combined with real-time event processing and analytics.

This means:

• simultaneously analysing historical data (data at rest) ...
• and real-time events (data in motion).

The Unified Data Architecture element here combines what is known as "closed-loop processing" of big data to collect and analyse real-time event streams.

The theory is that the insights extracted from the architecture enables business leaders to anticipate future events and take immediate automated action.

TIBCO's marketing glitterati like to call this the "two-second advantage" -- and they even have an ® for that term.

The value of the Teradata UDA to customers is the ability to leverage the tight integration of Teradata data warehousing, Teradata Aster discovery platforms and Apache Hadoop.

The two companies explain that this workload-specific architecture enables data centric developers to give business users a chance to gain competitive advantage through better insights from any data source.

Interestingly for developers here is intelligent filtering of "enormous streams" of event data before it is captured in Hadoop or other systems such as Teradata's own products.

So our real tattoo tagline here then is... "Big data with real time analytics equals ACTIONABLE insight for a two-second advantage."

Ouch! That's going to hurt going on.

"Teradata and TIBCO work with hundreds of the same customers, many of which have expressed the need for real-time stream processing within the Teradata UDA," said Peter Lee, senior vice president, TIBCO. "Our extended partnership with Teradata is fulfilling CIOs' long-time vision of a real-time analysis solution for big data. Not only can enterprises realise value with data discovery and analytics but now they are able to anticipate what will happen next and can act in real-time as new events stream in - truly delivering a two-second advantage."

UDA Image 1-5-13.PNG

IBM Impact 2013: a pictorial view

bridgwatera | No Comments
| More

IBM held its Impact 2013 conference, exhibition and unconference in Las Vegas this week and you can read news of the firm's latest IBM MessageSight M2M technology for connected cars and other implementations here at your leisure.

To wrap the week up, there have been enough words for now - so let's go with a pictorial view.

1 impact.jpeg

O'Reilly Books Tim O'Reilly and IBM Fellow Grady Booch


Lenovo with Intel Core i5 Inside but no IBM badge outside

4 impact.jpg

Lights, camera, RedMonk action: James Governor at the "unconference" sessions

5 imact.jpeg

Ah OK we get it, mobile is FIRST (not second) then


"Hey Vinnie, that's two prime ribs medium rare and an update on IBM's focus towards Message Queuing Telemetry Transport (MQTT) technology - you got that?"


IBM speaker in full flow, almost 9000 attendees listened this week


The show's over guys, but there's more in 2014

Subscribe to blog feed

About this Archive

This page is an archive of entries from May 2013 listed from newest to oldest.

April 2013 is the previous archive.

June 2013 is the next archive.

Find recent content on the main index or look in the archives to find all content.