Intel XDK HTML5 cross-platform development tool

bridgwatera | No Comments
| More

Memory maker turned chip maker turned chip maker who also makes a lot of software company Intel has released an updated Intel XDK -- an all-in-one HTML5 development environment.


Not an SDK, an XDK

Intel uses the X to denote cross platform -- the DK still means Development Kit.

The latest XDK is aimed at accelerating the deployment of new mobile games.

This software application developer tool can be used by game developers to write a single app and automatically build versions for Android, iOS and Windows app stores.


"Intel XDK gives gamers the ability to enjoy the same experiences across their PCs, phones, and tablets, and lets developers focus on creating new experiences instead of porting games to different platforms," said the firm, in a press statement.

Nice Asset, Management

This version adds popular game engine support for a range of code libraries and includes an Asset Manager to access, manipulate and manage game assets.

There is support here for the W3C Gamepad API for devices; and it also supports Google Play Game Services and Apple Game Center Services.

Intel XDK is available as a free download for Windows 7 & 8, Apple OS X and Ubuntu Linux.


"Intel XDK HTML5 Cross-platform Development Tool provides a simplified workflow to enable developers to easily design, debug, build, and deploy HTML5 web and hybrid apps across multiple app stores, and form factor devices," said the firm.

What to expect from QCon 2015

bridgwatera | No Comments
| More

QCon is coming to London again on March 4 2015 -- this is a software development conference with guts not glitz.


The event is described as a "practitioner-driven conference" designed for technical team leads, architects, engineering directors and project managers who influence innovation in their teams.

"This year's QCon continues our passion and commitment to providing a conference where software development innovators can learn more about innovator-stage topics from the best teams in the world, and from each other," said Floyd Marinescu, QCon conference Chair.

Highlights include a session entitled: Java - Not Dead Yet

The blurb says that Java is evolving to meet developer and business needs, from lambdas in Java 8 to built-in support for money types rumoured for Java 9.

Mobile is no longer the Next Big Thing say the organisers but a requirement for your business -- attendees can hear from those who have implemented successful mobile systems.

Emmanuel Marchal, managing director for EMEA at Basho Technologies states that although QCon is primarily a 'party' for developers, architects, engineers and other techy-types to trade thoughts on innovation and how they're achieving it -- there is no rule in the charter that states actual users of the tech are banned from proceedings.

"In-fact, a great deal of value can be found for those looking to spot the next innovative piece of software that may revolutionise their business. The speaker line-up, for example, sees an array of top end-user spokespeople, from CIOs to researchers, take to the stage with developers to discuss their use of tech," said Marchal.

He points to his session with bet365 and says that this should appeal not only to the developers but also the users, addressing the technical intricacies of next generation databases and the business value to be gained from a move to NoSQL.

What better for IT managers looking to tackle a persistent pain point than hearing how their peers addressed a similar issue?

Data Gravity

"Furthermore there are presentations from software providers that aim to enlighten both users and developers, with speakers such as Dave McCrory tackling issues such as data gravity and how very large volumes of data can negatively impact network performance. Data Gravity is an issue that can seriously impact a business, and such presentations can be vital learning tools for users' seeking to prepare themselves for the next onslaught of IT issues," said Marchal.

Other sessions include a look at how theories from neuroscience and psychology can help us better understand IT professionals and discover what really motivates them.

Speakers will also look at how to create reactive systems is more than simply learning a framework. Thinking in a reactive way helps you to design responsive architectures.

"QCon is a good event that acts as a pragmatic barometer, showing the evolution of software development. It does this not just in terms of technology innovation, but also by demonstrating how developers are becoming more 'connected' into the business as a whole through the growth of DevOps and Continuous Delivery," said Tulin Green, marketing manager, Perforce Software.

Perforce will present on the topic of 'High Performance Continuous Delivery - Versioning and Release Management Aligned', the session looks at the key requirements for optimising the pipeline from the developers' desktop to the customer.

The conference homepage is found at


Ericsson: The (optimised) network is the (app experience) computer

bridgwatera | No Comments
| More

The jury is out on application experience optimisation, or AEO if you prefer.

As web pages now automatically deliver content to mobile devices in what they think is a 'mobile optimised' format, not all sites offer the option to revert to 'Desktop Version' as Wikipedia does.


Given the size of an iPad or BlackBerry Passport screen, we don't always need it optimised thank you very much.

But application experience optimisation goes beyond screen size adjustments and websites -- it's a network level issue at the core.

Operators need to perfect network performance if they want customer loyalty -- and they know it.

But things are changing; Ericsson says that conventional network-related key performance indicators (KPIs) alone may no longer paint an accurate picture of the true user experience.

Fast-evolving app ecosystem

App Experience Optimization (note the caps and the Z to denote the branded product name) is a new service from the firm that claims to be able to "transform how operators optimise their networks" to meet the new demands created by a fast-evolving app ecosystem.

Not directly used by software application developers as such, but of interest to those who want to know how their apps are being served from the back end -- this service aims to create a picture of the local app experience and correlating this with network-related KPIs, which can then be acted upon.


Jason Marcheck, service director for service provider infrastructure at analyst house Current Analysis has said that Ericsson has always paid attention to how user interaction with the network impacts its operators' customers.

"This latest launch brings end-users' app experiences into the mix, marrying network optimisation services with insights from collaborations with over-the-top service providers to help networks perform better in ways that end users value most," said Marcheck.

Mendix CTO: If developers want RAD, get aPaaS

bridgwatera | 2 Comments
| More

Mendix is a company that offers what we call an aPaaS -- an Application Platform As A Service (aPaaS).


That doesn't tell us much.

So what is an aPaaS?

An aPaaS is meant for software application development pros to build business applications faster than by traditional methods.

Why and how?

Because an aPaaS is hosted so the operating system can be updated and upgraded more frequently by automatic controls -- and maintenance can also be performed from more of a backend position, plus there's also potentially better fault tolerance and (obviously) scalability, because this is cloud computing.

Mendix says programmers can use its aPaaS to design multi-device, multi-channel enterprise applications with their own data model, complex business logic, process flows and integrations using visual models and directly deploy to users.

Mendix's CTO Johan den Haan says that if programmers want to work in Rapid Application Development (RAD) environments today, they need aPaaS.

"Early RAD was great in theory, difficult in practice," says den Haan.

He argues that this is because the notion of fast, iterative development involving end users was years ahead of technology's ability to support it.

Thanks to the convergence of social, mobile, analytics and cloud, the promise of RAD is finally being realised.

Guest speaker content follows:

The following commentary comes directly from Mendixs CTO Johan den Haan.

Rapid application development's resurgence can be traced to the iPhone.


The explosion of mobile devices has given rise to app companies that are disrupting industries and forcing traditional businesses to reinvent themselves as software companies. Business need more apps to compete and they can't afford to wait years for them to be built.

In addition, our experiences as consumers have radically transformed our expectations for business software.

We're now used to apps that are built rapidly and updated frequently; that work across any device; and that are simple and intuitive. When IT can't deliver, business users simply take matters into their own hands.

Rapid Application Development is the 'new black', and it's on everyone's radar, from analysts like Forrester and Gartner to the big enterprise software vendors.

For proof, look no further than the arm's race between major cloud platform providers. Some are bringing rapid application development capabilities to market organically, while others are partnering with established vendors to round out their cloud portfolios.

Clocked creates socially-holistic profile matchmaking app

bridgwatera | No Comments
| More

A London-based software application development shop has produced a new app designed to tackle online matchmaking with an altogether more integrated and socially-holistic approach.

1 clock.png

Clocked connects a user's various online profiles to form a 'compatibility score' for potential matches.

Each day users are matched with potential suitors, with a couple of wildcards thrown in for good measure.

Okay so what you say?

Clocked's predominantly Agile-centric team says it is the first app to reveal compatible people partially based on information automatically pulled from users' various online profiles;

  • Instagram,
  • Facebook and,
  • LinkedIn.

So how was it built?

"We are an Agile shop," confirms Ben Lambert, CEO and founder of Clocked.

"The development was also done in Poland in two-week sprints... and using scrum. I have followed the whole thing on Trello (a free web-based project management application) and project managed from the UK."


"The back end is built on Ruby/ PostgreSQL for high scaling potential. The front end in ObjC and Swift giving the nice UX/ UI," added Lambert.

Cinderella twist

The team also built in a function which it calls the Cinderella twist i.e. users need to act on suggested profiles by midnight or they disappear.

In addition to location & age, Clocked takes into account values and other online profile information when suggesting how suitable a match is -- when browsing through profiles users will be able to see where shared similarities exist, making it easier to start those initial conversations.

This is shown by the 'Clocked Compatibility Rating' out of 5 stars.

Clocked is free and can be downloaded at, Apple iTunes and is coming soon to Android.

Image credit:

HP Haven Predictive Analytics: operationalising large-scale machine learning

bridgwatera | No Comments
| More

HP wants its new Haven Predictive Analytics product to be viewed as a route to operationalising large-scale machine learning and statistical analysis for today's big data volumes -- the technology is powered by HP's Distributed R programming language offering.

But isn't that all a bit of a mouthful?

Let's break it down.

Why predictive analytics and predictive modeling?

Because determining future outcomes and trends from existing data sets (potentially) allows firms to predict everything from customer buying behaviour to fraud detection to industrial plant machine downtime.

Why Distributed R?

Because distributed R is R itself, with new language extensions and a runtime to manage distributed execution i.e. in bigger enterprise environments.

Why is operationalising big data volumes a big deal?

Because none of this technology is easy from the get go, so HP is trying to kick start its use with out-of-the-box-algorithms (yes, sorry, that is a thing) as a set of proven parallel algorithms that produce accurate and consistent (so says HP) results with mature standard R algorithms.

The software itself enjoys native integration with the HP Vertica columnar massively parallel processing (MPP) database, which is supposed to increase overall data access performance and allow software application development professionals to start building software with predictive analytics inside.

ODBC parallel data loaders for dummies

Shilpa Lawande, GM of platform at HP's Software Big Data Business Unit suggests that when HP Distributed R is deployed with HP Vertica, overall data access performance is boosted by as much as five times over standard R ODBC (open database connectivity) parallel data loaders. According to a press statement, "Since Vertica fully supports industry-standard SQL queries, it enables a much broader community of developers and DBAs to employ the power of predictive analytics without the burden of learning an entirely new technology or tool."

HP reminds us that the open source R language is used by "millions of data scientists around the globe" to interpret, interact with and visualize data. It has been a powerful tool in tackling predictive modeling tasks such as drug discovery and financial modeling.

"Unfortunately, due to its inherent design, it has been challenged to process large data sets. HP worked out of HP Labs and HP Software to create its Distributed R extension and the result of this strategic initiative is the industry's first open source version of a distributed platform for R that is explicitly designed to address today's demanding Big Data predictive analytic tasks," said the company, in a press statement.

Comfortable warm R feelings

Now the global developer community can employ R to scale to more than a billion predictive records of data - and this is said to be 'an order of magnitude improvement' over traditional R-based performance. This offering from HP also retains the consistency with R and enables data scientists to use their familiar R console and RStudio to work with Distributed R -- and this could indeed be important for R converts.

Microsoft not cloudy on ISO cloud privacy

bridgwatera | No Comments
| More

Microsoft this week claims to be the first company to adopt the world's first international standard for cloud privacy.

The standard (known to its friends as ISO/IEC 27018) was developed by the International Organization for Standardization (ISO).


The standard exists with the intention of establishing a uniform international approach to protecting privacy for personal data stored in the cloud.

The British Standards Institute (BSI) has now independently verified that in addition to Microsoft Azure, both Office 365 and Dynamics CRM Online are aligned with the standard's code of practice.

Personally Identifiable Information (PII)

The code of practice is meant to oversee the protection of Personally Identifiable Information (PII) in the public cloud.

Microsoft promises that its adherence to the standard ensures that the firm only process personally identifiable information "according to the instructions" that users provide.

According to, "Adherence to the standard ensures transparency about our policies regarding the return, transfer, and deletion of personal information you store in our data centers. We'll not only let you know where your data is, but if we work with other companies who need to access your data, we'll let you know who we're working with."

If we have a break-in, we will tell you

In addition, if there is unauthorised access to personally identifiable information or processing equipment or facilities resulting in the loss, disclosure or alteration of this information, Microsoft says it let you know about this.

Do you feel safer knowing that?

Okay not really, but it does get better...

Microsoft confirms that it will inform you about government access to data.

"The standard requires that law enforcement requests for disclosure of personally identifiable data must be disclosed to you as an enterprise customer, unless this disclosure is prohibited by law. We've already adhered to this approach (and more), and adoption of the standard reinforces this commitment," said the company.

Progress then? Mostly yes - keep it up Microsoft.

Codeship fires Continuous Delivery rocket boosters

bridgwatera | No Comments
| More

Codeship, a continuous delivery platform, has announced ParallelCI -- its latest product designed to help 'get software to market', as they say.

What's inside a continuous delivery platform then?


Codeship turns a manual product release process into an automated one.

This type of product centralises around automating process for running and testing product releases.

The firm claims to now be making it as much as 10 times faster than Codeship previously allowed.

Aggressive production schedules

"As Codeship continues to grow, so does the diversity of our customer base. Companies with upwards of 30 developers, complex applications and aggressive production schedules simply cannot afford to waste time with a slow test environment," said Moritz Plassnig, CEO of Codeship.

"With the launch of ParallelCI, all our customers will substantially speed up their build times with little effort. This directly impacts the bottom line, as developers are more productive and engineering resources can be reallocated."

Sequentially loveliness

Previously, build or test commands used to run sequentially, one after another.

With ParallelCI, the developer can now set up extra "pipelines" so they can run multiple commands in parallel, creating faster feedback loops -- and ultimately getting to the finish line in the building process faster.

Ah, so not quite x10 faster then?

Early users of ParallelCI have already "doubled" their test and build times says the firm.

ParallelCI is currently included in Codeship's paid plans. Existing customers will receive access to ParallelCI automatically.

Sencha CEO: one (application deployment) size does not fit all

bridgwatera | No Comments
| More

HTML5 desktop and mobile web application development company Sencha has released its Space 1.3 product.

The software is intended to help deploy, manage and analyse business applications.

Here's the justification for the product's existence:

Organisations are increasingly tasked with meeting the IT needs of a diversified and extended workforce, in which many employees work from home, regionally in the field or all over the world. These employees may not have immediate or guaranteed access to a network connection, yet are dependent on their organisations' applications to perform their job -- certain industries, such as the healthcare and energy sector, require employees to continuously handle mission-critical and sensitive real-time data at field locations. This data must be readily and securely available to everyone else in the company with the right level of user authorisation.

One size does not fit

1 iwudgiwugd.png

"In today's high speed and diverse world, there is no single application or device that works for every person in an organisation - IT environments and end users are too varied and complex," said Art Landro, CEO of Sencha.

Multiple devices and operating systems

"With Space 1.3, we are extending capabilities to address crucial usage differences in the modern workforce, empowering organizations to deliver the ideal user experience no matter where employees are located or what applications they are using."

Sencha Space 1.3 introduces a new offline functionality for control over app versioning, allowing for multiple and different versions of the same application based on the user's profile.

Updates are also simplified, as Sencha Space provides HTML5 application development technology, with which IT can push application updates and edits directly to all platforms and devices.

A new white-label client application also features so that organisations can tailor a common look and feel across all platforms and devices.

Reinvention at the core: SAP S/4HANA

bridgwatera | No Comments
| More

SAP hung out the flags (literally) this month to celebrate the arrival of its SAP Business Suite 4 SAP HANA (SAP S/4HANA) product.

Did you get the idea yet that the marketing people like you to mention SAP by name?


Optimal optimisation?

This is the company's set of business software fully built on (and therefore optimised for) it's own HANA in-memory platform -- which, in itself, is optimised for the Intel chipset.

SAP punts the optimisation charge one step further and says that this software is aligned to the "most modern design principles" with the its own Fiori user experience (UX) for mobile devices.

Cue lots of use of ® symbols and mentions of SAP before all brand names, obviously.

The newly updated software is offered in cloud, on-premise and hybrid deployment options, naturally -- and comes with guided configuration for adoption.

So this is ---- "on-the-fly insight at the highest level of granularity and re-imagined real-time business processes," or at least that's what it says on the back of the packet.

SAP CEO Bill McDermott has said that SAP Business Suite has been reinvented for the digital age.

"At a moment when businesses around the world need to enter new markets and engage with their consumers in any channel, there's now an innovation platform designed to drive their growth. This is an historic day and we believe it marks the beginning of the end for the 20th century IT stack and all the complexity that came with it," he said.

The new suite is built only for SAP HANA -- but this is a good thing says the company... because it allows the firm to centralise upon its own technology and 'fully leverage' (they mean 'use') the latest in-memory and real-time capabilities of HANA itself.

What the customers think

"Businesses today are awash in data and faced with increasingly complex markets, customer engagement channels and business processes. Transforming technology systems can help simplify business processes, while providing more value to customers. SAP S/4HANA can help users connect processes, devices, big data and networks in real time."

"SAP has combined its expertise in business software applications with the unique power of SAP HANA to help businesses jump-start innovation and manage processes smoothly." - Rodney Seligmann, advisory principal and SAP global Alliance leader at PwC.

Editorial Disclosure: Adrian Bridgwater works for ISUG-TECH, the wholly and completely independent non-profit technical user group dedicated to SAP programming and data management technologies -- he is not an employee of SAP and receives no remuneration from the company.

Image credit: SAP

MuleSoft: The Internet of... woah! hold on there just a moment

bridgwatera | No Comments
| More

The Computer Weekly Developer Network blog talks to Ross Mason, founder of MuleSoft.

The firm aims to connect applications, data and devices with its Anypoint Platform featuring the Anypoint Platform for Mobile.

Ross Mason.png

Mason has said that when we think about the so-called Internet of Things, we should slow down and ask is the Internet of Things really here?

How will we know?

More and more things around us have sensors that passively collect information about people and activities, then this data is used by our apps on smartphones and tablets. Yet these things around us don't yet feel like they are making a big impact.

At least, not yet they don't.

The challenge is all these 'things' are still unconnected.

We need to integrate the applications, data, clouds, APIs, supply chains and partner ecosystems that make it possible for start ups and enterprises to deliver new business and consumer value.

CWDN -- With more sensors collecting information about people and activities, how do we move on from here?

Mason -- Developers in the IoT space need to think very differently about scale, reliability, security and dealing with many more connected consumers. Our traditional architectures need a rethink. IoT brings in a new era for edge computing and developers working on IoT projects have to think about more layers to enable 100,000s or millions of sensors to exchange information with the back end systems and also with each other.

One layer emerging is termed the Fog. Unlike the cloud the fog layer is concerned with connecting the sensors to backend or cloud systems. The Fog layer is essentially a collection hubs that sensors connect to that can be managed remotely but also have enough smarts to communicate with each other.

CWDN -- What's involved and how much 'heavy lifting' is needed at the developer architecture end of the API spectrum?

Mason -- Generally, because IoT type architectures are new for most people there is a lot of heavy lifting around device management, data management, architecture and connectivity to other systems. APIs are typically used to a) provide an interface to hub devices that sensors or smaller devices connect to or b) to provide access to the server side where developers can access data and maybe control some aspect of the devices either directly or through a hub.

Building APIs has gotten a lot easier with open languages to express APIs such as RAML and web-based tooling like MuleSoft's API platform (disclaimer I founded MuleSoft) to enable developers, architects and product managers to take a design first approach to APIs. These tools allow you to create repeatable patterns - traits of an APIs - and re-use them in all of your API implementations. This saves time, reduces usability problems and solves the major issue of creating consistent APIs across teams.

CWDN -- Highlight the advantage of creating APIs on the fly / seeing the outcomes in real-time whilst designing the API - can you explain why this is important for developers?

Mason -- This point about API-first design is important. It introduces the concept of APX (Application Programming eXperience) to API development, (which is borrowed from User interface design). It changes the way APIs are built today. It puts the focus squarely on the consumer of the API rather than the more technical aspects of building APIs. Most enterprise APIs are coded directly and then the code is annotated to describe the API interface (i.e. Java's JAX-RS). This is fraught with problems since the bit that the end consumer sees is slapped on as the code is written and there is no real design process to create an API around the requirements of the consumers.

A real example of this (who I can't name) is a company that has a mobile team and API services team. When the mobile team needed a new search API, the spec'd it on paper and gave it to the API team who took it away and then spent 2 months creating this new API. When the new APIs was available it wasn't what the mobile team needed. Partly it was the fault of the mobile team not specifying everything properly. And part was the fault of the API team that made assumptions and misinterpreted some requirements. Now wouldn't it have been better is they could have created the API by simply defining it with a simple language like RAML in a couple of hours collaboratively.

What if they could then quickly mock out the service so the mobile team can actually get a feel for it? And then once they agreed on a design, they could lock it down and the API team could invest the time in building it while mobile team could build their application against the agreed mock API. Introducing the design phase up front and using tools like MuleSoft's Anypoint Platform for APIs allows teams to work together in this way and focus on building APIs that are designed with the user experience first.

CWDN -- Will we be able to turn websites in their entirety into API channels?

Mason -- It's possible but the tools are so good and easy for creating websites and there are so many developers that have those skills that we'll keep doing a mix of traditional and API-driven web sites. New companies are already thinking API- first or Mobile-first for everything, so the shift away from traditional 3-tier web sites is gradually happening. Note that APIs strategies in the enterprise are being driven by mobile, not web sites.

Women in data

bridgwatera | No Comments
| More

Just when there aren't enough women in data, along comes a whole book of them.

Well, in seriousness, last week's AnsibleFest in London was a day-long conference for sysadmins, developers, DBAs and related engineering professionals to dig deep on configuration management but...


... of the 300 attendees, only one single handful were female.

Thankfully though, the wider imbalance is slowly and deliberately being rectified piece by piece.

O'Reilly writer Cornelia Levy-Bencheton's new 2015 title is simply titled 'Women in Data' and she says the gender gap in tech is shrinking.

An underrepresented minority

Women are still an underrepresented minority in the disciplines of science, technology, engineering and maths (STEM), but women in data and technology are no longer outliers or anomalies.

NOTE: She's American, she said "math" as a shortening for mathematics, the above text has been corrected.

For this book, author and data warrior Cornelia Lévy-Bencheton interviewed 15 women in data to learn how they achieved their current level of success, what motivated them to get there, and their views about opportunities for women.

Levy-Bencheton says that introducing women to STEM is now a nationwide crusade (she's American, what she meant to say was 'international' and 'global'), but advancing the idea that gender diversity fuels creativity, innovation and economic growth is still a challenge.

The stories in this book are inspiring, revealing insights that will widen the path for even more women in tech.

These interviews explore:

• The expanding role of the contemporary data scientist
• New attitudes towards women in data among Millennials
• Benefits of the data and STEM fields as a career choice for women
• Much needed and increasingly sought-after remedies for closing the gender gap.

The Computer Weekly Developer Network blog spoke to Carla Gentry who is based in Louisville, Kentucky USA in her role as data scientist at Analytical Solution -- a firm created to assist small companies who don't have an analytical department or companies that need a Analyst to come in for a few hours to assist on a per project contract, Gentry's comments follow below:

"I hope my story inspires someone who hasn't had the 'normal' career to see there are many ways to push through adversity. Just because you have kids or are divorced, you can still be a successful women in tech, business, or what ever you want. Never let anyone tell you that you CAN"T do anything and if they do, then make it your life's mission to prove them wrong! Best wishes to all thelLadies out there that want to better themselves and stand on their own two feet :o)..."

F5 framework targets slipshod IT delivery

bridgwatera | No Comments
| More

Fragmentation is everywhere.

Applications are fragmenting into cloud-based services, the Internet of Things is fragmenting (some would say 'cracking up') around a dispirited set disconnected systems with no clear base of standards and then mobile is fragmenting around the constant battle for market shipment dominance.


In other words, the data and application delivery landscape is (at time) a slipshod surface with the potential for potholes and pitfalls.

F5 Networks is aiming to target the slipshod and slapdash and bring order where there is chaos.

The firm used its appearance at Cisco Live this week to announce the next version of its F5 BIG-IQ, the company's management framework.

The product is intended to provide a single point of integration for administering and orchestrating security and application delivery services.

Role Based Access Control (RBAC)

BIG-IQ promises collaboration between the network operations centre and DevOps teams (if they actually exist as one unit) by centrally managing application delivery and by employing role based access control (RBAC).

The resulting workflow simplifies operations says F5 -- it also boosts efficiency and frees up the network team from having to set individual application delivery policies.

"As companies continue to invest in hybrid infrastructures to enable more nimble application delivery, BIG-IQ integrates both RBAC and a system for staging and scheduling configuration changes," said Karl Triebes, EVP of product development and CTO at F5.

Key capabilities include role-based central management of application delivery functions across the network to increase agility with software-defined orchestration of application services.

There is also 'orchestrated application delivery in the cloud' to enhance connectivity and partner integration with expanded orchestration and management of cloud platforms via third-party developers, as well as improved customer experience via workflows and integrations.

Rackspace #PowerOfSearch: defragging the data supply chain with contextual intelligence

bridgwatera | No Comments
| More

As you will know, journalists love nothing better than a really early start -- as such, the global PR industry is fond of staging what are known as 'power breakfasts' now and again.

So it came to pass this week that Rackspace hosted another in its series of coffee- and croissant-fuelled get together(s).

Rackspace #PowerOfSearch

1 raxspace.jpeg

Attendees at this event included: Nigel Beighton, VP technology, Rackspace; Chris Harris, VP of international at Hortonworks; Mark Harwood, a developer from Elasticsearch; Peter Owlett from Capgemini and Tony Duffy, e-commerce manager at Oddbins.

Beighton contended that most ecommerce websites have move on very little (or not at all) since the turn of the millennium.

Silos vs. centralised architectures

The problem here (the speakers suggest) is down to fragmentation and the fact that the "data supply chain" fails to exist as one single solid stream because it has not been unified.

This issue is further compounded by the fact that elements of IT sit so separately from each other:

• transactional processing sits apart from...
• analytics which sits apart from...
• bricks and mortar IT... and so on

According to a recent survey from Rackspace, almost half (45%) of UK consumers actually prefer to shop on the high street instead of online.

"Frustrated by long winded search functions and too much choice, over a third of shoppers will give up browsing a website after just 10 minutes if they can't find what they want. This demonstrates consumer frustrations at search capabilities and retailers inability to use big data to offer a truly useful experience online, beyond the best price," said the company, in a press statement.

Looking routes to defragmentation

So looking at data streams now and thinking about how software application development professionals will code to a data landscape that is fragmented with too many imprecise undefined elements - where do we go next?

Rackspace's Beighton argues that much of the challenge comes down to TRUST - and the question of where we are happier to provide access to personal information.

Contextual intelligence

Computer Weekly technology editor Cliff Saran suggested that he gets frustrated with search with regard to Amazon and the fact that the web services driving the site's offers keep promoting products to him that he had already purchased.

Surely the solution here comes down to contextual intelligence:

• If I buy an exercise bike (a high value item that may be a once in a lifetime purchase) then I should not see ads for this item again.
• If I buy a birthday present, then the systems used should offer the option to know that it is a once a year purchase.
• If I buy a food item or other similarly regularly repeatable product, then flagged promotions are more permissible

The lesson here for CTOs is one that should make them look back at their own data supply chain and their internal approach to information share - and we could be talking about in-company data usage.

Coming back to Beighton's point... we can not always expect all of this information to be available and it comes down to trust -- and perhaps, privacy, identity and security.

This debate may have highlighted some of the issues impacting search, the trouble is -- even if we do know how to make things better, privacy and trust may represent a barrier here.

The Chief Data Officer's job will always be a tough one.

Gut instinct drives more firms than big data analytics

bridgwatera | No Comments
| More

Big data is great, isn't it?

The rise of advanced analytics exerted upon big data stores and the opportunities for software application developers to engineer new 'insight-empowered' applications (and embedded application sub-elements) is taking the IT industry by storm.

This of course means that firms up and down the land must surely be now 'leveraging' the new opportunities that exist for operational intelligence and making more money and breaking into new markets, right?

1 eye.png

Data platform company Rosslyn Analytics suggests that perhaps only a quarter (23%) of decision makers closely align business strategy to data already held by their organisation.

Machine learning

A specialist in human-driven machine learning and NoSQL technologies such as Hadoop, MongoDB and ElasticSearch, Rosslyn further states that less than half (44%) of the business leaders it spoke to thought that data was considered a strategic asset.

This perhaps suggests that there is still some way to go for the importance of data to achieve widespread recognition.

"Most organisations continue to make decisions without data," said Charlie Clark, CEO, Rosslyn Analytics. "We believe business leaders need to renew their efforts and focus on improving the accessibility and quality of data required to make informed decisions that are aligned to business objectives. In today's age of intelligent, self-service data technologies, there is no excuse for data not to be in the hands of decision-makers."

The research coincides with the launch of Rosslyn Analytics' new business user report entitled, "Data: The Art of the Possible," a first-of-its-kind guide to understanding how all decision-makers can easily create value from multiple different data sources.

Other statistics identified by the recent Rosslyn research include:

  • When asked to identify the biggest barriers to using data, the single biggest challenge cited was that data was from too many sources and of different types was identified as according to 43% of the respondents.

  • Poor quality of data was cited as the second biggest challenge to data being used within the organisation.

  • Only 40% of respondents believe their organisation effectively exploits its internal data to gain competitive advantage.

  • When asked to rate what type of data was most valuable to the organisation, "product data" was considered, on average, the most valuable.

  • "Customer data" was rated as second most valuable type of data; "financial data" and "spend data" were seen as third and fourth most valuable respectively; "employee data" was seen as least valuable of the data categories.

  • Rosslyn Analytics' "Data: The Art of the Possible" report claims to detail how data already held by an organisation can be enriched to provide tangible value.

    "Our research shows that only 30% of business leaders explore data with a set question." Clark continues. "Understanding data is key to achieving data-business alignment, where data not only informs business strategy but the business strategy also dictates the type of data owned by the organisation."

    For example, by combining finance data with other types of internal and external data, the answers generated can propel growth, increase profitability and meet compliance standards. Enriching product data in a similar way will result in more efficient and more innovative software application and eventual product development.

    Synthetic DevOps: IT operations with bendiness

    bridgwatera | No Comments
    | More

    The new DevOps is more synthetic.

    In terms of real world workflows, the new approach to developer-operations intersections comes from a world that is more malleable, more pliable and altogether more bendy by far.

    DevOps: we have a problem


    Rob Markovich is a networking technology executive at IT operations intelligence company Moogsoft.

    Markovich has said that software application developers and IT Ops managers have agreed that while DevOps is speeding the software deployment process, it's also becoming difficult to maintain service assurance -- so we have a problem.

    What he means is, as new code is being deployed, seeing how it might adversely impact other parts of software is becoming more tricky -- especially when speed is of the essence and Continuous Delivery principles rule.

    Mode 2 DevOps

    "IT support teams are recognising they must embrace new approaches in terms of how they operate as Agile DevOps practices help to speed the time it takes to develop and put new code into production. Gartner's Will Cappelli refers to this era as 'Mode 2' and calls for a new generation of enterprise IT tools to support this pace of change," said Markovich.

    Markovich identifies five characteristics that consistently come up as he talks to enterprise IT shops about what is needed in from the next generation of DevOps tools.

    The five principles of Synthetic DevOps

    The new DevOps needs to be...

    Data-driven - utilising the growing volume of data from throughout the IT stack and potentially other systems to layer intelligence on top of intelligence.

    Collaborative - facilitates cross-team interaction to solve complex, cross-domain challenges.

    Self-learning - incorporates algorithms that "learn" and further hone themselves over time, lessening the need for manual re-tooling.

    Anti-monolithic/proprietary (i.e. taking a fully open, modular approach) - easily integrated with other software and systems within the IT stack.

    Automated - short-cutting operational process that previously was manually performed, e.g. discerning service-affecting situations out of clustered alerts, alerting the right personnel to come together and collaborate to remediate the problem.


    Open Norse: what to expect from Monki Gras

    bridgwatera | No Comments
    | More

    No not Las Vegas, but Shoreditch instead.

    Developer-focused analyst house Red Monk stages Monki Gras in London on January 29-30 2015.


    Nordic scalability

    This year's event is themed around Nordic craft and culture, that zippy zeitgeist cornerstone of software application development.

    We jest, there's a reason for the Scandinavian slant and it's not just an excuse to chow down on some salmon and aquavit (although those are probably planned too).

    The Nordics have shown a distinctive ability to scale their tech start ups (arguably) better than London and perhaps even Silicon Valley where the urge to sell all too often overtakes the desire to refine, blossom, grow and develop.

    Vital Nordic Internet infrastructure, these guys build the web

    The Nordic region has given us Qt, Skype, Kazaa and Spotify -- to name just three.

    Let's also not forget that Linis Torvalds grew up in Finland.

    What makes Nordic Culture so productive asks Red Monk?

    "Education is clearly fantastic in the region, and the winters are long and cold, perfect for heads down coding," say the organisers.

    "Design and lighting are both very important in Scandinavian culture too. This all comes together. Code, design, liberal social values, education, great taste, modesty, skill, practice. This conference is going to explore all of these themes, and it's going to rock."

    Viking craftwork

    Red Monk's own Donnie Berkholz will use his knowledge and position as an open source developer and leader of Gentoo Linux to present 'Viking Reprise: Nordic Undercurrents in US Tech Culture'.

    Other speakers include Jason Hoffman, head of cloud at Ericsson, Joonas Lehtinen who is the founder and CEO of the Vaadin project, a Java-based framework for easily building great web UI in Java -- and Patrik Sallner, CEO of MariaDB (formerly SkySQL) who will be presenting 'Sibelius, Sauna and Sisu - Why The Nordics Trust the Community'.

    "Last year's Monki Gras inspired me to brew my own craft beer, so it follows that this year's event taps into the software craftwork coming from the Nordics," commented Per Buer, founder & CTO, Varnish Software - the company behind Varnish Cache, a trusted open source web accelerator enhancing web performance for businesses online.

    Per will present 'Fighting with Polar Bears and Other Challenges You Encounter When Running a Startup In Norway'.

    "Nordic companies can be a bit low key and bury their innovations under the snow. I'm glad that Monki Gras has come along with its shovel to expose these and share with the community. I'm really looking forward to the event, and also sampling new craft beer!"

    How many tech conferences have all girl post punk bands, roast reindeer and deep code goodness on the menu?

    Answer: not many, but there is one.

    Perforce's Über-cluster developer team: 10s of thousands of concurrent users

    bridgwatera | No Comments
    | More

    Version management and collaboration platform company Perforce is envisioning the shape of developer teams to come, which will feature tens of thousands of concurrent users.

    To do this, the firm has added server clustering capabilities to its version management engine, which now enjoys a more horizontally scalable architecture.

    1 prfigi.png

    Tales from topographic software oceans

    The latest release also features new high availability (HA) deployment topology.

    Additionally, a new cluster management utility, catchily named P4CMGR, bids to streamline the addition of new Perforce servers and provides monitoring for automated switchover in case of failure.

    This server cluster distributes workloads across multiple nodes, but workload distribution is invisible to users, who connect normally while the cluster intelligently assigns workload to nodes.

    Standby depot server

    High availability is made possible by a standby depot server that ensures all users keep working when the master depot server must be taken offline for maintenance or fails.

    Automated switchover ensures a new level of uptime confidence vital for financial services, health care, manufacturing and other highly regulated industries.

    "Businesses today need a 24/7 development platform that supports distributed teams working together around the clock," said Christopher Seiwald, founder and CEO of Perforce. "Our server clustering brings the zero down-time ideal within reach while letting administrators expand capacity incrementally for on-demand growth at any scale."

    Optoma projectors: time for games programmers to think big again

    bridgwatera | No Comments
    | More

    The software application development industry's games segment has spent a good portion of the last decade (and more) looking to deliver playable games on ever-smaller devices.

    1 projector .jpeg

    The move to mobile is of course responsible for this trend.

    Mobile gaming requires that games are reengineered and re-architected with a consideration for devices with:

    • Less memory
    • Smaller screen sizes
    • Compromised input controls
    • Comparatively slower processing power etc.

    This is all great, but hard-core gamers still like consoles and dedicated gaming PCs.

    With this in mind, it came as something of a revalation to test out an Optoma 'short throw' gaming projector which produces a 100-inch picture on a screen or wall from just over a metre away.

    Usability note: As an Xbox360 gamer, I use a 50-inch Panasonic television for gaming and find this to be a quite explosive experience... the move to an Optoma machine was quite a pleasant shock, it's fairly impressive and the pixel quality is superb -- you can use the projection screen to get a pretty big image, but if you have a white wall in your house, then the size is literally mind blowing.

    This full 3D 1080p projector has built-in speakers and its colour production is down to its 2,800-lumen brightness -- and yes, an HDMI connection cable allows you to watch TV shows and movies.

    Optoma's head of product marketing Justin Halls spins his sales line on the basis that few of us have the space for a large-screen TV, but bigger is better when it comes to gaming he says - particularly for high definition systems.

    "Two HDMI inputs are a huge benefit of the GT1080. Games consoles, Blu-ray players or digital television set top boxes can be simultaneously connected via HDMI, making switching between sources literally the push of a button. It can even be turned it into a smart display by connecting a smartphone or tablet with a single cable using MHL to play games, stream videos and share photos on the big screen," said Halls.

    The gaming mode setting is designed to optimise the projector for lightning response times, maximum contrast and vivid colours.


    Other personal thoughts

    There's a real wow factor here with this unit. We tested it with Tomb Raider Anniversary (ok sorry, we like some old titles) and also Battlefield 3. The game experience is certainly totally immersive, but if you do project up to (for example) 8 or 10 feet wide on a wall, then the game almost becomes too much for your brain to take in. The 3D shapes in Tomb Raider are too much for you to take in and compute in your own head and the action in Battlefield is almost too real. But that's not really a criticism, you have to try this and blow your mind just because you know you can. Does it replace a big TV (as it's around the same cost), yes it might do for some, this is a very interesting unit.

    Editorial disclosure: Adrian Bridgwater was lent an Optoma unit for one week.

    How to reboot a Smart Car software system

    bridgwatera | 1 Comment
    | More

    Editorial clarification: The Computer Weekly Developer Network blog primarily exists to cover enterprise-centric software application development and data management industry news and analysis -- we occasionally look at hardware products from a internal software perspective, hence the reason for today's post.



    As we know, we have spent the last couple of decades and more putting more and more electronics into our automobiles.

    From an embedded software perspective this should all be good news i.e. more in-car entertainment, electronic central locking, satellite navigation units and system health intelligence... these are all good things.

    Well they are, if they work.

    But there comes a time when you just kind of wish that your car locks worked by plain old key-and-lock mechanisms, rather than via the 'point-and-click' wireless 'key fob' systems that we have all got so used to now.

    The story here is simple enough.

    A Smart Car's lock system goes haywire when started on Boxing Day after a trip to from London to Salisbury UK -- the ignition key can not be removed from the car without the central locking going into a sort of 'possessed dance' with itself, constantly clicking on and off.

    Your driver (that's me) has the foresight to a) initially leave the ignition key in the car so that the locks don't burn out and then b) disconnect one of the battery terminals so that they key can be removed.

    A dead car with disconnected battery is not an attractive option for a thief it appears, the car sat like this for two weeks untouched.

    So what to do?

    Green Flag Breakdown's superb fleet of (often Polish, thankfully, these guys are great) engineers are always on call and our mechanic suggested that it could be down to a 'rear lock solenoid burnout' -- he was close and in the right ball park.

    The problem was down to what is called the Smart's 'Zee control unit' -- and this is basically a piece of embedded software (firmware, if you like) that resides on a small motherboard located under the dash quite near to the fuses.


    The Zee unit is accessible via a large port similar to the power input socket on the back on an xBox360 -- image credit

    Mercedes bends

    But here's the real killer part of this story, I took my car to the main London Smart centre which is Mercedes Brentford -- and these guys knew what was wrong.

    They presented me with a bill estimate for £960 to fix everything that was wrong with the car including the Zee unit (which was estimated individually at £264 inc VAT plus £156 plus VAT for key recoding -- so a total of £420 GBP).

    What is gut wrenching is that the fix was so simple, but I was not offered the option for a simple quick reboot.

    To Mercedes Brentford credit, they should have charged me for the consultation, but as my car is only worth £1000 and the estimate was £960 for fixing, even they felt bad.

    So to the fix

    After many web searches and Facebook discussion exchanges (thank you everyone!) it turns out that your best option (if you need to get any automotive software reinstalled) is not just a local garage, but an independent specialist that is not a main dealer.


    Our saviour in this car was the Smart Clinic in Harrow.

    Has (that's his name) at the Smart Clinic saw our almost dead car, plugged in a Toshiba ToughBook thing, clamped in an Xbox 360 cable to connect it to our car, set out rebooting the whole automobile - started the engine, problem solved.

    Less than half an hour and £80 including VAT.

    So the moral of this story is...

    ... electric locks are more trouble than they are worth, always trust a Polish mechanic, never go to a main dealer if you can find an independent dedicated specialist and ask your friends on social media what to do if you have major mechanical issues with any piece of equipment because community knowledge is all-powerful.

    Smart Car? Well, sometimes.

    Subscribe to blog feed

    Find recent content on the main index or look in the archives to find all content.