June 2013 Archives

How fast is the developer population expanding?

bridgwatera | No Comments
| More

Evans Data has made some big picture projections by offering up its guess at the future size of the worldwide developer population.

The firm says the total developer population worldwide is expected to increase to 26.4 million by 2019, up from 18.2 million today.

The strongest growth is expected in the APAC region with India and China leading all countries in expected developer population growth.

Russia is also expected to provide strong growth and dominate the Eastern European region.

Slower growth is predicted for North America and Eastern Europe, while Latin America is expected to continue on a moderate growth cycle.

"APAC has shown the strongest growth for several years," said Janel Garvin, CEO of Evans Data. "India is, of course, the main powerhouse that's driving such a strong surge in developers in that region, but China is also starting to come online and we expect that once it hits its potential China will become a rival to India on sheer numbers of developers."

IMG_0427.JPG

Einstein's secret answer to unstructured data

bridgwatera | No Comments
| More

Software application developers today are struggling (we are constantly told) with the task of juggling unstructured (usually big) data.

Grafting unstructured data onto application and data management landscapes populated with other more structured (and even semi-structured) data makes the task even more troublesome.

One data store to bind them all

EMEA marketing director at Perforce Mark Warren suggests that the answer here will come (in part) through more automation (de-duping, mining content) and tooling that helps either avoid or reduce duplication.

"For example, centralised repositories which are shared by all and links or references get shipped around not the data. Actually the implementation could be decentralised or distributed but the appearance to the human being is a 'one data store to bind them all' as it were," said Perforce's Warren.

If email is the most visible example of unstructured data growth in most organisations, then shouldn't developers address the applications they build that touch this massively popular communication channel?

Head of global product management at GFI Software Sergio Galindo reminds us that data from the Radicati Group reports an average of:

• 144.8bn emails are sent every day,
• 90bn of which are business-related messages and,
• 60% of all email on average is spam

Galindo comments as follows, "Many users treat their email inbox as a database and repository, as well as a transmission medium, for everything from links to videos to Word docs and spreadsheets. The thing is, email silos are not built for the way they are increasingly being used, and using them as ad-hoc unstructured databases may provide users with a short-term productivity fix, but in the longer term it is both a major hindrance to workflow and places company data at significant risk of loss, corruption and theft."

According to GFI's Galindo, unstructured data management needs a tough, disciplined approach that incorporates good policy and stringent enforcement, alongside training and technology adoption to ensure that unstructured data sources like email is backed up, archived and retained to enable fast future access.

Strict policy enforcement is important, but we must be careful not to go to far lest we might lose the context for this unstructured data that we seek to find and pin down.

Tony Speakman, director at FileMaker suggests that the key to handling unstructured data is in the old Einstein quote:
Albert 456px-Einstein_1921_by_F_Schmutzer.jpg
"Make everything as simple as possible, never simpler."

Speakman says that we can not force people to create information to fit a database or strip down what they create, or we'll lose the meaning of the data.

"We need to make technology expand to fit around what people create. There will never be total order but we can herd data in a way that people can understand and therefore use effectively. One example is a FileMaker developer working with the Bodleian library to digitise the writings of Voltaire. This involved creating a system of search engine like tags to order transcripts and letters, making it all completely searchable for the end user, without the need for Voltaire to alter his behaviour," said Speakman.

When to use rapid software application deployment

bridgwatera | No Comments
| More

Software development (and deployment) is changing.

As software application development processes get a) more dynamically refined b) more imperatively pushed towards the need for continuous development c) more continuously integrated and d) more rapidly delivered -- we are naturally seeing a reshaping of the application processes available.

Specifically, it's no longer just a question of Rapid Application Development (RAD), but also a case of rapid-deployment solution modules.

Software firms are now producing deployment suites for specific industries.

SAP technology partner Rolta targets its OneView suite specifically for the process manufacturing and utility industries for rapid deployment.

These are pre-configured solution modules with operations, asset management and maintenance tools to "combine SAP (and non-SAP) software and content with software and services" from Rolta for a final solution.

Rolta says it will "rapidly install" these OneView rapid-deployment modules within predetermined time, cost and service scope.

TECHNICAL DEFINITION: Rapid-deployment solutions are typically ready-to-use combinations of pre-configured software packaged with fixed-price implementation services, content and so-called "end-user enablement" technology that have been specifically tailored to industries or line-of-business needs.

"[This type of solution is] designed to offer a quick, cost-effective method for standardising processes and adopting the latest innovations with fewer migration risks. The average timeframe for a typical deployment is eight weeks or less, which helps customers lower the cost of implementation and speed time to value, while retaining the flexibility to extend the solution according to individual needs," said the company, in a press statement.

oneview1.JPG

Wearable tech? Don't throw away your smartphone yet!

bridgwatera | No Comments
| More

In this guest post for the Computer Weekly Developer Network we hear from Gary Calcott, technical marketing manager at Progress Software with an explanation of why exponents of 'wearable' technology shouldn't throw away their smartphones just yet.

Could wearable tech start replacing the smartphone?
a 800px-Wearcompevolution.jpg

The announcement of Google Glass a little over a year ago turned so many heads that it wasn't long before some even began to question whether it, and other similar pieces of wearable hardware, were on the verge of replacing the smartphone.

So do these devices spell the end for our use of smartphones as business tools?

The simple answer is 'not yet'.

Wearable tech's "limited effectiveness"

The fact is that the ability of technology like Google Glass to add discernable business value will be judged on the usability and effectiveness of the applications that they can run. However, with limited screen real estate and processing power, there has to be a limit to the effectiveness that bespoke apps for these devices can provide.

Perhaps the real value in these devices will come from using smartphones in a slightly different way?

By enabling the Google Glass API to connect to other mobile devices, more data-intensive applications will be available to users on the device. In turn, the wearable device will act as a conduit for notifications being run on the smartphone sitting in the user's pocket.

NOTE: As a result (if the above suggestion holds), then the role of the developer will become increasingly critical in allowing wearable devices to be able to access the information it needs from a smartphone at the back-end.

At the same time, it's clear that it will provide a significant business opportunity for ISVs who have the expertise to enable smartphones to use the APIs for these devices to add even further value. This, in turn, could create a future API economy of sorts, with ISVs playing a key role in the ability of businesses to successfully use these devices.

Wearable tech version 2.0 already

We could even see next generation of smartphones being built with the specific purpose of allowing greater usability through APIs to newer, even more portable devices for businesses.

There's no doubt that we're at the beginning of a new frontier in wearable technology, and that devices such as Google Glass are only the tip of the iceberg.

a Zypad.jpg

IBM and Microsoft build cloud application lie detector

bridgwatera | No Comments
| More

Microsoft Research and IBM have developed a new project with the aim of verifying computations carried out in cloud environments to provide a higher level of validation.

Project Pinocchio for "Nearly Practical Verifiable Computation" exists to address privacy and security concerns for computations carried out in third-party cloud application environments.
pinocchio1.gif
More specifically, Pinocchio seeks to provide the user with confirmation that their data has been handled correctly and to verify the correctness of the results returned.

According to the Microsoft Research blog, "We introduce Pinocchio, a built system for efficiently verifying general computations while relying only on cryptographic assumptions. With Pinocchio, the client creates a public evaluation key to describe her computation; this setup is proportional to evaluating the computation once."

EDITORIAL COMMENTARY: Why is Microsoft Research ruining a perfectly good story with an overtly politically correct use of the term "HER" when no sex needs to be mentioned ** sigh **, oh well, back to the interesting part.

Pinocchio will work as a "kind of lie detector" to check whether a cloud service carried out the workload it was supposed to -- or, importantly, whether it might at any stage have been maliciously compromised and diverted or subverted.

Microsoft researcher Bryan Parno has said that the verification key "behaves like a digital signature", in that you can provide it to any third party to check a result.

"The proof [of the cloud application validation] is only 288 bytes, regardless of the computation performed or the size of the inputs and outputs. Anyone can use a public verification key to check the proof," says Microsoft.

The firms jointly confirm that Pinocchio's verification time is typically 10ms: 5-7 orders of magnitude less than previous work.

Parno confirms that Pinocchio is not yet ready for real world usage and deployment.

The birth of the data programmer

bridgwatera | No Comments
| More

The term "data programmer" now appears more readily on job posting boards serving the software application development community.

More accurately, the job tends to be referred to as data programmer / data analyst.

A random job listing taken today reads:

"The Data Programmer / Data Analyst must be a flexible team player able to use skills in programming and software development, data mining, as well as database management. Work directly with client managers and technical staff to understand business problem, develop predictive models and deploy/implement models into client database/data warehouse system..."

So will this role require new agility and/or skillsets of the individual?

ReadSoft UK's Adam Chapman says that we need to have some way of joining up all the data sources we face today and be able to get a single view of the information within the business -- and that this is the task ahead for the data programmer.

"This means extracting data in a meaningful way from any format of document, tagging it and processing it so that business analytics can then be applied in a meaningful and valid manner," said Chapman.

But while data control (tagging and processing) skills are important, so are modeling and management competencies.

Modeling and management

Anthony Saxby data platform product marketing manager Microsoft UK argues that analytics is increasingly being used to assist organisations in making decisions through the application of modelling to determine response to trends, identify underserved segments and pursue opportunities for product innovation.

"Whilst a level of analytics has always existed in the computer industry -- in fact the very first application developed for computers was for rudimentary weather forecasting -- the wide availability of huge amounts of processing power and storage has opened opportunities for even the smallest organisation with the required foresight to use data to build a deeper understanding of where to direct their attention."

Deeper circumstantial challenges

SMB owners take note, that was "even the smallest organisation" there, but there are deeper circumstantial challenges ahead.

F5 EMEA product manager Nathan Pearce argues that the key to using data analytics effectively is context.

"There's a huge amount that can be understood about the user and usage -- such as time of day, geographic location, access policy, device, operating system etc. -- to better understand a business and its customers."

"This information can be used to optimise the experience every single time, by routing traffic internally to serve up the app to an end user in the right way. At its most basic level, for example, mobile devices can be directed to a separate web server for a mobile interface while sending desktop connections to full-function applications that are expecting users with full screens and high-speed access," said F5's Pearce.

So the new data programmer cum analyst has a big role ahead. The need to understand what "type" of data the data in hand is will be key.

Access, security, control, contextual meaning and interpretation, wider system management and database connection point skills will all come to the fore now.

This is going to be a tough (although interesting job) -- the annual salary indicator below shows that pay is going up in this sector, but perhaps not enough if we take on some of the vendors' comments above.

a data.jpg

SAP: analytics builds "almost neural" future computer systems

bridgwatera | No Comments
| More

Analytics is one step away from artificial intelligence and advanced analytics will be used to build "almost neural" future computer systems that can learn complex patterns and control our world.
rob-coyne.jpg
This is the visionary view of SAP sales director Rob Coyne who writes on his firm's blog this month in a non-news spun piece of commentary.

Descriptive vs. predictive analytics

Coyne defines the difference between "descriptive analytics" and "predictive analytics" and says that descriptive analytics has been used to drive traditional Business Intelligence (BI) up until now i.e. using historical data to explain what has already happened.

Predictive analytics has been used to identify what might happen in future, based on existing patterns and relationships.

But a new dawn may yet rise...

Neither descriptive or predictive analytics has perhaps set the world alight because if it is focused on (for example) isolated transactional data (such as credit card purchases or other types of data that change over time) these data flows contain very little insight into the behaviour of the individual who generated the transaction.

But says Coyne, when you can combine multiple data sources and analyse them in real time, predictive analytics starts to get a whole lot more compelling.

SAP's Coyne blogs as follows:

"The advent of real-time data virtualisation, aggregation and processing has enabled automated, actionable insights through predictive modelling, decision analysis and optimisation, and transaction profiling. This is already leading to the creation of advanced, almost "neural" systems, which can learn complex patterns amid large data sets to predict the probability that an entity will exhibit behaviours that are of interest to the business. It's not confined to structured data - embedded social insight is allowing enterprises to embrace a streaming, crowdsourcing architecture to influence their strategy."

1 ade.jpg
A new dawn for data and analytics...

IBM Innovate: Ghostbusters is the best startup movie ever

bridgwatera | No Comments
| More

IBM senior VP Robert Le Blanc has used his speaking opportunities at this year's Innovate 2013 technical summit to reinforce his firm's now quite heavily reverberating DevOps message.

The IBM developerWorks software application development blog describes DevOps as a movement, rooted in the "agile" (as in Agile software delivery) community, to improve the collaboration between the development and operations teams with an ultimate goal focused on speeding software delivery and improving quality

For a clearly labelled "technical summit", IBM is focusing on really big picture messages here: initial keynotes and presentations doffed a deferential hat to customers, customers, customers with a side order of enterprise predictability and a weighty after dinner portion of business level application monitoring and optimisation.

Languages, platforms, plug-ins, components, code dependencies, libraries and developer-centric code listings did not feature at the top level morning sessions, as they did perhaps half a decade ago at this event.

This is perplexing on one level, but unfair as a whole if you take in the whole picture here.

In an interview with the Computer Weekly Developer Network, IBM's Le Blanc highlighted the fact that there are 450 technical sessions being presented here during this week. Even more interesting perhaps is the fact that 350 of these sessions are presented by customers/clients who do so for free, gratis, without payment, for the good of wider information exchange throughout the software application development community.

In terms of where IBM has been (acquisition) shopping to bring DevOps (and, crucially, additional continuous integration and application integration) technologies into its software stack, the firm has most recently acquired:

  • Worklight for mobile development
  • Greenhat for development and testing
  • Urbancode for release and deployment technology
  • Coremetrics for monitoring
  • Tealeaf for optimisation technology

IBM aquire.JPG
IBM product set

New products announced in the areas of application code creation, testing and delivery include IBM Rational Test Workbench, IBM Worklight & IBM SmartCloud Application Services Trial. IBM SmartCloud Analytics: Log Analysis, plus also IBM SmartCloud for Application Insight and IBM SmartCloud Application Services.

So integration and delivery aside, this event was called 'Innovate', so IBM logically had to spend time talking about innovation -- and the firm used two special guests to paint this picture.

First up was Eric Reis author of "The Lean Startup", who delivered a refreshing presentation where he talked about innovation and entrepreneurship. Firstly he asked the audience to turn their phones back on, "I don't want anyone to be disconnected from the Internet on my behalf. I mean, is that even living?"

Ghostbusters startup
220px-Ghostbusters_cover.png
Reis then went on to detail his approach to entrepreneurship and explained how Ghostbusters is his favourite startup movie -- they just happened to start up their company at exactly the right time as the evil Zool was about in invade Manhattan -- so it's all about agility and the ability to validate a market proposition in the right place in the right time.

"A start up is a human institution designed to deliver a new product or service under conditions of extreme uncertainty," he said -- it is nothing to do with the size of the company or the shape of the product.

So how to be successful? It's all about automation and systems and the ability to pivot (in an agile way of course)... as we see traditional (MBA style) management now becoming redundant. In 1911 - Frederick Winslow Taylor said that, "In the past, the man was first. In the future, the system will be first." Modern startups have to have the ability to pivot on the axis of what Intel's Andy Grove would call strategic inflexion points.

... enter Wozniak

Reis was followed by everybody's favourite computing innovator Steve Wozniak.
Wozniak explained how he used to innovate in the early days of Apple. "I just used to like creating things, but then every time I did, Steve Jobs would find a way of making money out of it."

Greatness comes from disruptive innovation when you are a young start up entrepreneur said Wozniak. He cited the example a class of schoolchildren all given a challenge to work out how fast a given number of canoes would take to cross a river with a certain number of men and power and other determining factors. Wozniak said that what he would look for is the kid who would ignore all those known variables and tangentially comment that the river might not flow in a normal congruent pattern i.e. the individual capable of looking completely outside of the problem -- this is where true disruptive innovation occurs he said.

Wozniak told his dad "Someday I am going to own my own computer." But his father said that they cost as much as a house. "OK, so I'll live in an apartment" he said. Wozniak asked his employers HP to make the Apple #1 a total of five times and they turned him down each of those five times... he was then approached by Steve Jobs to start a company and the rest is history.

IBM's Le Blanc had some wider innovation messages for developers to take away from all of this. He explained how developers today develop, but that in the future they may be more focused on design and creation as they are able to focus more on innovation rather than the mechanics of building the software application development process.

If we accept that an increasing proportion of the components used by the typical programmer may be more commoditised in the future -- almost like a car producer which does not actually "fabricate" individual components -- then this higher level "innovate" name for this developer conference may just start to make sense.

Look at what it says on an Apple product box said IBM's Le Blanc: "Designed by apple in California assembled in China."

So will software application development one day be known as software application design or software application innovation? Not this year maybe, but soon perhaps.

IBM's software language today: Innovate 2013

bridgwatera | No Comments
| More

Once upon a time, IBM used to make computers. Of course IBM still makes 'some' computers for internal use and at the higher System Z| server level, but these days we think of IBM as a software company.

What used to be known as the IBM Rational developer conference has been renamed IBM Innovate.

Although this more commercially tagged moniker almost forewarns you that someone is going to say "leverage" and/or something like "system of interaction" at the morning keynote, IBM has (arguably) got a worthy bushel of deeply technical content to deliver that programmers still come to Orlando every year to drink in.

a captureewfdw.JPG

Same same, but different

IBM is doing some things the same and some things differently on several levels. The programmers are here in droves it is true - but over 50 percent of this audience are new to this event. IBM talks about "new trends" when it focuses on social, cloud, so-called "smarter products" and big data. But the company also uses this event to evidence wider software engineering value by focusing on Agile programming methodologies, DevOps issues and data analytics.

So how have things changed in the software defined business app environment? This is a question that cloud computing focused developers will (arguably) want an answer to more than any other... Well initially, APIs need to have "self evident value" in the new API Economy as business applications now evolve says IBM.

If it is a measure of value (self-evident or otherwise) that the API can now bring to the programming team as it works to build and connect new cloud-based applications, then we may view a software engineering landscape more quantifiably analysed than ever before.

IBM BlueMix - how exactly do we develop cloud applications?

IBM now talks about its BlueMix initiative under the jStart element of its Emerging Technologies Group.

This division of IBM has been appears to have been established to try answer the question that we don't appear to be discussing openly enough just now i.e. "just HOW exactly do software programmers develop cloud applications and, even more importantly, WHICH applications should be architected for a cloud-based existence?"

IBM hosts a BlueMix Scenario Definition Workshop service through its jStart/IBM Software Services team to bring together architects and developers work with a customer's own team to identify a candidate application for developing/deploying as a proof of concept with BlueMix.

a ibm ser.jpg

In terms of new news this week, IBM is augmenting is cloud application development products with an eye on both the development and testing phases of the lifecycle. These technologies will be supported by IBM's Jazz platform which is designed to bring DevOps power to those applications being developed.

NOTE: IBM explains DevOps as an integrated approach to software delivery that integrates an organisation's culture, processes and tools. It spans the entire lifecycle, from business planning and creation to delivery and feedback -- the goal of DevOps is to enable continuous software delivery allowing businesses to grab market opportunities.

"Software is the invisible thread driving transformations in businesses of all industries and sizes," said Kristof Kloeckner, general manager IBM Rational Software. "As organisations and the dynamic markets in which they conduct business become more complex, it is critical that they adopt a DevOps approach to continuously delivery software-driven innovations to their clients."

What does big data insight actually look like?

Newest among the cloud developer tools here is Log Analysis, this product is designed to give developers a look into (or "insight" to use the industry term) terabytes of unstructured data with automated analysis which will output to IT asset logs. The concept here is that software teams can give organisations "actionable insights" (there's that term again) which a human expert might only deliver if given an unlimited time window.

Also here in terms of announcements are IBM SmartCloud Monitoring Application Insight: this tool is designed to helps businesses monitor the real-time performance and availability of applications hosted on a cloud.

A better way to look at how cloud apps behave

"The ability to embed monitoring capabilities during the development process, makes it easier for companies to understand how an application is being used once it has been deployed," says IBM.

This developer event is also looking at the announcement of the newly expanded IBM SmartCloud Application Services. Developers can use SmartCloud Application services to deploy and manage applications written in the PHP language using Zend Server 6.

NOTE: PHP is a popular general-scripting language that is designed for web development. IBM is hoping that support for this language will provide firms with greater choice in development options for programmers to create cloud-native applications.

As much as this event might have moved towards a slightly higher level message set in parts, the firm is also pushing forward efforts to support the global software application developer community here with an expansion of its developerWorks resources. New sites within the developerWorks network include mobile, cloud, big data, WebSphere application development and the new developerWorks Labs.

Core developer event & technical summit

Make no mistake, although IBM will repeat the words "leverage", "insight" and "innovation" this week ad nauseum, this is a core developer event and technical summit -- and the firm is (arguably in some areas) doing a far better job than some others (think about a firm based in the North West region of the USA known for its quite popular PC operating system) in terms of developer press facing messages and outreach.

We wouldn't get up for 7am Birds of a Feather pre-keynote UML get together sessions if we didn't want to now would we?

Subscribe to blog feed

About this Archive

This page is an archive of entries from June 2013 listed from newest to oldest.

May 2013 is the previous archive.

July 2013 is the next archive.

Find recent content on the main index or look in the archives to find all content.