December 2011 Archives

SAP's 2012 vision: gas meters, Twitter and Aktiengesellschaft

bridgwatera | No Comments
| More

SAP AG, or as I like to call them Systemanalyse And Programmentwicklung AktienGesellschaft (it's just catchier) has had a pretty good year it seems -- in terms of media recognition if not commercially, or indeed both.

Despite not always garnering the most positive headlines around, the company completed a positively received acquisition of Sybase, bolstered its HANA in-memory computing platform and hosted its SAP TechEd event in Madrid in November.

So, as is the way at this time of year, what does SAP see on the road ahead?

No surprise for a company that produces ERP-centric "systems of record" at the scale that it does, SAP sees data analytics (of Big Data) as key to producing business benefits -- or (and it's an expression that's already becoming slightly hackneyed) "real-time insight" into applications and data.

Away from its corporate message set for a moment, SAP suddenly starts to get really interesting with practical real world examples.

The company talks about homes equipped with smart energy meters that transmit back to utility companies every 15 minutes, "Utilities will have to find a way to suddenly mange and analyse billions of meter readings, a phenomenal amount of data," says SAP.

Retailers can now use GPS data from customers' smart phones to know when they've entered a store and where they are -- and then send them promotions appropriate to their browsing and buying behaviour.

Companies will be plugging into Facebook and Twitter to analyse sentiment and then using this insight to plot out and adapt marketing campaigns

Sounds like a valid point doesn't it?

SAP also points to the UK government "midata" project, which is designed to transform the relationship between consumers and corporations.

According to SAP, "The plan is that all sorts of companies will make their data available and then other firms will help consumers to manage it and build useful applications and services on the back of it."

Yet again, more "data analysis" being brought to bear.

Stream of consciousness/tag-cloud...

In other predictions SAP talks about tablets, mobile, iPads, mobile commerce,
Windows 8, 6.5 billion mobile workers by 2014, green tech and sustainability, challenges and emerging economies and guess what else?

Go on, guess what?

Cloud computing!

Well, you'd have been disappointed not to see that one on the list right?

But snide cheap journalistic naysaying aside, the data analytics stuff looks to be well thought through and is clearly going to impact programmers, administrators of all kinds and users in every niche.

So as they say in Germany -- Froue Weihnåcht'n, und a guad's nei's Joah

IBM hypothesises on next five years of innovation

bridgwatera | No Comments
| More

When's the next big thing coming then? When can we expect the next paradigm shift? If you ever get asked this question, the answer is simple -- about five years.

It's a safe bet, just think about it -- five years ago we were barely talking about cloud computing, mass broadband rollout and tablet computers.

IBM is pretty good at hypothesising on future trends; the company's alphaWorks division is built with a mix of both pure and applied research and has produced tangible technology advancements in Unstructured Information Management (UIMA), autonomic computing and grid computing technologies before now.

So what does IBM think we'll be doing with computing technologies in five years time?

The company has just released its 6th annual IBM Five in Five predictions list (see what they did there?) to look forward thusly.

If these predictions progress to becoming real products, the could have a significant impact upon both the new opportunities for software developers to build with these technologies and us, the users who will use them.

Here's a summary:

You will make your own energy: Anything that moves has the potential to create energy. Your running shoes, your bicycle and even the water flowing through your pipes can create energy.

You will not need a password: Your biological makeup is the key to your individual identity and soon, it will become the key to safeguarding it. Biometrics will witness a new renaissance says IBM.

Mind reading is no longer science fiction: Scientists are researching how to link your brain to your devices, such as a computer or a smartphone, so you just need to think about calling someone and it happens.

IBM man.jpg

The digital divide will cease to exist: In five years, the gap between information haves and have-nots will narrow considerably due to advances in mobile technology.

Junk mail will become priority mail: In five years, unsolicited advertisements may feel so personalised and relevant it may seem spam is dead. At the same time, spam filters will be so precise you'll never be bothered by unwanted sales pitches again.


Santa outsources North Pole toy factory B2B environment

bridgwatera | No Comments
| More

Everyone loves a Christmas and Santa themed technology story (don't they?), so what better time than now to look at how dear old Father Christmas is dealing with outsourcing the management of his B2B environment so that he can spend more time managing the day-to-day running of his toy factory at the North Pole.

E-commerce and integration services company GXS knows a "story crowbar" when it sees one and has creatively drawn up a tale detailing Santa's "Present Delivery Network" (or Santa PDN as it is known), which uses GXS Intelligent Web Forms to allow children to enter their present requests into a simple to use web portal.

Yes very good, we see what they're doing -- ok, let's go with it for a couple more chunks.

Santa.png

According to Elf Workshop News (dot com), Santa was keen to remove paper based present requests as he was trying to adopt a greener way of working with both his trading partners and 'little customers' around the world.

The following is an extract from Elf reporter "Buddy The Elf" --

Santa soon realised that the use of social media tools could completely transform not only how he engaged with the children of the world but also with trading partners connected to his Present Delivery Network. In 2009 GXS announced a partnership with a company called RollStream who offered a web-based trading partner collaboration platform.

Following a number of pilot projects across his business, Santa decided that he would like to revisit this in 2011. It was actually on the back of these successful projects that GXS decided to acquire RollStream in order to provide a supply chain collaboration solution for other customers of GXS.

Santa car.png

In July 2009, Santa approached Jaguar Land Rover (JLR), a GXS customer, to see if it could come up with a couple of design concepts for a new sledge. The design departments of both Jaguar and Land Rover built working prototypes for Santa to evaluate -- and due to the number of presents that he had to carry, combined with the varying conditions that he flies in, he decided to award the new sledge contract to Land Rover.

OK Christmas themed techie fun aside, does Santa's model have anything that actually resonates with software development today, the cloud, social media and web 2.0 driven computing?

The answer is probably yes isn't it? As we now "connect and collaborate" in more interconnected ways than ever before every business delivery model is becoming exposed to modernisation.

From the oldest profession in the world (it begins with a P, if you have to look it up then you're too young to be reading this anyway) -- to the most fanciful elements of Santa's kingdom... we've gone online in a big way.

elves.png

IBM's "new" Internet: full of toasters, earrings & electronic T-shirts

bridgwatera | No Comments
| More

Embedded software application development could be a significantly increasing trend for 2012 and onward if IBM's latest thinking is borne out in tangible product development.

This is the upshot of IBM's latest moves to produce what could effectively be a whole new Internet - or the "Internet of Things" as it is known. One made up of data and intercommunication exchanges between digitally empowered devices from fridges and toasters to cars, electronically intelligent sports clothing and plant pots.

So how will Big Blue do the do?

For a start, the company recently joined forces with Italian hardware architecture specialist Eurotech to donate a complete draft protocol for what it describes as a "asynchronous inter-device communication" to the Eclipse Foundation.

So just how many "connected devices" might we expect to see?

Estimates have hovered around truly massive predictions in the range of 24 billion electronically enabled machines (with Internet connectivity) by 2020.

This is where we start seeing RFID tags on cartons of milk that tell the fridge when they are out of date. The fridge then communicates with the "Household Shopping" application and this subsequently emails the user's PDA with a shopping list and so on...

Fridge.jpg

How far do the possibilities extend here? RFID tags on earrings seems to be about the most off the wall application so far.

As part of this news, IBM is releasing Java and C versions of its MQTT technology as an open sourced Eclipse release under the codename Paho.

NOTE: MQTT stands for Message Queuing Telemetry Transport (MQTT) protocol; essentially this is the machine-to-machine counterpart of HTTP.

As RedMonk analyst James Governor points out, the MQTT spec was actually already available, but pushing it forward to a fully blown open source release is a whole different ball game.

Governer writes, "IBM contributes plenty of code to projects like the Apache web server and Linux. But in many respects I see this latest drop as IBM's most significant since it open sourced Eclipse ten years ago. Why? Because the Eclipse Public License is designed to support derivative works and embedding, while the Eclipse Foundation can provide the stewardship of same. One of the main reasons Eclipse has been so successful is that rather than separate software from specification it brings them together - in freely available open source code - while still allowing for proprietary extensions which vendors can sell."

So you've never heard of the MQTT until today?

Well, this is the protocol used by Facebook to drive its chat/messenger service and if IBM's best intentions for these technologies evolve healthily, then you might be rebooting your microwave oven before you know it.

Data analytics for the drag-and-drop generation

bridgwatera | No Comments
| More

Computer science purists have long baulked at some of the 'dumbed down' drag-and-drop technologies that have proliferated in recent years.

While true programmers will have generally always cut their teeth on a command line interface and been able to use a keyboard, the rise of auto-drag functionality has continued unabated.

Given this truth, it is perhaps logical that business intelligence (BI) company Jaspersoft has built drag-and-drop data analytics (and reporting) functionality (with the non-technical user in mind) into the 4.5 release of its eponymously named core product.

Designed to support BI analysis on so-called 'Big Data' sets including Apache Hadoop and NoSQL, Jaspersoft offers options to output to Excel as well as an improved in-memory engine with intelligent query push-down.

Jasper.png

Frustratingly, Jaspersoft's web site and press materials fail to provide a clear explanation of what "query push-down" is. IBM on the other hand does provide the below info.

"Push-down analysis is performed on relational data sources. Non-relational [data sources] use the request-reply-compensate protocol, while push-down analysis tells the query optimiser if a remote data source can perform an operation. An operation can be a function, such as relational operator, system or user functions, or an SQL operator (GROUP BY, ORDER BY, and so on)."

Justifying its existence - as indeed it should - Jaspersoft asserts that the growth in data "volume, variety and velocity" has led to an explosion in new data stores, all of which are work to capture and process that data.

"However, the opportunity for new insight from this data is hampered by today's BI solutions that require an SQL interface. An SQL interface such as Hadoop Hive is often not suitable for real-time data exploration because of its high latency. For non-SQL interfaces like MongoDB or Cassandra, other BI tools have to extract and load the data into a relational database for analysis," said the company, in a press statement.

The upshot of this type of development (Jaspersoft hopes) is that greater use of its tools will be seen within say the analyst community and beyond.

The desert island procedural set-based thinking challenge

bridgwatera | No Comments
| More

The below text is contributed content to the Computer Weekly Developer Blog written by Michael Vessey, senior consultant with software quality specialist SQS.

One of the biggest challenges for modern day developers is dealing with the number of platforms thrown at them by technology providers. Depending on the structure of their team and the size of the project, a typical Microsoft developer may have to work with SQL server, C#, WCF, WPF, CSS, LINQ, ASP and any number of other tools of the trade.

It's rare to find a developer that will advertise on their C.V. that they only work with one of these technologies, developers by nature are curious creatures that want to play with the latest toys and value any form of diversity in their work.

With this in mind, it's hard to imagine how a developer can be a master of all of these tools and still manage to keep abreast of the most recent version of each. Throw into this development minefield the mentality differences between different platform specialists and you have a recipe for disaster.

To highlight the issue let's look at the differences between set-based-thinking database professionals and procedural-thinking development professionals.

In 1971 Edgar F. Codd defined the term "Third Normal Form" (3NF) which is regarded as "THE" major design rule when building databases. With the Assistance of Raymond F. Boyce and a few others, modern day "set based" approaches were formed that dealt with the optimal way of retrieving data from databases.

A difference of "mindsets"

The idea behind set based thinking is that you gather all of the data you need in one large operation and attempt to touch any database tables only once, (or as few times as possible) and with as few round-trips as possible.

But these techniques go strongly against the mentality used by procedural developers who work in a world of iterations and loops that allow them to apply complex business logic to single instances of a class. For an application developer, sitting and waiting 30 seconds for your data to load is not on their agenda.

A real-world example

To provide a real world visualisation of this let's consider a less conceptual scenario. A desert island with 50 people stranded because of an airplane crash. They have one bucket, five cups and a pool of fresh water a short walk from their campsite.

In order to make sure the 50 stranded people don't dehydrate, they need to move water from the pool to the campsite. The bucket is set up at the campsite, one cup is left with the bucket and in true multi-threaded application style two of the crew go off to the pool each holding two cups (one in each hand).

They make several trips and slowly fill the bucket, while the cup that has been left behind allows the poor unfortunate souls on the beach to each in turn take drinks from the limited water supply available at camp (perhaps even depleting the bucket to its original empty state).

This embodies the procedural developers approach to the solution, often automatically implemented by Domain Entity Modelling systems such as N-Hibernate; where the database code can be written by the computer and not by a database expert. Data is fetched row by row, or as referred to on many database forums, RBAR - "row by agonising row" - a term coined by Jeff Moden.

This technique looks absolutely amazing when you have a development system with the bare minimum of data, but can become woefully inadequate when looking at larger data sets.

How would a database pro do it?

The approach of the database professional is very different. They pick up the bucket, walk to the pool and fill the bucket. It's very heavy, so they walk very slowly back to camp. When they arrive there are five cups available and 50 people can drink (five at a time). The survivors had to wait five times as long for the bucket to arrive, but the crew didn't need to make 13 trips to the pool in order to provide the last man a drink.

Does this happen in the real world?

To show how this really does apply in the development world, I went to a site where I was asked to look at a purge process for old data. The system had worked absolutely fine when it was released, however over the course of a year it had become very slow and was taking up to hour hours to run.

The developer was at his whit's end trying to figure out how to solve the problem and was looking at Windows performance statistics trying to figure out if it was a hardware or software related issue. The project manager confided in me that he was already in talks with the operations team about improving the server hardware at a cost of about £18,000.

Fortunately the problem was very easy to spot. Looking at the code for the purge I found the following:

declare @id int
declare curs1 cursor for select id from dbo.mytable where status=3
open curs1
fetch next from curs1 into @id
while @@fetch_status=1
begin
delete from dbo.mytable where id=@id fetch next from curs1 into @id
end
close curs1
deallocate curs1

A cursor loop inside the database code was loading up a list of records to delete and then finding each one individually (amongst several million) and deleting it.

I suggested the following much simpler line of code.
Delete from dbo.mytable where status=3

The developer was sceptical; he suggested several reasons why he did not think this would improve things and proceeded to line up a demonstration on the development system with 50 records of data. The demonstration showed no measurable difference between the sub-second performance of the two blocks of code.

I asked at that point to re-run the test with one million rows of data, promising that if there was a performance issue then we could abort the code execution and look for other solutions. The developer was astonished when the million record delete that was taking up to eight hours in the live environment had completed in just less than eight seconds with the new code.

So what lessons have we learned?

After leaving the slightly confused developer to re-run the test and after reporting back to the project manager that the issue was resolved, I began to think about lessons learned and how such poor code had gotten into the database.

The developer wasn't bad at his job; in fact his middle tier code was very tidy and lean. It was that he was working in multiple technologies and one of those was not his forte. A peer review of the code by a technical expert would have identified the issue before any testing. In addition, if the test and development environments had been data heavy then the issue would have been detected before go-live.

Neither way of thinking is right or wrong, but if you can think like both then you'll have a great career in software.

Will our future be shaped by "polyglot" programmers?

bridgwatera | No Comments
| More

If you ask a software programmer what language they write in - the answer is sometimes simple and you may get a one word reply i.e. Java, Python, PHP, Visual Basic, C# or C++ etc.

More often I find that programmers define themselves as "all of the above" and more. Essentially they have become multilingual polyglots i.e. individuals capable of speaking or writing more than one language.

Polyglot [pol-ee-glot]
adjective
1. able to speak or write several languages; multilingual.

One developer made the following specific comments when questioned on this subject.

"As long as you understand logic, then the language is really just the syntax."

language.png

Web site http://www.polyglotprogramming.com/ takes this concept even further and posits that developers may be working with multiple languages across multiple paradigms i.e. functional programming, object-oriented programming etc.

This combination is referred to as "polyglot and poly-paradigm programming" or PPPP for short.

This set of realities is now shaping the way programmers now serve the application consumption hungry public at all levels. There are innumerable "write once, run anywhere" programming platforms, but the truth is that the proliferation of form factors and devices is only helping to further fuel the diversity which currently exists.

The polyglot programmer term was allegedly coined by ThoughtWorks architect Neal Ford in a blog written way back in the mists of time on December 5 2006.

ThoughtWorks' Ford writes, "We are entering a new era of software development. For most of our (short) history, we've primarily written code in a single language. Of course, there are exceptions: most applications now are written with both a general-purpose language and SQL. Now, increasingly, we're expanding our horizons."

Now if only the business community was as multi-lingual right?

Games: the 'most popular' form of mobile application

bridgwatera | No Comments
| More

There's a TED seminar by Jane McGonigal which extolls the virtues of games, gaming and so-called "gamification". Let's just say that again -- games have become a seminar discussion topic.

McGonigal says that we invest three billion hours a week playing online games. So games, as we know, have become mainstream big business and now enjoy as much publicity hype as new movie releases.

Edinburgh based Runtime Revolution Ltd (RunRev to its friends) clearly recognises this and has used games to promote its visual development system by launching the LiveCode Game Academy promo this month.

Stating that games are the most popular form of mobile apps today, comprising 70-80% of all mobile downloads, RunRev's new programme reaches out to "both new and experienced developers" who are being encouraged to use the company's natural language programming environment where application changes appear in real-time.

NB: natural language programming or high level programming languages are characterised by their proximity to English and (conversely) their "abstraction" from the machine code which, at its heart, a computer will natively speak -- so to speak.

RunRev's academy (well, no obligation promotional vehicle really) runs from December 6 to January 31 and offers would-be (and experienced) mobile app developers an "accelerated course in the principles of programming and game creation".

Areas covered will include:

• Game design considerations
• Basic gameplay coding
• Animating game pieces (the ability to manipulate sprites)
• Working with layers and backgrounds
• Deploying to your iOS or Android device
• Marketing your game

"Our initial LiveCode Summer Academy was a tremendous success and a great help to hundreds of developers who walked away with a mobile app after seven weeks," said Kevin Miller, CEO of RunRev. "The mobile gaming industry generated nearly £5 billion this year and is expected to reach £7 billion by 2014. This market prospect offers developers a tremendous opportunity for growth and success, and LiveCode will help them get there faster and easier than any other development environment."

RunRev.png

The LiveCode development environment enables shared code across multiple platforms and devices. In addition to LiveCode for mobile devices, deployment packs include Windows desktops, Mac OS X desktops, Linux desktops and web browsers on Windows, Mac and Linux.

Handling "complex" web sites, it's complicated

bridgwatera | No Comments
| More

While a good chunk of application development is aligning towards mobile apps that are in many cases initially scaled back (as updates follow) and/or even disposable -- somewhere, in the cloud data centre and on the web, things are getting more complex.

But what makes a complex web site anyway?

Issues here include the operation and management of "bigger" back end servers and the systems integration challenges that naturally come with these blocks of data.

So-called Big Data as well as Complex Event Processing technologies will also make the delivery of data to a web site (in many cases via a Content Management System (CMS) such as WordPress, Drupal or Joomla. The challenges mount, so naturally we start to look for the path of least resistance to get complex but still stay online?

Onward from the web CMS, there is also a need to look at the WCO factor i.e. Web Content Optimisation.

This is, in a word (or three) "application-specific resource management" and if that app happens to be on the web then so be it.

One solution in this space is Riverbed Technology's Stingray Traffic Manager. This is a virtual Application Delivery Controller (ADC) that provides "flexibility and a single point of management" for accelerating enterprise applications hosted on servers. It provides developers with what the company refers to as "advanced scripting and enterprise-level functionality" in the form of the branded TrafficScript code.

Traffic script manages the content that the app is fed. So for example, if it were a web site that has a lot of images -- the technology would work to scale back the images hitting the web site and just send the text first.

This is global load balancing, with a sprinkling of bandwidth management, a side order of "rate shaping" and some service level monitoring to boot.

Fractal.jpg

Also from Riverbed in this space comes Stingray Aptimizer, a WCO solution built to deliver both internal web applications, like Microsoft SharePoint... and external web applications, like e-commerce and highly customised websites.

According to Riverbed, "Typical websites can have 50-200 file requests and for each request, in addition to the network latency, the performance of websites can be severely impacted. Stingray Aptimizer reduces page load times and reduces bandwidth by transforming the content (multiple image formats, JavaScript, CSS files, etc.) that is delivered from a web server to the web site viewer, accelerating website performance in some cases by up to 400%."

Riverbed wants to see the Application Delivery Controller now sit in the application stack rather than the network stack. The company's vision sees Web Content Optimisation become a tool for operating any public web site or "connected" enterprise application.

Riverbed LIKES this. I wonder what the rest of us think. Consider this a POKE.

Sound: the next (social media) killer app?

bridgwatera | No Comments
| More

We're all looking for the next killer application. All of us are, all of the time -- everyone wants to know what the next Twitter is going to be and we're all frantically downloading increasingly lightweight (almost disposable) apps in the hope of getting ahead of the curve.

But what form will the next killer app actually take? Perhaps, just perhaps, the answer is sitting right in front of our faces.

Could it be sound?

Let's look at the evidence. Apple's voice-based "natural language user interface" Siri is of course based around the concept of human voice and spoken commands. Part of iOS 5, the hype surrounding this technology is (almost) outstripping that which surrounds the iPhone itself.

Then there is HD voice. While mobile phone voice technology has remained largely unchanged for over a quarter of a century, HD voice heralds the use of so-called "wideband technology" to provide a deeper clarity and a better audio experience in VoIP-based communications.

It should feel like callers are almost in the same room.

Apple users are also benefiting from an app called Sound App, which is designed to manipulate sound files and create playlists.

Finally here there is Soundcloud and this is where sound starts to impact social media.

Soundcloud.png

According to its developers, "SoundCloud is the world's leading social sound platform where anyone can create sounds and share them everywhere.
Recording and uploading sounds to SoundCloud lets people easily share them privately with their friends or publicly to blogs, sites and social networks."

SoundCloud can be accessed anywhere using the official iPhone and Android apps. There are also hundreds of "creation and sharing" apps built on the SoundCloud platform.

Could something as simple as audio (combined with social media driven dissemination channels) really be the next killer app?

I'm saying nothing...

About this Archive

This page is an archive of entries from December 2011 listed from newest to oldest.

November 2011 is the previous archive.

January 2012 is the next archive.

Find recent content on the main index or look in the archives to find all content.