November 2013 Archives

Is BYOD really a developer problem?

bridgwatera | No Comments
| More

Is it fair to refer to the term Bring Your Own Device (BOYD) as "hackneyed" or even over-used, already?

Given the inexorable rise of consumer-driven IT adoption, there really isn't much getting away from this phrase -- not today, not next week and not next year.

But who's problem is BYOD? Purchasing managers, security managers, networking managers or other.

Gartner has stepped aside from its dogged focus on Magical Quadrangles for just a moment and said that BYOD is a developer issue.

So is this worthy comment, or worthless spin of the (no doubt distinguished) analyst variety?

Gartner says the key decision about BYOD is one of applications architecture and solutions design.

"Designing your applications to meet the demands of BYOD is not the same as setting usage policies or having strategic sourcing plans that mandate a particular platform," said Darryl Carlton, research director at Gartner.

"BYOD should be a design principle that provides you with a vendor neutral applications portfolio and a flexible future-proof architecture. If the applications exhibit technical constraints that limit choice and limit deployment, then the purchasing policy is irrelevant."

If software application developers are no longer developing applications for deployment to an exclusive user base over which (in theory) it can exert standards and control, perhaps BYOD should indeed form part of the architectural programming consideration.

This development (if it is real) is leading to what the analyst firm has called "global class" computing -- an approach to designing systems and architectures that extends computing processes outside the business and into the cultures of the consumer, mobile worker and business partners.

It's not a bad suggestion -- certainly better than a Magic Roundabout or Quadrant any day of the week.

Mobile_Geräte.jpg

Are cloud developers different?

bridgwatera | No Comments
| More

Is cloud impacting real programming in software application development environments and are we creating a new breed of cloud programmer that behaves differently?

Bereft of a 'Magic Quadrant' to reference, Forrester earlier this year proffered forth claims that "less than quarter of all developers are using cloud" in its estimation.

There was some substance to validate this opinion; the research company explained that so-called cloud developers jump on new desktop technologies faster.

Cloud developers use Windows 8 (19%) far more than non-cloud developers (3%), said the Forrester Forrsights Developer Survey, Q1 2013.

The Computer Weekly Developer Network likes Forrester, a bit -- but we like cloud developers themselves more.

tomhaswjhg.jpeg

"Speaking as a developer who has worked on both sides of the cloudy divide, I've seen all of the personality types listed in both camps (cloud developers and non-cloud developers) -- but it's not the personality mix of developers that varies, so much as the rigidity of the framework they are working within," said Tomas Soukup, product development manager at Samepage.io, a social collaboration tool company focused on online product development.

Soukup says he can understand that cloud developers are seen as 'risk takers' because they are able to innovate and test new ideas quickly on their customers in real time.

But what are they really risking with cloud? If they get it wrong, they can usually find the bug and fix it the next day.

Why cloud programming is different

"By contrast, someone working on the latest version of software that is downloaded by millions of people, Microsoft Office for example, has to be more risk-averse for two main reasons. Firstly when the software is downloaded, they cannot easily see exactly what the customer's problem is which makes locating & fixing a bug more difficult. Secondly, the correction is more difficult to distribute widely due to the issues of delivery via download," he said.

Does cloud developing really attract 'trouble makers' as Forrester's research suggests?

Soukup says no, he doesn't think so.

He does at least contend that perhaps it's true that a cloud developer is more willing to question conventional ways of doing things and debate their fellow team workers but that's probably more a characteristic of a younger age profile than the work itself.

"This speed of deployment certainly does mean cloud developers can 'feel more challenged and successful', though whether they actually are, is also questionable. On a final note, the way the personality types are presented seems to suggest one type of developer is better than the other. I can't agree with that. Same horses, different courses," he concludes.

Food.png

Unifying social media APIs for our video conferencing future

bridgwatera | No Comments
| More

ae-natarajan.jpgThis is a guest post for the Computer Weekly Developer Network by A.E. Natarajan, executive vice president of worldwide engineering at telepresence and voice communication solutions comany Polycom.

A new method of choice?

Videoconferencing is increasingly becoming the communication method of choice for organisations needing to communicate with partners and customers.

According to a recent survey by Redshift Research, videoconferencing is currently the third most favoured method of communication (47 percent) and it is expected to rise to be the preferred business communications tool by 2016 (52 percent), rising above e-mail (51 percent) and voice/conference calls (37 percent).

What needs to happen

In order for this to happen it is important to consider the need for unified social media APIs to make videoconferencing as simple and easy to use as making a voice call.

This is where cloud-based social media APIs can play a vital role when combined with the ability to easily arrange ad hoc meetings by automatically sending web link invites via instant messaging or having the flexibility to arrange scheduled meetings, by automatically sending email and calendar invitations with meeting details, including a web link, for simplified browser-based 'click-to-connect' convenience.

By understanding standards-based XMPP presence proxies or client API and protocols, it is simple to derive contact lists from sources including Google Talk or Facebook.

This gives a user the ability to connect any contact into a secure enterprise-grade videoconference and share content. Through XMPP and third-party APIs, it is possible to collect and aggregate presence information easily.

How the cloud connects

It is also important to have the 'cloud' element for videoconferencing, allowing users to connect with others using Google Talk or similar services, who might not be using Outlook.

Cloud also refers to the infrastructure to enable this all in a scalable way and providing the connectivity between browser-based and room system participants.

This means that through a browser and web camera users have the ability to reach millions of participants outside of an organisation, including options for connecting with them through room systems by leveraging an infrastructure - located in any data centre and in any geography - in private, public or hybrid cloud models.

Connecting social media

Simply by adopting a unified approach to social media APIs and providing the means to easily connect with social media platforms, videoconferencing can be extended to mobile and desktop participants, making high quality video collaboration with room systems a simple, seamless and effective process.

Apple iOS 7.1 developers' dark (keyboard) thoughts

bridgwatera | No Comments
| More

Apple has rolled out its iOS 7.1 mobile operating system update to registered software application developers.

News of iOS 7. 1 is scant at this time because:

a) Apple covers most things with an NDA
b) Apple won't give press access to its developer network
c) Apple is Apple

There are a few details flying around the web though and (stone the crows would you believe it?) you can expect some "bug fixes and performance enhancements" in your next iPad/iPhone etc. OS update when it comes.

Of more substance perhaps is news that Apple developers will now find support for iOS SDK 7.1 as well as X code tools and compilers.

iOSApple_adrian.jpg

Reports also suggest that there is a new toggle for "dark keyboard" in the Accessibility settings -- although some users (on official Apple discussion forums) had initially suggested that the dark keyboard might be of use to offer the user a different UI when typing in secure log in details, it does appear that this will simply be (as mentioned above) an Accessibility setting option for all users if they prefer it.

CA Expo UK 2013: the Japanese schoolgirl enterprise application evolution curve

bridgwatera | No Comments
| More

So CA Technologies' UK CEO Mike Gregoire thinks he knows how DevOps and software application development methodologies are evolving right now.

It's all very well for a CEO to stand up on stage and say that today we exist in a so-called "application centric" world -- it's good stock standard fodder for any keynote to say this kind of thing after all.

ca Capture.JPG

But Gregoire had some substance to back this statement up as he comes from a career history working at PeopleSoft and others.

Perhaps 15 years or so ago we were focused on custom application development, but after this period the drive to this model of software was lessened...

... this is because we saw the rise of wide scale ERP software and a push from vendors to adopt these meaty packages of software.

But mobile, BYOD and ubiquitous web connection to cloud happened in the last decade and this shook up the way we all started to regard custom built enterprise applications again.

NOTE: Every knowledge worker now has an average of 3.3 connected devices today.

So then, at this point, the true birth of the "application centred" world came about... and we continue to see the application landscape change now evolve in this direction.

Japanese Schoolgirl Watch

This train of thought was augmented colourfully by the next speaker at this event - Tom Standage, digital editor at The Economist reminded us that Wired magazine used to have a column called Japanese Schoolgirl Watch (which obviously sounds a bit dodgy)... which was in fact purely devoted to looking at what gadgets Japanese schoolgirls were using on a day to day basis.

The theory being that what technology the average Harajuku girl is using today, we will all be using in the West in 10 years time.

Not mobile second

He also said that thinking MOBILE FIRST is of course important, or at least NOT MOBILE SECOND might at least be more realistic.

On both points above, we again see that the application evolution curve is shifting.

Major influences being (once again)...mobile, social and cloud -- and now with cloud we also see a tipping point i.e. for the first time the top reason listed for cloud concerns is no longer security; as more cloud is actually now deployed, security concerns are reducing... security concerns are now in fact equally sat next to complexity concerns.

The old term "groupware" never really got it right did it? But Facebook has successfully pulled off this model so we must look to consumers to see where our next usage models will take us. The consumerisaton of IT is really really this real.

So how should we consider new technologies and the wave of current development if it's not through studying the user adoption habits of Japanese schoolgirls?

How to understand the future

Standage also quoted the writer William Gibson who said: "The future is already here, it is just unevenly distributed."

The point being that we can already see fragments of the future laid out before us, its just that they are not all fully developed, adopted or embraced as yet.

Standage's next book -- Writing on the Wall: Social Media - The First 2000 Years 
talks about the way we used to witness social interaction in the old coffee houses of London, which used to be subject and/or culturally specific. Today we have Facebook, so nothing changes that much.

What this means for developers...

In break out interviews with the Computer Weekly Developer Network CA's Shridhar Mittal explained that there are deeper changes afoot to enterprise applications which are manifesting themselves at the programmer level.

As CA's DevOps lead, Mittal explains that today's world is all about composite applications i.e. user credit card checks inside applications can be used inside various different apps, so why reinvent them when you can make them composite elements in their own right?

So onward, the lifecycle inside composite app structures should be also be different i,e, faster in this scenario, well - theoretically at least.

But the problem is that the impact upstream and downstream on other components can be disruptive unless tested properly and adherence to standards based technologies will indeed mean that these components do work with each other... but that doesn't guarantee that the resultant apps will meet the required business function.

Mittal also asserts that the DevOps gap needs some cultural and technical evolution.

Operations workers naturally feel ownership not of apps, but of the infrastructure he says.

This means that we need to bring operations into the application earlier in the application design process so that FUNCTIONAL REQUIREMENTS are fed back into the development cycle.

Also, Agile software application development needs to be able to test individual components before they go into production - think about an aircraft wing being tested in a wind tunnel before being attached to the plane itself i.e. we don't just throw the whole plane together and see if it flies.

CA has Production Data Mining technologies (including the Pathfinder product) to bring forward real data into the test environment as well as performance characteristics of the application in the face of the data in question... plus performance of the data itself (such as how often it is updated).

speaker_bc.jpg

In short, the way we have adopted application centred worlds and the way we use them today is having an onward impact upon the way applications themselves should be developed and the way data should be regarded.

CA rounded out this event with a presentation by everyone's favourite particle physicist Professor Brian Cox with his talk entitled Disruptive Ideas In Science -- stirring stuff and, thankfully, not a Japanese schoolgirl in sight.

Visual Studio 2013 is for the cloud, in the cloud

bridgwatera | No Comments
| More

Microsoft has officially taken the shiny wrappers off of Visual Studio 2013 and the .NET 4.5.1 platform.

Along with additional Windows Azure developer services now being called "Visual Studio Online", Microsoft asserts that "devices & services transformation" are working with its vision for cloud-based operating-systems and how programmers will fit into this model and make it work.

New with this release is Visual Studio Online Monaco -- a "coding environment for the cloud, in the cloud".

Microsoft says that Monaco offers lightweight, friction-free developer usage in the browser for targeted Azure development scenarios.

New technologies now included are hosted source control, work item tracking, agile planning, build and load testing services (readers will note that these functions were all part of Team Foundation Service) which are all now available in "public preview" as part of Visual Studio Online.

"For teams, Visual Studio 2013 and Team Foundation Server 2013 offer new capabilities from Agile Portfolio Management and a rich set of Release Management features to automate deployment and do continuous delivery on the cloud, to support for Git source control, to the new Team Room feature," said Microsoft Developer Division VP Soma Somasegar.

VS04.gif

IDC's Al Hilwa comments on the release as follows:

I want to take the time to highlight the interesting pivot that Microsoft is gradually making to broaden the Microsoft developer ecosystem to encompass other developers.

VS2013 has the best support for Web ecosystem technologies such as HTML, JavaScript and dynamic languages like Python than any release prior. Microsoft has the best integration with a set of integrated cloud services than ever.

Support for Git and ALM capabilities in the cloud are also attractive for serious developers outside of the direct Microsoft ecosystem. I think the changes recognising the changing dynamics in platforms and my sense is that it is a sign of more to come in that direction.

This is a truly massive release for Microsoft and there are also debugging and optimisation improvements as well as expanded ALM support.

As Mr. Somasegar would say, Namaste!

Forget big data, welcome crunchy data

bridgwatera | No Comments
| More

Gartner's Doug Laney has identified what he calls "dark data" as a new kind of asset class worthy of being put on the corporate books.

Andy Green, content specialist at Varonis also likes this new term and says that dark data is a subset of big data i.e. enormous but without formal boundaries as defined by database schemas.

"In other words, it's the human generated content in documents, presentations, spreadsheets, notes, and other readable formats that make up the bits and bytes of a corporate file system," said Green.

Crunchy data cometh

For data-centric developers, DevOps staff, DataBase Administrators (DBAs) and programmers focused on so-called "content-centric" applications (is there any other kind of application?) the arrival of dark data is perhaps only overshadowed by the next big thing on the data scene...

... get read for crunchy data.

While corporate dark data comes about as a natural by-product of employees creating content to communicate ideas (as Green reminds us, every document is after all a just a thought that's been converted to bits) the inexorable rise of crunchy data is just one step away.

Crunchy data defined

So by way of definition, the origin of crunchy data stems from a classical economic definition as laid down by the late Nico Colchester, who was Deputy Editor of The Economist and before that Foreign Editor of the Financial Times.

CN_Clusters_Choc_500g_2010.jpg

Colchester described "crunchy economics" as the antithesis of soggy economics i.e. that state of total transactional being where "small changes have big effects" and so therefore...

... crunchy data is that most valuable of data where small changes to its values have the biggest effects.

Crunchy data is, if you will, distilled big data and cleaned dark data such that it has real value in the modern ERP system.

To list Colchester's original thoughts as below:

"Crunchiness brings wealth. Wealth leads to sogginess. Sogginess brings poverty. Poverty creates crunchiness. From this immutable cycle we know that to hang on to wealth, you must keep things crunchy. Crunchy systems are those in which small changes have big effects, leaving those affected by them in no doubt whether they are up or down, rich or broke, winning or losing, dead or alive. The going was crunchy for Captain Scott as he plodded southwards. He was either on top of the snow-crust and smiling, or floundering thigh-deep."

If you're dealing with applications and data streams that have been defined within the big data universe, then pull away from dark data all you can and make sure you move towards crunchy data from now on.

Image credit: Packaging News

Top 10 reasons for developer crowdtesting

bridgwatera | No Comments
| More

The Computer Weekly Developer Network this week caught up with Mithun Sridharan of Passbrains, a company that describes itself as an on-demand real-world testing specialist for mobile, web and desktop applications.

Sridharan has commented that with Google, Apple and Microsoft "practically giving away" their development tools for free (does he really mean to include Microsoft in that list?), there is a growing developer base creating mobile apps and responsive web sites for Android, iOS and Windows platforms.

But says Sridharan, it is easy to underestimate the costs of building and monetising an App successfully.

Passbrains staunchly advocates that programmers consider crowdtesting.

NOTE: There aren't many public definitions of crowdtesting available, so we will rely on a public wiki-based one here: Crowdtesting (or crowdsource testing) is an emerging trend in software testing which exploits the benefits, effectiveness and efficiency of crowdsourcing and the cloud platform. It differs from traditional testing methods in that the testing is carried out by a number of different testers from different places and not by hired consultants and professionals. The software is put to test under diverse realistic platforms which makes it more reliable, cost-effective, fast, and bug-free.

Crowd_wembley_FAT_08.jpg

Sridharan's top 10 benefits of crowd testing are as follows:

1. Maximum value and focus

The law of comparative advantage states that maximum value is derived if each party focuses on its core competence. Developers should focus on building products and leave software testing to testing experts to derive maximum value.

2. Improved cash flow

The crowdtesting pay-per-bug pricing model means that you pay only for unique bugs that testers discover - which is better for cash flow compared to an in house team.

3. Sanity check

Developers can quickly test whether their application addresses customers' pain points and validate the business hypothesis by targeting a select audience, gathering feedback and estimating the chances of adoption.

4. Usability

Crowdtesting helps uncover usability issues and developers' blind spots much earlier. Once an app is on the Apple or Google Play store, most customers won't bother to submit crash reports, but simply give the App a low rating, which is incredibly hard to repair.

5. Test coverage

In-the-lab software testing, normally performed alongside development, cannot cover all devices and system configurations, deployment scenarios and usage patterns. Many technical issues and bottlenecks only come to light when the product is 'in the wild'.

6. Speed

The crowd will help discover most critical bugs in a very short timespan due to much larger numbers (often factor 10-20) of testers engaged in a test. Many case studies show that crowdtesting can discover hundreds of issues within less than two days in applications which have passed internal QA!

7. Comprehensiveness

Crowdtesting eliminates any intrinsic bias, characteristic of a local testing team and adds richness and diversity. As different testers may follow different permutations, a multipath approach leads to more bugs discovered in a shorter timespan.

8. Expertise

Crowdtesters are professionally qualified enthusiasts marked by an intrinsic motivation to discover bugs and improve software quality. These professionals are unsparing in their feedback and make excellent development partners.

9. Flexibility

Crowdtesting is a flexible and smart way of scaling your test workforce at short notice.

10. Well proven

Crowdtesting has already been used by likes of eBay, Amazon, GE, Microsoft, Google and Facebook to develop user-centric products. Google regularly deploys crowd testing for 14 of their major product lines.

Zuckerberg, developers & the child video game privacy debate

bridgwatera | No Comments
| More

Newly formed company AgeCheq has produced a free multi-platform API for developers to adhere to the U.S. government's COPPA regulations.

The Children's Online Privacy Protection Act (COPPA) was passed in 1998 to protect children under the age of 13 as they use the Internet.

NOTE: Although this is essentially an American law, the Federal Trade Commission has said that requirements and bindings of COPPA do apply to foreign-operated websites if they: "are directed to children in the U.S. or knowingly collect information from children in the U.S."

COPPA was revised in 2012 to extend privacy protections to mobile games and applications with specific relevance to online data collection and behavioural marketing.

AgeCheq says it gives parents "unparalleled visibility and control" over the mobile apps and games their children use.

The basic AgeCheq service is completely free for developer and for parents.

800px-Children_playing_video_games.jpg
Image: Wikimedia Commons

"The process of validating a parent's ID, providing complete disclosures that parents can understand, and gaining parental approval for a child to play an app are completely new friction points for app and game publishers," said Roy Smith, founder and CEO of AgeCheq.

Kids' details: device ID, geolocation, or phone number

A December 2012 FTC study on mobile apps for kids stated, "most apps fail to provide any information about the data collected through the app, let alone the type of data collected, the purpose of the collection, and who would obtain access to the data. The results (of the study) showed that many of the apps shared certain information -- such as device ID, geolocation, or phone number -- with third parties without disclosing that fact to parents.

Facebook CEO Mark Zuckerberg is against COPPA saying that he wants kids to be allowed on social networks (like Facebook, for example).

Quoted on Fortune and CNN, Zuckerberg has said, "That will be a fight we take on at some point," he said. "My philosophy is that for education you need to start at a really, really young age."

IBM Watson: 2 +2 does not only equal 4

bridgwatera | No Comments
| More

So there are product related news announcements from IBM this week detailing the company's work with its IBM BLU Acceleration portfolio running will on the SoftLayer infrastructure.

But let's forget the corporate stuff for a moment.
Watson's_avatar.jpg
IBM's Watson (just in case you had missed it) is a "cognitive computing intelligence and predictive analytics" supercomputer system that beat human opponents on the US quiz show Jeopardy!

Watson now has massive potential to affect the way we analyse healthcare information and may soon also impact a wide range of other areas including (for example) the legal trade as we start to analyse human and systems-based data/information to a more granular level than ever before.

So Watson can obviously add up.

Ask Watson (in human spoken natural language) what the answer to 2 + 2 is and you may be surprised.

One of the answers is 4.

But Watson views the world in a probabilistic sense a lot like a doctor i.e. if you go to the doctor and tell him you have a bad back he or she might surmise that you have:

a) a back strain
b) sore kidneys and back pain after a severe cold
c) some kind of tumour or something more serious
d) some other ailment like an infection

The doctor will ask you a few other questions to get a wider picture of your health and then produce a prognosis based upon the most likely outcomes i.e. if you happen to work on a building site, a back strain is immediately more likely.

So back to 2 + 2 ... while 2 + 2 = 4 as a start, there are other answers also:

a) 2 + 2 = 4
b) 2 + 2 = the standard configuration of a car seating plan
c) 2 + 2 = the most likely "parent and child" family environment
d) 2 + 2 = other answers in this vein

However, deep semantic-based information with this essentially cognitive system (which means that it will learn more over time) should allow Watson to give you the right answer for 2 +2 based on the contextual semantics and nuances of the scenario inside which the question is asked in the first place.

Watson learns like a child: it learns by reading, it learns by listening to what others are telling and - it also learns by doing.

This type of cognitive computing intelligence and predictive analytics has the potential to change the way software application development and data management is carried out today - and it's real and it's now.

watson-game-top-1.jpg

IBM has confirmed that its next steps with Watson will see the opening up of the Watson API, the creation of Watson as a software application development platform (in order for software application developers to code to "and on top of" the Watson cognitive intelligence engine) and the wider development of the Watson cloud offering.

watson2.jpg

Can we survive 1 million big data events per day?

bridgwatera | No Comments
| More

IBM execs have spent a good portion of collective shoe leather this week hosting the Information on Demand Forum event in Las Vegas.

There are "cognitive computing intelligence and predictive analytics" technologies being launched here that expand the IBM BLU Acceleration portfolio.

Just how bad is big data?

IBM estimates that an average firm's servers, networks and applications can generate more than 1.3 terabytes of data per day in the form of log files, software error alerts, IT service tickets and network configuration updates.

NOTE: This can result in more than one million "events" or system alerts per day - and this spells living hell for systems administrators and the software application developers they help to serve.

IBM says we are on the cusp of a new era in computing where systems can learn, reason, sense, predict and enhance decision making.

"As the value of data continues to grow, the differentiator for clients will be around predicting what could happen to help transform their business with speed and conviction," said Steve Mills, senior vice president and group executive, software and systems at IBM.

"Cognitive systems, such as IBM's Watson, can "understand" the context within users' questions, uncover answers from big data, and improve performance by continuously learning from experiences," added Mills.

With IBM SmartCloud Analytics - Predictive Insights, users are promised the option to sift through terabytes of IT operations data in real time, spotting only the trends that are critical to IT network performance -- this new technology will run on the SoftLayer infrastructure, which will be the foundation of IBM's cloud portfolio.

Storage decisions, made more intelligent?

IBM is also applying machine learning and analytics to storage with a new version of SmartCloud Virtual Storage Center. The firm says that organisations can save time and money by automating complex storage tiering decisions and moving to cloud storage.

By analysing data usage patterns, this intelligent software identifies the type of storage best suited for an organisation's data -- and automatically makes the change without interruption to the user or applications.

IBM_IOD_2013_Monday_Nov4_General_Session_IMG_9124-L.jpg

IBM: data is the new (crude) oil

bridgwatera | No Comments
| More

So the question is, what is a technology conference keynote without a pithy takeaway catchphrase or analogy?

Answer: nothing.

So then, IBM started its Information On Demand symposium this week in Las Vegas with an intro delivered by Jake Porlow (apparently he has a show on the National Geographic channel but no, I've never heard of him either) who said the following:

Data is like oil... and there is a new rush to mine the flow of data.

Take that idea further if you like, data in the new CRUDE oil if we consider that data is now big data i.e. there is a huge amount of raw data/oil out there that needs mining, refining, treating and onward transportation before we can do anything with it.

In the 21st Century - big data is sexy

If you want more takeway soundbites... The Harvard Business Review apparently ran an article on how data scientists are the sexiest coolest people in the IT industry -- Data Scientist: The Sexiest Job of the 21st Century

Man: "Hello wife, I have just completed an analysis of a huge unstructured data stream emanating from a post-relational big data database connected to an Internet of Things style sensor-based data creation and capture unit attached to an industrial turbine in a petrochemicals factory in Southern Sudan."

Wife: "Husband, you are hot."

Hmmm, maybe the Harvard Business Review wanted to get the word "sexy" into a headline, what do you think?

The problem with big data

But it's not all roses, whiz-bang keynotes and cheesy soundbites. The real problem is that CIOs don't know where they should be directing individual software application developers (and teams) to start to actually "engineer in" big data analytics products.

IBM's forthcoming product announcements related to this show are obviously aimed at addressing this issue and "packaging" data analytics - look for news on the BLU Acceleration portfolio here.

Things are turning for the positive though says IBM and companies are now actually starting to appoint Chief Data Officers. So would that be CDOs? Yes it would.

Ray Wang of Constellation Research says that we're moving from the point of having data storage to new "data decisions" inside the new world of data visionaries.

10671024534_ebb2bfd056.jpg

Image Credit: Constellation Research

Big data and The Matrix

Ready for one more keynote style soundbite?

Advanced predictive analytics is like the freeze frame movie effect used in The Matrix movie.

... and here's why:

When the camera freezes and pans around the scene our mind starts to take in extra information about the scenario being played out and, as we see more and more, we start to make predictions about what will happen next. This analogy goes some way to explaining why we need to try and "stop data" for analysis. Of course we can not actually stop data as it flows continuously in the real world, so instead we need to be able to perform this analytics in real-time instead.

There is more news here in terms of product-related announcements, so hopefully this diversion with whet your appetite for more big data analytics.

How to host a big data analytics event

bridgwatera | No Comments
| More

As we know, big data is everywhere.

More specifically, the drive to architect software application development constructs inside which big data analysis and predictive analytics can be performed is ubiquitously spreading across the data services landscape such that every vendor worth its salt now proffers either a direct solution or a closely coupled touchpoint to this technology stream.

Time then for more big data analysis events, conferences, symposia, developer conventions and exhibitions.

An event of this kind needs various parameters covered and building blocks included.

Content analytics must feature prominently, but then so must "information lifecycle governance" (or ILG if you wish to give it an acronym) and of course social content management.

The social data/content stream needs to be close to the Bring Your Own Device stream, naturally - you wouldn't expect anything else right?

Also in the core "topics we really need to cover" zone is data quality management, information governance, risk analytics and cross-platform data management.

Once you've got those aspects covered off you can look at industry-specific solutions and start to list transport, retail, health etc...

Round about now you probably want to book the band for the party night.

Above all make sure they are fun... and if possible make sure they are actually the band FUN from New York -- http://www.youtube.com/watch?v=Sv6dMFF_yts

All going well so far?

OK so this is a good time to throw in a relevant data analysis statistic from somewhere like the Wall Street Journal - and luckily we have one.

In surveying 1000 middle managers of large companies in the U.S. and U.K., 59% miss important information almost every day because it exists within the company but they cannot find it.
[Accenture, Wall Street Journal, 5/14/2007]

Excellent work. OK so after lining up the Gold, Platinum and 'other' sponsors (who decided to sponsor the lunch sessions again?) and organising the EXPO theatre layout it's time for some creative audience planning.

Why not offer "themed guided walkthroughs" of the EXPO hall rather than just allowing attendees to bounce from one stand to the next scooping up sponsored packs of Jelly-Belly jellybeans (OK, yes, they are nice though) so that delegates get more value?

This is a data analytics conference after all right?

So Tour 1 could be for example - speeding up and simplifying data and analytics.
For Tour 2 - let's call it simply 'Managing Risk' with CAPS.
... and for Tour 3, let's call this one something like - transforming data identity in real world use cases.

To carry forward the hands-on feel from Tour 3, we could also have a 'Usability Sandbox' area with sessions to get developers and data engineers used to real tools and see what works, what feels clunky and what we perhaps need to feed back into the enhancements process.

Last but not least, the real meat i.e. morning keynotes and subsequent sessions.

Pre-conference sessions, core day 1, 2 and 3 product sessions and then, if you have the stomach for it, post-conference 'unconference' sessions too. Oh go on then, throw in a few Birds of a Feather sessions too just so we don't look stupid.

Finally you'll need to cover off social media and you'll need a memorable and snappy conference name.
iodinfo_carousel.png
As a pure finger in the air illustration here, you could use #IBMIOD, #IBMBigData, #IBManalytics and #IBM and call the event IBM Information On Demand for example.

The working title for this story was: What to expect from IBM Information On Demand 2013 Las Vegas November 3 - 7 2013, but why be specific?

If you had enough big data analytics in place you would have known what we were talking about before we got the end right?

Editorial disclosure: The event brochure for the IBM Information On Demand conference may have been 'casually leafed through' and perhaps 'painstakingly cross-referenced' during the drafting of the above content.

Where do you want your integration to go today?

bridgwatera | No Comments
| More

Integration is on the move.

Well, it could be, it can be, it should be and ....

... it now has the opportunity to be on the move given the changing state of the developer/data/analysis/cloud landscape(s) plural.

What we mean is, integration (in the cloud sense at least) has traditionally existed as a SERVICE element i.e. a SaaS consideration.

But, in cloud, integration is viewed (or at least it should be) as a very much reusable component resource.

imm-small_tcm8-19454.jpg
If we look at the current push towards Agile software application development and so-called "Continuous Delivery" (with CAPS) then we can see how integration in the cloud might be potentially moving out of the S - service layer and then subsequently featuring that much more prominently in the P - platform layer, for greater overall macro-level control.

Integration news is almost creating its own sub-category worthy of a genre in its own right.

TIBCO used the fading embers of its recent TUCON event to announce a announced a range of new approaches to integration for small and large businesses for its flagship product ActiveMatrix Business Works.

NOTE: Joining the core product was TIBCO ActiveMatrix BusinessWorks Express (for web and mobile integration projects) and Project Austin, a cloud-based integration application for non-technical business users.

The firm says the rise in the need for integration services is significant and it has expanded its integration portfolio to allow major technology shifts and different consumption model upgrades.

"Web and mobile projects are looking to cost-effectively integrate increasing amounts of data coming from both external and internal sources and while still providing a rewarding user experience. ActiveMatrix BusinessWorks Express addresses the needs of developers of web and mobile projects, and ensures these projects will get much faster to results, with much lower TCO, said Matt Quinn, chief technical officer, TIBCO.

Interestingly, the release of the Express edition recognises that web and mobile projects in organisations of all sizes require a different approach to integration.

"IT budget funds are allocated to web and mobile projects to quickly deliver value, and ActiveMatrix BusinessWorks Express will shorten this time to value. We are essentially putting powerful TIBCO technologies into the hands of everyone." said Quinn.

NOTE: According to a November 2012 Gartner report titled, Predicts 2013: Application Integration -- by 2016, the integration of data on mobile devices will represent 20 percent of integration spending.

... once again, integration is on the move.

"Project Austin is a stand-alone cloud service that represents yet another approach to integration from TIBCO and is aimed at business users. Departments like Human Resources are in many instances still processing critical data from one application to the other using spreadsheets. They have an integration need that Project Austin addresses," said the company, in a press statement.

Non technical integration is here, cloud integration is here, Agile integration is here, Continuous Delivery Integration is here.

Integration is on the move, we're just sayin' OK?

Subscribe to blog feed

About this Archive

This page is an archive of entries from November 2013 listed from newest to oldest.

October 2013 is the previous archive.

December 2013 is the next archive.

Find recent content on the main index or look in the archives to find all content.