Wipro tech evangelist: machine learning should be channelled to application health

bridgwatera | No Comments
| More

Wipro's Kumudha Sridharan dropped the Computer Weekly Developer Network blog a line this week and insisted she had a point to make.


You say Bengaluru...

The Bengaluru (we used to say Bangalore) based technologist works for India's biggest services consultancy and IT outsourcing company.

Sridharan argues that while machine learning and analytics have taken major strides, both have received relatively little attention in the QA (Quality Assurance) function but the two have the ability to inject intelligence, dynamically.

"The case for using cognitive computing in QA is rock solid. QA can fix the data (sources, types, extraction, sample size, labels etc.) and cognitive systems can continue to use the data to train the system and continuously improve quality levels," she said.

Why is this useful?

Because says Sridharan, it can be used to pro-actively monitor the health of an application.

"Using cognitive computing, the health of an application can be pro-actively monitored by a variety of bots. The bots observe patterns in the data, check on trends and then use algorithms and models to predict the impact of an application on related infrastructure, along with the allied risks and the vulnerabilities," she said.

We can apply some of the same ideas here when we look deeper at software application testing.

Optimising testing

According to Sridharan, today's statistical techniques that are applied to optimise testing, by reducing the number of test cases and eliminating redundancies, tend to become inadequate, especially when changes to applications are frequent.

"Manual intervention becomes problematic and poses a major challenge to QA. Cognitive Computing, that uses continuous learning systems, can be applied to dynamic, risk-based testing, solving the problem," she said.

Sridharan sums up

The following commentary is attributed directly to Wipro's Kumudha Sridharan.

"Defect management is time-consuming. It takes enormous effort in terms of daily calls, meetings, communication and an exchange of updates between teams to accurately identify, isolate and fix defects -- typical ones being tickets raised by users for IT applications. Applying learning-based systems, which look for patterns in the past and leverage them, is the equivalent of making ticketing systems intelligent.

"Enable self-healing: Cognitive Computing can be used to identify situations where self-healing processes can be developed and applied. This would eliminate a huge amount of effort that current QA systems necessarily entail."

Mamma Mia! Italian coders win PayPal hackfest with smart car app

bridgwatera | No Comments
| More

PayPal and Braintree today have announced the £65,593.13 pence British Pound winner of its 2015 BattleHack Series.


Braintree who?

... some Essex connection?

Ah yes, sorry -- Braintree provides "payment processing" options for devices.

Braintree's full-stack payment platform provides businesses with the ability to accept payments online or within their mobile application.

Essentially, it replaces the traditional model of sourcing a payment gateway and merchant account from different provide

BattleHack World Finals

The 24-hour BattleHack World Finals took place at PayPal HQ in Silicon Valley and hosted 14 teams of developers from across the globe, all winners of their regional BattleHack competitions.

Competitors were tasked with building an application that incorporates the PayPal, Braintree or Venmo APIs, encouraging hacks that include an element of social good.

The US $100,000 USD prize (see exchange rate above) was awarded to the team from Venice, Italy.

Team Venice's winning hack, called ifCar, tapped both hardware and software to make cars smarter using a combination of sensors, environmental and contextual data and user preferences.

"We're thrilled with what Team Venice built. Their technology has the potential to democratise access to the technical features of high-end cars, making it possible to turn any car into a smart car," said John Lunn, senior director of developer and startup relations at Braintree.

Picture credit: TopSpeed

Informatica big data management, more 'integrated' than some?

bridgwatera | 1 Comment
| More

Informatica launches industry's first integrated platform for big data management said the press release headline in what is, arguably, something of an overstatement all round.


There are of course several types, layers, breeds, sizes and species of big data management platforms out there -- and almost all of them are fairly 'integrated' in one form or another.

Where Informatica is going with this spin is its own-brand 'Big Data Management' offering -- a set of software intelligence intended to offer big data integration, big data quality and governance and big data security in a single integrated solution.

Which, as a combined combination set of big data 'things' is, arguably, more rounded than some.

The new product claims to be able to reduced the need for hand-coding and big data skill sets that are expensive and hard to come by.

"Data is the lifeblood of business, and only Informatica does end-to-end data management for big data," said Anil Chakravarthy, acting chief executive officer, Informatica.

"Big data represents the next frontier of competitive differentiation, superior customer experiences and business innovation. From driving rapid project implementations to ensuring confidence in the data and the safety of sensitive information, Informatica Big Data Management empowers business and IT leadership with unparalleled automation, pre-built tools and optimised capabilities. This allows for quick experimentation and seamless, mission-critical production deployments that deliver maximum business value from big data."

Microsoft's cloud vision widens -- Windows 10 will be an Operating System-as-a-Service

bridgwatera | No Comments
| More

As reported here on Computer Weekly, after what is only just a few days after AWS set out its plans to open a UK datacentre, Microsoft announced a move to support the delivery of its commercial cloud services at its Future Decoded event in London on 10 November 2015.


What does this mean for developers?

In simple terms this does at least mean that there are more on the ground cloud-developer related resources for UK-based (or perhaps even UK-centric) programmers to connect with.

Microsoft CEO Satya Nadella presented part of the keynote at this year's Future Decoded event with the following points of note:

• Nadella emphasises that this latest datacentre expansion effort centres around Microsoft's desire to build and provide the breadth of cloud infrastructure and resources needed for programmers in the UK to drive truly cloud-native applications.

• Nadella used his time on stage to present actual hands-on demo materials.

• Nadella talks about how (with that whole cloud centricity factor in mind) Windows 10 will now represent the first stage of the firm moving to provide what we could call an Operating System-as-a-Service.

• Nadella wore a suit - well, we had to mention it.

Nadella speaketh

"At Microsoft, our mission is to empower every person and organisation on the planet to achieve more," said Satya Nadella, chief executive officer of Microsoft.

"By expanding our data centre regions in the UK, Netherlands and Ireland we aim to give local businesses and organisations of all sizes the transformative technology they need to seize new global growth," he added.

As one of the largest cloud operators in the world, Microsoft insists that it has invested more than $15 billion in building a resilient cloud infrastructure and cloud services that deliver high availability and security while lowering overall costs.

There are now 24 Azure regions around the world.

Where's the cloud, there's IoT


Where's the cloud, there's always the Internet of Things -- so Microsoft showcased some of the 'coolest' IoT technology around with its robot bartenders. The machines worked making attendee cocktails to order with very little spillage.

Shaken, not stirred, but definitely 'positive disrupted' as they would say.

Carina by Rackspace: a cloud developer portability conduit

bridgwatera | No Comments
| More

Rackspace has announced the free beta offering of 'Carina by Rackspace' -- an instant-on native container environment.

Did you say Ai No Corrida?

1 yfuwyfduywgf.png

No... it's Carina.

This is a technology offering that focuses on portability i.e. allowing a customer to create and deploy a cluster for their containerised applications faster (claims Rackspace) than them doing it themselves.

Carina is a container cluster service offering that 'leverages' bare-metal performance, Docker Engine, native container tooling and Docker Swarm for orchestration that make container clusters accessible to everyone.

Why is portability so important to developers?

As recently explained here, "Containers are means of transporting software (in a reliable state) from one computing environment to another."

The developer factor

In practice this could mean software that has to move from a development team's server into a test environment or from a staging environment into 'live production' i.e. full deployment -- containers could also be used in the journey from a physical machine to a virtual machine in a private or public cloud.

Rackspace reminds us that container technology consumes a fraction of the compute resources of typical virtual machines, allowing for near-instant availability, application scaling and increased application density, allowing customers to save time and money.

Zero infrastructure

"At Rackspace, our mission is to give customers industry-leading service and expertise on the world's leading technologies. Carina extends this mission as part of our strategy to support OpenStack's position as a leading choice for enterprise clouds," said Scott Crenshaw, SVP, strategy and product at Rackspace.

"Carina design makes containers fast, simple and accessible to developers using native container interfaces, while leveraging the infrastructure capabilities of OpenStack," added Crenshaw.

With Carina, developers get a 'zero infrastructure' container environment where Rackspace manages the infrastructure and Docker environment for customers.

IBM expands 'Studio' line of facilities for coders

bridgwatera | No Comments
| More

IBM tells us it likes developers, but don't be fooled -- everyone has been saying that since a certain bald headed CEO started bouncing around the stage screaming the word.

But down at the guts level, we know IBM's intentions are pure enough i.e. the firm has spent years now validating its work with the Rational brand and has a solid developer stream running through almost every perceptible aspect of its entire stack from its Z-systems hardware beasts upwards to its Watson cognitive computing 'decision engine' platform.


Stu-Stu-Studio Line (not from L'oreal)

The firm has this week announced further expansion of the 'IBM Studios' across Europe with the opening of new facilities in Dublin and Hursley (UK).

By the end of 2015, new IBM Studios in Europe will also open in Warsaw, Prague, Hamburg and Paris, adding to IBM's more than 20 Studios around the world.

But what are IBM Studios?

These places are meant to play host to what IBM calls 'multi-disciplinary teams' and that means:

  • designers,
  • strategists,
  • coders and
  • other industry experts.

... all of whom come together to develop products and digital marketing services around cloud, analytics, Watson and collaboration.

Experts from IBM Design and IBM Interactive Experience will work together at these places.

"People's expectations of enterprise tech have changed because of innovative design they see in devices and apps used at work and play," said Phil Gilbert, general manager, IBM Design. "These studios will join a global network that is transforming how tech is created with user experience at its core."

Focus, fire a design-gun

Each new IBM Studio will have a core focus...

... so what this means is that Dublin and Hursley will focus on designing IBM products and user experiences around Watson, security, collaboration and Internet of Things.

The Hamburg and Paris Studios will focus on mobile and web application and digital transformation projects.

"This crop of new Studios in Europe reinforce IBM's continuing commitment to great design and innovation," said Matt Candy, Vice President & European Leader, IBM Interactive Experience. "IBM has been at the forefront of design-led thinking for decades and is now busy building the biggest design team in the world. With these six new openings -and more to come next year--we'll continue to break old models and create a new way to work."

IBM Interactive Experience's 9,700 designers, developers and consultants work with IBM clients to create data-driven design for everything from virtual showrooms, immersive customer experiences, business apps, content and more.

How to find real DevOps, look for binary artifact repository control

bridgwatera | No Comments
| More

DevOps (as the coming together of both the 'developer' and IT 'operations' functions) has been unfortunately propelled upwards by the force of the technology trigger and driven onwards towards the peak of inflated expectations (to coin a phrase from Gartner).


As Gartner would now warn us... the so-called trough of disillusionment is the next logical stage.

So is DevOps heading for a fall?

The problem with DevOps is that tangential ancillary IT vendors have sought to nail their worthy-in-their-own-right technologies to the DevOps mast as a tactic to:

• Hype the PR cycle for their own brand
• Follow current tech trends
• Get 'developer-centric', coz that's always good
• Some other less than valid spin-related reason

This level of insubstantial peddling has led us to a natural level of apprehension when we hear about DevOps today.

JFrog is a firm that (by most people's yardstick) does what we can arguably call real DevOps i.e. its Mission Control product exists to accelerate software delivery with monitoring and management functions over all binary artifact repositories.

Binary artifact repositories

NOTE: By way of definition -- binary artifact repositories (and binary artifact repository managers) are elements of software designed to manage, look after the version control of and ultimately store binary 'artifacts' i.e. those parts (models, use cases, diagrams) software that describe and denote functions (and architecture form) of software.

Tough road to cloud scale

JFrog claims that it has "discovered" a common set of issues that tends to bog down software development and DevOps teams as they scale up to thousands of developers and engineers in multiple teams leveraging multiple datacenters around the world.

These issues include:

• maintaining a clear real-time inventory of binary artifact repositories;
• managing binary artifact workflows among multiple global teams;
• locking down security, user entitlement, permissions and provisioning policies;
• and ensuring highly reliable storage of and access to artifacts.

With the thousands of binaries that often go into a software release and the explosion in binary artifact types, monitoring and managing each binary repository separately has become a huge challenge.

JFrog says that Mission Control saves time and effort with a unified dashboard-view and centralised control of binary artifact repositories.

JFrog CEO Shlomi Ben Haim says that, "JFrog's vision is to fill a critical need by providing an executive dashboard providing transparency into their global software development organisations based on the reality of builds, releases, distribution and consumption of software packages."

JFrog Mission Control is a downloadable product, offered free of charge for JFrog Artifactory users -- a Universal Artifact Repository, which manage all binary artifacts regardless of the programming language or technology used to create them.

"It combines high availability, a secure Docker registry, npm repository and support for Maven, Gradle, Nuget, Yum, PyPI and other technologies," said Ben Haim.

He continues, "JFrog's Bintray gives developers and organisations full control over how they store, publish, download, promote and distribute software with advanced features that fully automate the software distribution process."

A DevOps spin test

Next time you look at DevOps news, look for functions like quantifiable tasks metrics, call stack analysis technology or binary artifact repository control... the rest of it might be trying to spin you round.

IBM: data analytics leads us to 'life event predictions'

bridgwatera | No Comments
| More

IBM VP for industry analytics solutions Marc Andrews held a breakout session at the firm's recent Insight 2015 conference and exhibition to talk about the real world implications, possibilities and applications for data analytics in the real world.

How it Works_ IBM Datacap Insight Edition.png

How Watson could help

IBM explains how its humongous Watson data analytics technologies could be applied in what it likes to call The Insight Economy.

Take an example in a bank... using more advanced analytics than previously ever really applied up until now, we can see how banks might be able to analyse user transactions to see what kind of lifestyle each customer leads and be able to:

a) Serve the customer better

b) Predict problematic circumstances better

NOTE: There is obviously a privacy factor here, but we will assume that the customer has agreed to whatever level of data sharing with the bank in this circumstance.

a) If a particular customer spends a lot of money on wine clubs or holidays, then the bank may be able to channel special offers towards that individual that end up giving them a more rounded banking experience.

b) If a particular customer flies to Las Vegas frequently and gambles a lot, then it is safe to assume that they are a higher risk entity than some other customers -- the bank might like to prepare overdraft offers, present insurance deals and tune other services to the type of spend that that particular user is known for.

... and these lead us to be able to predict what IBM calls LIFE EVENT PREDICTORS.

You could also call this behaviour based insight, a term that has already gained popularity.

Pre-defined data model

The benefits go further here i.e. IBM will offers these analytics services to all banks, retailers, industrial machinery specialists, healthcare companies, with what represents a pre-defined data model.

This means that firms using the services can see what variables they should be tracking in what IBM presents as the widest possible (well, IBM does have breadth) scope of operations for that particular industry vertical.

Codifying expertise

IBM calls this codifying expertise i.e. firms (banks or any type of business) can see where their data model is the same as most businesses in that particular vertical -- they can also see what kinds of variables they may NOT be tracking but SHOULD be tracking based on the whole data model that IBM offers.

In terms of formal news related to this idea, IBM used to the show to tell us that Cognos Analytics now sports a redesigned user experience based on the same self-service design principles as Watson Analytics.

With the redesign of Cognos Analytics, business users and IT professionals are now able to author and distribute reports and access self-service dashboards about things that matter to their business, like monthly financial reports, weekly sales pipeline trends, daily production yields, hourly inventory levels - and do so on any device anywhere.

What Beth said

"IBM helps enterprises across all industries extract new insights from the explosion of available data to drive competitive advantage," said Beth Smith, GM, IBM Analytics Platform.

"With a Cognos Analytics, we have reimagined BI, delivering a modern user experience for business users, helping them gain the self-service independence they want without compromising the trusted scale an enterprise platform IT needs. Now our clients can gain the insights to effectively manage and improve the outcomes of their business."

IBM Insight 2015: Day #1 'insights'

bridgwatera | No Comments
| More

Let's move from switchboards to motherboards.

Not a bad tagline/catchline to kick off the IBM Insight 2015 conference; this event is all about big data, analytics and (you guessed it)... insight.


The 'insight economy'

Senior VP for IBM Analytics Bob Picciano talks about the 'insight economy' as another kind of catchline that tries to express where his firm is channeling its software application development efforts just now.

This leads us toward's IBM's work in what it calls 'cognitive computing' -- this is the division that drives forward the work of IBM Watson, the push here centres on creating what IBM wants to call an entirely new model for how people and intelligent systems will work together to enhance human cognition and expertise.

Have a Coke and a data analytics smile

During the keynote one Coke spokesperson took the stage... and his job title was Group Director for Data Strategy -- so Coke has a data strategy specialist, this kind of speaks for itself really.

There are now 50 core building blocks of services in Watson with specific APIs to encourage software application developers to start building more 'cognitive applications' today.


What's important to remember about cognitive 'learning' systems is that they never get tired, they keep learning and get better as they exist for longer periods of time. As these types of applications grow, they start to build up more experience and can therefore start to provide more informed predictions about the world around them.

IBM is pushing Watson to now progress outwards, it wants to work with its ecosystem partners to start to assemble new cognitive application combinations.

From wine to cancer cure

IBM got all practical on the audience this Insight (well, after all, why not) and brought out a string of application implementation partner/customer vendors who showcased the use of IBM analytics technologies.

Everything from so-called 'cognitive hotels' (that can fix guest issues much faster) to cancer cure development... through to wine choice applications based on real user preferences (rather than difficult to understand sommellier ratings) were on show.


Watson Health

Watson actually has a dedicated Watson Health division now... and Watson is learning to "see" now.

Russell Olsen from this division explained that radiologists are typically tasked with looking at up to 100,000 images per day -- so today Watson is working to analyse key imaging data (to compare it with the diagnosis seen in other patients in real world examples) and then...

... the radiologist or clinician can then take the heavy work that has already been done (although Watson can apply some clinical knowledge and reasoning of its own) so that the healthcare professional can combine that data with "patient driven differentials" and provide the best analysis possible to ongoing healthcare.

IBM announced that Watson will gain the ability to "see" by bringing together Watson's advanced image analytics and cognitive capabilities with data and images obtained from Merge Healthcare Incorporated's medical imaging management platform.

1 Insigh2e23et pic.JPG

Also on the newsfeed we see that IBM plans to acquire Merge, a provider of medical image handling and processing, interoperability and clinical systems designed to advance healthcare quality and efficiency, in an effort to unlock the value of medical images to help physicians make better patient care decisions.

The new dataspeak

Just remember... there's a new language now:

Don't say LIKES, say: "user-declared information".
Don't say CALCULATION, say: "knowledge-based reasoning".
Don't say COMPUTING, say "leveraging the Watson cognitive engine".
Don't say USER, say "citizen data scientist".
Don't say ANALYTICS, say "putting a cognitive lens on traditional business information".


What to expect from IBM Insight 2015

bridgwatera | No Comments
| More

IBM, it turns out, was quite clever.

The company renamed its old Information on Demand conference to IBM Insight just before the whole big data analytics (which leads to 'insight', obviously) thing really started to take a hold across the tech industry about two years back now.

NOTE: Clearly we can trace big data analytics back further than that, but you get the point.

1 Insight pic.JPG

So after IBM Insight 2014, what can we expect from IBM Insight 2015?

Better empowered - awesome!

Well, for one thing, we can certainly expect some glossy well-practiced US West coast stage presenters telling us that they are 'better empowered' by IBM today - awesome!

But after the showboating, there is much to learn here... so much so that IBM refers to its conference schedule as a 'curriculum' for education.

"From business and industry solutions to deep-dive technical sessions, learn new ways to unlock the potential of data and analytics -- regardless of your interest, skill level or business priorities, the curriculum is designed to help you," reads the programme.

Tracking the tracks

Tracks this year include sessions on:

  • advanced analytics
  • data and content management
  • Hadoop & Spark
  • integration. governance and security
  • systems and architecture

TECHNICAL NOTE: Apache Spark is an open source parallel processing framework that enables users to run large-scale data analytics applications across clustered computers.

The need for speed

You can expect IBM to focus on real time (big) data and the need for speed.

Most likely we will find that Big Blue hammers home the "data is simply moving too fast for traditional approaches" message as it tells us how organisations must take advantage of new best practices of harnessing data.

... and you know there will be plenty of Watson.

For developers... IBM will focus on its Watson Developer Cloud service, which this year already gained additional intelligence in the form of advanced language understanding as well as speech and vision services.

Today we know that IBM Watson is a computer (it lives in Astor Place, New York and there's one in San Francisco too) that is capable of extracting meaning from unstructured text, video, photos and speech.

IBM has recently also developed cognitive APIs to establish software connection points that work with tools for software application developers to code Watson intelligence into new applications.

Fancy a bit of a session?

Sessions, sessions, sessions -- yes there will be lots of sessions again this year and they are typically broken out as hour-long presentations that stop for a 15 minute Q&A (or diet Coke induced 'comfort break') at the end.

Session types this year include the keynotes (obviously) plus presentations from Box, Boeing, The Weather Company and Ron Howard.

(Ed -- Ron Howard? Don't tell me that 'Happy Days' runs on IBM now does it?)

There are also sessions presented in the form of panels, hands-on labs, Engagement Center sessions, Meet-the-Expert sessions, Design Studio sessions and the always welcome EXPO Theater sessions.

IBM Rocks

As well as opening receptions and after-hours roulette for the more hardcore technology hacks among us, IBM rounds out this year's event with a concert featuring Maroon 5 (the Imagine Dragons and Train were obviously unavailable, thankfully), so even geeks will be able to move like Jagger by the end of the week.


Editorial disclosure: IBM covered a proportion of Adrian Bridgwater's travel expenses to attend its conference.

App are 'always on, always connected' -- so testing must change

bridgwatera | No Comments
| More

As you may have already noticed, the state of the application is now an always-on and always connected thing -- this is the new normal.


Surely then, we need to approach software application development testing differently?

This is the thought process that drives much of Perfecto Mobile has done with its latest product launches.

User experience blind spots

The application testing, monitoring and analysis company has launched 'Wind Tunnel' -- a product to optimise the process for identifying user experience blind spots.

Wind Tunnel optimises testing for end-user conditions by defining and personifying end-user profiles and by enabling testing across common scenarios such as:

  • degraded network conditions,
  • conflicting apps and,
  • call interruptions.

"Organisations are shifting their digital strategies to serve the always-connected user," states Raúl Castañón-Martínez, senior analyst for enterprise mobility infrastructure & services at 451 Research.

"Quality is integral to the application lifecycle and continues to grow in importance as digital engagement pushes the industry to think about the entire user experience."

Pre-defined user profiles

Perfecto's Roi Carmel argues that Wind Tunnel transforms Perfecto's Continuous Quality Lab as the first solution to offer pre-defined user profiles that combine a variety of common test scenarios.

"Profiles contain test conditions that, if unaccounted for, can affect apps dramatically, including location changes, varying network quality, usage patterns and resource conflicts with background apps," said the company, in a press statement.

• Early analysis -- correlate mobile events, vitals, network quality and type and various conditions that influence the end-user experience and learn the impact they have on the end user.

• No change to existing test scripts or build processes - apply real user conditions onto existing automated tests using popular orchestration tools such as Jenkins or TestNG.

Tableau Conference 2015: howdy partners

bridgwatera | No Comments
| More

Whether you like to call it data visualisation or data visualization, data vizable-ness company Tableau has had a busy week.


The firm's Tableau Conference 2015 event has been staged in Las Vegas and the opening keynote session was of particular note.

It's rare to see a two-hour technical presentation with developers on stage showing off every aspect of functionality imaginable and detailing product updates in specific detail, but that's what the company did.

Day two - double happiness?

The company then built upon this with it's day two keynote as presented by TED motivational speaker Daniel Pink.

Pink is good, quite entertaining, but he repeats himself a lot and 'pads out' his rhetoric a lot when speaking -- one imagines he's not the best person in the world to get stuck in an elevator with.

In case you missed the news, Tableau also used this year's event to launch Vizable, a mobile application designed to help understand data with an iPad.

What the partners said

The partners were busy too, Attivio announced Attivio Data Source Discovey for Tableau.

This is a self-service solution designed to reduce the time typically required to profile, identify and unify data for analysis.

The product promises to transform analyst productivity and yield a far better return on Tableau investments.

Attivio for Tableau is supposed to give end users with an e-commerce-like shopping experience for their data, eliminating the principle bottleneck between agile data sources and Tableau.

"Tableau users spend too much time gathering data sources, and not enough time on analysis," said Stephen Baker, CEO of Attivio.

"With Attivio Data Source Discovery, Tableau users can instantly identify the data sets they need and provision them for analysis, eliminating the process bottleneck between enterprise information and data-driven insights."

Will it blend (in Alteryx)..?

Alteryx, Inc., a player in the data blending and analytics space announced that through its partnership with Tableau it will release the ability to directly publish and update Tableau Data Extract (TDE) files to Tableau Server and Tableau Online.

The firm says it is delivering a simpler way for analysts to scale their use of Tableau by automating the creation of TDE files to ensure every user of Tableau visualisations has the most up-to-date data available.

This new capability will be available in Q4 of 2015.

"We know that more and more Tableau users are expanding the value they produce by using Tableau Server or Tableau Online to distribute their visualizations to decision makers," said Paul Ross, vice president of partner marketing at Alteryx.

"The ability to create and update the perfect analytical data set and deploy it directly to where it's needed means that analysts save time and can scale their efforts even further."

TMMData: too much molasses?

TMMData, a top provider in flexible data management software, has announced its technology partnership with Tableau.

The pairing will showcase TMMData's advanced data management capacity with Tableau -- TMMData's software automates the otherwise productivity-sapping tasks of finding, standardising and aggregating information from multiple sources.

CEO Chris Walsh says that Tableau users who take advantage of his firm's software's ability to remove barriers to data access, refinement and analysis will maximize their investment in analytics through competitive advantage.

"In turn, clients using our data platform can easily share key reports with internal stakeholders through Tableau's visualisations," said Walsh.

... and finally

Last but not least, let's mention Informatica.

The firm announced the rollout of a set of new data management offerings purpose-built and bundled together to drive greater productivity and faster time-to-value for all users of Tableau.

The new Informatica offerings provide a turnkey data management solution which includes data integration, data preparation and out-of-the box visual templates for Tableau customers across the organisation.

Additionally, the solution includes zero-training, easy adoption, self-service tools for end users. Informatica provides the industry's best integration backbone to enterprise IT, with the data governance tools needed to deploy Tableau across the enterprise on a foundation of trusted data with full data lineage.

"Tableau and Informatica are deepening our strategic partnership by building on the tremendous success experienced with our initial Tableau offering," said Ronen Schwartz, senior vice president and general manager, Informatica Cloud.

"Informatica is expanding our support by bringing every Tableau user in the cloud, desktop and server the ability to explore any data and get answers. We are offering wide access to applications and databases, with point-and-click simple data extraction, for on premise and cloud data through easy visual cleansing and blending. Only with timely and trusted data can Tableau users realise the potential of powerful data visualisation without having to argue whether the data is accurate."

CWDN comment: the partner expo pavilion at Tableau Conference 2015 is quite unusual -- they supply attendees with food and drink all day long, they play music and there are games areas. Oh okay we like it.

Image credit: http://pierredreulle.fr/tag-Lucky-Luke.html

Tableau launches Vizable: a free (and fun) data analytics app for iPad

bridgwatera | No Comments
| More

Tableau has used its keynote sessions at the company's annual user conference to to announce Vizable, a mobile application designed to help understand data with an iPad.

The firm's VP of mobile and strategic growth Dave Story showcased the app to what appeared to be a receptively pleased crowd reaction.


Analytics can be fun

The free app can ingest data from a spreadsheet and then present it for touchscreen usage.

The idea here is said to be "fun and approachable data analysis" for more users -- and Tabeau means more 'types' of users here i.e. a democratisation of data analytics through a tool that works on a device that we are used to using for everyday tasks.

An iTunes free download link is shown here.

The free iPad app makes it possible to explore data by using gestures such as pinching, swiping and dragging, allowing users to sculpt data into visualisations.

A tool for a real need?

Tableau says it has already put Vizable in the hands of a number of businesses and organisations who previously had data and questions, but no easy tool to answer those questions.


"We started with a food truck, so the very nature of our business is mobile," said Roz Edison, owner of Marination, a Hawaiian-Korean fusion cuisine business with one food truck and three restaurants in Seattle. "Since we've started using Vizable, if someone asks me how Marination is doing this week, I can pull out my iPad, start swiping and dragging my cash register data, and boom! It's that easy."

"There is no great app to see and understand a spreadsheet visually, using only a tablet," said Story. "This is an incredibly common scenario. Whether you're a teacher with students' test data, a small business with the day's sales, a cyclist with ride stats, or an executive with the past hour's web site traffic - Vizable brings analytics to more people by engaging them with their data like never before."

The first version works with data in CSV files and Excel (xls or xlsx) files.

Visualisations created in Vizable can be shared with friends and colleagues through email, instant message, or social media.

PRODUCT NOTE: Vizable is currently available worldwide in the Apple iTunes store in English -- it supports data in many international formats today, and will be translated to other languages in the future.

Is it all good news here? Well, yes... but the crowd shouted one word as the Tableau VP left the stage at the end of the presentation:


We can reasonably assume that the firm will port to other operating systems in the fullness of time.


Tableau Conference 2015: notes from the #data15 keynote

bridgwatera | No Comments
| More

This blog is written live at the 8th annual Tableau Conference -- the firm is known for its 'mission' to reinvent the spreadsheet for the tablet age... in terms of operation, Tableau's software takes data held in Excel spreadsheets and converts the information into tactile/touch-based "data visualisations" with Business Intelligence (BI) functionality.


First impressions

Huge simply doesn't cover it; they say that there are 10,000 attendees here and most of them have the #data15 hashtag already memorised for the week.

Co-founder and CEO Christian Chabot kicked off this year's event with all the usual fanfare of a company chief taking the stage in Vegas.

Chabot insists that his firm is dedicated to helping people to understand data -- and the most important word in that sentence is 'people' (well, this is Las Vegas remember).

Data visualisation examples of the kind created by the firm's software are clearly applicable to every industry and vertical (or sub-vertical) on the planet.

Examples referenced included:

• Ebola research
• Heating systems
• Steel production
• Pea farming
• Manta-ray protection in the ocean
• Children's education

What data means in the real world

"Compiling and reflecting on data and being able to understand it is starting to really impact people's lives," said Chabot.


Data is no longer the exclusive realm of statisticians and mathematicians says Tableau.

The problem, if there is one, is that the opportunities to use data are expanding faster than the growth of skills to use it argues the firm - an obvious, but nonetheless valid 'tee up' to position the Tableau product set.

Francios Ajenstat from the Tableau product development team took the stage to explain his vision for how ALL users should be able to use what he calls 'self-service' analytics.

Developers on stage - this section of the keynote saw seven of the firm's actual programming team present and show an actual migration of spreadsheet information from Microsoft Excel into a Tableau visualisation.

Even for a spreadsheet that has been displaced into two sections, the software is capable of taking the information and converting it.

250 different date formats

Day/month/year, month/day/year, day/month/year without century etc...

Another common problem experienced when 'parsing a string of data' into a visualisation tool is data format and the team say they have found over 250 different data formats in use today - this again is one of the key pieces of operational intelligence that Tableau is capable of performing.

1 ihiuguyg.jpg

Another key feature just introduced is the ability to integrate cross-database data. If a user is inside one SQLserver database (for example) and needs to connect to a new data source (image say customer data is in one database and product data is in another) - the software now has the power to automatically join these kinds of data streams.

It's a new religion

As a final comment on this opening session, this conference is not what you would expect. If you thought that this would be a generally quite high-level presentation aimed at data visualisation for dummies - then you would be wrong.

The Tableau developer team took a major portion of this keynote and spent a weighty proportion of the time on stage showing off real application usage scenarios.

If you thought that data visualisation was a dry accounts-type subject, the you'd be wrong as well.

The user attendee (non Tableau employee) sat immediately to my right in the conference hall spent the entire presentation clenching her fists, whooping and saying "yes, yes, oh yes!" every time a new piece of functionality was shown.

The stuff really gets people going, it's almost like people aren't ever going to want to use Excel on its own anymore.

As they say at Tableau: viz long and prosper.

51Degrees' CTO: the responsive emperor is naked

bridgwatera | No Comments
| More

This is a guest blog for the Computer Weekly Developer Network by 51Degrees' advisor (and CTO of Wayra and boss of Mobile Monday London) Jo Rabin.

Rabin's company provides open source device detection services for web developers so they can identify the devices that are hitting their websites and send (hopefully) a relevant web experience back to the user regardless of the device they are browsing on.

The below commentary comes on the heels of Google announcing the Accelerated Mobile Pages project recently - which aims to deliver faster web pages for publishers of content.

A naked fairy tale

1 cw oihfwoeihfi;wu.jpeg

For some time it has been apparent that like the fairy tale emperor, responsive design lacks some essential items of clothing.

Google recently played the role of the small boy in that fairy tale by making the announcement of Accelerated Mobile Pages.

In Google's announcement we see a refreshing acceptance that there is a problem with the web on mobile. Even more refreshing is that the argument is conducted on a down to Earth pragmatic and commercial basis, rather than on an abstract technological, aesthetic basis which ignores the commercial point of an organisation having a web site in the first place.

So web site owners are suffering, their users are suffering too.

Unhealthy lashings of JavaScript

Responsive Web Design on its own is quite simply not enough of an answer -- and its unhealthy lashings of JavaScript poured over everything leads to sclerosis.

The first step, they say, is to acknowledge that there is a problem the next step, apparently, is to seek help.

So what help does Google offer?

Well, sensibly, it says it is tackling the problem one step at a time.

Google offers a remedy for primarily static pages that carry advertising. That's great, but the approach it advocates is not startlingly different to advice that has been available from the W3C in the form of Mobile Web Best Practices for many years.

Things were quite different when that document was written, but the basics are still quite sound especially when you realise that those recommendations were written at a time when most web pages were primarily static and responsive design had not become a creed.

So it is not surprising that Google's recommendations applied to static pages and the historic view are reasonably well aligned.

To take a specific example...

In AMP the size of an image is fixed and stated in the HTML. This avoids the browser having to shift pages around as they load, one of the main causes of poor user experience - if you start reading something then suddenly it changes position.

Knowing what size you want an image to be up front requires an understanding of the context in which the image is to be displayed - i.e. is this being shown on a 27 inch desktop monitor, or is it to be displayed on a small hand-held screen? The techniques that allow web sites to determine this kind of information have been around for a while. Businesses and brands that require better than a hit or miss user experience already use device detection as part of their web presence.

Determining the size of an image in advance is just one specific example of what AMP requires and what device detection provides the answer to. Many other aspects of user experience are improved by using this technique which is highly complementary to AMP.

Responsive design is by no means dead, but it really is beyond time that that its limitations were acknowledged and that debate moves on to discussing how to improve the real world of the web, improve user experience and help Web site owners to improve what is now an essential part of their business.

Kudos to Google for extending the emperor's wardrobe.

What to expect from Tableau Conference 2015

bridgwatera | No Comments
| More

The 8th annual Tableau Conference is staged next week in (fabulous) Las Vegas -- so what should we expect?


Tableau is known for its 'mission' to reinvent the spreadsheet for the tablet age.

The firm's software takes data held in Excel spreadsheets and converts the information into tactile/touch-based "data visualisations" with Business Intelligence (BI) functionality.

The upcoming conference and exhibition is being staged by the firm as an opportunity to connect with 11,000+ other data enthusiasts who celebrate a shared passion for building relationships with data.

Data relationships, for real

According to Tableau, attendees will learn how to extract the tools and tactics that help build better "data relationships" for your business.

The Ajenstat state of the nation

"The Tableau Conference has become a fantastic event for developers and data enthusiasts to learn new skills and collaborate with people from all over the world. Developers get to learn directly from their peers across various industries, and usually come away with fresh ideas to bring back to their organizations." said Francois Ajenstat, VP of product management and Tableau Conference.

"The new web data connector that we announced last month has really opened up the possibilities for our development community, and they're exploring new ways to connect to various data sources within their organization. We're really excited to hear, directly from our customers, about how they're using these connections to enhance their data visualization projects."

Looking at the firm's latest release -- we can also reference the following commentary:

"Tableau 9.1 builds on our mission to help people see and understand data, no matter where the data resides and what device they're using," said Chris Stolte, chief development officer and co-founder of Tableau Software.

"We've made significant investments in enterprise features and an entirely new mobile app. We also created a web data connector that helps developers extend Tableau to connect to a limitless number of sources from Facebook to Twitter and Google Sheets. New native connectors to critical data sources such as SAP and Google Cloud SQL are also included."


Mainlining in partner central

Tableau Conference 2015 (or TC15 to those of us in the know) appears to be one of those events where a) the host vendor has lots to talk about and a new product release to showcase... but that communications stream will be quickly followed up by a whole host of partner announcements, so let's list a few previews as they come in.

Southard Jones is vice president of product strategy at cloud BI company Birst -- Jones spoke to the Computer Weekly Developer Network blog in person to provide the following thoughts on next week's event.

"Companies have faced challenges getting insight from their data as often they are locked into older, legacy BI tools that are not as integrated as their vendors might have you believe," said Jones.

"At the conference, Tableau will be going into more detail on the company sees its role in data visualisation and analysis developing... and Birst supports this through our work on data preparation and networking data sources together. This Networked BI approach offers companies a way to make their data richer through context, which Tableau can then help teams visualise; together this approach makes everyone work in smarter ways."

Where's it's At (scale)

AtScale, a company that describes itself as an operation to provide business users with self-service BI access to Hadoop will be talking about developments to its AtScale Intelligence Platform.

You could be hearing more about a technology known as Adaptive Caching, that's all we're saying for now.

Informatica commentaria

After posting this story the Computer Weekly Developer Network also received the following comment from from Informatica's senior VP and GM for Informatica Cloud.

Ronen Schwartz gives us his on predictions to expect at the Tableau conference.

"As usual data will take center stage at the annual Tableau Conference in 2015 with an emphasis on complete, trustworthy data. There will also be a focus on analytics and visualisation tools, which can be very powerful, but depend on good data."

Schwartz adds, "If all of the necessary data sources are not powering the analytics offering and if the data available for analytics doesn't go through transformation, normalization, validation and data quality, then you have an incomplete picture. Business and IT users will continue to demand visual solutions, but they also want better data management tools."

Never one to miss out on a sell and spin as a parting note, Schwartz bids that we watch for Informatica and Tableau to deepen its strategic partnership -- "to meet our joint customers' needs", as they say in PR land.


Editorial disclosure: Tableau has covered a proportion of Adrian Bridgwater's travel costs to attend this year's event.

Pentaho on big data -- George Clooney, margarine divorces & the Cheesecake Factory

bridgwatera | No Comments
| More

Forrester analyst Mike Gaultieri always gives one of the most colourful presentations at the PentahoWorld conference and exhibition every year.

Last year's George Costanza story is linked here.

1 butter.JPG

George Clooney

Gualtieri this year at PentahoWorld 2015 reminded us that the number one (and two reason) for doing data analytics is to improve customer relationships -- hold that thought in mind for a second.

So imagine George Clooney walks into the Cheesecake Factory restaurant.

Although most of us would be handed the same standard menu that every consumer gets when they enter this franchise, we can imagine that if George Clooney walked in... he would get special treatment based on who he is.

Like any celebrity, his preferences and likes might be reasonably well known to the public in magazines and/or on the Internet.

The point is... big data analytics allows firms to treat EVERYONE as if they were a celebrity.

Although no firm can know everything about each person, we can use analytics to fill in the blanks and start to work in information on weather, location, time and date, popular social trends etc.

The opportunity now exists for big data analytics to start creating enough knowledge about every consumer that we get to a point where new (as Forrester's Gaultieri calls it) hyper-personal real-time relationships are enabled with every consumer.

This is where predictive analytics starts to come online... and predictive models are about possibilities, not absolutes. - and accurate predictive models may not even exist for every question.

Lessons from margarine

But let's remember that correlation does not always imply causation -- for example, the divorce rate in Maine is directly linked to the per capita consumption of margarine in the USA -- so two seemingly congruent data sets might follow each other for no logical reason at all.

The lesson is to love big data analytics, but never place blind faith in it without knowing what its context is.

Image credit: I can't believe it's not butter

Gee whiz, Apigee is gee'd up on APIs

bridgwatera | No Comments
| More

Gee whiz, if Apigee ain't as busy as a gee'd up API specialist this month.

1 Apigihdweihded.JPG

(Ed - enough cowboy talk, what's the scoop?)

Apigee (pron: Apa-gee) is the firm behind Apigee Insights, an API-based self-service predictive analytics solution that works to understand and predict customer behaviour across so-called "interaction channels" today.

What is an interaction channel?

This is simply a term to denote things such as emails, the web itself, mobile apps, chat and contact centers.

Apigee Insights uses the Apache Hadoop open source framework for processing large sets of structured and unstructured data -- along with specialised data structures and sophisticated machine learning.

As well as a new collaboration with Accenture (channel delivery based, not so much new technologies) the firm also has some developments inside its own stack this month.

The newly released Apigee Sense is new software that is claimed to represent the industry's first intelligent API security product.

Apigee Sense is a data-driven security solution that uses a high volume of API call data and predictive analytics to continually and proactively identify bad "bots" - the automated software programs deployed over the Internet for malicious purposes like identity theft.

According to the firm, Apigee Sense software uses sophisticated machine learning to intelligently improve security as bad bots evolve; it extends the security capabilities of the Apigee Edge API management platform.

"APIs deliver the data and information powering our hyper-connected world, so protecting APIs from cyber threats is key for safeguarding data," said Chet Kapoor, Apigee CEO.

"Apigee customers have processed hundreds of billions of API calls through our platform in the past year to deliver rich digital experiences to their users," he continued. "Apigee Sense delivers a new kind of API security that combines that API call volume with predictive analytics to quickly identify patterns for potentially malicious bots and then - most importantly -- it learns and adapts as these bots evolve."

Mainlining on DevOps: Perforce intros Helix GitSwarm

bridgwatera | No Comments
| More

As readers of the Computer Weekly Developer Network blog will know, we don't appreciate firms who bolt on the term DevOps to a tangential function in order to try and spin some 'share of voice' in the media.

1 helo.png

No worries of this kind with Perforce Software though... the firm's understanding of developer workflow processes has led it to announce the general availability of Helix GitSwarm, a complete Git ecosystem integrated with Perforce Helix.

Based on the GitLab collaboration suite, GitSwarm is supposed to foster developer productivity through Git repo management, a pull-request workflow, issue tracking and an integrated wiki.

With GitSwarm, Perforce Helix integrates Git workflow used by development teams with mainline development preferred by DevOps teams responsible for releasing products quickly.

GitSwarm developers use "narrow cloning" from a Helix mainline to manage the distribution of intellectual property through Git.

At the same time, mirrored branches of code maintained in the Helix Versioning Engine protect enterprise IP from potential theft and loss that can take place with unmanaged code distribution.

By integrating Git workflow with a centralised workflow, Perforce Helix addresses the limitations of native Git and other Git management solutions in the enterprise.

It meets the scalability required by DevOps practices and supports enterprise-class digital asset management, global distribution, high performance at petabyte scale, quality, security, process management and governance.

"Helix GitSwarm allows developers to collaborate in their preferred workflow," said Sytse Sijbrandij, CEO of GitLab. "Backed by the scalability of the Helix mainline repository and the collaboration features of GitLab, GitSwarm is a powerful Git-powered enterprise platform."

According to Gartner, "Enterprise-grade management of Git that offers important aspects of a DVCS -- good merging, the ability to work offline and good collaboration -- along with the security and central repository of a CVCS, will resolve most remaining concerns about the use of the DVCS model."

"Popular Git repo hosting services are great for small open source projects but fall short when a company needs to scale, protect their IP, or work with more than plain text files," said Christopher Hoover, vice president of product strategy at Perforce. "With the addition of GitSwarm to our Helix platform, we've delivered a single-vendor, best-of-both-worlds solution that understands and respects development teams as well as the business, digital asset and security needs of a large enterprise."

Helix GitSwarm is available free of charge as part of Perforce Helix.

Microsoft developer chief 'Soma' Somasegar departs

bridgwatera | No Comments
| More

Microsoft developer division chief S. Somasegar (call me 'Soma') is to leave his post after 27 happy years of serving the firm.


Renowned and respected Microsoft-tracking journalist Mary Jo Foley has confirmed that the company's cloud and enterprise executive VP Scott Guthrie announced Somasegar's impending departure internally on October 8.

"Most recently, Somasegar has been in charge of Microsoft's developer tools and services, including programming languages and runtimes; the Visual Studio line of products and services; and the .Net Framework. Somasegar also was responsible for the Cloud and Enterprise business' Global Development Centers in China, India, and Israel, and was the executive sponsor for these centers for all of Microsoft," reports Foley.

Soma has lead a huge proportion of the development behind Microsoft Visual Studio and the firm's wider software application development tools and platform.

As he now moves on to new challenges, Soma's clarity, deep technical knowledge and genuine embrace of open source and overall affable and approachable demeanour will be sorely missed.

Subscribe to blog feed

Find recent content on the main index or look in the archives to find all content.