One truly massive Git -- GitLab Enterprise Edition

bridgwatera | No Comments
| More

GitLab has unveiled a new version of GitLab Enterprise Edition (EE) with features specifically sized up for the enterprise world.


How do we use Git?

Open-source GitLab is being used for collaboration across over 100,000 organisations to help large distributed teams of developers to work together and control features that allow users to build apps with both accountability and enterprise-grade support.

Since GitLab EE is an on-premise solution for Git hosted repositories, enterprise companies are able to keep their codebases secure, private and customisable.

Private cloud repository reality

QUOTE: "[Over] 27 percent of enterprise business respondents indicated their companies were using private cloud solutions, compared with the 24 percent tapping public cloud options. An additional 17 percent noted firm plans to deploy private cloud, compared with 13 percent planning a public cloud deployment," according to Verizon's recent report, State of the Market: Enterprise Cloud 2016.

"We built GitLab EE ground up for the enterprise to deliver enterprise class support to our customers and give companies unprecedented control," Sytse Sijbrandij, co-founder and CEO of GitLab.

In the release of GitLab 8.2, the Enterprise Edition added repository mirroring to the existing list of enterprise-focused features.


To increase performance for distributed team members, each month GitLab releases EE-focused features along with the monthly GitLab release.

In addition to the EE-only features this month, support for Git LFS was released which allows companies that require the versioning of large files to use GitLab EE.

With over 1 million downloads and hundreds of thousands of developers using GItLab, GitLab's developer community is found inside firms including NASA, CERN, Alibaba, SpaceX, O'Reilly, IBM, and Expedia.

GitLab is available as a free, open source Community Edition, a free SaaS version and (not for free) as a new Enterprise Edition (EE).

Awfully pleased to meet you: survey finds open source needs more formal policies

bridgwatera | No Comments
| More

A new study has suggested that while nearly 80% of firms are making use of open source software, the vast majority of them have no formal policies to accommodate for its existence in place.


The survey stems from work carried out by Black Duck Software, a firm focused on open source software logistics solutions to secure management of open source code.

Gangnan ITAM style

According to the study, less than 42% of organisations maintain a IT Asset Management (ITAM) style inventory of open source components.

"We look forward to analysing the results of the Future of Open Source survey each year as it helps us validate the trends we've seen with customers to help discover open source in a company's code base, identify known security vulnerabilities, and track remediation," said Lou Shipley, CEO, Black Duck Software.

Slightly (arguably) less believable are claims that 50% of respondents to this survey said they were not satisfied with their own capability to understand known security vulnerabilities in open source components.

A surprisingly low 17% said they planned to monitor open source code for security flaws.

Shipley has also added the following comment, "In the results this year, it has become more evident that companies need their management and governance of open source to catch up to their usage. This is critical to reducing potential security, legal, and operational risks while allowing companies to reap the full benefits OSS provides."

Seed-to-growth & soup-to-nuts

Seed-to-growth venture capital firm North Bridge was also involved in the research here.

Image credit:

Red Hat freebases on Linux containers

bridgwatera | No Comments
| More

Red Hat is a busy company with tired arms.

One minute Red Hat is building out its platform and helping Microsoft to 'embrace' its wider offerings, the next it is 'embracing' open standards and ready to 'embrace' meritocracy over rank in terms of its open community ethos... then next, Red Hat 'embraces' containers like never before.

So it comes to pass then that the latest version of Red Hat Enterprise Linux (RHEL) is said to be focused on the development and deployment of Linux container-based applications.

Red Hat Enterprise Linux 7.2

According to the firm's official statement, Red Hat Enterprise Linux 7.2 features many improvements to the underlying container support infrastructure.

"Updates are included for the Docker engine, Kubernetes, Cockpit and the Atomic command. In addition, Red Hat Enterprise Linux Atomic Host 7.2, the latest version of Red Hat's container workload-optimised host platform, is available with most Red Hat Enterprise Linux 7.2 subscriptions."

Also available is the beta of the Red Hat Container Development Kit 2, a collection of images, tools, and documentation to help application developers simplify the creation of container-based applications that are certified for deployment on Red Hat container hosts, including Red Hat Enterprise Linux 7.2, Red Hat Enterprise Linux Atomic Host 7.2 and OpenShift Enterprise 3.

Ops insight on Insight

RHEL 7.2 also includes compatibility with the new Red Hat Insights, an add-on operational analytics offering designed to provide proactive identification of known risks and technical issues.

Microsoft Connect() -- a call to connect on free tooling

bridgwatera | No Comments
| More

Microsoft hosts its 'Build' event for developers, but it also stages 'Connect()' developer event.

This year's event news centred around updated developer tools and programmes relating to Visual Studio, Azure, Office and Windows.


Exec VP for the firm's cloud & enterprise group Scott Guthrie is on the record saying, "The additions of Visual Studio Code beta, the free Visual Studio Dev Essentials program and .NET Core RC for Linux, OS X and Windows show that Microsoft is now the company working hardest for all developers."

Tooling up

Visual Studio Dev Essentials is a new free program designed to create applications on any device or operating system with access to popular Microsoft developer services, tools and resources.

This software includes access to Visual Studio Community, Visual Studio Code and new Visual Studio Team Services; priority forums support; Parallels Desktop for Mac; training from Pluralsight, Wintellect and Xamarin.

The company also introduced Visual Studio cloud subscriptions, which offer Visual Studio Professional and Visual Studio Enterprise on a monthly or annual basis.

Microsoft Graph

Perhaps slightly fresher is news of the general availability of the Microsoft Graph.

This is intended to offer developers a consistent way to access data, intelligence and APIs within the Microsoft cloud and with a single authorisation token.

According to a press statement, "With the Microsoft Graph, developers can tap into the collective power of the Microsoft cloud to create smart, people-centric applications that help companies and end users achieve more with contextual insights. Any developer capable of making an HTTP request can call the API from any platform."


Also now available is Azure Service Fabric in public preview, for developers to build and operate microservice-based applications at scale that fully integrate with Microsoft Azure and Visual Studio.

This preview includes support for .NET development on Windows Server, with Linux support expected in 2016.

Microsoft released Visual Studio Code beta as an open source project, available on GitHub.

NOTE: Visual Studio Code is an 'advanced code editor' and part of the Visual Studio family that runs on Linux, OS X and Windows.

The new beta version includes a new extension model with a gallery of extensions for additional features, themes and language support.

The company also delivered release candidates (RC) of .NET Core 5 and ASP.NET 5 for Linux, Windows and OS X. With this implementation of the .NET Core for any operating system, developers can start using it in production environments.

Docker's oyster has security in the shell

bridgwatera | No Comments
| More

Goodness isn't it all about the containers these days?

Actually, scratch that, isn't it all about the container security debate these days?

As recently explained here, a container is a specific place to run an application alongside its own dependencies, configuration files, libraries and the 'runtime environment' that defines its engine power.

Containerizing, yes, it's now a word


Containerizing an application means that it can be moved from one environment (from 'test' to 'deployment' for example) to another.

This process also means that the containerized application can benefit from being abstracted away from it underlying infrastructure... but what security implications arise herein?


Docker hopes to have additional security answers to address current issues inside its open platform for distributed applications.

The firm this week announced new security enhancements designed to safeguard and protect Dockerized distributed applications.

The new development sees what is claimed to be the first "hardware signing" of container images, content auditing through image scanning and vulnerability detection and granular access control policies with user namespaces.

Why is hardware signing the answer?

Hardware signing and scanning of container images addresses the trust and integrity of application content.

Both are universal considerations in the application lifecycle and are becoming a central focus for organisations with Dockerized distributed applications in production, which accounts for 40 percent of all deployments of Docker.

These new capabilities, in combination with Docker's existing security options, are claimed to ensure the publisher of the content is verified, chain of trust is protected and containerized content is verified via image scanning.

Solomon speaks

"It has been our goal from the beginning to develop a framework that secures Dockerized distributed applications throughout the entire application lifecycle," said Solomon Hykes, CTO and chief architect of Docker.

"With this latest set of capabilities, we continue to drive our users and ecosystem forward with industry-first innovations and best practices that advance the end-to-end security of distributed applications. Furthermore, we've enabled developers and IT ops to benefit from a more secure environment, without having to learn a new set of commands or to be trained on a deep set of security principles. Docker security works as part of an integrated component without any disruption to developer productivity while providing IT with the appropriate level of security controls."

What's even better than predictive analytics?

bridgwatera | No Comments
| More

Question: what's better than predictive analytics?

Answer: predictive analytics and anomaly detection with pipeline aggregation intelligence, of course... so here's an argument for why.

Elastic, the company behind the open source projects Elasticsearch, Logstash and Kibana, finished its European developer-focused tour this month.

Fetch with Cluster.png

The firm has completed a number of upgrades to its products.

Elasticsearch 2.0

Elasticsearch is the workhorse engine for Elastic's products.

The technology was created by Shay Banon, co-founder and CTO of Elastic, to provide real-time capabilities to store, search and analyse large amounts of structured and unstructured data.

It allows users to ask complex questions and slice and dice the aggregated data with what are claimed to be 'sub-second' response times.

A new 'deepness' through pipeline aggregations

One of the major updates in Elasticsearch 2.0 is the development of pipeline aggregations, which allow users to run aggregations such as derivatives, moving averages and series arithmetic on the results of other aggregations.

This opens up the potential for predictive analytics and anomaly detection with pipeline aggregation intelligence.

You can read more about pipeline aggregations here.

Also of note in Elastic territory is the release of Marvel 2.0, the company's monitoring solution for ElasticSearch clusters. Previously only available in the paid-for enterprise version of Elastic, this new version of Marvel has been released under a free license.

Elastic was also showing off recently announced major upgrades to its other products including its Kibana user interface and visualisation tool.

Shay Bannon's blog, covering other new features as well as performance and security enhancements in the latest ElasitcSearch release, can be found here.

Google TensorFlow -- deep learning neural networked cognitive computing machine learning gone open source

bridgwatera | No Comments
| More

They all share a similar meaning: deep learning -- neural networks -- machine-learning -- cognitive computing.


But what are they and what is it?

Cognitive computing and the ability to create what we have started to call artificial neural networks describe the new breed of computers (such as IBM Watson) that can interpret human 'meaning and intent' out of questions spoken in natural language and also extract meaning from unstructured text, video, photos and speech.

Google TensorFlow

While many progressions in this space are proprietary, Google has released its TensorFlow artificial intelligence software on GitHub under an open-source license.

Google reminds is that it uses a weighty sprinkling of artificial intelligence (AI) behind the scenes in its Gmail and Google search services.

TensorFlow, primarily written in C++ and is already getting lots of traction.

Prior to TensorFlow, we may have more readily associated Google's artificial intelligence and machine learning work with the DistBelief offering.

DistBelief has been used to analyse and automatically identify items contained within videos and photos for example.

What Google says

According to the Google developer blog, "TensorFlow is a highly scalable machine learning system. It can run on a single smartphone or across thousands of computers in datacentres. We use TensorFlow for everything from speech recognition in the Google app, to Smart Reply in Inbox, to search in Google Photos. It allows us to build and train neural nets up to five times faster than our first-generation system, so we can use it to improve our products much more quickly."

Hadoop is not the only fruit

bridgwatera | No Comments
| More

It is true, Hadoop is a key focal point for many of us when we talk about big data -- and indeed, open source big data projects.


However, it's important to think outside the Hadoop box for a number of reasons.

Outside the Hadoop loop

By its very nature Hadoop is open source, so many of its developers and other contributors will naturally revel in the openness of the entire open code surface and work on other projects as well... these 'tangential' (many of them substantial) projects are typically complementary to Hadoop.

Where projects in fact compete with Hadoop, that's also a good thing as it keeps the overall drive for efficiency and functional excellence as sharp as it should be.

Why open source is so good

We might suggest that the there is a core reason for why open source is so well suited to big data... that is to say, if we accept that Hadoop is hard and that the actual implementation of big data analytics is still in its relative infancy, then we can see how the open customisability of open software structures could be better suited to big data projects as they now grow.


Looking outside the Hadooposphere, the Enterprise Apps Today website brings together a much needed selection pack cum Obligatory List Article of some of the other open source big data tools out there.

Lumify is an open source data integration, analytics, and visualisation platform built to help you understand the world of data.

Lumify features include its ability to analyze relationships, automatically discover paths between entities -- it can also overlay data as layers on a map for a geographical view of the data model.

Talend Open Studio for Big Data provides simple graphical tools and wizards to generate native code that helps you leverage the full power of Hadoop

HPCC Systems Big Data -- as detailed at the above link, "Is a platform for manipulating, transforming, querying and data warehousing your Big Data and is an alternative to Hadoop. It uses the Thor data refinery, Roxie data query/delivery engine and Enterprise Control Language (ECL) as an alternative to Apache Pig. (ECL is claimed to be 4.45 times faster than Pig on average.)"

You can read Paul Rubens' piece at the above link for more clarification on the other tools available in this space.

Image credit:

Windows 10 IoT gets more Java via Azul

bridgwatera | No Comments
| More

JavaOne may well be behind us, but some of the Java goodness still flows outwards into the newswires.


An (arguably meaty) morsel you may have missed is news that Java runtime solutions company Azul Systems is partnering with Microsoft to provide Java developers with:

• open source development tools,
• device I/O libraries,
• a Java runtime targeting Internet of Things (IoT) applications on Windows 10.

Zulu Embedded for Windows 10 IoT is a Java Development Kit (JDK), Java Virtual Machine (JVM) and a set of device I/O libraries based on OpenJDK that is compliant with the Java 8 SE specification and has been certified by Azul for use with Windows 10 IoT Core.

The software is free to download and use and may be distributed without restriction.

NOTE: Microsoft Windows 10 IoT Core is an edition of the Windows 10 OS designed for low-cost, small-footprint embedded devices such as those based on Raspberry Pi 2 and Minnowboard Max.

Azul and Microsoft's IoT team are partnering to ensure Zulu Embedded meets the ongoing Java development and runtime requirements for Microsoft's IoT initiatives, including continued updates to ensure compatibility with the latest Java updates and security patches as well as support for additional IoT device connectivity, control, and communication.

Microsoft's Windows Internet of Things man Steve Teixeira argues that, "Microsoft and Azul made it easy for those who prefer Java to build premier IoT devices running Windows."

All kinds of robots

Microsoft Windows 10 IoT Core is designed to work with a variety of open source languages as well as Visual Studio -- it is built for powering intelligent connected devices ranging from small form factors, such as gateways or mobile point-of-sale units, to industrial devices, like robots and specialty manufacturing equipment.

Neo Technology: we put the graph database into the data matrix

bridgwatera | No Comments
| More

Here's some things you need to know about Neo Technology.

First, the company chooses great giveaway bottle opener key rings as conference freebies, so be sure to pick one up if you see them.


Secondly, the firm bizarrely uses the same text in the headline and standfirst (sub headline) of its announcements... to save time & money perhaps?

Thirdly, the firm works to produce what we call graph database technology -- a type of database a graph database stores connections (between different elements of the dataset) as 'first class citizens' and makes them available for any 'join-like' navigation operation.

Did a PR person say 'leading'..?

Finally, Neo Technology is creator of Neo4j one of the world's graph databases -- and this month the firm announces the launch of openCypher, an open source project that will make Cypher (a popular graph query language) available to technology providers as a universal language for querying graph data.

The firm insists that creation of openCypher is a pivotal moment in the evolution of the graph data space.

Much like SQL did for relational databases, Cypher promises to accelerate the usage of graph processing and analysis worldwide by making it easier for any data storage, analytics or tooling platform to offer access to graph capabilities using a universal query language.

Is it popular?

Hundreds of thousands of developers and data analysts already use Cypher and the majority of people learning about graph databases do so with Cypher.

Besides having garnered wide enthusiasm from the user community, Cypher is currently supported by numerous tooling providers, providing a strong foundation of existing skills and support. The query language has a robust history and has been well proven in the field, with tens of thousands of deployments.

A quota of quotes

"Graph processing is becoming an indispensable part of the modern big data stack," said Ion Stoica, CEO and founder of Databricks

"Lots of software systems could be improved by using a graph datastore," said," Rebecca Parsons, CTO, ThoughtWorks.

"Much as SQL was a springboard more than 30 years ago to bring relational database technology to the fore, we expect Cypher to have a similar effect on graph database adoption. Companies such as Google, Facebook and LinkedIn have leveraged graph processing to transform their respective industries," said Emil Eifrem, CEO, Neo Technology.

Image: Wikipedia

SUSE enterprise storage 2: revenge of the Ceph

bridgwatera | No Comments
| More

SUSE has announced its Enterprise Storage 2 product.

This is the latest version of its self-managing, self-healing, distributed software-based storage solution for enterprise.


It is the first (and only) Ceph-based solution with heterogeneous operating system support.

What is Ceph?

Ceph is a distributed object store and file system designed to provide performance, reliability and scalability -- it provides access to objects using native language bindings or 'radosgw', a REST interface that's compatible with applications written for S3 and Swift.

The new SUSE release is intended to give users the ability to deploy software-defined storage with less cost.

"Software-defined storage promises to change the economics of enterprise storage infrastructures," said Roger Cox, research vice president, data center convergence, at Gartner.

Commodity off-the-shelf servers

Powered by Ceph, SUSE Enterprise Storage 2 is uses commodity off-the-shelf servers and disk drives.

Ralf Flaxa, SUSE vice president of engineering has commented, "With the majority of enterprises planning to adopt software-defined storage in the next few years, it is emerging as the method of choice for companies looking for cost-effective, enterprise-grade storage."

Image: Wikipedia

Chef beefs up for regulatory compliance

bridgwatera | No Comments
| More

There's a whole lot of DevOps puff and fluff out there, but that's okay.. because this one is real.


Right across your stack

Chef is a company known for its DevOps automation prowess and this week sees the firm move to release new products that claim to be able to automate change management for the entire application and infrastructure stack.

Chef's workflow automation product Chef Delivery, initially made available as an invitation-only program in April, is now generally available and integrates with the new Chef Compliance offering.

Together, they automate infrastructure, runtime environments, applications and compliance policies.

(Ed -- yeah, that is real DevOps, okay.)

Chef's new compliance capabilities enable users to automate the assessment and remediation of IT infrastructure.

Chef Compliance is built on technology from VulcanoSec, a security software company based in Germany, which Chef recently acquired. Chef is integrating Chef Compliance with Chef Delivery to bring compliance into the DevOps workflow.

"As organisations in all industries rapidly evolve their business models around technology, they are struggling to develop software at velocity, while also meeting complex compliance and security obligations," said Barry Crist, CEO of Chef.

DevOps evangelist, it's now a 'thing'

"DevOps is how we do things at Ooyala. We're building out Chef Delivery to be the foundation for our workflow, so our engineering team has a single pipeline to drive our development," commented Caedman Oakley, DevOps evangelist, Ooyala. "A great example of how we'll use Delivery is as the pipeline for building, testing, and deploying our open source Docker management framework, Atlantis, enabling the team to best collaborate on a core piece of our production environment."


Linus Torvalds 'launches' Linux kernel 4.3

bridgwatera | No Comments
| More

Linus Torvalds has detailed the launch of the Linux 4.3 kernel, a new release with significant security enhancements.


As creator and 'father' of the Linux kernel and project himself, Torvalds used a short note on the LKMLorg website to make things official.

"So on the whole, this remains a rather calm release cycle until the very end. And with the release of 4.3, obviously the merge window for 4.4 is open, and let's keep our fingers crossed that that will be an equally calm release," wrote Torvalds.

More code than Mars Rover

As journalist Chris Merriman points out, "This release ships with 20.6 million lines of code, more than it took to get the Curiosity Rover safely to Mars."

The new Linux ships with specifically improved graphics support for a) Nvidia graphics cards and b) support for Intel's Skylake processors.

According to Intel, the redesign for Skylake brings greater CPU and GPU performance and reduced power consumption -- and Skylake uses the same 14 nm manufacturing process as the previous Broadwell generation of chips.

More Linux contributors than ever

According to the Linux Foundation, "What we're learning from this year's data is that there are more developers working on Linux than ever. More than 12,000 individuals have contributed to Linux since 2005 and more than 4,000 contributed in just the last 15 months. Nearly half of these recent developers are first-time contributors, which we think really represents the growing community of people supporting Linux."

NOTE: The current number of code contributors at the time of writing is said to be over 14,000 authors.

According to Michael Larabel of Linux specialist website, red Hat leads in as the company that makes the most code contributions with 5.9% of all commits.

"Intel followed closely behind with 5.32% while the Linux Foundation came in third at 3.14%. Following the Linux Foundation were SUSE, Linaro, Texas Instruments, and Samsung. Ubuntu/Canonical didn't make the top ten list," writes Larabel.

More Internet of Things means more Bluetooth

bridgwatera | No Comments
| More

Devices are being connected to the so-called Internet of Things (IoT) at an increasingly rapid rate -- this we already know to be true.


The rise of 'smart home' technologies and 'wearables' means that the IoT is demanding more connectivity all the time i.e. these devices need a connection channel... and it is often Bluetooth.

NOTE: Interestingly, the rise of low energy Bluetooth is also starting to deliver wireless connectivity to 'products' such as store-bought goods that we might never have previously considered to have been blessed with connectivity.

SIG power is rising

New developments here include the fact that the Bluetooth Special Interest Group (SIG) has announced the general availability of its Bluetooth Developer Studio, a no-cost software-based development kit that helps developers learn Bluetooth technology quickly and (hopefully) bring products to market faster than ever.

"The Bluetooth Developer Studio arms developers with an all-in-one, cost-efficient tool to turn their ideas into reality. With it they can create products and applications that make our lives easier, better, smarter," said Steve Hegenderfer, director of developer programs, Bluetooth SIG.

Hegenderfer goes on to assert, "With Bluetooth Developer Studio, not only will we see more smart gadgets enter the market, we will see quality products that 'just work' delivering the IoT experience consumers actually want."

What's inside?

The Bluetooth Developer Studio's key benefits and features include a drag-and-drop user interface, sample codes, virtual and physical device testing... plus also built-in tutorials for faster deployment.

Additionally, the tool makes it easy to share reference designs and use successful implementations created by others.


The Bluetooth Developer Studio Dashboard

Veeam cleans up dirty backup worries on Linux

bridgwatera | No Comments
| More

Veeam Software describes itself as the company that delivers 'availability for the always-on enterprise' -- so much so, it has even trademarked the phrase.


The firm's Veeam Backup for Linux is a free standalone 'agent' (i.e. a piece of software intelligence with a specific role) designed to handle backup and recovery for Linux servers running in public cloud instances.

It also works the 'few' remaining physical Linux servers running on premises.

Ditch your dirty old back up

Customers can use this tool to back up individual cloud instances and restore them as bright shiny new instances on premises, or back up on premises and restore in the cloud.

The company reminds us that Linux remains the go-to choice for the cloud, with 75 percent of enterprises reporting they use Linux as their primary cloud platform.

Manual intervention hassles

"Backing up and recovering Linux servers is often a complicated and costly process, frequently needing manual intervention or consuming too much of an IT administrator's time," said Doug Hazelman, veep of product strategy at Veeam.

"Moreover, as hybrid cloud increasingly becomes the industry standard and more enterprises look to run more workloads in the public cloud, it is important to ensure that these public cloud server instances are backed up and can be recovered easily and quickly in order to ensure availability and avoid business disruption."

Hazelman insists that the 'Veeam community' has asked his team for an easy to use tool to ensure the availability of their Linux cloud instances and restore them anywhere.

As such, Veeam says it is not only meeting this need, but are also adding a portfolio of free tools.

Image credit: Unilever

IBM builds Apache Spark into core analytics engine

bridgwatera | No Comments
| More

IBM staged its Insight 2015 conference this week, so naturally we were expecting plenty of announcements.

So... what of open source goodness then?


NOSTAGLIA NOTE: In true old-school style, IBM hosted a traditional press conference and produced a pack of nine 'printed' (i.e. on real paper) press releases - it was kind of like a welcome return to the way things used to be.

The firm says it has announced a redesign of more than 15 of its core analytics and commerce solutions with Apache Spark - the open source parallel processing framework that enables users to run large-scale data analytics applications across clustered computers.


It's all about accelerating real-time processing.

IBM also announced the availability of its Spark-as-a-Service offering (known lovingly as IBM Analytics on Apache Spark) on IBM Bluemix following a 13-week Beta programme.

Apache Spark is known for its ability to create algorithms for crunching complex data -- as a piece of software, it boats in-memory processing that is ideal for 'frequently accessed' information.

IBM says it has been able to simplify the architecture of some of its most widely used software solutions and cloud data services, such as IBM BigInsights, IBM Streams and IBM SPSS.

As an example, IBM reduced the code base of DataWorks (the company's data preparation and data refinement service) by over 87 percent, from 40 million lines of code to 5 million lines of code.

DataWorks will now benefit directly from Spark's scalability, distributed programming model and data source connectivity as well as the frequent enhancements delivered to Spark by the project's contributors.

Offered as a service for developers within the broader ecosystem of IBM's managed cloud data services, IBM Analytics for Apache Spark integrates with open source, proprietary and third party tools on the IBM Bluemix cloud platform.

The big promise from IBM is...

... developers will now be able to infuse analytics into their apps in real-time.

"For data scientists and engineers who want to do more with their data, the power and appeal of open source innovation for technologies like Spark is undeniable," said Rob Thomas, VP of product development, IBM Analytics. "IBM is committed to using Spark as the foundation for its industry-leading analytics platform, and by offering a fully managed Spark service on IBM Bluemix, data professionals can access and analyze their data faster than ever before, with significantly reduced complexity."

Since announcing its commitment to the Apache Spark community in June 2015, IBM has made over 60 contributions to the Spark project, including Machine Learning and SQL.

A Blueprint For Building a Data Product With Spark.jpg

Magnet CEO Chuang: application development is playing catch up

bridgwatera | No Comments
| More

Magnet Systems recently launched Magnet MAX, a modular suite of tools that are supposed to help developers maintain user experiences.

This software also exists to "optimise legacy technology investments" within enterprsies.


But what does that phrase really mean in the real world?

Optimising legacy technology could be, for example, combining CRM software with inventory management to create new retail applications.

Magnet argues that the increased value from the combination of multiple point applications will become ciritical as the number of smart devices per consumer increases and IoT and wearable technology becomes more common.

Developers need tools that can help them deploy applications at a rapid rate, but as seen with the recent XcodeGhost malware in Apple's app store, when developers take too many shortcuts to rush application updates they sacrifice consumer security and stability.

The following commentary is written by Magnet Systems CEO Alfred Chuang.


To date mobile applications for consumers and businesses have largely focused on a single task.

Rare is the mobile app that brings together all of the unique capabilities of mobile to create new business models, new ways of delivering services and information and new user experiences.

Playing catch up

Despite the recent launch of the iPhone 6S and fierce competition between Samsung and Apple to innovate on mobile experiences, application development has not been able to keep up with the pace of hardware and software launches.

Creating a stand out mobile application becomes increasingly difficult for businesses that want to leverage their internal infrastructure and might have legacy investments including inventory management, CRM information and marketing campaign software.

At Magnet Systems we believe that we are not far from reaching a generation of apps that will create new business models, new products, new services and innovation.

But in order to realise that future, developers need a framework that can help accelerate the development of mobile applications and provide connectivity to enterprise applications, databases and other services that create a personalised user experience that can't be found anywhere else.

The answer from Magnet?

Magnet Software is known for its a Magnet Max framework -- Magnet's open source architecture provides APIs, SDKs and tooling for faster application development, a server optimised for mobile applications and deep messaging capabilities for stronger user engagement.

Thomson Reuters raises stakes in financial desktop software

bridgwatera | No Comments
| More

Thomson Reuters has enhanced the open capabilities of its financial desktop software -- Eikon.


App Studio in Eikon is supposed to allow third-party developers to create apps that display as native applications on the Eikon screen.

Eikon itself is a financial analysis desktop and mobile solution that connects trusted content, Reuters news, venues, markets and liquidity pools info etc.

Finance is going open

The firm suggests that the financial industry is increasingly turning to open technology standards to spur the innovation and flexibility institutions need to remain competitive in an increasingly complex business landscape.

App Studio is an evolution of Eikon's pre-existing open capabilities, allowing in-house developers and independent software vendors to blend Eikon's leading news, market data and analytical tools with proprietary research and other third-party data.

These apps can then be deployed directly within Eikon, either publicly or to specific individuals or groups.

Format-friendly fuzziness

"Our customers' success increasingly depends on their being able to use the data they want, in the format they require, on the devices they prefer," said Philip Brittan, chief technology officer and global head of platform, Financial & Risk, Thomson Reuters.

"Thomson Reuters is building an open platform where the ideas of the market can thrive. App Studio in Eikon is one of many ways we are drawing on open technologies to eliminate the barriers to efficient, collaborative workflows in the financial industry. It will help financial professionals generate fresh business ideas, respond rapidly to market changes and deliver new tools and services to their clients"

Pentaho: the 'traditional' industries 'get' IoT analytics first

bridgwatera | No Comments
| More

Pentaho has had a busy week -- the firm has had its first week out in full public scrutiny as the new Pentaho, a Hitachi Data Systems company and staged its second annual PentahoWorld customer, partner, user & developer event.

As part of the shenanigans, Pentaho announced that customers including Halliburton Landmark, IMS and KDS are using its platform to "reimagine established industries" (as the PR spin doctors would say) to blend, integrate and orchestrate machine-generated big data and deliver for analytics embedded at the point of impact.

"Big data and the Internet of Things are disrupting entire markets, with machine data merging the virtual world with the physical world. We've really only just scratched the surface of how IoT will reshape sectors of the economy," said Quentin Gallivan, CEO of Pentaho.

According to McKinsey Global Institute "The Internet of Things: Mapping the Value Beyond the Hype," the IoT market could have an estimated total economic impact of $3.9 trillion to $11.1 trillion per year in 2025.

With this market opportunity also comes IT roadblocks. McKinsey notes that the lack of open-standards and agile platforms may slow the adoption process across the enterprise.


"IoT applications can get very complex very quickly due to the extensive breadth and diversity of data sources and analytics involved, as well as the challenge with standards," said Vernon Turner, SVP of Enterprise Systems and IDC Fellow for The Internet of Things.

"For companies and developers looking to unlock the value of IoT, the focus will be on technology vendors that provide an open and agile platform," added Turner.

Halliburton Landmark -- Oil & Gas

"Oil and gas is an old industry with a new take on technology. At Landmark, a Halliburton business line, we've embarked on an enterprise-wide deployment of Pentaho across our multiple industry platform offerings to improve collaboration between oil and gas companies and the broad supply chain," said Kumar Shanmugavel, Product Manager, Halliburton Landmark. "By expanding our advanced analytics capabilities to include monitoring machine sensor data, our deployment has improved pump safety and prevents spills by predicting failure rates, resulting in 60 to 80 percent less cost and 2x to 4x faster development."

Intelligent Mechatronic Systems (IMS) -- Automotive


"As a leader in the connected car industry, IMS is creating revolutionary, award-winning technology that enables drivers to be safer, smarter and greener," said Christopher Dell, Senior Director, Product Development and Management, IMS. "Pentaho enables us to derive greater meaning from the big data collected from our connected car programs, increasing our competitive advantage and enabling us to offer customers the most comprehensive end-to-end connected car solutions on the market. For example, IMS is currently utilizing high performance analytics to drive better outcomes for both insurers and drivers in usage-based insurance programs. We are also looking to leverage this technology to grow our other connected car programs and services, such as road-usage charging and fleet management offerings, as well as expanding to new opportunities in the related IoT market."


Kirchhoff Datensysteme Software (KDS) -- Manufacturing

"Plastic compounding is complex and highly specialised process. The industry is an order-driven small batch process in which products can be manufactured in different plants; it's therefore essential to identify the optimal production path," said Oliver McKenzie, Managing Director, Kirchhoff Datensysteme Software. "Poly.MIS was built by industry experts on the Pentaho business analytics platform and helps plastic compounders diagnose the causes of poor throughput times and serves as a basis for continuous production path optimization, laying the fundament for a smart factory."

PentahoWorld 2015: notes from the keynote

bridgwatera | No Comments
| More

How is Pentaho doing under its new uber-parent Hitachi Data Systems (HDS)?

Very well, thank you for asking, said the EMEA chief and the comms lead in a pre-conference informal session prior to this big data analytics driven conference.

1 pentaho image 1.jpg

From trains to televisions

Hitachi, it appears has a vested interested in the Internet of Things (and so, therefore, Pentaho's data capabilities) as a company that produces everything from televisions to trains.

All these devices have connectivity these days, so data forms the lifeblood in what Hitachi likes to call 'The Internet of Things that matter'.

(Ed -- but ALL devices need love too right?)

NOTE: The event has a heavy Dev-by-Devs developer track, this is a hands on symposium with plenty of coding activity - people getting their hands dirty on interface connectivity... you know the kind of thing.

Pentaho CEO Quentin Gallivan

CEO Gallivan took the stage to claim that the "unstructured element of data is doubling every three months"... much of this coming from the Internet of Things, of course.

The growth areas for the Internet of Things (and the big data analytics that will support it) are in key areas such as predictive maintenance for industrial equipment and smart cities.

The difference, from Pentaho's perspective, is that analytics has to start happening INSIDE the application itself so that it can start impacting application behaviour at the point of impact.

More technical sections of this keynote were delivered by Chris Dziekan -- he is big data chief product officer and EVP at Pentaho.

It appears that much of the effort going into working with big data is focused on which data tool mechanics we use...

... auto-modelling (and in-line modelling) in the firm's PDI product can start to build the data model in a more automated fashion. This type of analytical model editing also allows users to engage in the model editing process i.e. a data developer could start to input meta data to help define the schema emerging from a data lake as it comes out of the water.

As also know, big data configuration can be a hard thing to do. Pentaho has been working with its latest release to help create pathways to built in testing and troubleshooting.

Talking about the operations of his firm under its new parent Hitachi, Dziekan said "Hitachi allows us to stretch into new places and scale in new ways, without touching the Pentaho agenda."

.... this blog will expand and link to other stories from PentahoWorld 2015.

Find recent content on the main index or look in the archives to find all content.



Recent Comments

  • Gauthier Kervyn: This is a great free tool but unfortunately not open read more
  • Ray: I'm an IT professional and joined Spiceworks a few months read more
  • Rodya Raskolnikov: Naturally, I assumed from this blurb that this tool was read more
  • Akmal Chaudhri: I have been tracking NoSQL databases for several years, collecting read more
  • S Ten: Microsoft say that VB6 programming will work on Windows 10... read more
  • Shane Johnson: Couchbase Comment: MongoDB published another benchmark performed by United Software read more
  • Ilya Geller: All programming languages - Java among them - are obsolete. read more
  • Jatheon Technologies: That's very nice! I like equality between man and women read more
  • Gail Smith Gail Smith: You're talking about open source but then promote LogEntries which read more
  • john doe: It seems that the experts disagree that this laptop is read more

Recent Assets

  • 1iugiuwqgdiug.png
  • steptoe20smilesh7.gif
  • p1894mv2141b6t109v41sajta2b5.jpg
  • 1edygeuwdygfu.png
  • graph_illustration_horizontal_948x215.png
  • Docker_Euro_Largelogo-1c4ec95d91a66c91f44c831a65d2147d.png
  • Fetch with Cluster.png
  • 2FNLTensorFlow.png
  • Frootitetrapak200MLArebic.jpg
  • 1Isaacs_-_Sjaka,_Koning_van_die_Zulu_(1836)_crop.png

-- Advertisement --