When IoT connected fridges send out spam

bridgwatera | No Comments
| More

So anyway, Juniper Research predicted last week that the number of "connected appliances" (i.e. Internet-of-Things style devices) in so-called SmartHomes will reach 10 million by 2017.

In the same vein, Società Fabbricato of Italy also predicted that Internet of Things (IoT) devices in homes will reach 77 million by 2017 -- so don't give too much credence to analyst predictions sometimes is (arguably) our lesson.


What type of devices are we talking about here?

This can range from fridges that order food when it begins to run out, through to lighting and heating that functions to people's schedules.

Whatever the figure, Mike Ellis in his role as CEO of open source identity management company ForgeRock, thinks our lives are going to change immeasurably,

"The rise of IoT devices in homes has a dark side though. Our houses are the source of our most private data and, as the number of internet connected devices rise, so does the potential for hackers to exploit them. Preventing this should be the prime priority for businesses."

Fridge spam

"We have already seen the impact a hacked device can have when, a fridge sent out almost a million spam emails. Allowing this type of technology in our homes without adequate protection is a recipe for disaster."

Ellis advocates "identity and context-based security" and says that these are the layers that software application development pros need to start building into their project architectures.

ForgeRock's advice for programmers is to look at how the data is transferred between Internet-enabled machines so that it is encrypted and authenticated.

"To ensure that the request to access a machine is valid, a number of factors must be checked. Data such as location, time and device must be verified to ensure that requests are warranted," said Ellis.

Why Heartbleed did not harm open source

bridgwatera | No Comments
| More

Unless you live in a cave, Victorian style external toilet or Bear Grylls style treehouse in the Outer Hebrides it is safe to say you will have read about the Heartbleed bug.


The Heartbleed bug is an OpenSSL cryptographic library flaw that allows attackers to steal sensitive information from remote servers and devices and is said to have affected nearly two-thirds of websites.

Because of the bug, many say that secure connections can no longer be regarded as trustworthy, since hackers can access and view user IDs and passwords, or worse, the private encryption keys that secure all connections.

CEO of open source middleware company Talend Mike Tuchen says that a lot of the coverage to date has highlighted how small and underfunded the OpenSSL team is, and how this "volunteer" approach to open source development can cause problems.

Reading many of the commentaries, one could easily come to the conclusion that a proprietary approach would be better he argues.

But did the open approach ultimately actually protect us?

Tuchen argues that in reality, he believes the open source nature of OpenSSL has actually provided real benefits in this situation.

"The community scrutiny of the open source code worked," he said.

Talend's Tuchen argues as follows:

After the Snowden disclosures researchers began focusing on widely used cryptographic components to look for weaknesses. As a result of this action, public scrutiny the flaw was discovered independently by two researchers, within a month or so of one another.

The cooperative nature of open source worked.

If this stack was closed source, the flaw might have been found by any number of malicious parties but never disclosed.

In addition, all of the public scrutiny on OpenSSL has also uncovered other potential issues that would never have been found if it wasn't open source, for example a custom buffer allocation approach that bypasses some of the advances in buffer management in the underlying operating systems.

At the end of the day, with a closed source approach, this security flaw would might never have come to light publicly -- as widespread as the problem has been, it's almost certainly better than the alternative.

Open source carrots, broccoli, celery, kale & coffee pods

bridgwatera | No Comments
| More

Reading over the Easter holidays (as one does) a good book like When Computing Got Personal: A history of the desktop computer [Kindle Edition] by Matt Nicholson, one is reminded that the birth of open source as a wider movement goes back further than the last 25 years of Linux.

Nicholson helps to remind us that the sharing and dissemination of component construction information predates even the Internet and that IBM went through a series of lawsuits way back when Thomas Watson was steering the company through its initial development of punch card tabulating machines back in 1914.

Fast forward to 2014 and we understand open source to be largely associated with technology, and predominantly, software technology -- with hardware as a reasonable additional interest.

But what does open source really mean to the world of industry at large?

Look at Nespresso the company that run by head coffee brewing barista George Clooney (Ed: we don't think he actually works for them OK?) with its colourful "coffee pods" for its no-mess coffee machines.


Quite apart from these thing being an (arguably) environmental packing disaster (although they are starting to try and use recyclable materials in the future), the firm has had a virtual monopoly on the production of coffee pods for its own machines up until now.

Nespresso parent company Nestlé has just reached an agreement with France's antitrust authorities to now extend the guarantee on its fancy coffee machines to customers who use pods other than its own, branded ones.

This means that George Clooney will now personally remove the warning on all Nespresso coffee machines that tells customers only to use its own capsules.

But open source goes further...

The University of Wisconsin-Madison has started its Open Source Seed Initiative (OSSI) to help publicise the fact that there are now 29 kinds of plant varieties available under an open source license.

The OSSI group is run by scientists and is currently releasing 29 new varieties of crops (including carrots and broccoli - and also celery, kale, quinoa) under what it has labelled as an "open source pledge" to defend the ability of farmers, gardeners and (possibly most importantly of all) plant breeders to share seeds freely around the industry.


Jack Kloppenburg (left), professor in the Department of Community and Environmental Sociology, Irwin Goldman (center), chair of the Department of Horticulture, and Claire Luby (right), graduate student in the UW's Plant Breeding and Plant Genetics program, fill envelopes with non-patented seeds in the Horticulture office in Moore Hall.

Photo: Bryce Richter

"These vegetables are part of our common cultural heritage, and our goal is to make sure these seeds remain in the public domain for people to use in the future," says UW-Madison horticulture professor and plant breeder Irwin Goldman, who helped write the pledge.

The Open Source Seed Initiative (OSSI) was established in 2011 by public plant breeders, farmers, non-governmental organization staff and sustainable food systems advocates from around the nation concerned about the decreasing availability of plant germplasm-seeds-for public plant breeders and farmer-breeders to work with.

Opening Box to open source

bridgwatera | No Comments
| More

Online file sharing and cloud content management company Box is showcasing its now quite well populated open source repository.

As a company Box has always come across as 'open enough' to third party projects, partnerships and promotions --- but now an appreciation for open programming steps forward in more solid terms.


The logically named http://opensource.box.com/ is, as it says on the tin, an open source repository of Box-connected metadata, emerging projects, core content and SDKs for Android, iOS, Windows and Java.

Box says it relies on open source software every day.

"That's why we give back to the open source community whenever possible, by contributing code to outside projects and sharing projects that we've developed internally," says the firm.

Example projects include:

Rain Gauge -- a tool to simplify the process of collecting detailed information from mysql database servers when specific conditions are triggered.


Flaky -- is a plugin for nose that automatically reruns flaky tests. Instead of skipping flaky unit tests for components that aren't 100% reliable, use flaky to automatically retry them.

Box's Benjamin VanEvery blogged this month to say, "Open source has been a part of the Box technology stack since the company's earliest days. Technologies including Apache, nginx, PHP and their peers have been critical to Box's success and to the technical revolution of web software and platforms as a whole. Today, we are very excited to announce Box Open Source, Box's formalized open source initiative committed to giving back to the community. Our engineering teams have contributed 20 open source projects, all showcased onopensource.box.com."

A proactive approach to open source governance

bridgwatera | 1 Comment
| More

This is a guest post for the Computer Weekly Open Source Insider blog written by Lacey Thoms, a marketing specialist and blogger at software analysis and code attributes management company Protecode -- Lacey has a Bachelor's Degree in Mass Communications from Carleton University and has written many articles on open source software management.

Omnipresent openness


Open source software has become an omnipresent and major driver of software activities worldwide. Many organisations, from small start-ups to large multinationals, are using open source code to accelerate development and reduce costs.

As open source adoption increases, the processes for managing open source code and its associated license obligations, security vulnerabilities and export content are maturing. The days of manually auditing the code before the product ships are losing out in favour of more proactive, cost effective approaches.

The modern approach for an open source software adoption process is similar to the one used for any other third party software, which revolves around uncovering all external code used in a project, and identifying their license and copyright attributes, as well as any security vulnerabilities or encryption content associated with the code.

Like other quality assurance processes, it is best to start managing open source governance in the early stages of development.

Organizations are beginning to take more proactive steps towards managing open source software licenses, beginning with an established open source policy and a defined workflow process that can reject any packages that violate the policy before the developer is permitted to use the code. From there, organisations follow practices that can detect and flag violations as the code is brought onto a developer's workspace.

Creating an open source policy

The first stage of implementing an open source governance process it to draft an open source policy. The policy regulates the open source governance process and covers topics such as who the stakeholders are within the organisation, and outlines acceptable attributes, such as open source licenses and communities. The open source policy is drafted with input from all the relevant stakeholders in the organisation.

Typically an open source committee consists of representatives from legal, R&D, and product management. An open source policy also includes a workflow for requesting and approving open source packages that can be used in specific projects or within the entire organisation and defines the course of action once an open source policy violation is suspected.

Implementing a pre-approval workflow

A good open source policy puts emphasis on catching open source governance issues at the earliest stage of development, therefore vastly reducing the time and effort involved in remedying them. An important element of any solid open source policy is a package pre-approval process. In essence, this process is a series of actions that allows anyone to request a certain open source package to be used in a project. Through a streamlined workflow process, a licensing person can approve or reject the requests based on the available information about the project, how the package is to be used in the project, and the open source package attributes.

So, what does a package pre-approval workflow entail?

First, developers must submit a request including details such as package's name, a link to the code, and information such as version, authors, and the license cited on the site or specified in the package. Other information such as known open source security vulnerabilities and presence of encryption content in the package will help the compliance examiner streamline the approval process. Another important item accompanying the pre-approval request is a description of how the package is going to be used in the product, including whether or not the code will be modified, redistributed, or if it will only be used internally.

After the request is submitted, an administrator (usually someone from the open source committee) can review the request. Typically, a combination of manual research and automated open source scanning tools are used to confirm and identify licenses, obligations, copyrights, open source security vulnerabilities, and encryption properties of the requested package. At this stage, the licensing person will review license obligations and other properties of the requested package against the organisation's policy, taking into consideration how the developer intends to use the package.

If there are no conflicts with the organisations open source policy, the administrator can approve the package. Once a software package is approved, it is then logged and made available to the specific product groups or the whole organisation. A record of the approved packages is made available so that developers can readily use these pre-approved components in the future.

Software package pre-approval can be added to existing open source management processes to further improve governance. Organisations that have a process in place that scans code at regular intervals (e.g. daily weekly, monthly) and also organisations that have a continuous scanning process in place (scanning in real-time as code is brought in by developers) will benefit from a package pre-approval process. Package pre-approval speeds up continuous scanning because code can be approved before it enters the development environment.

The result is a lower number of overall files that need to be scanned thus speeding up the overall scanning process.

A proactive approach

To get the most benefit out of open source code, organisations are turning to more sophisticated practices for open source governance. As with other software lifecycle management processes, automated solutions for package pre-approval exist that significantly reduce the time and effort spent on open source governance and increase the accuracy of the results.

Information that these processes detect includes permissions from the owner to use the software, any known deficiencies such as bugs or security vulnerabilities, and any other information pertinent to carrying out the business such as exportability.

A package pre-approval workflow process, when combined with automated open source scanning, is an effective part of managed adoption of open source software, allowing organisations to reduce their development costs and speed up delivery times using quality third-party software. With a proactive approach to open source management, organisations can harness the benefits that open source has to offer, while creating a streamlined process to avoid challenges associated with open source software.

Intel beefs up open source Raspberry Pi challenger and slashes price

bridgwatera | 2 Comments
| More

default-logo.pngIntel has beefed up its open source single-board computer and cut its price in half.

The Minnowboard Max features an open hardware design and is targeted at software application development pros and enthusiasts who want to code for the "deeply embedded" market.

Intel has slashed the Minnowboard price from US$199 to $99 (£60) -- although distributor prices vary and not all have reflected the price reduction at the time of writing.


System-on-Chip (SOC)

Along with the price cut comes a more powerful Atom processor (64-bit Intel Atom E38xx Series SOC), integrated Intel HD graphics and a smaller overall footprint for the machine itself.

Minnowboard Max runs Debian GNU/Linux or Android 4.4 system -- it is also Yocto project compatible.

NOTE: The Yocto Project is an open source collaboration project that provides templates, tools and methods to help create custom Linux-based systems for embedded products regardless of the hardware architecture.

Occasionally touted as a "Raspberry Pi challenger", Intel's Minnowboard Max is arguably better suited to the professional and hobbyist engineering space and is unlikely at this stage to make inroads into Raspberry Pi's popularity inside the education sector -- although that success in itself has been questioned.

The Intel graphics chipset included here comes with open-source drivers so software developer/hackers can really play with a wide range of possible use cases for this machine.

MinnowBoard Max has two USB ports (one of them is now USB 3.0) and a microSD slot as well as 10/100/1000 Ethernet.

If the Raspberry Pi and its Broadcomm SOC with 700Mhz ARM processor had won over many of the hackers in this zone recently, then many would argue that Intel has placed Atom right back in the interest zone with these updates.

The software-defined business cometh

bridgwatera | No Comments
| More

Somebody had to say it, so it might as well have been an Application Performance Management (APM) company that did so.

Having just about got our heads around Software Defined Networking (SDN) -- where system management is decoupled from hardware to be given over to the control offered by a software application called a controller -- we now learn that the software-defined business cometh.

APM vendor AppDynamics has been busy attempting to coin this buzz-phrase in the context of its core solution.

The firm's core offering has (queue news item) just been augmented with enhancements for apps built in Java, .Net, PHP and mobile apps; new support for Node.js, Scala Apps and Big Data stores based on increasingly popular NoSQL databases

This software is intended to monitor, manage and analyse complex software environments in real time and in production.

AppDynamics' Spring 2014 Release includes new support for NoSQL Big Data stores including MongoDB and Hadoop, Couchbase and Cassandra through its extensible API framework.


"This goes way beyond monitoring -- it's true application intelligence," said Jyoti Bansal, founder and CEO of AppDynamics

... and that, if you buy the sizzle on this particular sausage, if some of what makes a software defined business i.e. it is a firm who can trace its core operational vital signs down to functions and control that are dictated (or at least managed) by software.

Has SkySQL MariaDB pulled off the NoSQL + SQL combo challenge?

bridgwatera | No Comments
| More

There is something of a war of words (and code) going on between the NoSQL and SQL database camps.

Some of it is merely flack; both approaches have their benefits.

SQL databases are marked out for their predefined schema (the structure of the database that describes its construction and basic 'integrity constraints'), whereas NoSQL databases are built with dynamic schema for unstructured data.

Going further -- lightweight table-based SQL databases exhibit vertical scalability, whereas your common or garden document (or graph, or wide-column store) based NoSQL databases exhibit horizontal scalability.

NoSQL has proved popular with 'modern' web centric companies, but less popular in 'conservative' industries such as banking and manufacturing where SQL reigns.

You can guess the next part

In the meantime then, the debate goes on and (you can guess this next part) the SQL specialists will tell us they are more agile and the NoSQL specialists will tell us they are good and robust

With input from engineering teams at Red Hat and Google, SkySQL has just released its latest MariaDB Enterprise product range, based on the new MariaDB 10 code.

This, claims the company, means that the product combines NoSQL with SQL technology together.

MariaDB Enterprise 2 and MariaDB Enterprise Cluster 2 editions expand SkySQL's vision of a high performance SQL platform suited to the complex web-scale challenges.

The big trade off

SkySQL CEO Patrik Sallner explains the trade off here:

"SQL databases like MariaDB remain crucial to almost every enterprise because they can reliably convert real-world business transactions into grouped multi-step operations for consistent data manipulation. NoSQL solutions are simple to use and so popular with developers but they lack business critical features, like ensuring data consistency. Until now, enterprises have been forced to select robust SQL databases for some data loads and less mature solutions from NoSQL vendors for others, leading to integration and support issues."

Sallner's firm claims to have combined the best of both approaches.

Data-centric developers want the assurance of features that ensure data consistency at all times with the agility of handling very large, unstructured NoSQL datasets.

Could this be it?

"The availability of an enterprise-grade SQL database platform with NoSQL interoperability is a game charger for developers building serious revenue-generating applications and DBAs that run large, complex data environments," said the company, in a press statememt.

NOTE: The announcement of these new commercial products coincides with the release of the open source database server MariaDB 10 -- and SkySQL is the biggest contributor to the MariaDB project, both in terms of resourcing and code.

So does it work?

Red Hat CTO Brian Stevens has said that Red Hat has included MariaDB in its Red Hat Enterprise Linux 7 beta.

SkySQL insists upon the assertion that no other SQL database solution can reliably deal with the latest wave of applications (which contain massive amounts of users and support data) across today's mix of mobile apps, gaming and e-commerce platforms.

The case is stated; at length the truth will out.

MarkLogic champions 'semantic context' for NoSQL data insight

bridgwatera | No Comments
| More

NoSQL database platform company MarkLogic is having a successful period of growth and says that MarkLogic release 7 is marked out for its elasticity, tiered storage and semantics capabilities.

Semantic schemantic -- what could that mean?


Company CEO Gary Bloom is bullish about his firm's growth in Asia Pacific and says that MarkLogic Semantics (Ed - CAPS S, it's branded don't ya know?) is now included for works for data insight as it helps deliver "contextually-relevant" information to users.

A chemical formula for data insight

David Leeming is strategic innovation group solutions manager at the Royal Society of Chemistry.

Leeming explains how semantic context has helped his organisation consume (without severe indigestion) over 1 digitised million pages of written word and chemical formulae into XML.

After the formulae had been moved to XML, the chemists said they needed to get more out of the XML and that a relational database model would not fit their needs.

"MarkLogic is more than just a NoSQL database, it has an extremely powerful search engine to enable logical associations between different types of content, helping us launch new online journals very quickly and grow our publishing output from 6,000 articles a year to over 30,000 a year today."

Stemming from open source roots, the company now offers a free developer license and "cloud-ready" hourly pricing for Amazon Web Services.

Don't say big data, say big data in motion

bridgwatera | No Comments
| More

What is real time data anyway?

As you will know, in computing terms we talk about real time processing (or perhaps "computer responsiveness") as being that level of compute power and speed such that users PERCIEVE that the systems they use are operating at the same speed as human (or indeed machine-based) life events.


Real time data now sits alongside real time information and data-in-motion (as opposed to data-at-rest, such as non-active databases, tape backups etc.) to form the new always-dynamic nature of data today.

Traditionally we have always talked about "three states of data" such that we also incorporate data-in-use also, but the degree to which we focus on real-time analytics now is arguably eroding the definitions of the past i.e. almost all data is potentially data-in-use now.

So then, logically, big data needs more data-in-motion tooling right?

This is clearly the mindset exhibited by secure data company Zettaset.

The firm has recently announced the addition of data-in-motion encryption as a new feature in its Zettaset Orchestrator management and security add-on application for Hadoop.

What is data-in-motion encryption?

It is intended to provide organisations with an additional layer of protection for their Hadoop clusters and sensitive data, eliminating access by unauthorised users.

How does data-in-motion encryption work?

Orchestrator data-in-motion encryption ensures that all networking connections to the Orchestrator web-based console are completely secure within a Secure Socket Layer (SSL) tunnel.

Can we go deeper?

Communication links between all cluster nodes are encrypted and authenticated to eliminate the possibility (claims Zettaset) of unauthorised access to data by anyone within the corporate network or Hadoop cluster.

Why is big data security on Hadoop an issue?

Gartner's Bhavish Sood says that Hadoop certainly wasn't built with enterprise IT environments in mind, "Because there is a shortage of robust security controls in Hadoop."

"Orchestrator data-in-motion encryption for Hadoop represents the next phase of Zettaset's Big Data encryption technology initiative, which began with the release of data-at-rest encryption in late 2013," said Jim Vogt, CEO of Zettaset.

The Java parallelism paradox

bridgwatera | No Comments
| More

Java 8 has been reengineered with the strength of Project Lambda Expressions.

Lambda, is a Java function that adds 'closures and related features' dedicated to supporting multicore programming.

a ScienceDuke150.jpeg

A Lambda expression is code that a developer can "share" to be executed later, just the once or executed multiple times -- hence, more control exists for parallelism over multicore.

This is important because as Herb Sutter said in Dr Dobb's Journal back in 2005, we are at a "fundamental turning point in software development" -- this was in his piece entitled The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software

Up until Java 8, this is what Oracle's Java Tutorial page had to say about parallelism:

Parallel computing involves dividing a problem into sub-problems, solving those problems simultaneously (in parallel, with each sub-problem running in a separate thread), and then combining the results of the solutions to the sub-problems.

So looking back, Java SE provided the fork/join framework, which enables developers to implement parallel computing in applications.

However, with this framework, the programmer must specify how the problems are subdivided (partitioned).

With Java 8, Oracle explains that it is taking some of the dirtier mechanics out of the equation and essentially making the whole route to parallelism easier to get to.

Java SE vice president George Saab explained to reporters in Prague this week that he is thinking about parallelism and concurrency and how the Java team now works with developers to exploit the potential here with Lambda.

Saab notes that you could achieve parallelism with Java closures previously (as Java 8 now provides) via abstract inner classes, so this will not be completely alien to developers who could no doubt have been using parallelism elsewhere also as well as Fork/Join Framework tools (JSR166y).

The proposition with Lambda is that it takes some of the mechanics out of this process and makes it more automated.

Going back further, vice president of development for the Java platform Nandini Ramani points out that Java has a history of parallelism as Phasers were introduced in 2006 with Java 7 (JSR166) and further still all the way back to Java 1.4 with java.lang.Thread if you want to do the history.

Plus let's also not forget... Lambda (as a working project with builds) has been around since JDK 7.0 so it has had time to bake.

Whichever way you slice it, Java is ready (or more ready at least) for parallelism and the free (developer) lunch is indeed over.

Java 8 grasps parallelism (more tightly) with Lambda Expressions

bridgwatera | No Comments
| More

Oracle's Java coding cognoscenti gathered in the Czech capital of Prague this week to analyse, present and postulate over the arrival of the Java 8 language.

Now some 20-years after Sun Microsystems first envisioned and built the Java language and platform under the eye of James "father of Java" Gosling, we now find ourselves in a postmodern Oracle world of Java development.


Java 8 has been reengineered and the most significant enhancement in this release is Project Lambda Expressions for the Java Programming Language.

To be more complete, the standout features of JDK 8 are Project Lambda the Nashorn JavaScript Engine and a new Date and Time API.

Top of the pile is Lambda, which is a Java function that adds 'closures and related features' dedicated to supporting multicore programming.

Image credit note: this image comes from the very excellent TAKIPI BLOG.

What are Lambda expressions?

A Lambda expression is described as a block of software code that a developer can "pass around" in order for it to be executed later, executed just the once or indeed executed multiple times -- hence, more control exists for parallelism over multicore.

While some commentators have said that Oracle's take on Lambda for Java is not as polished as it might be (and that standard Java imperative style performs much better), Java 8 should not be denigrated or disparaged just for being new.

Java SE vice president Georges Saab told reporters in Prague that yes, there will be refinements over time and that performance will improve.

Java 8 with Lambda has been widely heralded as the most significant update to the language since the introduction of Generics in Java SE 5.0 way back in 2004.

NOTE: Java Generics are meant to add stability to code by making bugs detectable at compile time. Oracle points out that runtime bugs can be much more problematic; they don't always surface immediately, and when they do, it may be at a point in the program that is far removed from the actual cause of the problem.

The Java 8 release is substantial and spans outward to incorporate the Java Platform Standard Edition 8 (Java SE 8), Java Platform Micro Edition 8 (Java ME 8) and connected releases of Java Embedded products.

So more specifically then, Oracle is also announcing Oracle Java SE Embedded 8, which uses the new Java SE 8 features and is optimised for mid- to high-end embedded systems.

With this release Oracle is aiming to converge Java ME and Java SE to engender a more consistent developer experience and more code re-use across the platforms.

"Java is the global standard for developing and delivering applications everywhere - from small devices to the cloud. The convergence of Java SE 8 and Java ME 8 is intended to provide a consistent development environment that improves developer productivity and application performance, along with the ability to 'right-size' the platform for deployment across an even wider range of use cases," said Nandini Ramani, vice president of development, Java Platform, Oracle.

Ramani spoke to journalists in Prague to explain that as Java now converges, we get to a new and more positive point in terms of being able to program based on device constraints (with mobile and Internet of Things devices in mind) rather than based upon particular functions in the language.

How will developers learn Java 8?

Oracle is also conducting a worldwide tour of Java User Groups (JUGs) and plans to deliver educational events for nearly sixty JUGs in over twenty countries on six continents.

An updated Java SE 8 training curriculum will be available soon from Oracle to help developers transition to Java SE 8 and implement the latest platform enhancements.

Why did we title this piece Java 8 grasps parallelism (more tightly) with Lambda? You may ask...

... this is because Java already has a history steeped in parallelism but that's another story.

a eudgiwuegfd.jpg

DevOps is a real job, it's official

bridgwatera | No Comments
| More

The 'developer' and 'operations' DevOps role is now an official part of the tech industry nomenclature.

The number of permanent and officially recognised DevOps Engineer posts in the UK has jumped 347% in the past two years.

This news comes alongside the discovery that 'DevOps Engineer' has jumped 222 places in a list of IT jobs skills/job titles, making it one of the fastest growing skills needed by the industry [Source: IT Jobs Watch].

Open cloud company Rackspace has supplied these "findings" and says that by automating infrastructure, workflows and continuously measuring application performance, DevOps has become an important business process.

NOTE: The top technology skills DevOps Engineers currently need are Linux (79%), Puppet (60%), Chef (47%), Python (44%) and Ruby (42%), according to job advertisements for the position for the six months to March 2014.

Rackspace has a DevOps offering and launched its global DevOps Automation Service in late 2013 from the US -- a UK DevOps Automation Team has also just been launched.

The firm says that the service is designed to help developers automate the process of deploying and scaling hybrid cloud infrastructures for fast-growing applications.

a devp[oiedw.jpg

NoSQL or RDBMs: who will win the database war?

bridgwatera | 3 Comments
| More

This is a guest post for Computer Weekly Open Source Insider written by Sandor Klein of EDB, a provider of enterprise-class products and services based on PostgreSQL.


An open (data) battlefield

The NoSQL and relational database camps emerged fighting in 2013, and the once peaceful niche of data management has transformed into an open battlefield.

As businesses look to cope with the challenge of big data, both camps claim to offer the best approach when it comes to managing the data deluge.

Selecting the right database to suit your requirements has become tricky amidst a noisy backdrop of claims and counterclaims from the market's major players on both sides of the divide.

As the battle continues to intensify into 2014, what hope is there for the relational camp when it comes to convincing CIOs that their solutions are more than match for the rapidly maturing NoSQL solutions that have sauntered into the marketplace? As it happens, plenty.

Noisy Mongo

The likes of CouchDB, Redis and MongoDB have certainly made a lot of noise to attract attention in 2013, but advances made by relational databases such as Postgres have continued apace.

Relational systems have been very good at learning the lessons that the NoSQL systems presented. By tooling up in areas such as JSON for semi-structured web-based documents and fast data lookups through HStore, relational solutions are now in a position to offer precisely the same core capabilities as NoSQL vendors.

This is in addition to meeting standard transactional requirements -- which is not to say there is no role to play for NoSQL systems. Indeed many organisations have found that NoSQL technology is very well-suited to some types of big-data applications, but not others.

What NoSQL is good at

NoSQL's strength lies in its ability to number crunch vast amounts of data, not as a solution for real-time decision-making. The key is for organisations to use the right technology for every application.

When a technology triggers an ideological divide as NoSQL has, there is a tendency to take sides. What's important is for organisations to take a more enlightened view, overlooking the hype and making a decision that will ultimately derive maximum value from their database investments.

Microsoft: open source developers not yet fully recognised by Microsoft MVP Programme

bridgwatera | No Comments
| More

Microsoft continues to claw its talons into the open source space this week by announcing that developers who run, manage, or commit to large and highly influential open source projects will now be "formally recognised" by the company's MVP Award Program.


Microsoft code guru Scott Hanselman has posted excerpts from an internal company memo that specifies, "Currently there is a class of developer influencers whose contributions are not yet fully recognized by the Microsoft MVP Program."

NOTE: The Microsoft Most Valuable Professional (MVP) Award recognises exceptional, independent community leaders who share their passion, technical expertise, and real-world knowledge of Microsoft products with others.

Hanselman quotes more from Microsoft's statement on open source and the MVP programme saying that the company now recognises community developers as
influencers who run, manage, or commit to large and highly influential open source projects.

"However since they do not participate in what is considered as more traditional ways such as speech engagement, books, online forums or user groups, they are not usually considered as potential MVPs. Often these developers have technical community followings but may not necessarily be on message.. As a result, there is a belief amongst some influencers that Microsoft does not support open source software."

But things have changed...

"As we move forward, we will change the MVP guidelines to recognize open source activities with the same weight as other community activities."

Hanselman himself has said that he had envisaged Microsoft creating a new "Open Source MVP" as a formalised badge or label.

"I pushed that for a while but realised quickly that it would create an island of OSS MVPs, and no one group would claim them. Better to push OSS throughout the whole program so everyone shares a home," he said.

Microsoft has rounded out the discussion by saying that the MVP Award Program will recognise open source activities in order to promote further growth and support of the technical communities.

... and specifically, the company states that:

Microsoft will work to fine tune the MVP guidelines to recognise open source activities with the same weight as other community activities.

Can a leopard change its spots?

Is Microsoft really just still saying that the only valuable open source is that which extends Microsoft products in its own view?

Sony open sources games tools framework: woos developers to Playstation land

bridgwatera | No Comments
| More

Sony is making it Authoring Tools Framework product available to programmers for free under the Apache 2.0 open source license.

The Authoring Tools Framework (ATF) will be free to download and use.

Industry commentators have suggested that this may now create an incentive for independent third-party developers to start developing games (or indeed porting existing games) to the PlayStation.

The ATF itself is comprises a set of C#-authored software components used to build games.


Sony itself has used its ATF to build games including Naughty Dog's action-adventure survival horror video game "Last Of Us" level editor.

The Soy ATF has been in "continuous development" since 2006.

"We're looking forward to expanding ATF's usage beyond SCE, working more with external PlayStation developers and the larger game development community as a whole," said Sony CEA principal tools programmer Ron Little.

"Now that ATF is open-source, we're excited to see the ways developers use the toolset, which could expand beyond games."

There are many types of components in ATF. ATF Managed Extensibility Framework (MEF) components can be added to a .NET TypeCatalog with a single line of code that can be found in most of our sample applications, as in \Samples\CircuitEditor\Program.cs.

Other components, like the DOM, are really a collection of related classes.

Mozilla VP: Firefox for Microsoft Metro would be a mistake

bridgwatera | No Comments
| More

Mozilla has canned plans to build a Firefox browser that aligns specifically to
Microsoft's so-called 'Modern Design' approach, the design approach artist formerly known as Metro.

Microsoft Metro (sorry, Modern Design) remains a key component of the Windows 8 optimised-for-touch experience and the user interface presentation has its fans and its opponents.

Firefox VP Johnathan (correct spelling) Nightingale has confirmed that he has asked his engineering leads and release managers to take the Windows Metro version of Firefox "off the trains", as he puts it.

He means "off the rails", but let's continue.

"The team is solid and did good work, but shipping a 1.0 version, given the broader context we see for the Metro platform, would be a mistake," he blogged.

Mozilla's Nightingale says that he doesn't want the team to work on the Microsoft plan because they "need to focus on the projects with the most impact for our mission" today.

"In the months since, as the team built and tested and refined the product, we've been watching Metro's adoption. From what we can see, it's pretty flat. On any given day we have, for instance, millions of people testing pre-release versions of Firefox desktop, but we've never seen more than 1000 active daily users in the Metro environment," said Nightingale.

Windows 8 users who prefer Firefox to Internet Explorer don't need to panic -- the non-Metro (desktop) version of Firefox still works well in Windows 8 environments and will continue to be developed.

The codebase is off the trains and off the rails, but will live on at http://hg.mozilla.org/projects/metro for those who wish to retain an attachment to it.

Image below: Firefox Australis, the next Firefox


March of the penguins: how we migrate from UNIX to Linux

bridgwatera | No Comments
| More

This is a guest post for Computer Weekly Open Source Insider by Adam Leventhal, CTO at Delphix, a company that helps virtualise database infrastructures to accelerate application rollouts and development.

The IT industry is in the midst of a mass platform migration.


Gartner anticipates that in three years from now, 65% of the applications that were running on UNIX in 2012 will migrate to Linux.

Now... while the courts have given HP-UX a reprieve from Oracle's death sentence, IT organisations know not to put their faith in a product on death row.

Relational opportunity

In the drive to modernise, applications built around relational databases represent the greatest opportunity for return, and the greatest challenge.

Multiple CIOs have told me they planned to modernise all of their databases, but -- constrained by cost and time -- only managed to tackle the 20% that yielded 50% of the value. The remaining 80% remains a major drag to IT; we're kicking the can down the road.

UNIX platforms such as Solaris on SPARC and AIX on Power store data in a different format than Linux on x86 (big-endian versus little-endian).

Complex and cumbersome conversion

The Oracle database comes with tools to execute a conversion between platforms, but the process is still so complex and cumbersome that few organisations attempt it without help from specialised consulting practices.

Editorial note: It is at this point that Leventhal mentions certain analysts who are beginning to look at technologies like Delphix's own Modernisation Engine -- this product is specifically designed to facilitate and automate the conversion of Oracle databases from UNIX to Linux.


Now, we're moving to a model in which we can use virtualisation technology to provision copies of the original, production Oracle databases on UNIX platforms or automate a conversion process that can be tested and tuned to stamp out platforms automatically.

How to convert a database

Converting a database typically involves modifying 90% of the data blocks, changing the content to work on the new platform.

Even with advanced storage platforms converting a 1TB database would result in a copy that's nearly 1TB.

Data Agility brings the end-to-end awareness of data and storage to fit even converted databases into a tiny storage footprint. Removing the storage constraint means that modernisation efforts operate at lower risk, always testing with fresh, full datasets.

Datacentres are populated with lumbering legacy apps; migrations have been slow - to consolidated data centers, hybrid clouds or into retirement.

Economics has increased the pressure to leave legacy platforms. Fortunately advances in data agility and virtualisation have come just in time to transform the datacenter.

About the author...

Adam Leventhal is a co-inventor of DTrace. At Sun Microsystems, he worked in the Solaris kernel group and was a founding engineer in Sun's Fishworks group. He currently sits on the illumos developer council, and has 11 patents filed or pending.

9 ways for programmers to sleep better

bridgwatera | No Comments
| More

Can anyone code proficiently or architect a software masterpiece without enough sleep?

Answer: no, of course not.


This Friday March 14 is the 7th Annual World Sleep Day...

... this is where members of the World Association of Sleep Medicine come together and educate the world on the importance of getting enough sleep.

A total of 21% of adults get less than six hours of sleep each night says the organisation.

Cathy Beggan, founder and CEO of Rise-N-Shine, has come up with 10 tried and true ways for software application development professionals to get a good night's sleep.

1. Avoid chocolate, coffee and red wine in the evening hours. These foods have been known to disturb sleep patterns and digestive tracts.

2. Have an early dinner. Make sure dinnertime meals are finished before 7pm or at least three hours before bedtime.

3. Establish a sleep cycle. It's important to notice when the body starts to get tired and being able to adjust

4. Keep the lights off. Rays of light from nightlights, hall lights, bathroom lights or even TV.'s can disturb a natural sleep pattern.

5. Relax. Turn the mind off at night is an important component of quality sleep. Yoga and meditation can help turn the mind off after an action packed day.

6. Exercise in the morning. Exercising at night can release endorphins that can keep the body awake for longer periods of time. Releasing these endorphins in the morning or during the day can help give the body more energy when it's needed most.
Have a nighttime routine.

7. Once a more concrete sleep cycle has been established, creating a nightly routine can help prepare the body for rest.

8 Read a book. Watching T.V. in bed can sometimes stimulate the mind so choose something that will wear the mind out, like reading.

9. Close your eyes. Sometimes the first step to falling asleep is closing your eyes and allowing yourself to rest.

Rackspace #bigdatabreakfast, you had me at bacon and cloud

bridgwatera | No Comments
| More

Rackspace describes itself as the 'open cloud' company no less.

Open enough then to host a cloud-enriched big data breakfast in London this month.


The #bigdatabreakfast (doesn't your breakfast have a hashtag?) saw representatives from Rackspace itself sit alongside suits from EMC, MongoDB, HortonWorks and DataStax.

This was a morning of occasionally great soundbites and a few over-practiced over-'media trained' howlers.

"Big data is the new currency," said one of the spokespeople. Yeah right.

More thought provoking was Berne Kaponig of EMC who said that big data is the 'avant garde element' of data management.

Rackspace itself had an admirable take on where big data is headed (it looks after a lot of it after all) and eluded to the tipping point that we now find ourselves at...

... that is to say, the firm has said before now that IT skills sets as recently as 18 months ago were not high enough for us to exploit big data to the full.

But now, in 2014, the skills gap is still there "but no longer huge" says Rackspace head of technical product strategy, Toby Owen.

Rackspace's Owen contends that that the basic skill sets are there today and that the tooling is "getting better and easier to use" as we now form a new fulcrum between BI and big data itself.

Owen also suggests that big data platforms (as a means of compute processing power) are becoming more straightforward for the end user in operational terms.

The Rackspace man in particular referred to where we are today with MongoDB i.e. it started back in 2007, grew slowly, skills developed and then now... it is one of the top 10 databases on planet Earth.

EMC commentary was stronger than some others at this event; the company stated that enterprise class data management is about to become more mainstream.

A couple more for you...

EMC mentioned that we are on the verge of bringing latency down one more step i.e. the cloud is getting more "pragmatically real" in terms of implementations today.

Rackspace also said that analytics & integration inside the vertical stack is where the opportunity is.

Bacon sandwiches, cloud and insight then - what's not to like?

Gastronomic disclosure: No HP sauce, Daddies or other 'brown condiment' was provided.


Find recent content on the main index or look in the archives to find all content.



Recent Comments

  • ccm12983: The Raspberry Pi is hard to beat, but it depends read more
  • Jessica Dodson: An open source policy helps ensure that all your bases read more
  • qaz wiz: even at that cut rate price this thing needs to read more
  • Evaldo Horn de Oliveira: Great post – there’s definitely truth behind the statement that read more
  • Doron Levari: Thank you for your post, I enjoyed reading. As I read more
  • Monica Pal: Totally agree that organizations should use the right technology for read more
  • Alan Carvalho: Adrian, when people think about IoT the first thing comes read more
  • Noah Slater: I agree with Lydia about the dual edged sword of read more
  • Ian Ferguson: I think this is a simplified view of IoT. Sure, read more
  • Shawn Douglass: Chef is a great tool to supplement the DevOps process. read more

Recent Assets

  • 1q1w1.jpg
  • 140411_0409_290X230.jpg
  • nescafe-nespresso-armchair-small-65982.jpg
  • OpenSourceSeeds14_2827.JPG
  • rain-guage.png
  • hero-image.png
  • 3587b20.jpg
  • MinnowBoard_MAX-Top-Angled_1280x960.jpg
  • default-logo.png
  • sausages.gif

-- Advertisement --