Opening Box to open source

bridgwatera | No Comments
| More

Online file sharing and cloud content management company Box is showcasing its now quite well populated open source repository.

As a company Box has always come across as 'open enough' to third party projects, partnerships and promotions --- but now an appreciation for open programming steps forward in more solid terms.

hero-image.png

The logically named http://opensource.box.com/ is, as it says on the tin, an open source repository of Box-connected metadata, emerging projects, core content and SDKs for Android, iOS, Windows and Java.

Box says it relies on open source software every day.

"That's why we give back to the open source community whenever possible, by contributing code to outside projects and sharing projects that we've developed internally," says the firm.

Example projects include:

Rain Gauge -- a tool to simplify the process of collecting detailed information from mysql database servers when specific conditions are triggered.

rain-guage.png

Flaky -- is a plugin for nose that automatically reruns flaky tests. Instead of skipping flaky unit tests for components that aren't 100% reliable, use flaky to automatically retry them.

Box's Benjamin VanEvery blogged this month to say, "Open source has been a part of the Box technology stack since the company's earliest days. Technologies including Apache, nginx, PHP and their peers have been critical to Box's success and to the technical revolution of web software and platforms as a whole. Today, we are very excited to announce Box Open Source, Box's formalized open source initiative committed to giving back to the community. Our engineering teams have contributed 20 open source projects, all showcased onopensource.box.com."

A proactive approach to open source governance

bridgwatera | 1 Comment
| More

This is a guest post for the Computer Weekly Open Source Insider blog written by Lacey Thoms, a marketing specialist and blogger at software analysis and code attributes management company Protecode -- Lacey has a Bachelor's Degree in Mass Communications from Carleton University and has written many articles on open source software management.

Omnipresent openness

3587b20.jpg

Open source software has become an omnipresent and major driver of software activities worldwide. Many organisations, from small start-ups to large multinationals, are using open source code to accelerate development and reduce costs.

As open source adoption increases, the processes for managing open source code and its associated license obligations, security vulnerabilities and export content are maturing. The days of manually auditing the code before the product ships are losing out in favour of more proactive, cost effective approaches.

The modern approach for an open source software adoption process is similar to the one used for any other third party software, which revolves around uncovering all external code used in a project, and identifying their license and copyright attributes, as well as any security vulnerabilities or encryption content associated with the code.

Like other quality assurance processes, it is best to start managing open source governance in the early stages of development.

Organizations are beginning to take more proactive steps towards managing open source software licenses, beginning with an established open source policy and a defined workflow process that can reject any packages that violate the policy before the developer is permitted to use the code. From there, organisations follow practices that can detect and flag violations as the code is brought onto a developer's workspace.

Creating an open source policy

The first stage of implementing an open source governance process it to draft an open source policy. The policy regulates the open source governance process and covers topics such as who the stakeholders are within the organisation, and outlines acceptable attributes, such as open source licenses and communities. The open source policy is drafted with input from all the relevant stakeholders in the organisation.

Typically an open source committee consists of representatives from legal, R&D, and product management. An open source policy also includes a workflow for requesting and approving open source packages that can be used in specific projects or within the entire organisation and defines the course of action once an open source policy violation is suspected.

Implementing a pre-approval workflow

A good open source policy puts emphasis on catching open source governance issues at the earliest stage of development, therefore vastly reducing the time and effort involved in remedying them. An important element of any solid open source policy is a package pre-approval process. In essence, this process is a series of actions that allows anyone to request a certain open source package to be used in a project. Through a streamlined workflow process, a licensing person can approve or reject the requests based on the available information about the project, how the package is to be used in the project, and the open source package attributes.

So, what does a package pre-approval workflow entail?

First, developers must submit a request including details such as package's name, a link to the code, and information such as version, authors, and the license cited on the site or specified in the package. Other information such as known open source security vulnerabilities and presence of encryption content in the package will help the compliance examiner streamline the approval process. Another important item accompanying the pre-approval request is a description of how the package is going to be used in the product, including whether or not the code will be modified, redistributed, or if it will only be used internally.

After the request is submitted, an administrator (usually someone from the open source committee) can review the request. Typically, a combination of manual research and automated open source scanning tools are used to confirm and identify licenses, obligations, copyrights, open source security vulnerabilities, and encryption properties of the requested package. At this stage, the licensing person will review license obligations and other properties of the requested package against the organisation's policy, taking into consideration how the developer intends to use the package.

If there are no conflicts with the organisations open source policy, the administrator can approve the package. Once a software package is approved, it is then logged and made available to the specific product groups or the whole organisation. A record of the approved packages is made available so that developers can readily use these pre-approved components in the future.

Software package pre-approval can be added to existing open source management processes to further improve governance. Organisations that have a process in place that scans code at regular intervals (e.g. daily weekly, monthly) and also organisations that have a continuous scanning process in place (scanning in real-time as code is brought in by developers) will benefit from a package pre-approval process. Package pre-approval speeds up continuous scanning because code can be approved before it enters the development environment.

The result is a lower number of overall files that need to be scanned thus speeding up the overall scanning process.

A proactive approach

To get the most benefit out of open source code, organisations are turning to more sophisticated practices for open source governance. As with other software lifecycle management processes, automated solutions for package pre-approval exist that significantly reduce the time and effort spent on open source governance and increase the accuracy of the results.

Information that these processes detect includes permissions from the owner to use the software, any known deficiencies such as bugs or security vulnerabilities, and any other information pertinent to carrying out the business such as exportability.

A package pre-approval workflow process, when combined with automated open source scanning, is an effective part of managed adoption of open source software, allowing organisations to reduce their development costs and speed up delivery times using quality third-party software. With a proactive approach to open source management, organisations can harness the benefits that open source has to offer, while creating a streamlined process to avoid challenges associated with open source software.

Intel beefs up open source Raspberry Pi challenger and slashes price

bridgwatera | 2 Comments
| More

default-logo.pngIntel has beefed up its open source single-board computer and cut its price in half.

The Minnowboard Max features an open hardware design and is targeted at software application development pros and enthusiasts who want to code for the "deeply embedded" market.

Intel has slashed the Minnowboard price from US$199 to $99 (£60) -- although distributor prices vary and not all have reflected the price reduction at the time of writing.

MinnowBoard_MAX-Top-Angled_1280x960.jpg

System-on-Chip (SOC)

Along with the price cut comes a more powerful Atom processor (64-bit Intel Atom E38xx Series SOC), integrated Intel HD graphics and a smaller overall footprint for the machine itself.

Minnowboard Max runs Debian GNU/Linux or Android 4.4 system -- it is also Yocto project compatible.

NOTE: The Yocto Project is an open source collaboration project that provides templates, tools and methods to help create custom Linux-based systems for embedded products regardless of the hardware architecture.

Occasionally touted as a "Raspberry Pi challenger", Intel's Minnowboard Max is arguably better suited to the professional and hobbyist engineering space and is unlikely at this stage to make inroads into Raspberry Pi's popularity inside the education sector -- although that success in itself has been questioned.

The Intel graphics chipset included here comes with open-source drivers so software developer/hackers can really play with a wide range of possible use cases for this machine.

MinnowBoard Max has two USB ports (one of them is now USB 3.0) and a microSD slot as well as 10/100/1000 Ethernet.

If the Raspberry Pi and its Broadcomm SOC with 700Mhz ARM processor had won over many of the hackers in this zone recently, then many would argue that Intel has placed Atom right back in the interest zone with these updates.

The software-defined business cometh

bridgwatera | No Comments
| More

Somebody had to say it, so it might as well have been an Application Performance Management (APM) company that did so.

Having just about got our heads around Software Defined Networking (SDN) -- where system management is decoupled from hardware to be given over to the control offered by a software application called a controller -- we now learn that the software-defined business cometh.

APM vendor AppDynamics has been busy attempting to coin this buzz-phrase in the context of its core solution.

The firm's core offering has (queue news item) just been augmented with enhancements for apps built in Java, .Net, PHP and mobile apps; new support for Node.js, Scala Apps and Big Data stores based on increasingly popular NoSQL databases

This software is intended to monitor, manage and analyse complex software environments in real time and in production.

AppDynamics' Spring 2014 Release includes new support for NoSQL Big Data stores including MongoDB and Hadoop, Couchbase and Cassandra through its extensible API framework.

sausages.gif

"This goes way beyond monitoring -- it's true application intelligence," said Jyoti Bansal, founder and CEO of AppDynamics

... and that, if you buy the sizzle on this particular sausage, if some of what makes a software defined business i.e. it is a firm who can trace its core operational vital signs down to functions and control that are dictated (or at least managed) by software.

Has SkySQL MariaDB pulled off the NoSQL + SQL combo challenge?

bridgwatera | No Comments
| More

There is something of a war of words (and code) going on between the NoSQL and SQL database camps.

Some of it is merely flack; both approaches have their benefits.

SQL databases are marked out for their predefined schema (the structure of the database that describes its construction and basic 'integrity constraints'), whereas NoSQL databases are built with dynamic schema for unstructured data.

Going further -- lightweight table-based SQL databases exhibit vertical scalability, whereas your common or garden document (or graph, or wide-column store) based NoSQL databases exhibit horizontal scalability.

NoSQL has proved popular with 'modern' web centric companies, but less popular in 'conservative' industries such as banking and manufacturing where SQL reigns.

You can guess the next part

In the meantime then, the debate goes on and (you can guess this next part) the SQL specialists will tell us they are more agile and the NoSQL specialists will tell us they are good and robust

With input from engineering teams at Red Hat and Google, SkySQL has just released its latest MariaDB Enterprise product range, based on the new MariaDB 10 code.

This, claims the company, means that the product combines NoSQL with SQL technology together.

MariaDB Enterprise 2 and MariaDB Enterprise Cluster 2 editions expand SkySQL's vision of a high performance SQL platform suited to the complex web-scale challenges.

The big trade off

SkySQL CEO Patrik Sallner explains the trade off here:

"SQL databases like MariaDB remain crucial to almost every enterprise because they can reliably convert real-world business transactions into grouped multi-step operations for consistent data manipulation. NoSQL solutions are simple to use and so popular with developers but they lack business critical features, like ensuring data consistency. Until now, enterprises have been forced to select robust SQL databases for some data loads and less mature solutions from NoSQL vendors for others, leading to integration and support issues."

Sallner's firm claims to have combined the best of both approaches.

Data-centric developers want the assurance of features that ensure data consistency at all times with the agility of handling very large, unstructured NoSQL datasets.

Could this be it?

"The availability of an enterprise-grade SQL database platform with NoSQL interoperability is a game charger for developers building serious revenue-generating applications and DBAs that run large, complex data environments," said the company, in a press statememt.

NOTE: The announcement of these new commercial products coincides with the release of the open source database server MariaDB 10 -- and SkySQL is the biggest contributor to the MariaDB project, both in terms of resourcing and code.

So does it work?

Red Hat CTO Brian Stevens has said that Red Hat has included MariaDB in its Red Hat Enterprise Linux 7 beta.

SkySQL insists upon the assertion that no other SQL database solution can reliably deal with the latest wave of applications (which contain massive amounts of users and support data) across today's mix of mobile apps, gaming and e-commerce platforms.

The case is stated; at length the truth will out.







MarkLogic champions 'semantic context' for NoSQL data insight

bridgwatera | No Comments
| More

NoSQL database platform company MarkLogic is having a successful period of growth and says that MarkLogic release 7 is marked out for its elasticity, tiered storage and semantics capabilities.

Semantic schemantic -- what could that mean?

705px-Two_small_test_tubes_held_in_spring_clamps.jpg

Company CEO Gary Bloom is bullish about his firm's growth in Asia Pacific and says that MarkLogic Semantics (Ed - CAPS S, it's branded don't ya know?) is now included for works for data insight as it helps deliver "contextually-relevant" information to users.

A chemical formula for data insight

David Leeming is strategic innovation group solutions manager at the Royal Society of Chemistry.

Leeming explains how semantic context has helped his organisation consume (without severe indigestion) over 1 digitised million pages of written word and chemical formulae into XML.

After the formulae had been moved to XML, the chemists said they needed to get more out of the XML and that a relational database model would not fit their needs.

"MarkLogic is more than just a NoSQL database, it has an extremely powerful search engine to enable logical associations between different types of content, helping us launch new online journals very quickly and grow our publishing output from 6,000 articles a year to over 30,000 a year today."

Stemming from open source roots, the company now offers a free developer license and "cloud-ready" hourly pricing for Amazon Web Services.

Don't say big data, say big data in motion

bridgwatera | No Comments
| More

What is real time data anyway?

As you will know, in computing terms we talk about real time processing (or perhaps "computer responsiveness") as being that level of compute power and speed such that users PERCIEVE that the systems they use are operating at the same speed as human (or indeed machine-based) life events.

Big_data_cartoon_t_gregorius.jpg

Real time data now sits alongside real time information and data-in-motion (as opposed to data-at-rest, such as non-active databases, tape backups etc.) to form the new always-dynamic nature of data today.

Traditionally we have always talked about "three states of data" such that we also incorporate data-in-use also, but the degree to which we focus on real-time analytics now is arguably eroding the definitions of the past i.e. almost all data is potentially data-in-use now.

So then, logically, big data needs more data-in-motion tooling right?

This is clearly the mindset exhibited by secure data company Zettaset.

The firm has recently announced the addition of data-in-motion encryption as a new feature in its Zettaset Orchestrator management and security add-on application for Hadoop.

What is data-in-motion encryption?

It is intended to provide organisations with an additional layer of protection for their Hadoop clusters and sensitive data, eliminating access by unauthorised users.

How does data-in-motion encryption work?

Orchestrator data-in-motion encryption ensures that all networking connections to the Orchestrator web-based console are completely secure within a Secure Socket Layer (SSL) tunnel.

Can we go deeper?

Communication links between all cluster nodes are encrypted and authenticated to eliminate the possibility (claims Zettaset) of unauthorised access to data by anyone within the corporate network or Hadoop cluster.

Why is big data security on Hadoop an issue?

Gartner's Bhavish Sood says that Hadoop certainly wasn't built with enterprise IT environments in mind, "Because there is a shortage of robust security controls in Hadoop."

"Orchestrator data-in-motion encryption for Hadoop represents the next phase of Zettaset's Big Data encryption technology initiative, which began with the release of data-at-rest encryption in late 2013," said Jim Vogt, CEO of Zettaset.

The Java parallelism paradox

bridgwatera | No Comments
| More

Java 8 has been reengineered with the strength of Project Lambda Expressions.

Lambda, is a Java function that adds 'closures and related features' dedicated to supporting multicore programming.

a ScienceDuke150.jpeg

A Lambda expression is code that a developer can "share" to be executed later, just the once or executed multiple times -- hence, more control exists for parallelism over multicore.

This is important because as Herb Sutter said in Dr Dobb's Journal back in 2005, we are at a "fundamental turning point in software development" -- this was in his piece entitled The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software

Up until Java 8, this is what Oracle's Java Tutorial page had to say about parallelism:

Parallel computing involves dividing a problem into sub-problems, solving those problems simultaneously (in parallel, with each sub-problem running in a separate thread), and then combining the results of the solutions to the sub-problems.

So looking back, Java SE provided the fork/join framework, which enables developers to implement parallel computing in applications.

However, with this framework, the programmer must specify how the problems are subdivided (partitioned).

With Java 8, Oracle explains that it is taking some of the dirtier mechanics out of the equation and essentially making the whole route to parallelism easier to get to.

Java SE vice president George Saab explained to reporters in Prague this week that he is thinking about parallelism and concurrency and how the Java team now works with developers to exploit the potential here with Lambda.

Saab notes that you could achieve parallelism with Java closures previously (as Java 8 now provides) via abstract inner classes, so this will not be completely alien to developers who could no doubt have been using parallelism elsewhere also as well as Fork/Join Framework tools (JSR166y).

The proposition with Lambda is that it takes some of the mechanics out of this process and makes it more automated.

Going back further, vice president of development for the Java platform Nandini Ramani points out that Java has a history of parallelism as Phasers were introduced in 2006 with Java 7 (JSR166) and further still all the way back to Java 1.4 with java.lang.Thread if you want to do the history.

Plus let's also not forget... Lambda (as a working project with builds) has been around since JDK 7.0 so it has had time to bake.

Whichever way you slice it, Java is ready (or more ready at least) for parallelism and the free (developer) lunch is indeed over.

Java 8 grasps parallelism (more tightly) with Lambda Expressions

bridgwatera | No Comments
| More

Oracle's Java coding cognoscenti gathered in the Czech capital of Prague this week to analyse, present and postulate over the arrival of the Java 8 language.

Now some 20-years after Sun Microsystems first envisioned and built the Java language and platform under the eye of James "father of Java" Gosling, we now find ourselves in a postmodern Oracle world of Java development.

aDuke_8.jpg

Java 8 has been reengineered and the most significant enhancement in this release is Project Lambda Expressions for the Java Programming Language.

To be more complete, the standout features of JDK 8 are Project Lambda the Nashorn JavaScript Engine and a new Date and Time API.

Top of the pile is Lambda, which is a Java function that adds 'closures and related features' dedicated to supporting multicore programming.

Image credit note: this image comes from the very excellent TAKIPI BLOG.

What are Lambda expressions?

A Lambda expression is described as a block of software code that a developer can "pass around" in order for it to be executed later, executed just the once or indeed executed multiple times -- hence, more control exists for parallelism over multicore.

While some commentators have said that Oracle's take on Lambda for Java is not as polished as it might be (and that standard Java imperative style performs much better), Java 8 should not be denigrated or disparaged just for being new.

Java SE vice president Georges Saab told reporters in Prague that yes, there will be refinements over time and that performance will improve.

Java 8 with Lambda has been widely heralded as the most significant update to the language since the introduction of Generics in Java SE 5.0 way back in 2004.

NOTE: Java Generics are meant to add stability to code by making bugs detectable at compile time. Oracle points out that runtime bugs can be much more problematic; they don't always surface immediately, and when they do, it may be at a point in the program that is far removed from the actual cause of the problem.

The Java 8 release is substantial and spans outward to incorporate the Java Platform Standard Edition 8 (Java SE 8), Java Platform Micro Edition 8 (Java ME 8) and connected releases of Java Embedded products.

So more specifically then, Oracle is also announcing Oracle Java SE Embedded 8, which uses the new Java SE 8 features and is optimised for mid- to high-end embedded systems.

With this release Oracle is aiming to converge Java ME and Java SE to engender a more consistent developer experience and more code re-use across the platforms.

"Java is the global standard for developing and delivering applications everywhere - from small devices to the cloud. The convergence of Java SE 8 and Java ME 8 is intended to provide a consistent development environment that improves developer productivity and application performance, along with the ability to 'right-size' the platform for deployment across an even wider range of use cases," said Nandini Ramani, vice president of development, Java Platform, Oracle.

Ramani spoke to journalists in Prague to explain that as Java now converges, we get to a new and more positive point in terms of being able to program based on device constraints (with mobile and Internet of Things devices in mind) rather than based upon particular functions in the language.

How will developers learn Java 8?

Oracle is also conducting a worldwide tour of Java User Groups (JUGs) and plans to deliver educational events for nearly sixty JUGs in over twenty countries on six continents.

An updated Java SE 8 training curriculum will be available soon from Oracle to help developers transition to Java SE 8 and implement the latest platform enhancements.

Why did we title this piece Java 8 grasps parallelism (more tightly) with Lambda? You may ask...

... this is because Java already has a history steeped in parallelism but that's another story.

a eudgiwuegfd.jpg

DevOps is a real job, it's official

bridgwatera | No Comments
| More

The 'developer' and 'operations' DevOps role is now an official part of the tech industry nomenclature.

The number of permanent and officially recognised DevOps Engineer posts in the UK has jumped 347% in the past two years.

This news comes alongside the discovery that 'DevOps Engineer' has jumped 222 places in a list of IT jobs skills/job titles, making it one of the fastest growing skills needed by the industry [Source: IT Jobs Watch].

Open cloud company Rackspace has supplied these "findings" and says that by automating infrastructure, workflows and continuously measuring application performance, DevOps has become an important business process.

NOTE: The top technology skills DevOps Engineers currently need are Linux (79%), Puppet (60%), Chef (47%), Python (44%) and Ruby (42%), according to job advertisements for the position for the six months to March 2014.

Rackspace has a DevOps offering and launched its global DevOps Automation Service in late 2013 from the US -- a UK DevOps Automation Team has also just been launched.

The firm says that the service is designed to help developers automate the process of deploying and scaling hybrid cloud infrastructures for fast-growing applications.

a devp[oiedw.jpg







NoSQL or RDBMs: who will win the database war?

bridgwatera | 3 Comments
| More

This is a guest post for Computer Weekly Open Source Insider written by Sandor Klein of EDB, a provider of enterprise-class products and services based on PostgreSQL.

1Sandor_Klein.jpg

An open (data) battlefield

The NoSQL and relational database camps emerged fighting in 2013, and the once peaceful niche of data management has transformed into an open battlefield.

As businesses look to cope with the challenge of big data, both camps claim to offer the best approach when it comes to managing the data deluge.

Selecting the right database to suit your requirements has become tricky amidst a noisy backdrop of claims and counterclaims from the market's major players on both sides of the divide.

As the battle continues to intensify into 2014, what hope is there for the relational camp when it comes to convincing CIOs that their solutions are more than match for the rapidly maturing NoSQL solutions that have sauntered into the marketplace? As it happens, plenty.

Noisy Mongo

The likes of CouchDB, Redis and MongoDB have certainly made a lot of noise to attract attention in 2013, but advances made by relational databases such as Postgres have continued apace.

Relational systems have been very good at learning the lessons that the NoSQL systems presented. By tooling up in areas such as JSON for semi-structured web-based documents and fast data lookups through HStore, relational solutions are now in a position to offer precisely the same core capabilities as NoSQL vendors.

This is in addition to meeting standard transactional requirements -- which is not to say there is no role to play for NoSQL systems. Indeed many organisations have found that NoSQL technology is very well-suited to some types of big-data applications, but not others.

What NoSQL is good at

NoSQL's strength lies in its ability to number crunch vast amounts of data, not as a solution for real-time decision-making. The key is for organisations to use the right technology for every application.

When a technology triggers an ideological divide as NoSQL has, there is a tendency to take sides. What's important is for organisations to take a more enlightened view, overlooking the hype and making a decision that will ultimately derive maximum value from their database investments.

Microsoft: open source developers not yet fully recognised by Microsoft MVP Programme

bridgwatera | No Comments
| More

Microsoft continues to claw its talons into the open source space this week by announcing that developers who run, manage, or commit to large and highly influential open source projects will now be "formally recognised" by the company's MVP Award Program.

9171414085_cc7e1453ce_z_caa8fceb-3401-4477-99c0-0e98918ea206.jpg

Microsoft code guru Scott Hanselman has posted excerpts from an internal company memo that specifies, "Currently there is a class of developer influencers whose contributions are not yet fully recognized by the Microsoft MVP Program."

NOTE: The Microsoft Most Valuable Professional (MVP) Award recognises exceptional, independent community leaders who share their passion, technical expertise, and real-world knowledge of Microsoft products with others.

Hanselman quotes more from Microsoft's statement on open source and the MVP programme saying that the company now recognises community developers as
influencers who run, manage, or commit to large and highly influential open source projects.

"However since they do not participate in what is considered as more traditional ways such as speech engagement, books, online forums or user groups, they are not usually considered as potential MVPs. Often these developers have technical community followings but may not necessarily be on message.. As a result, there is a belief amongst some influencers that Microsoft does not support open source software."

But things have changed...

"As we move forward, we will change the MVP guidelines to recognize open source activities with the same weight as other community activities."

Hanselman himself has said that he had envisaged Microsoft creating a new "Open Source MVP" as a formalised badge or label.

"I pushed that for a while but realised quickly that it would create an island of OSS MVPs, and no one group would claim them. Better to push OSS throughout the whole program so everyone shares a home," he said.

Microsoft has rounded out the discussion by saying that the MVP Award Program will recognise open source activities in order to promote further growth and support of the technical communities.

... and specifically, the company states that:

Microsoft will work to fine tune the MVP guidelines to recognise open source activities with the same weight as other community activities.

Can a leopard change its spots?

Is Microsoft really just still saying that the only valuable open source is that which extends Microsoft products in its own view?


Sony open sources games tools framework: woos developers to Playstation land

bridgwatera | No Comments
| More

Sony is making it Authoring Tools Framework product available to programmers for free under the Apache 2.0 open source license.

The Authoring Tools Framework (ATF) will be free to download and use.

Industry commentators have suggested that this may now create an incentive for independent third-party developers to start developing games (or indeed porting existing games) to the PlayStation.

The ATF itself is comprises a set of C#-authored software components used to build games.

tlou_boxart.png

Sony itself has used its ATF to build games including Naughty Dog's action-adventure survival horror video game "Last Of Us" level editor.

The Soy ATF has been in "continuous development" since 2006.

"We're looking forward to expanding ATF's usage beyond SCE, working more with external PlayStation developers and the larger game development community as a whole," said Sony CEA principal tools programmer Ron Little.

"Now that ATF is open-source, we're excited to see the ways developers use the toolset, which could expand beyond games."

There are many types of components in ATF. ATF Managed Extensibility Framework (MEF) components can be added to a .NET TypeCatalog with a single line of code that can be found in most of our sample applications, as in \Samples\CircuitEditor\Program.cs.

Other components, like the DOM, are really a collection of related classes.

Mozilla VP: Firefox for Microsoft Metro would be a mistake

bridgwatera | No Comments
| More

Mozilla has canned plans to build a Firefox browser that aligns specifically to
Microsoft's so-called 'Modern Design' approach, the design approach artist formerly known as Metro.

Microsoft Metro (sorry, Modern Design) remains a key component of the Windows 8 optimised-for-touch experience and the user interface presentation has its fans and its opponents.

Firefox VP Johnathan (correct spelling) Nightingale has confirmed that he has asked his engineering leads and release managers to take the Windows Metro version of Firefox "off the trains", as he puts it.

He means "off the rails", but let's continue.

"The team is solid and did good work, but shipping a 1.0 version, given the broader context we see for the Metro platform, would be a mistake," he blogged.

Mozilla's Nightingale says that he doesn't want the team to work on the Microsoft plan because they "need to focus on the projects with the most impact for our mission" today.

"In the months since, as the team built and tested and refined the product, we've been watching Metro's adoption. From what we can see, it's pretty flat. On any given day we have, for instance, millions of people testing pre-release versions of Firefox desktop, but we've never seen more than 1000 active daily users in the Metro environment," said Nightingale.

Windows 8 users who prefer Firefox to Internet Explorer don't need to panic -- the non-Metro (desktop) version of Firefox still works well in Windows 8 environments and will continue to be developed.

The codebase is off the trains and off the rails, but will live on at http://hg.mozilla.org/projects/metro for those who wish to retain an attachment to it.

Image below: Firefox Australis, the next Firefox

firefox-australis-soft-texture.jpg

March of the penguins: how we migrate from UNIX to Linux

bridgwatera | No Comments
| More

This is a guest post for Computer Weekly Open Source Insider by Adam Leventhal, CTO at Delphix, a company that helps virtualise database infrastructures to accelerate application rollouts and development.

The IT industry is in the midst of a mass platform migration.

015d2a5.jpg

Gartner anticipates that in three years from now, 65% of the applications that were running on UNIX in 2012 will migrate to Linux.

Now... while the courts have given HP-UX a reprieve from Oracle's death sentence, IT organisations know not to put their faith in a product on death row.

Relational opportunity

In the drive to modernise, applications built around relational databases represent the greatest opportunity for return, and the greatest challenge.

Multiple CIOs have told me they planned to modernise all of their databases, but -- constrained by cost and time -- only managed to tackle the 20% that yielded 50% of the value. The remaining 80% remains a major drag to IT; we're kicking the can down the road.

UNIX platforms such as Solaris on SPARC and AIX on Power store data in a different format than Linux on x86 (big-endian versus little-endian).

Complex and cumbersome conversion

The Oracle database comes with tools to execute a conversion between platforms, but the process is still so complex and cumbersome that few organisations attempt it without help from specialised consulting practices.

Editorial note: It is at this point that Leventhal mentions certain analysts who are beginning to look at technologies like Delphix's own Modernisation Engine -- this product is specifically designed to facilitate and automate the conversion of Oracle databases from UNIX to Linux.

aTux.jpg

Now, we're moving to a model in which we can use virtualisation technology to provision copies of the original, production Oracle databases on UNIX platforms or automate a conversion process that can be tested and tuned to stamp out platforms automatically.

How to convert a database

Converting a database typically involves modifying 90% of the data blocks, changing the content to work on the new platform.

Even with advanced storage platforms converting a 1TB database would result in a copy that's nearly 1TB.

Data Agility brings the end-to-end awareness of data and storage to fit even converted databases into a tiny storage footprint. Removing the storage constraint means that modernisation efforts operate at lower risk, always testing with fresh, full datasets.

Datacentres are populated with lumbering legacy apps; migrations have been slow - to consolidated data centers, hybrid clouds or into retirement.

Economics has increased the pressure to leave legacy platforms. Fortunately advances in data agility and virtualisation have come just in time to transform the datacenter.

About the author...

Adam Leventhal is a co-inventor of DTrace. At Sun Microsystems, he worked in the Solaris kernel group and was a founding engineer in Sun's Fishworks group. He currently sits on the illumos developer council, and has 11 patents filed or pending.







9 ways for programmers to sleep better

bridgwatera | No Comments
| More

Can anyone code proficiently or architect a software masterpiece without enough sleep?

Answer: no, of course not.

vcsPRAsset_526861_80570_eb12b5b5-393d-4e94-8ac8-eb7c9e8bd0d0_0.png

This Friday March 14 is the 7th Annual World Sleep Day...

... this is where members of the World Association of Sleep Medicine come together and educate the world on the importance of getting enough sleep.

A total of 21% of adults get less than six hours of sleep each night says the organisation.

Cathy Beggan, founder and CEO of Rise-N-Shine, has come up with 10 tried and true ways for software application development professionals to get a good night's sleep.

1. Avoid chocolate, coffee and red wine in the evening hours. These foods have been known to disturb sleep patterns and digestive tracts.

2. Have an early dinner. Make sure dinnertime meals are finished before 7pm or at least three hours before bedtime.

3. Establish a sleep cycle. It's important to notice when the body starts to get tired and being able to adjust

4. Keep the lights off. Rays of light from nightlights, hall lights, bathroom lights or even TV.'s can disturb a natural sleep pattern.

5. Relax. Turn the mind off at night is an important component of quality sleep. Yoga and meditation can help turn the mind off after an action packed day.

6. Exercise in the morning. Exercising at night can release endorphins that can keep the body awake for longer periods of time. Releasing these endorphins in the morning or during the day can help give the body more energy when it's needed most.
Have a nighttime routine.

7. Once a more concrete sleep cycle has been established, creating a nightly routine can help prepare the body for rest.

8 Read a book. Watching T.V. in bed can sometimes stimulate the mind so choose something that will wear the mind out, like reading.

9. Close your eyes. Sometimes the first step to falling asleep is closing your eyes and allowing yourself to rest.

Rackspace #bigdatabreakfast, you had me at bacon and cloud

bridgwatera | No Comments
| More

Rackspace describes itself as the 'open cloud' company no less.

Open enough then to host a cloud-enriched big data breakfast in London this month.

awordlebacon.png

The #bigdatabreakfast (doesn't your breakfast have a hashtag?) saw representatives from Rackspace itself sit alongside suits from EMC, MongoDB, HortonWorks and DataStax.

This was a morning of occasionally great soundbites and a few over-practiced over-'media trained' howlers.

"Big data is the new currency," said one of the spokespeople. Yeah right.

More thought provoking was Berne Kaponig of EMC who said that big data is the 'avant garde element' of data management.

Rackspace itself had an admirable take on where big data is headed (it looks after a lot of it after all) and eluded to the tipping point that we now find ourselves at...

... that is to say, the firm has said before now that IT skills sets as recently as 18 months ago were not high enough for us to exploit big data to the full.

But now, in 2014, the skills gap is still there "but no longer huge" says Rackspace head of technical product strategy, Toby Owen.

Rackspace's Owen contends that that the basic skill sets are there today and that the tooling is "getting better and easier to use" as we now form a new fulcrum between BI and big data itself.

Owen also suggests that big data platforms (as a means of compute processing power) are becoming more straightforward for the end user in operational terms.

The Rackspace man in particular referred to where we are today with MongoDB i.e. it started back in 2007, grew slowly, skills developed and then now... it is one of the top 10 databases on planet Earth.

EMC commentary was stronger than some others at this event; the company stated that enterprise class data management is about to become more mainstream.

A couple more for you...

EMC mentioned that we are on the verge of bringing latency down one more step i.e. the cloud is getting more "pragmatically real" in terms of implementations today.

Rackspace also said that analytics & integration inside the vertical stack is where the opportunity is.

Bacon sandwiches, cloud and insight then - what's not to like?

Gastronomic disclosure: No HP sauce, Daddies or other 'brown condiment' was provided.

Bh8_j1yIEAAigdV.jpg

British open source project eyes Le Mans 2015

bridgwatera | No Comments
| More

A British originated open source project is aiming to bring in technology expertise and design from anyone who wants to get involved and buck the previously "secretive world of Formula 1" that exists today.

The Perrinn team http://perrinn.com/us has its eyes on Le Mans in 2015 and is now working to develop, build and run a sports car.

An interactive website is hoped to attract open source developers, racings enthusiasts, students, schoolchildren, fans and automotive engineers.

Participants can get involved with everything from CAD models to livery-design and even financial budgets.

The project aims to host so much data about the car and the project itself that it will ultimately be possible to 3D print a model of the car itself.

Yorkshire-based race car designer Nicolas Perrin says his goal is to achieve success in the FIA World Endurance Championship and win Le Mans 24H with myP1 within five years.

The design phase has taken 3 years and is complete. The vision and objectives have not changed since the beginning of the project.

perrinn-lmp1-4wd.jpg

Merkel & Cameron: Don't mention the war at CeBIT

bridgwatera | No Comments
| More

Don't mention the war.

Angela Merkel mentioned it once, but I think she got away with it.

220px-Exportmesse_1947.jpg

Joining the German chancellor in the war references at the opening ceremony of the CeBit 2014 exhibition this year was Christian Democratic Union (CDU) politician Johanna Wanka and, in fact, Wanka mentioned the war once too.

Well, after all, it is the 75th anniversary of the start of World War II, so this in now a time for positive reflection, innovative construction and above all, peace.

What was perhaps most interesting of all at this opening Anglo-German love in was the explanation of how and why CeBit came about.

This explanation was provided by prime minister of Lower Saxony Mr Stephen Weil.

NOTE: CeBIT is a German language acronym for «Centrum für Büroautomation, Informationstechnologie und Telekommunikation»

At the end of the second world war, a British military government operation set up a "positive constructive force" (to use Weil's own words) to create a trade fair at the Hannover Messe in 1947.

An undamaged factory in Laatzen (just south of Hanover) was used and this later became the full Hannover Fair (or Messe) -- of which CeBit now exists.

Merkel and others referred to CeBit as the "offshoot" and (perhaps more affectionately) as the "daughter" of the Hannover fair.

This spirit of openness (if not open source) still pervades said the politicians.

British PM David Cameron was also here to announce a new Internet of Things grant from the UK government.

"We need to change the Internet of Things from a slogan to reality," said Cameron.

German premier Merkel rounded out this event by referring to history and reminded us that it was 300 years ago now since the Prince of Hanover became King George 1 of England.

Unity and openness brings connectivity after all it seems.

871921-fb1b2602-a7c5-11e3-b9af-c51380e50fb4.jpg
Image credit: AFP

Editorial Disclosure: Adrian Bridgwater attended CeBit as a guest of Software AG.

Undo better than GNU, who knew?

bridgwatera | No Comments
| More

A Linux debugging tool more efficient than GNU debugger (GDB), really?

Bucking time honoured marketing best practice stating that a firm shalt not use an OVERT NEGATIVE in an advertising headline, promotional campaign or (saints preserve us) the actual name of the company, it appears Undo Software is doing well in the Linux debugging market.

Well, ok, the term "undo" could be a positive in tech if it is used to correct a mistake, which of course it is in this case with the reversible debugging tool UndoDB from Undo Software.

The company has this month worked with Mentor Graphics Corporation to implement UndoDB and develop Linux code faster.

So what does it do?

UndoDB claims to be able to allow developers to record their program's execution and then rewind their code in real-time to find bugs more quickly, saving time and reducing cost.

Undo better than GNU, who knew?

Mentor Graphics used UndoDB 10 alongside the GNU debugger (GDB) and said that UndoDB makes it easier and faster for its development team to track down problems compared to GDB.

"UndoDB greatly improves our debugging time, making it up to two to three times faster than before. In addition, the learning curve for getting up to speed with UndoDB was very short because of the familiar interface and command structure," said Jean-Marc Talbot, senior director of engineering, AMS, Mentor Graphics.

"UndoDB enables software developers to use the power of reversible debugging on complex, real-world code," said Greg Law, CEO and co-founder, Undo Software.

Law concludes by again claiming that tests prove that his firm's tools boast performance that is several orders of magnitude better than open source solutions, with significantly improved memory consumption.

1edwefwf.jpg

Find recent content on the main index or look in the archives to find all content.

Categories

Archives

Recent Comments

  • ccm12983: The Raspberry Pi is hard to beat, but it depends read more
  • Jessica Dodson: An open source policy helps ensure that all your bases read more
  • qaz wiz: even at that cut rate price this thing needs to read more
  • Evaldo Horn de Oliveira: Great post – there’s definitely truth behind the statement that read more
  • Doron Levari: Thank you for your post, I enjoyed reading. As I read more
  • Monica Pal: Totally agree that organizations should use the right technology for read more
  • Alan Carvalho: Adrian, when people think about IoT the first thing comes read more
  • Noah Slater: I agree with Lydia about the dual edged sword of read more
  • Ian Ferguson: I think this is a simplified view of IoT. Sure, read more
  • Shawn Douglass: Chef is a great tool to supplement the DevOps process. read more

Recent Assets

  • rain-guage.png
  • hero-image.png
  • 3587b20.jpg
  • MinnowBoard_MAX-Top-Angled_1280x960.jpg
  • default-logo.png
  • sausages.gif
  • 705px-Two_small_test_tubes_held_in_spring_clamps.jpg
  • 1Glühwendel_brennt_durch.jpg
  • Big_data_cartoon_t_gregorius.jpg
  • a ScienceDuke150.jpeg

-- Advertisement --