One of today's buzzwords is big data. Volumes of information generated are increasing rapidly, driven in part by increased take up of mobile technologies and the growing number and range of machine-to-machine communications as ever more equipment such as industrial sensors is connected to networks. Increasingly, organisations are looking to understand how information, events and behaviours impact the overall goals and objectives of the business. Making use of that information is critical as it provides valuable insights that can be used for improving operational performance and business decision making.
The ability to harness big security data generated by feeds from throughout the network stack, incorporating network systems, host, applications and users, combined with internal and external security intelligence information, similarly has enormous potential benefits for organisations. These feeds provide essential information and should be continuously monitored in real time to uncover and understand deviations from what is considered to be abnormal behaviour to uncover threats that more reactive security controls can miss.
To answer these needs, vendors with their roots in the security information and event management technology and log management systems space have been building out their capabilities, developing security intelligence platforms that incorporate such advanced capabilities as big data analytics and event correlation, integrity and change management, archiving and incident response. A new report from Bloor Research discusses these developments and provides pointers as to what organisations should look for in such a security intelligence platform. This link will take you to the report: The value of big data in security.
The threats we face today are no longer smash-and-grab raids, looking for instant gain. Rather, perpetrators are looking to get a deep foothold into the network. They use subterfuge to trick their way into the organisation that is being specifically targeted and, rather than exiting rapidly, they then move laterally through the network, looking for richer pickings by escalating their access rights and laying in wait, often for long periods of time. They aim to remain undetected. The scale of the problem is borne out by this year's data breach investigations report by Verizon Business, which found that just 16% of breaches suffered by respondents were discovered by the victims themselves.
These criminals are well resourced and technologically adept. They aim to use multiple attack techniques and constantly evolve their exploits, testing them against commercially available security controls to ensure that they can evade them. Many of those controls are reactive in nature, only providing countermeasures against threats that have already been countered. That is no longer sufficient for fending off the sophisticated threats that we face today.
What is needed is a new approach--one that is based trust. Application control and whitelisting technologies provide the advanced weapons needed to counter advanced threats. They can be used to ensure that only trusted applications can be run on the network, blocking all other applications from executing. Thus, they are highly effective at preventing malware infections and data exfiltration, especially when all systems are continuously monitored in real time.
By allowing only what is known to be god to run, trust is returned to the security equation and an organisation will be in a much stronger position to protect its sensitive information from the risks posed by sophisticated cyberthreats. To learn more about how such technologies are important weapons in an organisation's arsenal, join Bloor Research and Bit9 for a webinar that will take place at 9am EST/2pm GMT/3pm CET, Tuesday 4th December. Click here to register: Enhancing security through a trust-based approach.
The technology landscape is changing fast, bringing much disruptive change that provides organisations with new ways to streamline their businesses, reach out to customers more effectively and keep their workers productive. Among the hottest trends are cloud computing, software as a service, big data and mobile security. Within these sectors, many security technology vendors are showing a great deal of innovation--leading to a flurry of acquisitions and investments recently.
As with most areas of the technology sector, the security technology sector is characterised by a few large players and a range of innovative smaller players. In recent years, those larger players have been acquiring specialists to build out their portfolios. McAfee, which itself was acquired by Intel Corporation in 2010, has made investments in the mobile security space, and in the security information and event management and database security sectors, which will allow it to expand its capabilities in real-time big data analysis. It has also been integrating its Intel and McAfee product lines to allow it, among other things, to enter the identity management space. Symantec has also made a number of acquisitions to expand its capabilities in ediscovery and archiving, authentication and mobile security. IBM introduced its Advanced Threat Protection platform in April 2012, incorporating assets from its acquisition of Q1 Labs and of Internet Security Systems. It has recently brought all of its security activities together into a separate security unit and has announced a focus on cloud services, mobile security and big data.
There has also been a great deal of investment activity among smaller, specialist companies that shows just how dynamic the security technology sector is currently. The following are just a few of the most noteworthy examples seen recently. Barracuda Networks, which offers a broad portfolio of content security, data protection and application delivery subscription-based products received funding of $130 million in October 2012 and, according to Bloomberg, is eyeing an IPO. Cloud security provider Qualys has successfully undergone an IPO, raising a little over $70 million in September 2012, proving the viability of its model. Another candidate for an IPO is email security and archiving vendor Mimecast, which received $62 million in funding in September 2012. Application control vendor Bit9 received $34.5 million in July 2012 and cloud security vendor Zscaler $38 million in August 2012.
Deals such as these show just how important the security technology sector is and point to the high levels of innovation that are being seen. That innovation is bringing disruptive change--not just in terms of the security technology available to organisations that deploy such technology, but in the overall vendor landscape. There will be more to come.
Among the many challenges that telecommunications providers face are the need to transform and consolidate their businesses. They need to adopt new business models that allow them to move away from commodity services such as voice traffic where revenues are dwindling, to stave off increased competition from new market entrants, and to take advantage of opportunities in emerging market sectors.
There are several areas of opportunity that will allow telecommunications providers to transition themselves in order to stand out from the crowd. Given their technology and communications expertise, they are well positioned to reposition themselves as managed service providers for their commercial and public sector customers, supplying information and communications services that will help their customers to consolidate highly complex, heterogeneous network infrastructures into efficient and secure networks, incorporating next-generation technologies in conjunction with their technology partners.
Telecommunications providers can also help organisations to tackle the challenges and opportunities that they face in specific vertical markets through the development of new technology products and services designed for specific business needs. For example, in the healthcare sector there are opportunities to aid in the development of telemedicine capabilities, home-based care and health monitoring, and, in the US, to support the development of health information exchanges.
Telcos are also in the position to tap into a number of markets with extremely high growth potential. These include machine-to-machine applications, which rely on communications systems to allow devices such as sensors or meters to communicate directly with each other for greater efficiency, especially in industrial markets and in the critical infrastructure sectors.
Big data presents another opportunity, allowing telecommunication providers' customers to harness the potential of the huge volumes of data that their businesses generate, turning raw data into actionable intelligence that will allow them to achieve improved operational and asset performance, and that will provide revenue-generating opportunities by enabling them to offer more personal, relevant services to their own customers.
For small business and consumers, telcos can answer calls from governments worldwide to provide customers with security services, in conjunction with their technology partners, so that they can take an active stand against the security threats that they face, as well as providing other value-added services to allow consumers to take advantage of new digital lifestyles.
In short, there are many opportunities for telecommunications providers to turn the challenges that they face to their advantage, allowing them to leverage the investments they have made in information and communications infrastructures to stave off competition, increase their relevance to customers and identify new revenue-generating opportunities. A white paper that explores these challenges and opportunities in greater detail can be accessed here: Challenges and opportunities in the telecommunications sector.
Trust is essential for building a sustainable business. Security is essential for building trust. To build trust in electronic networks, security needs to be built into a suitable framework, rather than being bolted on in a piecemeal fashion. The only way to make security equate to trust is to build a secure foundation, taking into account the security of the whole system and not merely protecting its individual components by bolted-on security.
However, where once an organisation's network was fairly self-contained, today's world is a highly interconnected one, incorporating cloud services, mobile devices, virtualisation, the consumerisation of the enterprise and highly interactive web applications that has seen perimeters eroded. The challenge is to embrace these new technology developments to enable the business to grow, to support innovation and competitiveness, and to enhance the ability to serve customers.
These developments mean that trust models must now be extended to external networks and services, which changes the way that security needs to be implemented and managed. Bloor Research has published a new paper that looks at these issues in greater detail and describes how organisations can build a new security paradigm based on a foundation of trust. This paper can be accessed here: Security based on trust not fear.
According to the US government, "the strength and vitality of our economy, infrastructure, public safety and national security have been built on the foundation of cyberspace." The McKinsey Global Institute recently published a study that aimed to quantify the impact of the internet on the world economy. It found that the internet has become a significant and essential factor in national economies and in the global economy itself, allowing established industries to be more productive and creating new jobs. Among advanced economies, it found that the internet accounts for around 6% of GDP and is a critical element in economic growth, accounting for 21% of GDP growth in those advanced countries over the past five years.
The internet is also growing rapidly, both in terms of numbers of people connecting to it, including high levels of growth in emerging economies, and in terms of the numbers and types of devices connecting to it. According to Cisco, there were about five billion devices connected to the internet at the end of 2011 and it predicts that number will rise to around 50 billion by 2020. Contributing to that growth is the proliferation of mobile devices including smartphones and tablets, in-vehicle computers, televisions, cameras, sensors, medical devices and smart machines used for supporting the high growth of machine-to-machine applications in a wide range of industries and consumer environments.
Every one of those devices needs an IP address to connect to the internet, but the current prevalent communications protocol for internet traffic, IPv4, has only 4.3 billion IP addresses available and that stock has been exhausted. As of February 2011, the last IPv4 addresses were handed out to regional internet registries. We have known about this problem for a long time and a successor to IPv4--IPv6--has been available for many years, offering an infinitely larger number of addresses. In order for the growth of the internet to continue and the benefits of emerging technologies to be realised, the transition to IPv6 is critical.
People have been fear-mongering about the forthcoming address depletion for some years now, but very little has been done in the way of IPv6 adoption. However, according to Axel Pawlik, managing director of Europe's internet registry, RIPE NCC, "We are really running out now. By the end of this year, Europe will not have any IPv4 addresses easily available. If you don't do IPv6 now, you will lose connectivity to the IPv6 network. There are solutions for limited connectivity, but they are hard and costly. It will not be a big global crisis, but growth will slow." Because IPv6 is not backwards compatible with IPv4, customers with IPv6 devices will not be able to reach IPv4-only services, which could lead to lost sales to customers in high-growth emerging economies in particular.
Things are getting better. Penetration is growing and the future is rosy. But it is still not good enough. According to Pawlik, heads will begin to roll now amongst those who have not even started planning for the IPv6 migration. The situation is better than it was last year, and certainly better than five years ago, but some people will only move when they see the evidence of the address depletion by being unable to secure any further IPv4 addresses.
To encourage adoption, the World IPv6 Day was held in June 2011, which saw major content providers enabling IPv6 for their primary domains for 24 hours. According to Tom Coffeen, IP evangelist for technology vendor Infoblox states that the day was a great success, with no major operational issues encountered, and the volume of IPv6 traffic doubled during the event. Coffeen also states that significant progress has been made with IPv6 adoption since then, with the percentage of zones under .com, .net and .org domains offering IPv6 support increasing by 1,900% in the past year and a growing number of enterprises have begun or are continuing substantial IPv6 adoption initiatives.
However, after the 2011 IPv6 day, many organisations switched off IPv6 support and growth has not been seen at the levels expected. Because of this, many organisations are continuing their efforts to raise awareness of the need for the switch to IPv6. Among the initiatives was a second IPv6 Day, held in June 2012--this time entitled the World IPv6 Launch, where organisations were encouraged to enable IPv6 on their networks and to keep IPv6 enabled after the day ended. According to Scott Iekel-Johnson, product manager at Arbor Networks, the amount of native IPv6 traffic grew 20% with the launch of World IPv6 Launch day and this has remained steady since then. According to Iekel-Johnson, "This shows that hopefully many of the newly enabled IPv6 services are here to stay--another important milestone on the road to ubiquitous IPv6 adoption."
Fear-mongering apart, the time to act really has come. If you are not even planning for IPv6 now, you may lose out to your more nimble competitors and miss the ability to cash in on the benefits of emerging technologies that require high-speed internet connections. According to Pawlik, time is really running out now and there are literally only weeks left in Europe.
Emails are essential business communications and collaboration tools and the vast majority of business information is, at some point of its lifecycle, communicated via emails and their attachments. Even though the use of other forms of communications is increasing, email usage continues to outpace them all.
The fact that hackers target emails is not news, but their motivations have changed. They are no longer content with just causing disruption through the damage caused by malware-riddled messages, but now look to steal the sensitive information that many emails contain.
It is therefore essential that email systems are adequately secured. But it is not sufficient to think of email security merely in terms of protection against malware threats. Rather, email security needs to be considered in a wider context that includes protection against outbound threats to prevent data leaking out of an organisation.
Other considerations for email security include mailbox management, continuity, archiving and discovery, and compliance reporting. Continuity is extremely important so that productivity is not impacted by and no records are lost owing to downtime. Archiving and discovery are increasing in importance and are ideal candidates for a cloud-delivery model, allowing organisations to securely store and quickly find email records when they are needed. A compliance capability, including blueprints for the main industry standards and government regulations is also of vital importance for complying with the demands of such mandates that all business records, including those produced electronically, are processed, transmitted and stored in a highly secure manner.
A new report discussing what is required of a holistic email security system and comparing the capabilities of some of the major players in the market is available for download via the following link: http://www.mimecast.com/bloorsecurity.
Buzz phrases of the day include consumerisation of IT and BYOD--bring your own device. The former phrase refers to the use of increasingly powerful and feature-rich devices, be they PCs, smartphones or tablet computers, by consumers. The meteoric rise of the tablet computer embodies this trend. According to comScore, the use of tablets in the US alone took just two years to reach 40 million--compared to seven years for smartphones to reach the same level of adoption. And those end users increasingly want to use their own devices to access both work and leisure applications--the second trend, BYOD--as they are often seen as superior to those issued to them by the organisation.
As a result of trends such as these, the number of devices connecting to corporate networks is expanding rapidly and those devices must be managed to ensure that the organisation is not exposed to security vulnerabilities through their use.
Traditionally, securing endpoints has been approached by installing software on every device needing to be protected, which works by scanning programs for signatures that have been developed by anti-virus vendors that indicate that the program is malicious. However, this method is no longer sufficient. The number of viruses and other malware has grown dramatically, with an average of 73,000 malware samples being seen daily in 2011, many of which are variants of known viruses that have been developed to avoid detection. The amount of malware that is considered to be aggressively polymorphic is also growing and this is a further problem with traditional anti-malware technologies as this type of malware is designed to modify itself on each infection. A system based on signatures alone provides no defence against threats that vary from those seen before.
A further problem is that anti-malware programs are large and tend to get bigger as more signatures are added to their defences. It is well known that they tend to be a drain on computer resources, significantly slowing down computer performance, especially at startup and during scans. Even on corporate-owned devices, many users try to circumvent such controls and many would find it totally unacceptable for an organisation to demand that they deploy such controls on devices that they have purchased themselves.
Clearly a new approach is needed--one that provides better protection by guarding against new threats as well as those for which countermeasures have already been made available--and one that does not hinder the user. This can be achieved by subscribing to endpoint security services based in the cloud, whereby only a small agent is placed on each device and protection is applied in the cloud, before exploits can ever reach the device.
Such services are new and there are a number of elements that must be considered, including the types of controls that are provided over and above signatures, the availability of cloud-based threat intelligence networks for identifying new threats, privacy and data protection controls, protection for devices when not connected to the network, and remediation capabilities should any threat still be able to break through the barriers. Bloor Research will be participating in a webinar at 10am GMT on Wednesday 29th February 2012 that will outline what organisations should look for when choosing such an endpoint security system and the benefits that they can expect. For more information and to register for this webinar, click on the following link:The changing face of endpoint security.
Cloud-based computing is growing faster than the IT sector as a whole. There are plenty of analysts throwing numbers about regarding cloud spending. Here are some from Forrester Research: in 2011, US$40.7 billion was spent on public, private and virtual private cloud IT services and that will expand to US$241 billion by 2020. Of that spend, US$21.2 billion was spent on software as a service, which will expand to US$92.8 billion by 2016--26% of all sales of packaged software applications.
One area that is showing particular growth is the market for email management and archiving. It is estimated that around 60% of business-critical data is transmitted via email, either in the body of the text or as attachments, and that information forms the basis of vital business records. All organisations are subject to regulations of some form or another and many of those regulations demand that business records be maintained. Those regulations vary widely from those applicable to specific industries, such as financial services or pharmaceuticals, to those affecting any organisation, such as employment and data protection regulations.
Being able to retrieve those business records when needed is not only vital for regulatory compliance, but also aids greatly in productivity of workers and in responding to internal or external regulations. According to Osterman Research, 66% of IT organisations that it surveyed referred to email or instant message archives or backup tapes to support their organisation's case in litigation in 2010 and 63% were ordered by a court or regulatory body to product email or instant message records.
However, whilst the need to maintain business records is stark, technology vendor Proofpoint found in a recent survey that just 54% of large organisations in the US had deployed a technology solution for email archiving in 2010 and another survey, by GFI Software, found that that proportion fell to just over one-third of small and medium-sized companies.
Early email archiving technology tended to focus on the needs of specific types of companies, with financial services particularly well served. And, as email volumes continue to grow and grow, many felt that centrally archiving all emails was too complex a challenge. For large organisations in particular, scalability was considered to be an issue, and many others brushed the issue under a carpet. As a result, many organisations continue to rely on users storing emails on their own hard drives or using the email system itself as a storage repository. Neither is a good option as email records can be hard to find or even lost forever--especially if stored on a piece of equipment that is lost or stolen, which is a common problem with laptops and other portable media. A further issue to be considered is that many employees regularly use their smartphones for sending and receiving emails and those emails need to be captured for future use as well.
However, there is an alternative available that is suitable for any organisation, no matter its size or the regulatory burden that it faces--subscribing to cloud-based email management and archiving services. Such services take the cost and complexity out of managing email storage and provide ancillary services as well, such as business continuity and security. They are also highly scalable and suited to the demands of the mobile workforce.
According to Orlando Scott-Cowley, product marketing manager at cloud archiving vendor Mimecast, "Email archiving is going through the phases of its lifecycle. On-premise solutions are no longer scalable, have become too complex and don't really solve the email retention or litigation readiness problems that organisations have. Companies, whether regulated or not, are now turning to the cloud for their email archiving needs. Those that chose to deploy on-premise archives all those years ago are now finding they have the added complexity of migrating those solutions to more flexible and scalable cloud offerings. Setting their data free has become a bit of a nightmare, but their current on-premise vendors do not appear to be keen to wake them up from their bad dream."
Jon Pilkington, VP marketing and product management at cloud archiving vendor Sonian agrees, stating "Cloud-powered archiving provides a cost-effective, highly scalable solution for SMEs and enterprises alike. We view the cloud as a transformation service that is challenging the capital-intensive, on-premise models in use today, making email archiving accessible to companies of all sizes and verticals."
Email management and archiving are considered by many organisations to be among the most suitable applications for using cloud-based services as they are relatively uncomplicated and uniform. In December 2010, the US government unveiled its "Cloud first" policy, under which federal agencies must consider the option of using cloud-based services when planning new IT projects. In April 2011, the White House CIO stated that 15 agencies had announced that the intended to move their email management and archiving applications into the cloud. Two agencies--the General Services Administration and the Department of Agriculture--claim to have saved some US$40 million by abandoning in-house email. Building on this, the US government announced in November 2011 that all federal agencies have until May 2012 to report on how they intend to improve the way that they store and manage electronic records including emails, blog posts and social media activity and the White House in conjunction with the National Archives and Records Administration is currently drafting a new records management directive. Using cloud-based services is considered by many to be the best option.
Other governments are following this lead. The UK government has stated that cloud computing should account for half of its IT spend by 2015 and it is hoped that this will reduce its annual IT expenditure of £16 billion by £3.2 billion.
Organisations that follow suit and embrace the cloud for email management and archiving will find that there are many benefits from doing so, not least of which is the peace of mind that business records will be securely preserved and can be easily retrieved as and when necessary. Emails are among the most requested documents as evidence in lawsuits and the courts no longer accept the argument of technical difficulty when dealing with legal issues surrounding email management and archiving. With cloud-based services, the burden and cost is taken out of the hands of the organisation and placed in those of specialists. For a competitive overview of some of the main players, click to download this document: Email archiving best practices
According to recent research, Microsoft Exchange accounts for 65% of email servers in use in organisations today. Many of these are deployments of Exchange 2003, for which Microsoft no longer offers support, and Exchange 2007. According to the Radicati Group, Exchange 2007 accounts for 44% of all enterprise on-premise deployments. However, with the release of Exchange 2010 and other related products by Microsoft, earlier versions of Exchange are losing market share. Radicati predicts that Exchange 2010 will account for 57% of total Exchange deployments by 2014.
Exchange 2010 offers many improvements over previous versions in areas such as more flexible and lower cost of deployment, easier access for mobile clients, and the introduction of email archiving, retention and discovery capabilities. These are just some of the reasons why many organisations are looking to upgrade to 2010--but such migrations are not without risks in such areas as data loss and downtime, which affects productivity.
Email archiving specialist Mimecast is holding a seminar 3rd November in London to explore the issues organisations face in migrating to the latest Exchange version, as well as Microsoft's new Office 365 software-plus-services productivity suite, which expands options for having services hosted, which takes much of the tasks of administration and management out of the hands of internal IT resources. Nick Caw from Microsoft will be on hand to explore the benefits of these new products and services further.
As a specialist in email management services, Mimecast is offering services to those organisations looking for a pain-free migration to these new services. It will also introduce its capabilities in add-on services for Microsoft products and services to offer a more complete and robust email archiving capability for organisations that need something more advanced, including always-on capability, even in the event of a server outage. Bloor Research will also contribute, looking at the need for email archiving as well as considerations in selecting a vendor, including a look at the capabilities of some of the major players on the market.
The following link will take you to the registration page for the seminar:The great email migration.
Given the amount of business information that is contained in the multitude of emails received and sent every day and the need to preserve business records and correspondence for regulatory compliance and governance purposes, as well as for dealing with litigation requests, the use of email archiving technologies and services is growing fast.
Email archiving is not a standalone capability, but should rather be considered as part of a wider encompassing email management solution that provides complementary capabilities in mailbox management, policy enforcement, security, continuity and e-discovery. To ensure that any technology or service selected can cover all of these bases, the capabilities should be tightly integrated as part of one unified system that is built on a common architecture and that provides a single management interface. This will allow for centralised policy enforcement, and will provide management reports that provide an audit trail for governance and compliance purposes.
Bloor Research has today published a report that discusses what organisations should look for when evaluating products or services and looks at the delivery options available in terms of hosted services or on-premise deployments, or a hybrid mix of the two. It then provides details of some of the offerings of the major players in this market sector, comparing the strengths and weaknesses of each. The report can be accessed here: Best practices in email archiving.
When the BSIMM project started I was excited but apprehensive - the challenge to produce such a maturity model was enormous. With BSIMM3 we not only see the fruits of a huge amount of detailed work but the team behind it have proven that they can bring together lots of disparate firms with different ideas together under the BSIMM banner. The scientific foundation of BSIMM is its strength - the rigour behind the work is a joy to behold.
Check out the details here http://bsimm.com
Practice Leader - Security
To reduce complexity, a high proportion of organisations are looking at modernising their data centre infrastructure through consolidation, virtualisation and by leveraging the cloud. In traditional data centres, security controls can be applied to each physical system and systems with different levels of criticality or those that contain the most sensitive data can be physically separated. This is no longer the case for next-generation data centres where virtual resources cannot be compartmentalised in the same way and security controls can no longer be tied to physical resources.
While the chief goals of data centre modernisation projects are to enable the business by being able to accommodate rapidly changing business needs, while reducing operational complexity and cost, risk and compliance obligations must also be prioritised.
The modern data centre requires an integrated set of security controls that are applied consistently across physical and virtual systems, as well as those residing in the cloud, with federated management and reporting across hybrid environments that may include extensions to private and public clouds. The only way that this can be achieved is by building security into the design phase during key inflection points as data centres are built out, virtualised or upgraded and must be applied consistently across all systems in a hybrid environment that spans physical and virtual systems, as well as cloud-based computing. This will enable the busi¬ness by improving its ability to offer dynamic services that are always available, and that are resilient and secure, which will improve the capability to manage risk, apply and enforce consistent security policies, and to achieve compliance objectives.
A recent paper discusses these issues in greater detail and provides details of the key issues and security controls that organisations should be looking at. The paper can be accessed here:Architecting the security of the next-generation data centre.
IPv6 Day came and went without much fanfare. That is because, according to participants, it worked. True, there were a few problems encountered, but no more than expected and that was one of the main points of the exercise anyway. According to Cisco, the event proved that careful and gradual adoption will be easier than believed. And Arbor Networks reported that the test was enough to tell us that we can handle the transition to IPv6.
So what happens next? One of the benefits seen from the day is that it has persuaded hardware and software vendors to add support for IPv6 into their products, which has been one of the biggest sticking points to date. There are still further challenges to be overcome, including details of running dual stack IPv4 with IPv6 and new security challenges that are unique to IPv6. But now is the time for all organisations to at least be planning for their own transition.
IPv6 will allow continued growth of the internet, which has become essential for commerce, communication and social interaction. According to Verisign, internal drivers for adoption are for organisations to be as technologically current and future-proofed as possible, whilst external drivers include the need to keep up with the increasing number of devices requiring IP addresses, ranging from mobile and streaming technologies, to smart meters, cars, TVs, game consoles and medical devices, plus a surge in new users from emerging markets who all need IT addresses.
Another push for IPv6 take up is that governments worldwide are increasingly looking to promote take up of IPv6. In Europe, national governments are undertaking their own initiatives, as well as efforts being made at an EU level. The US government is going even further as it believes that IPv6 technologies will allow it to pursue policy goals in areas such as healthcare, education and energy. In September 2010, the federal government mandated that all agencies must upgrade external-facing systems to IPv6 by end-2012 and internal applications that communicate with the internet by 2014.
The transition to IPv6 will not happen overnight, but there is finally a great deal happening to spur adoption. There are workarounds that have been in put in place to extend the life of IPv4 and organisations, but these are just that--temporary workarounds, not a long-term solution. According to Alan Way of Spirent: "The organisation that sticks doggedly to its old IPv4 inheritance won't be cut off from the outside world, it will simply suffer increasingly degraded performance as more and more communications move to IPv6. For financial services and such high speed transactions this would be disastrous. For other businesses, it could still erode their competitive edge."
Taking back control in today's complex threat landscapeToday's security threats are complex and sophisticated and are getting ever harder to defend against. Attackers use multiple methods and vectors to try to bury deep into networks and are increasingly looking for longer term gain, rather than just a one-off theft. Traditional security controls that focus on previously seen attacks are no match for these complex, blended exploits.
Organisations deploy multiple security controls to defend their networks and these still have their place. However, there are newer technologies that have emerged recently that can improve their chances of defending against the insidious threats seen today--those of application control and change control.
Application control uses whitelisting to ensure that only authorised applications can be allowed to run and to prevent those with a malicious payload from executing. This is because if an application is not on the whitelist it can be automatically blocked. Change control technologies prevent vulnerabilities from being introduced into networks that can be exploited by controlling the configuration creep that occurs when changes are introduced into the network, whether intentional through patching or upgrades, or where misconfigurations have been introduced by mistake. Such controls can do much to ensure that the integrity of the network is kept as intact as possible. https://www.bloorresearch.com/research/white-paper/2099/taking-back-control-todays-complex-threat-landscape.html
Bloor Research has recently published a report that looks at the role played by these technologies in greater detail. The report can be accessed here upon registration: There will also be a webinar on this subject tomorrow, 10th August 2011, at 10am BST. The registration page for this event is here: http://www.brighttalk.com/webcast/288/32519
The internet protocol (IP) is the primary communications protocol for determining how data packets are routed around the internet and is responsible for the addressing system that ensures traffic is routed to the intended destination. The current version is IPv4 and has worked well for years, running in the background without anyone really worrying about it.
But IPv4 was developed when the internet was a smaller place. Ten years ago, there were slightly over 360 million internet users worldwide; by mid-2010, that had grown to around two billion. However, those numbers do not tell the whole story. Many people use more than one device to connect to the internet, often a mobile device in addition to a PC. As well as this, any manner of devices are becoming internet-enabled--from home appliances to medical equipment, networked cameras to intelligent transport systems, online gaming consoles to cars. It is estimated that there are currently five billion devices connected to the internet and that by 2020 that number will grow to some 50 billion. Each devices needs an IP address to identify it on the network and there are simply not enough addresses available with IPv4.
Because of this, IPv6 was developed some years ago, offering a vastly expanded pool of available IP addresses. The transition to IPv6 is not optional as the internet and the number of devices connected to it continues to expand. There are many reasons for switching over to IPv6 beyond the fact that the number of available IP addresses is at exhaustion point--it offers security improvements over IPv4, such as mandatory use of IPSec for encryption and authentication, it offers auto-configuration for new devices connected to the network, it offers superior connections for mobile devices and improves peer-to-peer collaboration capabilities. However, there are also new security issues that it introduces that will need to be addressed, including an increased risk of distributed denial of service and buffer overflow attacks.
According to network equipment and services vendors, those security risks can be mitigated. Of more concern are security issues that are not inherent in IPv6 per se, but rather concern the way that it is used and implemented. Misconfigurations are considered to be among the most important security issues since IPv6 is new, is considered to be complex, and there is a lack of implementation and policy guidance, training and available tools.
In an effort to test drive IPv6 implementations, 8th June 2011 has been designated as IPv6 Day by the Internet Society. A wide variety of organisations will participate in IPv6Day, ranging from web content providers such as Facebook, Yahoo and Google, to service providers and telcos. The purpose of the day is to gather information about how IPv6 functions in a production environment with a view to accelerating the momentum of its deployment worldwide and to work out how to iron out problems that are already known about, such as IPv6 brokenness, which are primarily related to misconfigured network equipment and faulty firewall settings.
IPv6 Day is not a flag day for worldwide implementation of IPv6, which will probably take a number of years. However, it is an important milestone in terms of uncovering the issues that will be involved in its deployment so that any problems can be solved. The results of IPv6 Day will be reported on in further articles on this blog.
But why is code security so important?
The use of complex software is now part of daily business life. Unfortunately cyber criminals are taking advantage of this to spread malware and to attack systems with the aim of stealing information, money and intellectual property.
Information security specialists have been relatively successful in protecting networks and data systems from these cyber criminals but, to date, computer software has been an Achilles heel, open to attacks that take advantage of bugs and errors in computer code. Once a security bug is found it can be abused by cyber criminals whilst a business, in many cases, remains blissfully unaware that they are under attack.
Computer software must therefore be checked for security related bugs--a process that has historically been very manually intensive and expensive, with limited scalability and needing access to the underlying source code.
It's a software developer's job to write application code that satisfies customer requirements and meets business objectives. This code needs to be functional, usable, reliable and with acceptable performance and supportability. As the modern world relies on software to function, teams of developers must do their best to churn out millions of lines of code under huge pressure to satisfy customer demand.
With looming deadlines and the need to do yet more work developers, in the past, had little time to ensure their code was free from bugs or errors that opened security holes in the application. Fortunately, as many applications ran within a client server network, relatively isolated from the outside world, this approach was normally successful.
Then along came the Internet, the World Wide Web and the subsequent massive growth in handheld devices that exposed what would be normally closed applications to millions of anonymous users. Combine this with the recent introduction of organised cyber criminals continuously looking for new ways of committing crime, and the computer security ground rules have been rewritten forever.
Against this background we have seen a huge move towards componentised code, and the reuse of code libraries and functions that had been developed in house, purchased or borrowed from other developers. As customers have looked to slim down their costs, the use of commercial and open sourced software grew. Outsourced software development has seen projects sent across the other side of the world to be written by developers they have never met in a country they may never have visited. So not only do developers need to worry about security defects in the code they write, but also in the code they reuse.
This perfect storm raises huge concerns in the minds of information security professionals who are trying to get a grip on the scale and diversity of software entering their organisations.
On the other hand we need to consider the developers. The sheer volume of potential security flaws and new and emerging threats can be overwhelming to a developer under pressure to roll out yet another new feature.
Software development managers and information security professionals need to act now to address the security of the software they write, purchase or co-opt into their solutions.
I recommend this event for both security professionals and developers alike.
Practice Leader - Security
The video demonstrates just how straightforward and achievable GSM cell phone/mobile phone interception can be, given enough time and some smart folks.
Hopefully people will now believe me when I say that voice data protection needs to be seriously considered!
Practice Leader - Security
We found jihadists were compiling packages of information designed to be received on smartphones. They contained copies of videos, songs, speeches and images that followers are encouraged to pass on. Some were using Bluetooth short-range radio technology to anonymously spread information to potential supporters, and there are further implications for mobile phone security following the commoditisation of tools and techniques.
Practice Leader - Security
This year's Counter Terrorism Conference (London, 19th - 20th April 2011, ) looks set to be one of the largest CT events ever.
We need to keep up with new threats and challenges, and I have been asked to speak at the conference on cell/mobile phone security. My session is called "Cell Phone Hacking - The Terrorist's Latest Playground" and is scheduled for 1100hrs - 1120hrs on Wednesday 20th April. It will be based on research I have been conducting into the jihadist use of mobile phones to spread propaganda against a background of commoditised hacking against the GSM mobile phone network.
In addition to speaking at the conference I will be spending time at the Morrigan Partners stand (P44) discussing the issues that GSM hacking is presenting to businesses and organisations. I will be at the stand from 1430hrs - 1530hrs on Tuesday 19th April and 1200hrs - 1300hrs and 1430hrs - 1530hrs on Wednesday 20th April.
If you are interested in the problem of cell/mobile phone hacking come along and have a chat at these times. I'd be happy to speak about more research I am doing and ways in which you can protect your data, systems and users from such attacks by terrorists and criminals alike.
Practice Leader - Security
-- Advertisement --