Secure web use for all, without walls

Few companies would lay claim to being able to protect all end devices. So in a deperimeterised environment a holistic approach to web access is required, says Paul Simmonds

filename

Few companies would lay claim to being able to protect all end devices. So in a deperimeterised environment a holistic approach to web access is required, says Paul Simmonds

Browsing the web is a risky pastime. E-mails try to lure people to dodgy sites, and URLs are deliberately misspelt to snare the unsuspecting. Some end-users will inevitably stray to sites that are clearly inappropriate or have been hacked and crammed with malicious code.

IT managers need to provide end-users with a web browsing experience that protects them from inadvertently straying to inappropriate sites and also ensures that, wherever they browse, the data downloaded via their web browser is free from malicious content.

Although organisations work hard to maintain an environment where access to computing devices is governed and controlled by inherently secure protocols, the problem remains of how to control access to inherently untrusted environments such as the web.

In a deperimeterised environment, an untrusted website needs to have its trust level raised so that end-users are presented with browsing that is 100% free of malicious content, closing off this route to compromised PCs.

IT departments face three problems with end-users accessing the web. First, they must ensure that the sites end-users browse are in line with the stated (corporate, country, personal or home) policy on web browsing. Second, they have to ensure that what a web server delivers back to the end-user is free from malicious content. The third challenge is to protect all end devices, no matter where or how they are connected.

Most organisations admit to doing the first reasonably well, the second poorly, and the third rarely or never.

Existing approaches to controlling web access involve installing filtering products in a demilitarised zone (DMZ). But such products generally protect only those users inside the corporate intranet. The same level of filtering is rarely available for SME or home users. And where a corporate policy exists for remote end-users, it generally involves either leaving mobile staff unprotected or insisting that all web access requires end-users to run an authenticated VPN tunnel back into the corporate environment.

There are two separate problems to be solved here. First, an IT architecture is needed that allows operation in a deperimeterised environment. Second, user organisations need a way to provision a distributed filtering service.

To ensure users have 100% protection at all time, all web traffic, regardless of where the end device is connected, must be filtered. And crucially, once filtered, all protocols between the filter and the end device must be inherently secure.

Standard URL filtering can use a database look-up of known sites to support corporate-wide blocking of categories such as hate, criminal activities and racism. A wildcard capability to blacklist and whitelist similar domains, such as http://*.my-company.com, is also useful.

Filtering should provide intelligent handling and differentiation of port 80 tunnelling traffic, such as instant messaging, Limewire, Skype/VoIP, video and audio streaming. It is also important to be able to identify proxy sites that deliberately mask their URLs as well as those that accidentally mask or bypass URLs, such as the Google cache.

If end-users access the internet via an external service provider, then access points, or POPs, should be globally available with 24x7 support, ensuring that mobile deperimeterised workers have only a short hop to the nearest POP and from there directly to the internet. It is important that global POPs provide global load-balancing and resilience. This external service should offer a secure interface between the local web browser and the service, thus minimising upgrades or changes which are the responsibility of the service provider.

There is an issue when a remote worker needs to make an initial local connection, typically using a local hotspot or hotel. This issue is compounded by the plethora of web redirection methods for authentication and payment. There is a need for a standard method or protocol for web charging to provide a secure interface that can separately handle the authentication and payment necessary to grant access to the internet via a local connection.

Reporting facilities should be able to present the extent of internet use, not just by individuals, but also aggregated to departments and locations, with suitable safeguards on individual privacy. For instance, there should be facilities to anonymise usage so that service providers and IT managers can see usage (and abuse), but are unable to identify the individuals.

The reporting should also offer configurable data retention policies to meet business, industry, regulatory and country-specific requirements, and provide real-time reporting on filtered conditions.

Most of these filtering features are partially available in products today. But in a deperimeterised environment, internet filtering and reporting will need to run in a distributed environment. In other words, multiple, replicated internet filtering and reporting environments are needed to connect users to the local service or DMZ. Common filtering rules are applied and reporting is consolidated into a central report interface, irrespective of the actual hardware or the way the internet is accessed.

Challenges for the industry remain, such as defining a web page redirection standard that allows a minimal subset of protocol exchange for granting website access by password, token, certificate, pre-authentication or payment card. The industry also needs to agree a standard mechanism for secure proxy connectivity with credentials being passed.

Accelerating the use of inherently secure protocols for proxy connections, with the ability to use those protocols inside or outside the corporation, will let corporates provide a simpler, yet more secure and holistic approach to web access.

Web security basics

  • Bar sites known to contain malicious code
  • Scan files and other non-HTML code for viruses and malicious code
  • Block password-protected files that prevent their contents being scanned
  • Use heuristic detection tools to ensure all malicious code is detected.

Paul Simmonds is chief security officer at ICI and a contributor to the Jericho Forum

www.jerichoforum.org

Comment on this article: computer.weekly@rbi.co.uk

 

 

Read more on IT risk management

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close