Maksim Kabakou - Fotolia

Security Think Tank: Managing data securely throughout its lifecycle

Managing data in a secure manner is key to ensuring its integrity and therefore its value to the organisation, as well as reducing risk from breaches and misinformation

Managing data securely is key to ensuring its integrity and therefore its value to the organisation, as well as reducing risk from breaches and misinformation.

Although, on the face of it, managing data throughout its lifecycle can be a mammoth job, breaking it down into different elements, as outlined below, makes it less daunting.

Creation or collection

This may involve manual data entry, such as data relating to a new joiner in the organisation; acquiring external data produced outside the organisation; or data generated by devices, such as customer spending habits captured by store loyalty cards, for example. 

Storing this data requires having the correct authorisation – consent for personally identifiable information (PII) and permission to store, process and transmit for organisational data.

Identification or classification

Organisations should aim to classify information, potentially with the help of technology such as artificial intelligence (AI) tools. However, it’s important to note that the labels themselves (confidential, private, top secret, for example) do not protect the data – people need a clear understanding of how to handle data based on its ranking.

Classification and management of unstructured data, including Excel files, Word documents and system exports, is always challenging, but classifying it and storing it in the correct place provides peace of mind and a basis for more complex handling rules.

Organisations should also maintain a catalogue of all information, clearly defining for each classification level:

This is particularly relevant where there are sensitive information requirements, such as the handling of personal information under the General Data Protection Regulation (GDPR).


Each classification or type of data requires a clear owner, whether that is the head of a department or a team, or the individual who submits it. In some cases, this will be determined by legislation (GDPR defines data owners versus processors, for example) and regulation (such as PCI DSS, which establishes who is accountable). Owners ensure data is handled in line with the organisation’s data governance principles, which enable efficient access to, and appropriate use of, the data held while adhering to local laws and regulations.

All individuals coming into contact with the data must be trained so they know where it should be stored, how it should be processed and the correct way of transmitting it.

If possible, all data should have an original data source identified, ideally within an application for easy cataloguing and to capture any changes. This provides a clear master version if copies of the data are made and then subsequently manipulated or changed for business purposes.


How individuals handle data is one of the most critical steps to keeping it secure. Where it must be stored (on laptops, shared cloud-based storage drives, and so on) needs to be clearly defined, along with how it should be transmitted (never by email, always encrypted, only redacted versions, for example), and the way in which it must be processed (never exported from the master system, only with approval, only using agreed processes, and so on).

Without clear guidelines, a mess of duplicate data can quickly accrue, with no one sure of the original and the values to use. From there, it’s an easy step to people sharing material they shouldn’t, which opens the organisation up to data breaches.

As well as the human element, encryption, as the key control for data storage and transmission, plays a major role in data handling. It comes with its own challenges, but also interesting developments. For example, to be used, data typically data has to be decrypted, leaving it vulnerable before it is re-encrypted. Homomorphic encryption, however, in allowing for interaction with encrypted data, could remove this point of risk.


There are no hard and fast rules when it comes to how to protect data. To prevent disclosure (accidental or deliberate), each organisation must perform a risk assessment for each type of data that it holds to understand the risks and the potential ways in which it could be copied, exported, or saved without approval. Armed with this knowledge, appropriate action can be taken.

Once the risks and impact of unauthorised access have been established, tools can be used to both protect data and prevent against unwanted uses of it. Tools can be used to monitor for data leaving the organisation (data loss prevention), look for exports from enterprise applications, and monitor user behaviour to detect sharing or data being sent by emails (XDR and system log information).

The most effective way to stop unauthorised use of data is to prevent people accessing it in the first place. Access controls should be applied at all levels of technology – the application itself where people regularly log on, the databases where it is stored, and the interfaces that transmit data from one application to another – although in reality it is hard to identify all areas. 

Regardless of the tools or capabilities deployed, it is essential they are monitored to ensure any alerts flagging anomalies are spotted. Security orchestration, automation and response (SOAR) tools and similar can be used to automatically block unauthorised attempts at exporting or sharing data, but these should be managed to make sure false alarms are minimised and that, ultimately, no breach occurs.

Physical controls should also be considered. Printers are a common way of highly confidential information being made widely available, while some organisations use security doors, or other physical barriers, to restrict the flow of information.

Backups must be afforded the same protection as live data. No matter how strong the controls to restrict data access, if a backup file is stored somewhere centrally, or in a disaster recovery centre with weaker controls, overall security is compromised, and risk is introduced.

Archiving and destruction

In the main, data owners will specify the retention lifetime of “their” data, which is often tied to legislation, although the longevity of relevant financial information is often governed by the audit cycle.

Review points should be set for when people leave the organisation or change roles. At this point, their data access should be revoked and a review of any data to which they have access performed, as information which is potentially outdated or confidential may be stored on their devices.

Initially “old” data might be archived by transferring it outside the active production environment, where its continued safety can be achieved with tools such as antivirus software and network security and encryption. However, it is impractical to store archived data indefinitely, and the retention schedule should be followed to ensure it is disposed of at the correct time.

Automated data deletion, which can be implemented in most business applications, can be a frightening prospect, but it is the most effective way to purge obsolete information. Backups, copies, information shared with trusted third parties, and anywhere else it has been copied to need to be tracked and destroyed, with strict processes defined, followed and audited to make sure this is taking place as expected.

Read more on Privacy and data protection

Data Center
Data Management