The Risks of Data Tampering and How to Prevent It

The Risks of Data Tampering and How to Prevent It

Most businesses are aware of the risks associated with data theft or exposure. The recent Equifax breach, which compromised the sensitive information of nearly half the U.S. population, is only the most recent in a series of cybercrimes in which massive amounts of data were exfiltrated.

But what if data wasn’t stolen but modified. What if, for example, someone tampered with the quality assurance data of a manufacturing plant? Or a bank’s account balances? Or the patient information held by a hospital? How long might it take an organization to discover that its data had been modified? How would the organization recover?

Admiral Michael S. Rogers of the U.S. Navy, who serves as Director of the National Security Agency, Commander of the U.S. Cyber Command and Chief of the Central Security Service, has said that data tampering could become the greatest cybersecurity threat organizations face. Data tampering could be an act of revenge by a disgruntled employee, industrial espionage by a competitor or the work of hactivists or a rogue nation state. Whatever the root cause, the prospects of such a security breach are alarming.

One type of data tampering has unfortunately become commonplace — ransomware. In a ransomware attack, cybercriminals encrypt an organization’s data and demand payment of a ransom to obtain the decryption key. According to data from Quick Heal Security Labs, more than 25,000 ransomware infections were reported daily on Windows system in the third quarter of 2017 alone.

What’s more, many cyberattacks involve some kind of data tampering. Hacker often insert new files that perform some malicious activity, change a configuration file to gain control of a system, or delete or modify system log files to cover their tracks.

Clearly, it’s important that organizations be able to identify successful and unsuccessful attempts to change critical files. But how do you go about it? That’s the function of a security control known as file integrity monitoring (FIM).

Also known as change monitoring, FIM is the process of examining critical files to see if, when and how they change. FIM systems compare the current state of a file to a known, good baseline, typically using a cryptographic algorithm to generate a mathematical value called a checksum. Files may be monitored at predefined intervals, randomly or in real time.

Given the large amount of data stored by organizations today, monitoring all files typically isn’t practical. FIM systems can be resource-intensive, particularly when it comes to very large files and those that change constantly. That said, it’s important to monitor any files that a hacker might seek to compromise, or that might cause downtime or data loss if a legitimate user makes an error.

With that in mind, FIM systems generally are used to monitor user identities, privileges and credentials, security settings, operating system and application files, configuration files, and encryption key stores. Monitoring of log files is especially important, and should ensure that only systems and applications write data to logs and that log files are frequently collected and stored in a separate management system. Organizations may also use FIM to monitor files that contain sensitive content such as customer information and trade secrets.

In our next post, we’ll describe what to look for in a FIM solution and how to integrate FIM into your security strategy.


Just released our free eBook, 20 Signs That Your Business is Ready for Managed ServicesDownload
+