Gil Dabah
February 7, 2023
Guess what? You can now fully automate and support individual rights with privacy-by-design! This method for improving privacy encourages introducing and integrating privacy directly into the entire engineering process from day one. Even better news is that anyone can achieve it with the right kind of planning and application. If you think this sounds and feels a bit similar to the concept of security-by-design, you’d be right. But privacy-by-design is a fast-growing domain in its own right.
Alongside its security counterpart, it’s helping more and more companies overcome the growing risks of collecting and storing sensitive customer data, as well as honoring data-related preferences. Let’s compare the two.
Privacy-by-design is a method of planning and implementing a system and architecture that fully supports individual rights and protects people's data. Developers are tasked with taking all privacy considerations into account as part of the development process. Architecture built in this way should include a data inventory, retention policies, minimization policies, consent management support, security mechanisms, etc.
The goal is to maintain ultimate control over data by knowing who, when, and where the data was collected from and being able to consult this metadata while processing the data itself. Privacy-by-design key principles include:
Security-by-design is the planning and implementation of a foundationally secure system. As with privacy-by-design, the responsibility shifts onto developers to bake as many security features as possible into their applications. The main goal of security-by-design is to build a robust architecture that is as immune to implementation bugs as possible. What do we mean by that?
One example is to run code with bugs in a tight sandbox to reduce its access to everything else that isn’t necessary for its functionality. This way, the bug that can be exploited by threat actors loses escalation capabilities to compromise the whole device it runs on. Typically, security by design employs the following principles:
A known pitfall to the uneducated, or to security newcomers, is the fact that they design a system that they think is secure, although they poked a hidden hole in it that they assume nobody will find out. Granted, they believe that if they will hide some parts of the mechanisms there, nobody will be able to find it. In practice, they assume that since something is obscure (hidden) then it is considered secure. Let's see a real-life example.
Suppose that there's a system-call (syscall - an API from the kernel that any application can call) that isn't documented at all publicly. For some reason it allows you to create, or override, a file anywhere in the file system (including privileged directories). This syscall wasn't meant to be used by any applications rather than Microsoft's own Installer system. And one day a threat actor reverse- engineered the Installer application and found this syscall. He realized how it works and that it lacks real security checks to make sure the only caller is the Installer application. Thus this syscall is not secure. That threat actor is now able to write a new malware that uses this syscall directly (remember, it's not public in the SDK) to gain root access on the system by overriding a system file that is executed with high privileges...
The point is, assuming nobody external will know how something you built works is wrong. People sit down and try to break systems and they get nice bug bounty awards (in cash) for their security vulnerability findings.
Think of it this way - A robust design is one that, if you hand in all the information and architecture documents to an adversary (or red-team), it will not give them any advantage to break your system whatsoever! A good example is encryption algorithms, you know the algorithm and yet you can't break it.
Both privacy- and security-by-design are dedicated to building foundational data protection, and both are meant to do it from the onset of a product’s design and build process. They also each require and expect developers to share this responsibility to meet the relevant emerging industry standards. Where privacy focuses on protecting data and how to work with data more responsibly, security talks about how to secure the systems around it.
Importantly, security as a domain hardly focuses on issues brought up in regulations such as GDPR–such as geographical limitations on where to store data or the requirement to delete data not in use after a certain amount of time. Privacy also assigns different levels of prioritization to different types of data and encourages additional measures like pseudonymization (de-identification).
The latter is part of a larger aim to isolate and segregate sensitive data into one place, technically unlinking it from the data subjects. In essence, this scatters the pieces and keeps the most important (or sensitive) ones locked away and can drastically reduce the impact of data breaches. Both privacy- and security-by-design can overlap and share many principles. They aren’t mutually exclusive and actually complement one another in the mission to protect data. For example, whatever is locked away can only be kept safe with the help of good security-by-design. And privacy-by-design can help security efforts focus on what is most critical to keep safe.
We’ve already covered some of this in our Practical Guide to Privacy by Design Architecture. Neither method involves entirely straightforward processes. They must often be retroactively applied to well-entrenched systems and account for constantly moving targets. They also require already-swamped developers to learn these new domains–and without sufficient pressure related tasks are often pushed to the wayside.
However, security-by-design does enjoy significantly more buy-in and a healthy market of solutions to help. It is far more mature than privacy-by-design in this regard. Meanwhile, the buy-in for privacy-by-design, thanks to lagging awareness around GDPR and privacy-by-design requirements, remains low despite mounting legal pressure to comply. The market for out-of-the-box solutions is also pretty bleak.
The privacy-by-design space is growing hotter, though. A growing number of solutions are beginning to hit the market to help developers who simply don’t have the time to learn and build privacy-by-design infrastructure from scratch. For example, the Piiano Vault is a piece of pre-built infrastructure that serves as a core building block of privacy-by-design systems. It is capable of implementing most privacy requirements straight out-of-the box. Interestingly enough, it was designed by a group of security experts looking to augment data protection beyond what security-by-design could offer.
As privacy regulations and policies continue to emerge and develop, it’s a fascinating time to see how they will continue to intersect with security. We predict that it’s a simple matter of time before privacy-by-design becomes as ubiquitous as security-by-design for the best data protection.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
CEO & Co-founder
Gil is a software ninja who loves both building software (companies too) and breaking code. Renowned for his prowess in security research, including notable exploits of the Microsoft Windows kernel that have earned him unusual high bounty awards. He has written a couple of very successful open source libraries. And he likes to talk publicly in conferences.
Increased complexity as the number of keys and systems grow.
Adopt a centralized key management solution such as a Hardware Security Module (HSM) or cloud-based KMS to securely manage and control cryptographic keys at scale.
Ensuring secure and timely key distribution and synchronization at scale.
Automate key rotation processes to maintain synchronization, reduce human intervention, and minimize errors as the system grows.