Crunchy Bridge for Analytics now has support for Iceberg and other new features. Learn more in our Crunchy Bridge for Analytics announcement.

Data Encryption in Postgres: A Guidebook

Greg Nokes

6 min read

When your company has decided it's time to invest in more open source, Postgres is the obvious choice. Managing databases is not new and you already have established practices and requirements for rolling out a new database. One of the big requirements we frequently help new customers with on their Postgres adoption is data encryption. While the question is simple, there's a few layers to it that determine which is the right approach for you. Here we'll walk through the pros and cons of approaches and help you identify the right path for your needs.

Overview of At-Rest Encryption Methods

Let’s start by defining some terms. There are four primary ways to encrypt your data while it is at rest:

OS-Level and Filesystem Encryption

Operating system or disk-level encryption protects entire file systems or disks. This method is application-agnostic and offers encryption with minimal overhead. Think technologies like luks in Linux or FileVault in MacOS.

Pros:

  • Transparent to applications and the database
  • Simplifies management by applying encryption to the entire storage layer
  • Offloads encryption and decryption processing to the OS
  • Minimal performance and operational impact
  • Widely understood and implemented technology

Cons:

  • Less granular control over specific databases or tables
  • Backups are not encrypted by default
  • Additional overhead is required to ensure encryption keys are properly managed

Storage Device Encryption

Encryption is directly implemented on storage devices such as hard disk drives or SSDs which automatically encrypt all of the data written to their storage.

Pros:

  • Suitable for environments with hardware security requirements
  • Minimal performance and operational impact
  • Offloads encryption and decryption processing to the hardware layer

Cons:

  • Less granular control over specific databases or tables
  • Additional overhead is required to ensure encryption keys are properly managed

Transparent Disk Encryption (TDE)

In the context of Postgres, TDE means offloading encryption and decryption to the Postgres application. TDE encrypts the entire database, its associated backup files, and the transaction log files, using a database encryption key. This process is transparent to applications, meaning they operate without any changes, as the encryption and decryption happen at the database engine level.

This introduces some complexity and performance overhead as the database must handle all encryption and decryption tasks. Postgres does not currently have this capability built in. Generally, TDE in Postgres is accomplished by forking Postgres, applying patches, and re-building the forked TDE-enabled version.

Pros:

  • Encryption at the database level

Cons:

  • The database must handle all encryption and decryption for every disk read and write
  • Moving away from TDE can be complex and requires an expensive dump and restore process
  • Risk of total data loss if keys are not accessible
  • Additional overhead is required to ensure encryption keys are properly managed
  • Functionality is not native to Postgres and currently requires a forked version of the code
  • Data in memory is not encrypted

Application-Level Encryption

Encryption logic is implemented directly within your application code. This method can impact performance and add complexity but offers the most flexibility. You can use tools like pgcrypto alongside your application code to encrypt data at the columnar level, ensuring that your most sensitive data is stored safely. You can create a “dual key” system, where one key unlocks access to the database, and the second key unlocks access to sensitive data stored in the database. The database need never be aware of the second key, as the application uses it to encrypt and decrypt data before it sends it to the database.

Pros:

  • Offloads encryption and decryption processing to the application layer
  • Allows fine-grained encryption down to the field level
  • Enables encryption tailored to the application's specific security needs
  • Can be used to give a level of separation of duties and access
  • Protects data from non-business users thus giving a level of separation of duties between admins and users

Cons:

  • Greatly increases application complexity
  • Additional management overhead is required to ensure encryption keys are properly managed
  • Can hinder database search, sorting and indexing capabilities

How We Think About Data Encryption

First and foremost, are you encrypting your data in flight? That's just table stakes in our mind. When it comes to at-rest data encryption, you need to think about the options available and your real requirements.

1. Assess Compliance Requirements

Understanding regulatory frameworks and internal requirements is crucial in deciding which encryption strategy to implement. Your specific requirements should guide your decision-making process and help you reach a decision around which technology is the best fit for your environment.

2. Evaluate Existing Architecture

When selecting an encryption strategy, evaluate the existing architecture and available resources. Consider OS support, hardware resources, and storage devices to ensure compatibility and minimal disruption. Think about any operational burden you may incur.

3. Balance Complexity and Security

Finding the right balance between security and performance is critical. Encrypting highly sensitive data with more intensive methods is justified, but less critical data might require less intensive methods. Testing and understanding your requirements can help identify acceptable trade-offs in complexity.

4. Minimize Management Complexity

Encryption solutions should not overwhelm existing management workflows. Effective key management, version compatibility, and alignment with current security operations are essential to minimize additional management overhead.

5. Combine Strategies for Layered Security

Consider combining encrypted storage with in-database encryption for highly sensitive data. Layered security can provide additional safeguards and may enhance overall data protection.

Finding the Right Solution

For General Implementations or Medium to High Security Environments

Most implementations fall into this category. Think about OS-level or filesystem encryption for ease of deployment and compatibility across multiple applications. You can leverage tools like Tablespaces to target more secure underlying storage systems for tables with more sensitive data. Design a solution leveraging filesystem, disk and application level encryption that addresses your specific needs.

For when Disk Encryption is not an option

If you really require transparent disk encryption, evaluate whether storage-level encryption is sufficient. For workloads with stringent data classification requirements that necessitate TDE, ensure that you undertake a proof-of-concepts with clear goals and measurable outcomes. Test and create playbooks for events like data restore to an off-site database. Ensure that you have built operation expertise in key management, security and backup.

Conclusion

Encrypting your data while it is at rest is essential for data security in Postgres environments. By understanding your needs and balancing various encryption approaches, you can achieve optimal data protection without overcomplicating your workflows. With the right strategy, you can ensure security, compliance, and performance, providing a robust solution tailored to each environment's needs. At Crunchy Data we have deep expertise in helping enterprises and agencies navigate these requirements and design secure solutions that meet their requirements while also keeping maintenance and operational complexity low. We have helped countless organizations navigate these challenges and would be happy to discuss this further with you. Reach out to info@crunchydata.com for more information.

Avatar for Greg Nokes

Written by

Greg Nokes

May 30, 2024 More by this author