Vault: Hardware recommendation/requirements for Vault components

UPDATED: March 27, 2017

Using storage that might not have a reliable connection to the server
  • A reliable connection to the storage is essential. If the storage disconnects during an index operation (or otherwise loses writes), data might not be completely written to disk which might lead to index corruption.
  • Using replicated file systems might lead to unpredictable behavior in some cases. If a server is running off a replica, it might see partial writes from the primary. If the replication system only replicates entire files, small changes in Vault indexes could trigger large copies to the replica.
  • Redirected file systems might cache data at the local node and so might not immediately reflect writes made from a remote node.
 Hosting the Vault server or loader in a virtual machine
  •  VMs often had too little RAM or too few CPU cores.
  •  Vault marks I/O on some of large files as random access. Under Windows, this puts a much larger amount of file cache memory in the active rather than standby state. This effectively drives down the memory available to other applications.
 Virus scanners, in particular real time virus scanners
  • Putting Vault executables on the exception list helps reduce the impact on I/O. This particularly applies to e2ps since it is typically started very frequently (Postscript data only).
  • Data files (dri/drr/drd/drp/etc) may also need to be excluded if the scanner attempts to check these too.
  • Real time virus scanners (active or not, exclusions or not) can increase the memory use in the kernel which can negatively affect 32-bit Windows

Environment Details



  • No Downloads