Optimal Database fullness? -


lets have relational database of arbitrary finite capacity, , database holds historical event information online system generating new events. database should hold event information reporting purposes, should purge events older (n) number of days. given have enough historical information deduce rate of event generation relatively constant , not increasing or decreasing on time, there optimal percentage (60%, 70%, 80%,...) fullness design database? if so, why did choose percentage?

it depends.

well, more helpful, said rate of event generation "relatively constant". need enough margin deal inconstancies in rate, both statistical , emergency. statistics can history, emergencies can guessed at.

the actual amount of space used depends on how stored. on related note, many filesystems become slow if exceed degree of fullness; want include percentage part of total margin. also, consider things granularity of event purge: how happen?

also, consider consequences of running out of capacity. system crash? how critical system, anyway? can emergency purge make additional space? how expensive capacity, relative expense of outage?


Comments

Popular posts from this blog

java - SNMP4J General Variable Binding Error -

windows - Python Service Installation - "Could not find PythonClass entry" -

Determine if a XmlNode is empty or null in C#? -