4GB x 120 Can Cover All?
Adam Bosworth has an interesting post about what database features customers want these days: 1) dynamic schema, 2) dynamic partitioning, and 3) modern indexing. Most interesting are the parameters he quoted for tracking 100M items of 4KB each. He figured (only) 120 machines of 4GB each are needed to keep current day's work all in memory. Now that's NOT a lot of machines. If this is correct, we can marvel how well hardware works for us. On the other hand, these machines have 400GB x 3 hard drives. The memory to hard drive space ratio is 1:300, which means a cache hit rate of 99.67% is required to keep it away from swapping, which seems too high to me. On the other hand, keeping everything in memory does sound like a great idea.
<< Home