The Role of Memory within Web 2.0 Architectures and Deployments
Although I have a basic working knowledge of memory, SSDs and the like, I am not technical...I have never developed or deployed a system. I was exposed to ram-disks years ago, when their expense limited their use to very small files or DB applications. I am looking to "get current" on what role memory plays in curremt WEB 2.0 design and deployments.
How is memory commonly used to remove latency and accelerate performance in typical Web 2.0 architectures? What role can memory play in massive scale-out implementations? Are there such a thing as memory "best practives"? If memory were cheap, would that significantly change the way systems are designed and deployed?
What commercial and open source products that use memory are used, what are the benefits and trade-offs?
Can anyone suggest what sources - people, books, papers, products - I might look into to gain a practical understanding of this topic?