Desktop or Data Center Application Delivery
I recently loaded a Windows desktop widget to monitor my CPU and memory utilization. To my surprise, I discovered “thin” web applications are not that thin, and “fat clients” don’t need “The Biggest Loser” to be efficient.
Site |
Min Memory Usage |
Max Memory Usage |
www.harvardpartners.com | 15MB | 15MB |
www.microsoft.com | 38MB | 58MB |
www.oracle.com | 45MB | 49MB |
www.google.com | 35MB | 38MB |
www.yahoo.com | 55MB | 70MB |
Application | Memory Usage |
Word (28 pages) | 28MB |
PowerPoint (27 slides) | 42MB |
Excel (234 rows) | 21MB |
Photoshop (1 image) | 80MB |
Quicken | 46MB |
So, which is better and why do we care?
We care because we are Operations and Infrastructure people and, after keeping everything running, our charter is to deploy faster and more reliable systems for less money.
Memory used for web applications on a PC primarily represent graphics processing overhead. This memory does not account for the memory used on the server to retrieve, process, and transmit the text and graphics for a web page.
PC memory is at least 50% less expensive than server memory. PCs don’t require high availability technology. PCs do not reside in expensive data centers. PC applications do not require network connectivity.
… but PCs and PC applications fail at a far greater rate than servers.
… and failed applications cause end-users and systems support people to waste time.
… time costs money.
What we need are “infrastructure economics” describing the relationship and cost between user, application, PC, network, data center, server, storage, and the cost of supporting everything. We also need to understand the costs of providing high availability, disaster recovery, and documentation.
Armed with these tools, we make fact-based decisions about which solutions meet requirements, deliver business outcomes, and grow with the business. We turn art into science.
Reader Comments