From the September 2019 issue of HealthCare Business News magazine
From an IT investment perspective, improvements in technology have given us much faster networks, much faster processing and huge amounts of storage. Virtualization of the traditional client-server IT model has provided massive cost savings. And new hyperconverged systems can improve performance as well in certain instances. Cloud computing has given us economies of scale.
But costs will not easily be contained, as the mounting waves of data continue to pound against the IT breakwaters.
Containing IT costs
Numed, a well established company in business since 1975 provides a wide range of service options including time & material service, PM only contracts, full service contracts, labor only contracts & system relocation. Call 800 96 Numed for more info.
Costs continue to rise, proportionate to the demand for the three fundamentals (applications, uptime and speed).
However, there are solutions that can help contain IT costs. Data Center Infrastructure Management software has become an effective tool for analyzing and then reducing the overall cost of IT. In fact, the U.S. government Data Center Optimization Initiative claims to have saved nearly $2 billion since 2016.
Other solutions that don’t require new hardware to improve performance and extend the life of existing systems are also available.
What is often overlooked is that processing and analyzing data is dependent on the overall system’s input/output performance. Many large organizations performing data analytics require a computer system to access multiple and widespread databases, pulling information together through millions of I/O operations. The system’s analytic capability is dependent on the efficiency of those operations, which, in turn, is dependent on the efficiency of the computer’s operating environment.
In the Windows environment especially (which runs about 80% percent of the world’s computers), I/O performance degradation progresses over time. This degradation, which can lower the system’s overall throughput capacity by 50 percent or more, happens in any storage environment. Windows penalizes optimum performance due to server inefficiencies in the handoff of data to storage. This occurs in any data center, whether it is in the cloud or on premises. And it gets worse in a virtualized computing environment. In a virtual environment the multitude of systems all sending I/O up and down the stack to and from storage create tiny, fractured, random I/O that results in a “noisy” environment that slows down application performance. Left untreated, it only worsens with time.
Even experienced IT professionals mistakenly think that new hardware will solve these problems. Since data is so essential to running organizations, they are tempted to throw money at the problem by buying expensive new hardware. While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by up to 30 to 50 percent or more. Software like this has the advantage of being non-disruptive (no ripping and replacing hardware), and it can be transparent to end users as it is added in the background. Thus, a software solution can handle more data by eliminating overhead, increasing performance at a much, much lower cost and extending the life of existing systems.
With the tsunami of data threatening IT, solutions like these should be considered in order to contain healthcare IT costs.
James D'Arezzo is CEO of Condusiv Technologies, a global provider of software-only storage performance solutions for virtual and physical server environments. Back to HCB News