Anthony Ricigliano – News by Anthony Ricigliano: While its true that information is king, hes definitely a greedy ruler! As the business world continues to demand the storage of more and more data for longer periods of time, the need for increased amounts of disk space grows exponentially larger each year. To compound the issue, the low price of storage space means that many software developers no longer feel the need to make their products space efficient, and government regulations seem to increase legislative requirements for the retention of critical information each year. As the business units see the price tag on servers and disk space become more affordable, they cant understand why adding just one more should be a problem. They fail to recognize that the cost of a growing computer room includes more than just the initial cost of the storage units.
The Shocking Cost of Maintaining Storage Units
Most non-IT workers would be shocked to find out that the cost of managing each storage unit can be as much as four to 10 times the original purchase price. In addition to putting a big dent in the IT budget, ever increasing storage units lead to server sprawl and a constantly declining operating efficiency. Increased maintenance can also be disruptive, expensive, and burdensome to the entire enterprise. To solve this problem, system engineers have been working on file virtualization methods to eliminate these issues. Their goal is to reduce storage and server inefficiencies while permitting infinite growth. Lets take a look at exactly how they intend to accomplish this lofty goal.
Breaking the Tight Connection between Clients, Servers, and Storage
The old strategy of tightly coupling storage space with clients and servers is a big reason that adding a new storage unit becomes expensive to maintain. When machines from a variety of vendors are added to the network, they may not all integrate seamlessly creating individual islands of storage to manage. When applications are physically mapped to a specific server for storage, any changes, including additions, require modifications to this complex mapping algorithm. In some cases, adding a new device or moving a system to a storage unit with more space requires expensive and annoying downtime. This often leads to an under-utilization of the actual storage space, an expensive proposition, because system administrators over-allocate space to minimize the need to take an outage. To break free from this outdated methodology, file virtualization relies on the ability to remove this static mapping process to allow storage resources to freely move between applications as needed without restricting access to the data.
Adding a Layer of Intelligent Design to the Network
File virtualization adds a layer of intelligence to the network to decouple logical data access from the physical retrieval of the actual files. This separates the application and the client from the physical storage devices so that static mapping is no longer needed. With this change, the existing bank of servers can be maintained without disrupting the core system or the users access to valuable information. After implementing a file virtualization strategy, many IT shops find that they can consolidate storage units and increase their overall utilization. In this way, they may be able to simplify the system configuration by decommissioning older storage devices that are no longer needed or that they can go much longer than anticipated without adding additional disk space.
In todays IT world, most shops are finding that using a file virtualization system is not only a best practice,” its a must-do to continue operating. IT shops with budgets that continued to rise each year just a short time ago are seeing their available funds shrink more and more each year. With increasing pressure to reduce costs or keep the flat, file virtualization is also a virtual requirement.
Anthony Ricigliano SABUNG AYAM