In-memory computing can trigger agencies to accelerate.

A profound shift is underway across government that affects how data is stored, accessed and processed. Federal agencies with mission-critical applications are learning to unshackle themselves from slow, disk-bound databases and embrace the tremendous benefits that come with managing data in-memory.

As agencies deal with aging software applications or seek to launch new public-facing Web portals, ensuring system performance is paramount. Caching data in RAM puts the data closer to the user and the applications that are trying to access it. It’s analogous to a chef who puts his knives and ingredients closer to his prep station rather than in remote pantries. Reducing that distance can make a significant impact on how long it takes the chef to prepare a meal.

For agencies that want to help their applications gain speed, in-memory computing works the same way — and can drastically and cost-effectively boost application performance.

Most traditional applications are constructed so that every time a user wants information from the system’s vast database, the application or website has to read that data by querying the database. The growing number of concurrent users an application might have and the growing amount of data that is likely being added to the database often create a huge and unnecessary bottleneck. Furthermore, the data that comes back from the database typically must be converted into application objects in order for the application to use it.

Addressing that choke point is vital to unlocking application speed. Storing data in a layer above the database — called a cache — allows data access to become exponentially faster and reduces connections to the database. The result is an end to the performance issues plaguing most applications. Using in-memory data is the road to success for agencies when they need to showcase system improvements quickly.

Although the decline in the cost of RAM is clearly attractive to budget-conscious agencies, it’s not the only benefit that appeals to federal IT leaders. Four key reasons stand out when discussing why in-memory computing is making inroads at agencies:

1. Speed and acceleration — perfect for today’s analytical needs. In-memory data is accessed in microseconds, resulting in immediate, near-real-time access to critical data. Imagine retrieving data nearly 100 times faster than is possible from disk-based storage accessed across a network. No matter where data resides — whether in an application, in the cloud or within a remote sensor — federal agencies will no longer be dogged by a lack of fast and efficient movement, and users will no longer need to wait for a report of days-old data. With in-memory computing, federal IT teams can analyze data at a speed that improves its relevancy in decision-making, helping agencies meet ever-shrinking decision windows.

2. Easy to use and easy to add. In-memory computing satisfies the “need it now” demands of users waiting for tabulations and evaluations. There is also no simpler way to store data than in its native format in memory. Most in-memory solutions are no longer database-specific, which makes it easy to add to an agency’s current system and platform. No complex APIs, libraries or interfaces are typically required, and there’s no overhead added by conversion into a relational or columnar format. That is true even for agencies that have custom-developed solutions based on open-source technology. For instance, the Java-based standard for caching, Ehcache, is available in an enterprise version. That means agencies running commercial software or open-source applications can turn on the power of distributed caching by changing just a few lines of configuration. There’s no need to rewrite code or rip up applications.

3. Cost savings and enhanced storage capabilities. With a precipitous drop in the cost of RAM in the past decade, in-memory computing has become a budget-friendly option for federal agencies. When procurement officials can buy a 96 gigabyte server for less than $5,000, in-memory storage of data makes smart fiscal and technical sense. Terabyte servers are sized to harness, in memory, the torrent of data coming from mobile devices, websites, sensors and other sources. An in-memory store can act as a central point of coordination for aggregation, distribution and instant access to big data at memory speeds.

For agencies that still rely on mainframes, in-memory computing holds even more appeal because a large portion of their overall IT budgets is likely dedicated to keeping those mainframes running. That is due in part to the way mainframes are traditionally licensed: by how many millions of instructions per second they perform, which is essentially a measurement of how much of the mainframe is used for processing. The more you use, the more you pay. Open-data initiatives are already pushing such costs upward, but by using in-memory computing to “move” data off their mainframes, agencies can reduce their costs by nearly 80 percent.

4. Higher throughput with real-time processing. In-memory computing significantly lowers system latency, which leads directly to dramatically higher throughput. Agencies that run high-volume transactions can use in-memory data to boost processing capacity without adding computing power. During real-time processing for some applications — such as fraud detection and network monitoring — delays of seconds, even milliseconds, won’t cut it. Acceptable performance requires real-time data access for ultra-fast processing, superior reactive response and proactive planning.

In-memory computing offers unprecedented opportunities for innovation within government. Agencies can transform how they access, analyze and act on data by building new capabilities that directly benefit the mission and help them achieve their goals faster.

 

 


Darryn, Graham. “4 ways in-memory computing can bring agencies up to speed– FCW.”

FCW. N.p., 24 Nov 2014. Web. 12 Jan. 2016.