In-memory computing can trigger agencies to accelerate.

A profound shift is underway across government that affects how data is stored, accessed and processed. Federal agencies with mission-critical applications are learning to unshackle themselves from slow, disk-bound databases and embrace the tremendous benefits that come with managing data in-memory.

As agencies deal with aging software applications or seek to launch new public-facing Web portals, ensuring system performance is paramount. Caching data in RAM puts the data closer to the user and the applications that are trying to access it. It’s analogous to a chef who puts his knives and ingredients closer to his prep station rather than in remote pantries. Reducing that distance can make a significant impact on how long it takes the chef to prepare a meal.

For agencies that want to help their applications gain speed, in-memory computing works the same way — and can drastically and cost-effectively boost application performance.

Most traditional applications are constructed so that every time a user wants information from the system’s vast database, the application or website has to read that data by querying the database. The growing number of concurrent users an application might have and the growing amount of data that is likely being added to the database often create a huge and unnecessary bottleneck. Furthermore, the data that comes back from the database typically must be converted into application objects in order for the application to use it.

Addressing that choke point is vital to unlocking application speed. Storing data in a layer above the database — called a cache — allows data access to become exponentially faster and reduces connections to the database. The result is an end to the performance issues plaguing most applications. Using in-memory data is the road to success for agencies when they need to showcase system improvements quickly.

Although the decline in the cost of RAM is clearly attractive to budget-conscious agencies, it’s not the only benefit that appeals to federal IT leaders. Four key reasons stand out when discussing why in-memory computing is making inroads at agencies:

1. Speed and acceleration — perfect for today’s analytical needs. In-memory data is accessed in microseconds, resulting in immediate, near-real-time access to critical data. Imagine retrieving data nearly 100 times faster than is possible from disk-based storage accessed across a network. No matter where data resides — whether in an application, in the cloud or within a remote sensor — federal agencies will no longer be dogged by a lack of fast and efficient movement, and users will no longer need to wait for a report of days-old data. With in-memory computing, federal IT teams can analyze data at a speed that improves its relevancy in decision-making, helping agencies meet ever-shrinking decision windows.

2. Easy to use and easy to add. In-memory computing satisfies the “need it now” demands of users waiting for tabulations and evaluations. There is also no simpler way to store data than in its native format in memory. Most in-memory solutions are no longer database-specific, which makes it easy to add to an agency’s current system and platform. No complex APIs, libraries or interfaces are typically required, and there’s no overhead added by conversion into a relational or columnar format. That is true even for agencies that have custom-developed solutions based on open-source technology. For instance, the Java-based standard for caching, Ehcache, is available in an enterprise version. That means agencies running commercial software or open-source applications can turn on the power of distributed caching by changing just a few lines of configuration. There’s no need to rewrite code or rip up applications.

3. Cost savings and enhanced storage capabilities. With a precipitous drop in the cost of RAM in the past decade, in-memory computing has become a budget-friendly option for federal agencies. When procurement officials can buy a 96 gigabyte server for less than $5,000, in-memory storage of data makes smart fiscal and technical sense. Terabyte servers are sized to harness, in memory, the torrent of data coming from mobile devices, websites, sensors and other sources. An in-memory store can act as a central point of coordination for aggregation, distribution and instant access to big data at memory speeds.

For agencies that still rely on mainframes, in-memory computing holds even more appeal because a large portion of their overall IT budgets is likely dedicated to keeping those mainframes running. That is due in part to the way mainframes are traditionally licensed: by how many millions of instructions per second they perform, which is essentially a measurement of how much of the mainframe is used for processing. The more you use, the more you pay. Open-data initiatives are already pushing such costs upward, but by using in-memory computing to “move” data off their mainframes, agencies can reduce their costs by nearly 80 percent.

4. Higher throughput with real-time processing. In-memory computing significantly lowers system latency, which leads directly to dramatically higher throughput. Agencies that run high-volume transactions can use in-memory data to boost processing capacity without adding computing power. During real-time processing for some applications — such as fraud detection and network monitoring — delays of seconds, even milliseconds, won’t cut it. Acceptable performance requires real-time data access for ultra-fast processing, superior reactive response and proactive planning.

In-memory computing offers unprecedented opportunities for innovation within government. Agencies can transform how they access, analyze and act on data by building new capabilities that directly benefit the mission and help them achieve their goals faster.



Darryn, Graham. “4 ways in-memory computing can bring agencies up to speed– FCW.”

FCW. N.p., 24 Nov 2014. Web. 12 Jan. 2016.

Continuous Diagnostics and Mitigation program; A Game Changer.

An effective cybersecurity strategy requires more than a periodic safety check. That’s the thinking behind continuous monitoring, a risk management approach that seeks to keep organizations constantly apprised of their IT security status.

The National Institute of Standards and Technology describes continuous monitoring as providing an ongoing awareness of security threats and vulnerabilities. That approach provides a sharp contrast to what has been the federal norm of annual security reviews and more thorough recertifications every three years.

The rapid proliferation of malware and other cyberattacks encourages a faster monitoring tempo. IT security vendor Kaspersky Lab said in late 2013 that it was detecting 315,000 new malicious files each day, up from 200,000 new files per day the previous year. Panda Security, a security solutions provider, reported earlier this year that 20 percent of the malware that has ever existed was created in 2013.

As the onslaught continues, the federal sector has been taking steps to improve its situational awareness. Indeed, agencies have been following continuous monitoring directives and guidelines for a few years now. The Continuous Diagnostics and Mitigation program, which the Department of Homeland Security manages with support from the General Services Administration, is the government’s latest take on continuous monitoring. CDM provides a more comprehensive approach and makes funding available for agencies to adopt the security practice.

The [CDM] program reflects the evolution of continuous diagnostic programs over the past 10 years,” a DHS official said.

However, Ron Ross, a NIST fellow, acknowledged that continuous monitoring is difficult given the number of IT systems in the federal sector and agencies’ diverse missions and business functions. “It is a big job to have a good continuous monitoring program so we can give senior leaders the best information that we can possibly give them,” he added.

Why it matters

The Federal Information Security Management Act (FISMA) of 2002 requires agencies to review their information security programs at least annually, and Office of Management and Budget Circular A-130 calls for agencies to review their systems’ security controls at least every three years.

The government’s current security push, however, favors a more dynamic approach. The emphasis on continuous monitoring reflects the realization that nothing stays the same in the IT environment. The threat landscape changes with each new attack vector and malware variety, while agencies’ systems and networks are subject to frequent reconfiguration.

As a result, a security regimen that keeps the IT infrastructure locked down today might not provide adequate protection tomorrow. The moment-to-moment vigilance of continuous monitoring seeks to ensure that an agency’s security controls remain relevant.


Editor’s Note: Ideas inspired from;

John, Moore. “Can CDM change the game?– FCW.”

FCW. N.p., 10 Oct 2014. Web. 22 Dec. 2015.

‘Internet of Things’, may change ‘Internet of Everything’

The term; Internet of Things (IoT) emerged which means, a network of physical objects or “things” embedded with electronics, software, sensors, and network connectivity, which are enabled to collect and exchange data. IoT is an ultimate idea which is going to change the entire internet system. Yes, something that neither the Government nor any agencies can afford to ignore.

Internet researchers believe that IoT is the future of internet; shouldn’t we gear-up for this change? This much-hyped idea is not just an alarm, but time for the entire market to evolve. Iot is exponentially much risky, challenging, yet rewarding than any technical arrangement that was deployed yesterday. Increasingly connected, sensor-laden and data-driven systems are poised to change everything from national security to office-space management. The only issue is that, implementing IoT would generate more data, therein increasing complexity which most of the agencies couldn’t handle.

Cisco posits that IoT will generate $4.6 trillion for the public sector before 2025 in value added and costs saved. And although the General Services Administration (GSA) has not yet come close to those sorts of returns, the agency— which manages nearly 10,000 government-owned buildings around the country— has pioneered IoT building management with its GSALink initiative. Read more in the Original article: Internet of Everything: A $4.6 Trillion Public-Sector Opportunity. Collecting 29 million data points per day from myriad sensors throughout its buildings, GSA is able to monitor everything from light use to humidity, enabling the agency to boost productivity and promote good health by optimizing conditions when workers are present and saving on energy costs when they’re not.

Other big adopters include the intelligence community and the Defense Department. Warfighters can benefit from sensors that improve their tactical awareness, while vitals monitors can help commanders know who’s healthy or injured. Gary Hall, chief technology officer for Federal Defense at Cisco said, “I do see the Defense Department out in front [of IoT].” Hall added that there is plenty of room for crossover. Municipal experiments with smart lighting or parking, for instance, could inform similar adoption on agency campuses or military bases. “I’ve been on a lot of military bases, and the parking situation could certainly be improved,” he quipped.

The term “Internet of Things” refers to the physical elements of a connected network — the “things” — while the term “Internet of Everything” is all encompassing including: servers, sensors, data flows between them, people interpreting the data and even people talking to other people about the system.

Now the most important question remains unanswered; Can humans deal with the volume?

The number of connected “things” is expected to balloon from around 16 billion today to 50 billion by 2020, with skyrocketing data generation spurring a need for a 750 percent expansion in data center capacity. Hall pointed to the problem of “big, large data” because both the overall volume and the size of individual files have exploded. That creates a need for pre-processing with machines rather than people. He stated that, “Humans can’t deal with the volume of data we’re producing”. The CTO of Federal Defence concluded by warning the Government agencies quoting, “It’s not something they can avoid.”

Editor’s Note: Ideas inspired from;

Noble, Zach. “Are Agencies Really Ready for the Internet of Things?
— FCW.”FCW. N.p., 1 June 2015. Web. 10 Dec. 2015.

Government Login System; A Critical Business.

Website, an ultimate tool to match and keep-up with the ever-growing market. What makes it so special? The architecture? The feel? The Service? Or the ability to take control as per our convenience? Be it a personal website or a government website. Every website demands for security, especially those involving transactions.

Federal Agency’s website is always on the front lines when it comes to deliver services to the public. Majority of American Population go online to seek government services. According to the survey, the Research Center estimated that 82% of the U.S. Internet users search for information and complete a transaction on a government website, including 40% of those via smartphones. So, the utmost important ingredient here is; Security.

What makes a website secure? Yes the Login Portal to authenticate users. But Federal Agencies have their own protocol, therefore individuals who want to access government applications and services generally must create a username and password for each agency site they visit. And agencies maintain their own identity management systems to authenticate users. This approach invites data redundancy: Same user maintains multiple passwords whilst the government maintains multiple systems for managing credentials. Security suffers as well; weak and stolen passwords rank among the top ways an online system can be compromised.

In response, the federal government has been moving toward an identity management approach that will let people use the same credential to conduct business with multiple agencies, thereby creating a common mechanism for transmitting identity information and introducing stronger authentication. “The usability of secure identity solutions is something that the market has been struggling to improve for years,” said Jeremy Grant, senior executive adviser for identity management at the National Institute of Standards and Technology. “We’ve had no problem developing ‘secure’ identity technologies, but if people don’t use them, then they really don’t offer much security.”

Since the passage of the E-Government Act of 2002, myriad federal services have emerged online. A 2014 Government Accountability Office report noted that agencies operate more than 11,000 websites. As more people make the Web their default choice for government interactions, the need to provide safe access has become even more important. The sharp rise in the use of mobile devices to access federal websites adds another dimension to the security challenge. The White House’s 2012 Digital Government Strategy states that “policies governing identity and credential management may need to be revised to allow the introduction of new solutions that work better in a mobile world.”

In 2009, the White House published a Cyberspace Policy Review that included the need to create a “cybersecurity-based identity management vision and strategy” on a list of 10 action items. That paper led to the launch in 2011 of the National Strategy for Trusted Identities in Cyberspace, which works with private- and public-sector entities to support the development of interoperable identity credentials. That move set the stage for a cloud-based, federated identity management solution.

A NIST-managed National Program Office coordinates NSTIC activities. The office collaborated with the General Services Administration to draft the requirements for the Federal Cloud Credential Exchange and awarded a contract to SecureKey Technologies in 2013 to create the exchange. FCCX was designed to let people use third-party credentials to access federal services online. In addition to improving the user experience, the governmentwide exchange would help agencies sidestep the cost of credentialing the same person numerous times.

FCCX is now known as and falls under the auspices of GSA. The program allows people to use digital credentials provided by government-approved sign-in partners to confirm their identities when requesting access to online government services. When they log in, users consent to share what describes as a “limited set of personally identifiable information.” then serves as the pipeline for transmitting identity information from the sign-in partner to the agency’s online application.

Editor’s Note: Ideas inspired from;

John, Moore. “How many parties does it take to provide a single government login?– FCW.”

FCW. N.p., 22 May 2015. Web. 26 Nov. 2015.

Sabre88 among the 2015 Washington Technology Fast 50

Newark, NJ – Global consulting firm Sabre88, LLC has recently been named amongst the Washington Technology Fast 50. The Washington Technology Fast 50 recognizes the 50 fastest growing small businesses in the government market considering their compound annual growth rate over the past four years. Washington Technology is known as one of the authoritative sources of competitive intelligence for business providing contract services to the federal marketplace.

Sabree88 has grown at such a fast rate that the company has sealed its spot on the Washington Technology’s 2015 Fast 50. Sabre88 ranked 37th overall where it registered $2.6 million in revenue in 2014 and compound annual growth rate of 63.63%.

“It is an honor to be included amongst the Washington Technology Top 50,” said Robert Cottingham, Founder and CEO, Sabre88. “The secret to navigating today’s federal marketplace is to continue be a step ahead of where you are right now.”

Click here to view rankings, where you can view the Fast 50 and learn who these companies with explosive growth are.

Highlights of the 2015 Washington Technology Fast 50

  • In total, 21 of the Fast 50 are making a second straight appearance. 29 are newcomers.
  • The designation distribution involved 9 Small business companies, 16 8(a) companies.
  • 16 companies with fewer than 50 employees.
  • 26 of the companies in the top 50 are IT service providers.

Company description

Sabre88 is a global consulting firm applying capabilities in financial services, billing support, FOIA, IT Help Desk Support, Data Entry and Document Scanning to government and commercial clients. With more than twenty years of combined personnel experience offering strategic solutions, Sabre88 staff advance the firm’s mission to provide civilian and defense agencies of the government with the necessary tools to address emerging challenges. Sabre88 was formed in January of 2008, with a mission to serve both civilian and defense agencies of the federal government. The founder, Robert Cottingham, Jr., started the firm out of a government need for innovative small businesses which provide a 100% customer focused service.

Small FAR Phrasing Results in Major Impact on Small-Business Contracting Spending

The Federal Acquisition Regulation established by the heads of several agencies, requires that all large companies bidding on prime contracts specify the percentage of awarded dollars that flow through to small-business subcontractors. Section 52219-9 of the FAR Small Business Subcontracting Plan rules were drafted in order to guarantee that small businesses have the maximum practicable opportunity to participate in performing contracts, and to help the government meet its goal of awarding 35.9 percent of all subcontracting dollars to smaller companies. Collectively, agencies have failed to make this mark for the past five years.

Particular phrasing in the FAR complicates the issue. Of planned subcontracting dollars, large companies are required to set aside a percentage of that for small businesses, however, it’s required to be stated as a percentage of total subcontract dollars not a percentage of total contract dollars. This subtle, but crucial distinction means a large prime contractor can pledge to commit 40 percent of its subcontracting dollars to small business, but if the company provides services without the use of subcontractors, it still technically meets its small-business obligation. This lack of commitment proves threatening to smaller firms often seen as crucial engines of job creation in the United States that rely upon subcontractor dollars to continue operating.

Several federal departments have begun altering their procurement policies to require prime contractors to clearly state their small business plans as a percentage of total contract dollars. Along with this, the federal contracting community has begun to urge officials to revise the regulations, which would require action by the Defense Department, the General Services Administration and NASA, which oversee and are responsible for updating the rules.

The push to rework the language has arrived as agencies and policymakers are seeking means to reserve more government work for small companies. The House passed legislation, FY 2015 Department of Defense Appropriations Bill that included measurable changes to long standing small-business contracting rules. The most notable of which makes plans to streamline some of the bidding requirements for small firms, saving them time and money, and would lift the government’s annual small-business contracting target from 23 percent to 25 percent and its total subcontracting goal from 35.9 to 40 percent.

Sabre88, LLC Awarded Nuclear Regulatory Commission Contract in Region III


Newark, New Jersey, April 15, 2013 Sabre88, LLC has been awarded a contract to provide Administrative Support Services to the U.S. Nuclear Regulatory Commission (NRC) in Region III headquartered in Lisle, Illinois. Sabre88 is responsible for providing the management, supervision, benefits, employment, termination, oversight, and assignment of all personnel to perform requirements set forth in the contract.

Sabre88 will primarily be responsible for general typing using word processing equipment or other automated systems. Sabre88 will be required to produce letters, reports, memoranda and other documents of a technical and non-technical nature. They will also ensure that all documents are proofread, grammatically correct, and complete. Sabre88 will maintain a tracking system either manually or by utilizing an existing automated systems for tracking the status of controlled correspondence or action office items within the assigned Region III office.

Sabre88 will also serve as receptionist to the office assigned. The duties will include receiving telephone calls and visitors, making appointments for staff, and personally responding to inquiries of a routine nature involving non-technical information. Sabre88 will also arrange conferences and/or meetings for the staff and make necessary arrangements for conference rooms, audiovisual equipment, or other materials required. Sabre88 will also prepare travel itineraries and make necessary travel arrangements for official travel, prepare vouchers, trip reports, etc as required by each traveler.

The Sabre88 will also establish, maintain or revise an adequate filing system for the office and dispose of records in accordance with an approved records disposition schedule. Sabre88 will also provide other needed support to the office staff scheduling appointments, disseminating mail, maintaining office supplies and forms copying.Sabre88 will be responsible for monitoring time and attendance for the employees and will perform electronic processing profiling and filing of documents using the Agencywide Document Management System (ADAMS).

About Sabre88, LLC Sabre88 is a global consulting firm applying capabilities in technology, public policy, international affairs, and education to government and commercial clients.



A. Lorraine Jones
Sabre88, LLC.
(973) 321-4886
(973) 833-0286