And the Award Goes to…Women-Owned Businesses!

Women-owned businesses can now count another victory as the federal government has reached its goal of awarding five percent of the money spent on contractors to businesses owned by women. The government defines a women-owned businesses as those that are at least 51 percent controlled by women. In 1994, the federal government set a goal of awarding five percent of the money it spent on contractors to businesses owned by women (www.nytimes.com). Twenty-two years later, the federal government has for the first time finally met its goal.

According to the Small Business Administration (SBA), small businesses earned nearly 29 percent or $90.7 billion of the government’s contracting dollars during the 2015 fiscal year, which ended on September 30th, 2015. Out of that $90.7 billion earned by small businesses, women-owned businesses captured nearly $18 billion of those dollars. The government set this goal as companies owned by women tend to be younger and smaller than other businesses. Although women-owned businesses may be young and fewer in number, analysis by the department of commerce shows that women-owned business are indeed 21 percent less likely to be awarded government contracts than small businesses that are not women-owned.

The infographic below explains the growth and importance of women-owned businesses in America:

This goal was achieved mostly as a result of rules that were implemented by the government five years ago in 2011, which mandated that agencies set aside specific contracts allowing bids from only women-owned businesses as well as rules making them eligible for no-bid contracts—ultimately permitting women-owned businesses to not only gain experience, but to also provide them with the past performance necessary to win other competitive projects. Another reason why the government was able to meet its goal can be attributed to the Small Business Administration, which increased its outreach efforts over the past several years to teach women entrepreneurs about federal procurement opportunities—guiding them through the often complex process of preparing bids. Maria Contreras-Sweet, the 24th Administrator of the Small Business Administration, also made the recent achievement a top priority.

While five percent may seem like a small number, it is a significant achievement for women in the federal marketplace who have long been underrepresented. With this milestone, Washington has shown all Americans that if the government is determined enough, it can produce positive and impactful outcomes.

 

 

PaaS (Platform as a Service) is the next step towards Cloud Technology

Platform as a service (PaaS) is a category of cloud computing services that provides a platform allowing customers to develop, run, and manage applications without the complexity of building and maintaining the infrastructure typically associated with developing and launching it. PaaS can be delivered in two ways: as a public cloud service from a provider, where the consumer controls software deployment and configuration settings, and the provider provides the networks, servers, storage and other services to host the consumer’s application; or as software installed in private data centers or public infrastructure as a service and managed by internal IT departments.

Platform as a service promises significant savings of both time and money. The biggest names in cloud computing all offer PaaS solutions — as do countless providers that specialize in everything from mapping to content management to mobile app development. A few offerings already comply with the Federal Risk and Authorization Management Program, and a 2013 survey of federal IT professionals found that 95 percent believed their agency would benefit from migrating to PaaS.

Now the reason why PaaS is mysterious for many is a matter of structure and security. Many of the most popular early PaaS solutions, such as Heroku and Engine Yard, were available only in the public cloud, limiting their practical appeal for most federal agencies. Today, however, a wide array of PaaS providers offer private enterprise versions, while Pivotal’s Cloud Foundry and Red Hat’s OpenShift also come in downloadable, open-source versions that can be hosted locally or in a user’s own cloud.

A more significant challenge, however, might be pinning down what qualifies as PaaS. While software as a service (SaaS) is now a familiar concept and the paired pressures of FedRAMP and data center consolidation have put infrastructure as a service (IaaS) on most agencies’ radar, PaaS remains something of the muddle in the middle — more easily defined by what it isn’t than what it is. The National Institute of Standards and Technology has detailed the differences between PaaS and its sibling services, but it boils down to this: In addition to virtualized and easily scalable hardware, PaaS provides a ready-to-use suite of code libraries, change-management tools and other application-building resources that the provider installs and maintains.

Federal Communications Commission CIO David Bray said PaaS lets agencies “ideally begin to build up this library of reusable modules, much like a quilt,” so that functions such as user authentication or map-based data visualization can be built once and then used by many different systems. “Then in the future, if Congress…or the president asks us to do something, it’s not a matter of building a system from scratch.” There are many organizations adopting PaaS. “There are some early adopters scattered throughout government,” Bray said, particularly in the Defense Department and the intelligence community. However, PaaS remains aspirational for many agencies. In the 2013 survey (a Red Hat-sponsored MeriTalk study) that showed overwhelming belief in the benefits of PaaS, just 12 percent of respondents said they were already using it. And although 71 percent said they were at least considering a transition to PaaS, a recent search of FedBizOpps found just one solicitation in the past year that explicitly called for PaaS.

Other IT leaders said the slow embrace likely reflects uncertainty — not about PaaS’ potential benefits but about most agencies’ specific needs and the type of developer skills that will be available. Compared to IaaS, “PaaS has a greater degree of ease and efficiency, but it also comes with a significant loss of freedom,” one agency’s senior developer said. “The needs [can be] so diverse that paying for and committing to a platform as a service doesn’t make a lot of sense right now.”

A year to 18 months down the road, “once things settle down a bit,” the developer added, “that’s when we would commit to PaaS.” And even when an agency is prepared to zero in on a particular platform, there’s still the small matter of payment. With the operation and maintenance of legacy systems consuming 70 percent or more of agency IT budgets, there’s precious little money available to try something new — particularly when a PaaS investment cannot be directly tied to a mission system.

“That’s why we have to make the case to Congress for the initial investment” in PaaS, the FCC’s Bray said. “We need that little bit of breathing room so that we get out of the existing legacy model. Otherwise, the legacy model is just going to get more and more expensive.”

 

Editor’s Note: Ideas inspired from;


Tony K. Schneider. “Can PaaS carve out its place in the federal cloud?– FCW.”

FCW. N.p., Web. 28 Feb. 2016.

Who Will Protect the Government Against Cybercrime?

For the general public, identity theft, online security, and credit card fraud are an ever increasing concern in the technology age in which we live. But what about the federal government? Most people may think the government is impenetrable to cyber threats, hackers, and security breaches, but this is far from true. In fact, hackers are now probing the deepest layers of every federal government agency according to the Department of Homeland Security. The good news is that the cyber intruders are from the Department of Homeland Security. That’s right, DHS is conducting exercises to test vulnerabilities in federal computer systems that contain sensitive data, which are also prime targets for legitimately malicious hackers.

The strategy is part of the greater Cybersecurity National Action Plan (CNAP), a plan proposed by President Obama that takes near-term actions and puts in place a long-term strategy to increase cybersecurity awareness and protections, protect privacy, maintain public safety as well as economic and national security and empower Americans to have more control of their digital security (www.whitehouse.gov).

When we think about how much of our personal information is stored online (e-mail, bank information, online stores, bill payment information, etc.), it’s important to remember that going paperless gives experienced hackers more ease of access to our digitized information especially when operating on unprotected networks or repeatedly using weak or multiple identical passwords. The infographic below explains how hackers think and what little regard they can have when it comes to stealing digital information.

After the 2015 breach of information from the Office of Personnel Management’s (OPM) system wherein the information of 21.5 million United States national security professionals and their families’ information was exposed, the Department of Homeland Security devised a civilian agency cybersecurity strategy as a part of the Cybersecurity National Action Plan. The agency’s utilization of the abovementioned authorized hacking comes as federal government agencies are taking stock of information technology tools and databases that would ultimately send the government into disarray if these tools and databases become compromised (www.nextgov.com).

Although this is not President Obama’s first endeavor to protect the United States against cybercrime, the plan does symbolize Obama’s last effort to ensure progress in an ongoing effort to strengthen the nation’s online security. Since 2010, $73 billion has been spent to protect the nation against organized cybercrime, but the President’s most recent plan will request an additional $19 billion in funding to support cybersecurity activities, which include a commission on enhancing national cybersecurity, public service campaigns, and funds for replacing antiquated, unsecure government information technology.

Just ten years ago, the idea of spending billions on the nations’ digital safety would have been unthinkable, but in 2016 and beyond, the concept of creating multibillion-dollar information technology systems and security is all too real. According to nextgov.com, designing new and more secure systems is an imperative as the latest high-profile hack of a Justice Department computer system leaked the contact information of 9,000 Department of Homeland Security personnel as well as 20,000 Federal Bureau of Investigation employees. With that being said, there is still much work to be done as adversaries of the United States are quickly learning that it may be easier to attack the nations’ cyber networks in both the public and private sector than it is to attack tangible areas.

As the President stated on Tuesday, 9 February 2016, “More and more, keeping America safe is not just a matter of more tanks, more aircraft carriers; not just a matter of bolstering our security on the ground. “It also requires us to bolster our security online” (www.nextgov.com). Obama’s plan allows the Federal Government to acquire new information now and lays forward the conditions necessary for long-term progression in the government’s approach to protecting the cybersecurity of the Federal Government, the private sector, and in personal lives (Fact Sheet: Cybersecurity National Action Plan). Thus, President Obama’s cybersecurity plan is designed not only to protect the government and individual citizens, but to also protect the companies that store vast amounts of sensitive data belonging to the general American public.

 

 

Crowd GPS Technology busy in wooing the world?

Habit of forgetting things often? Propensity to misplace car-keys knowing how irritating it is to look for them whilst you are running late? Well, technology companies have solution for every little problem these days. When you’ve lost something, another set of eyes can spot clues that your own eyes inadvertently ignore.

In 2013, a crowdfunded project known as the Tile became a smash hit, racking up over $2,500,000 in funding from nearly 50,000 backers. The secret to its success? Simple: The Tile promised to help users locate any object attached to the coin-sized Bluetooth-connected tag priced at $20. Similar Project; Phone Halo’s TrackR was trying to establish this idea. However, both these projects TrackR and Tile used similar technology. This technology uses a small handy instrument (TrackR is a small, circular device and Tile is a small, square device). You attach them to things you’d miss if they went missing, and when those things inevitably do go missing, you can use your smartphone app to make the TrackR or Tile beep so as to find them. They also try to find your stuff when it’s farther away near you. Both of these products use the concept of “crowdsourcing” and Crowd GPS.

Crowd GPS is based on the idea that if you can’t find something, say, your keys, maybe someone else can — as long as they also happen to be using the TrackR or Tile app (iOS and Android), with Bluetooth turned on and crowd GPS enabled. Your lost keys will give off a unique identifier that can be detected by other people’s apps, sending you GPS data about where they are.

In addition to using the crowd to learn your lost item’s GPS coordinates, the TrackR app also helps you find things that are close by, and alerts you before you walk away from a spot without bringing your phone or TrackR-labeled device. This works by setting off an alarm on the device when it and your phone are separated by more than 100 feet. Likewise, if you press a tiny button on the TrackR, it can locate your iPhone or Android phone by setting off an alarm on the phone, even if the phone is in silent mode.

So what happens when your Tile/ TrackR can’t be located by going back to the last place your app saw it?

Tile calls it the “Community Find” feature. Turns out, every person who keeps the Tile app open on their iOS device becomes a node in a much larger Tile network. For example, were there 5 Tiles at the Starbucks this morning? Your Tile app took note of them. Your cubicle mate left their Tiled keys at their desk during lunch while you worked straight through? Your Tile app knows that too, even if you and your cube mate don’t. The same will be true for your Tiles.

If there’s a killer ingredient to the Tile, this is it: By leveraging the combined tracking power of thousands of Tile users (er, Tilers?), that paltry 150-foot Bluetooth radius is amplified many times over

While Tile’s ability to notify you of nearby lost items via alarms is helpful, it’s not unique. They need enough people to use its app to make its crowd tracking worthwhile, and they won’t likely use the app unless they have a device. The Community Find method relies on people having the Tile app installed — and running — on their iOS device. If the app is closed, it cannot track the presence of Tiles.

One key factor to remember with both of these networks is that their crowd GPS techniques rely on strong communities of users. That means that people who live in more densely populated areas, like big cities, are more likely to have luck when tapping the crowd for finding lost things. And, of course, the product has to have a lot of people using it in order for the crowd GPS to really work well.

 

Could Pay Disparities Ever Become an Issue of the Past?

If you have ever been curious about whether or not your salary is the same, less, or more than your co-worker who performs the same job duties as your own, you are not alone. In fact, pay inequality is still a huge issue in America with women—especially minority women—experiencing the greatest disparities in wages and benefits. In light of this pay disparity, the federal government is taking the necessary steps to mitigate this long standing wage gap issue.

Indeed, as of Friday, 29 January 2016, President Barack Obama announced an executive action that would mandate companies with over 100 employees or more to report to the federal government how much they pay their employees broken down by race, gender, and ethnicity (www.govexec.com). In 2009, President Obama signed into law the Lilly Ledbetter Fair Pay Act, which allows employees to file lawsuits concerning equal pay for up to six months after a discriminatory paycheck. President Obama’s latest legislation—the Paycheck Fairness Act—concurs with the Lilly Ledbetter Fair Pay Act and is being simultaneously published by the United States Equal Employment Opportunity Commission (EEOC) and the Department of Labor (www.govexec.com).

In 2016, Caucasian women are paid 77 cents for every dollar paid to men on average, while African-American and Hispanic women earn only 64 cents and 55 cents, respectively, for every dollar paid to men on average. For women, the gender pay gap reduces the amount of lifetime earnings and also affects their pensions, which is a substantial reason for the impoverishment of women in their later lives (Gender Pay Gap and the Struggle for Equal Pay). Furthermore, unequal pay based on gender is problematic because women who work full-time, year round in the United States earn $10,876 less per year in median earnings—an economic imbalance for women and their families.

On Friday, 5 February 2016, President will call on Congress to pass the Paycheck Fairness Act. If passed, the Paycheck Fairness Act will establish an essential foundation for progress toward realizing equal pay, as well as encouraging and fostering greater voluntary compliance by employers with existing federal pay laws, including the evaluation of how they are currently paying their employees (www.whitehouse.gov).

Although research explains there has been a lack of progress in closing the pay gap due to the fact that neither men nor women have seen a meaningful increase in median earnings since 2009, unequal pay in the workplace is unacceptable. If employers truly want to level the playing field, being more transparent about pay and compensation is necessary to acknowledge this longstanding issue. The infographic below provides an analysis about why the Paycheck Fairness act is so important for advancing equality in the workplace.

The Paycheck Fairness Act can be seen as a pragmatic measure to ensure that women are paid the same amount as their male counterparts. For now, the federal government is taking steps in the right direction to eliminate the gender pay gap, but for now, the private sector can do its part by disclosing their gender pay gaps as an initial effort to close them.

GSA’s next move; Making acquisition data public.

The General Services Administration is set to open its government-facing Acquisition Gateway to public users in the coming days, allowing contractors and industry acquisition professionals similar access to the aggregated acquisition data that federal acquisition workers now have through the portal.

The gateway is a key to the category management practices that the White House hopes will help the federal government act like large corporations do to get a better handle on the ocean of goods and services it buys every year.

Ten “hallways” offering pricing data on goods and services ranging from IT to office supplies are open on the gateway, said Anne Rung, administrator of the Office of Federal Procurement Policy, in a conference call with reporters following a Jan. 27 online demonstration of the portal. The gateway provides information on pricing, best practices covering acquisition, and models on how to implement those practices, all with an eye to helping federal program officers draft better requirements and federal contracting officers negotiate better contracts.

The portal incorporates user-centric features like “thumbs-up” (or -down) feedback on information; a “solution finder” that lets users enter what they’re looking for and get a refined set of solutions defined by their specific needs; a function that allows federal contracting personnel to “follow” more experienced workers through acquisitions; and the ability to create communities of users with similar challenges.

“A lot of these resources have been created by government,” said Laura Stanton, acting director of strategy management in GSA’s Federal Acquisition Service. Finding them all in one, easily accessible place, however, had been “challenging” before the gateway. GSA has been working on the gateway with federal users since October 2014, and Stanton said federal use has been growing. There are now 5,000 federal acquisition employees enrolled to use the resource. Rung said the goal is to get 10,000 by the end of the year.

In the last few months, Stanton said, GSA has accelerated its efforts to get federal employees to use the portal.

GSA personnel also have been working behind the scenes to manage the information on each hallway, but Rung said GSA will soon name 10 managers responsible for procuring and curating information for all the categories. GSA has not specified and exact date, but John Felleman, the senior innovation specialist overseeing development of the gateway, said the public roll-out will come in a few days. That public access, he said, will allow public contractors to see most of what federal workers do, but shield potentially sensitive information unobtrusively.

Public access, Stanton and Felleman said, is an important stage. Commercial contractors eventually will be able to offer insights for the benefit of federal contracting officers, Felleman said — for instance, information on how to write the most effective and efficient statement of work in a procurement.

 


Mark, Rockwell. “GSA inches towards making acquisition data public– FCW.”

FCW. N.p., 28 Jan 2016. Web. 28 Jan. 2016.

Your Next Job is Hiding in Plain Sight

When it comes to employment and the laws of supply and demand, job seekers will always be hopeful for a large supply of jobs that equally demand their labor. So, for those who are currently searching for work it comes as exciting news that there are currently an abundance of job vacancies around the globe. According to govexec.com, job vacancies in dozens of countries are plentiful. In the United States alone, there were 5.4 million jobs in in 2015—the highest number of unfilled jobs in 15 years (www.govexec.com). Furthermore, countries like Germany, Canada, India and the United Kingdom are attracting more workers than they lose—an indication of a healthy, growing economy.

Although it may be true that there are an abundance of jobs worldwide, companies are finding it increasingly difficult to recruit qualified prospective candidates, especially in technology-related job positions. This mismatch is unfortunate for employers because they are being forced to recruit workers from abroad, where the availability of both experienced and talented workers are more expansive. However, this poses a problem for countries where there are a greater number of citizens leaving than there are citizens entering.

The infographic below explains what qualities job seekers should have in order to be seen as an asset to employers:

Although this infographic does not contain a definitive list of qualities that employers may find desirable, it does highlight some of the most important qualities that a worker should strive to embody as a job seeker. Within the labor market, employers are trying to hire people with familiar qualities and abilities, but this lack of candidate diversity can lead to a decrease in a country’s supply of talent, which can ultimately result in stunted growth and prosperity. While the United States and European countries attract more workers than they lose, countries like China, Israel and Sweden are experiencing just the opposite because employers in these countries largely fail to gain the interest of job seekers and thus, their subsequent relocation.

Therefore, both employers and employees should seek ways to stand out amongst the crowd in ways that are appealing to the masses, yet specific enough to add value to a particular career sector. If countries wish to flourish economically, they should hold on tightly to valuable employees and encourage employers to find ways to improve their desirability to job seeking citizens who may be looking to find employment abroad.

In-memory computing can trigger agencies to accelerate.

A profound shift is underway across government that affects how data is stored, accessed and processed. Federal agencies with mission-critical applications are learning to unshackle themselves from slow, disk-bound databases and embrace the tremendous benefits that come with managing data in-memory.

As agencies deal with aging software applications or seek to launch new public-facing Web portals, ensuring system performance is paramount. Caching data in RAM puts the data closer to the user and the applications that are trying to access it. It’s analogous to a chef who puts his knives and ingredients closer to his prep station rather than in remote pantries. Reducing that distance can make a significant impact on how long it takes the chef to prepare a meal.

For agencies that want to help their applications gain speed, in-memory computing works the same way — and can drastically and cost-effectively boost application performance.

Most traditional applications are constructed so that every time a user wants information from the system’s vast database, the application or website has to read that data by querying the database. The growing number of concurrent users an application might have and the growing amount of data that is likely being added to the database often create a huge and unnecessary bottleneck. Furthermore, the data that comes back from the database typically must be converted into application objects in order for the application to use it.

Addressing that choke point is vital to unlocking application speed. Storing data in a layer above the database — called a cache — allows data access to become exponentially faster and reduces connections to the database. The result is an end to the performance issues plaguing most applications. Using in-memory data is the road to success for agencies when they need to showcase system improvements quickly.

Although the decline in the cost of RAM is clearly attractive to budget-conscious agencies, it’s not the only benefit that appeals to federal IT leaders. Four key reasons stand out when discussing why in-memory computing is making inroads at agencies:

1. Speed and acceleration — perfect for today’s analytical needs. In-memory data is accessed in microseconds, resulting in immediate, near-real-time access to critical data. Imagine retrieving data nearly 100 times faster than is possible from disk-based storage accessed across a network. No matter where data resides — whether in an application, in the cloud or within a remote sensor — federal agencies will no longer be dogged by a lack of fast and efficient movement, and users will no longer need to wait for a report of days-old data. With in-memory computing, federal IT teams can analyze data at a speed that improves its relevancy in decision-making, helping agencies meet ever-shrinking decision windows.

2. Easy to use and easy to add. In-memory computing satisfies the “need it now” demands of users waiting for tabulations and evaluations. There is also no simpler way to store data than in its native format in memory. Most in-memory solutions are no longer database-specific, which makes it easy to add to an agency’s current system and platform. No complex APIs, libraries or interfaces are typically required, and there’s no overhead added by conversion into a relational or columnar format. That is true even for agencies that have custom-developed solutions based on open-source technology. For instance, the Java-based standard for caching, Ehcache, is available in an enterprise version. That means agencies running commercial software or open-source applications can turn on the power of distributed caching by changing just a few lines of configuration. There’s no need to rewrite code or rip up applications.

3. Cost savings and enhanced storage capabilities. With a precipitous drop in the cost of RAM in the past decade, in-memory computing has become a budget-friendly option for federal agencies. When procurement officials can buy a 96 gigabyte server for less than $5,000, in-memory storage of data makes smart fiscal and technical sense. Terabyte servers are sized to harness, in memory, the torrent of data coming from mobile devices, websites, sensors and other sources. An in-memory store can act as a central point of coordination for aggregation, distribution and instant access to big data at memory speeds.

For agencies that still rely on mainframes, in-memory computing holds even more appeal because a large portion of their overall IT budgets is likely dedicated to keeping those mainframes running. That is due in part to the way mainframes are traditionally licensed: by how many millions of instructions per second they perform, which is essentially a measurement of how much of the mainframe is used for processing. The more you use, the more you pay. Open-data initiatives are already pushing such costs upward, but by using in-memory computing to “move” data off their mainframes, agencies can reduce their costs by nearly 80 percent.

4. Higher throughput with real-time processing. In-memory computing significantly lowers system latency, which leads directly to dramatically higher throughput. Agencies that run high-volume transactions can use in-memory data to boost processing capacity without adding computing power. During real-time processing for some applications — such as fraud detection and network monitoring — delays of seconds, even milliseconds, won’t cut it. Acceptable performance requires real-time data access for ultra-fast processing, superior reactive response and proactive planning.

In-memory computing offers unprecedented opportunities for innovation within government. Agencies can transform how they access, analyze and act on data by building new capabilities that directly benefit the mission and help them achieve their goals faster.

 

 


Darryn, Graham. “4 ways in-memory computing can bring agencies up to speed– FCW.”

FCW. N.p., 24 Nov 2014. Web. 12 Jan. 2016.

Switching the Career Gears to the Millennial Generation

As the American population ages, the workplace is transforming in more ways than one. Those belonging to the baby boomer generation have been working all of their adult lives, but are now either retired or nearing retirement. Among the current generation of working baby boomers, one worry stands out more than most: the idea that retirement will not occur by age 65. Fortunately, baby boomers—those born between the years of 1946 and 1965—need not fret because according to govexec.com, this is not a reality. Indeed, by age 68, only 16 percent of baby boomers will be working full-time jobs while millennials will make-up 75 percent of the workforce by 2026 (www.govexec.com).

In preparation for the career world, millennials are obtaining college degrees and rapidly entering the workforce just in time for baby boomers to make their mass exodus out of the workforce. This is good news for both baby boomers and millennials because as baby boomers are reaching retirement, millennials are more than ready to begin building careers and step into vacant leadership positions once held by their parents and grandparents.

While millennials are eager to become career women and men, 24 percent of millennials feel as though their formal education has not prepared them for leadership roles in the workplace, which will be both a challenge and an opportunity for employers. Although employers will face the burden of having to train millennials to become leaders, employers may gain a competitive advantage because companies that can establish millennials as leaders should grow faster and more profitably than those that are more reluctant to place millennials in leadership positions (www.govexec.com).

As the above infographic shows, the millennial generation is relatively unconventional compared to the generation X and baby boomer generations because millennials have come of age during a period of major technological innovations and economic fluctuations in a globalizing world. As a result, millennials have adopted a set of different behaviors and have experiences unlike their parents. In the workplace, these behaviors and experiences are expected to produce new leadership styles and work environments. In fact, millennials are expected to cultivate greater employee engagement and retention benefits because they place a high value on mentors, leadership, and highly prioritize learning new things, which will make a huge impact in organizational success during this generational career transition (www.govexec.com).

The millennial generation is the biggest in United States history; boasting a population of 92 million young adults (www.goldmansachs.com). If companies want to succeed, it looks as though they will have to put great emphasis on leadership development programs, digital and technological training, employee empowerment, transparency and objectivity in performance criteria, and goal-oriented work.

Baby boomers will not be working forever and organizations will need millennials to dominate their industries by motivating excellence in others to foster and achieve long-term business success. After all, this is the millennial’s world and we are all just living in it.

Continuous Diagnostics and Mitigation program; A Game Changer.

An effective cybersecurity strategy requires more than a periodic safety check. That’s the thinking behind continuous monitoring, a risk management approach that seeks to keep organizations constantly apprised of their IT security status.

The National Institute of Standards and Technology describes continuous monitoring as providing an ongoing awareness of security threats and vulnerabilities. That approach provides a sharp contrast to what has been the federal norm of annual security reviews and more thorough recertifications every three years.

The rapid proliferation of malware and other cyberattacks encourages a faster monitoring tempo. IT security vendor Kaspersky Lab said in late 2013 that it was detecting 315,000 new malicious files each day, up from 200,000 new files per day the previous year. Panda Security, a security solutions provider, reported earlier this year that 20 percent of the malware that has ever existed was created in 2013.

As the onslaught continues, the federal sector has been taking steps to improve its situational awareness. Indeed, agencies have been following continuous monitoring directives and guidelines for a few years now. The Continuous Diagnostics and Mitigation program, which the Department of Homeland Security manages with support from the General Services Administration, is the government’s latest take on continuous monitoring. CDM provides a more comprehensive approach and makes funding available for agencies to adopt the security practice.

The [CDM] program reflects the evolution of continuous diagnostic programs over the past 10 years,” a DHS official said.

However, Ron Ross, a NIST fellow, acknowledged that continuous monitoring is difficult given the number of IT systems in the federal sector and agencies’ diverse missions and business functions. “It is a big job to have a good continuous monitoring program so we can give senior leaders the best information that we can possibly give them,” he added.

Why it matters

The Federal Information Security Management Act (FISMA) of 2002 requires agencies to review their information security programs at least annually, and Office of Management and Budget Circular A-130 calls for agencies to review their systems’ security controls at least every three years.

The government’s current security push, however, favors a more dynamic approach. The emphasis on continuous monitoring reflects the realization that nothing stays the same in the IT environment. The threat landscape changes with each new attack vector and malware variety, while agencies’ systems and networks are subject to frequent reconfiguration.

As a result, a security regimen that keeps the IT infrastructure locked down today might not provide adequate protection tomorrow. The moment-to-moment vigilance of continuous monitoring seeks to ensure that an agency’s security controls remain relevant.

 

Editor’s Note: Ideas inspired from;


John, Moore. “Can CDM change the game?– FCW.”

FCW. N.p., 10 Oct 2014. Web. 22 Dec. 2015.