Agencies get another tool for recruiting and hiring student interns, OPM says

Thanks to the Office of Personnel Management, federal agencies will have new tools and resources to help in the recruitment process of student interns. They will be able to use strategic skills to reel in students to specific temporary positions. In a final rule, which focused on hiring authority for post- secondary students, the Office of Personnel Management made it known that eligible students will be those pursuing a bachelors or graduate degree.

The temporary positions for these students will vary from one to four years depending on the term appointment. While the students are in school, they will have to work in General Schedule 11 level or lower. Those who have obtained their degree and fit the criteria of the further eligible requirements (like working a minimum of 640 hours during their employment), will be allowed to be appointed to a permanent position.

President Biden and his administration shared that there has been a struggle in hiring entry-level, young individuals under the age of 30, then that of 10 years ago. In essence, this new authority will give those in academic programs the opportunity and ability to get paid while at school, and while contributing to agencies with the skills they bring into the “work world.” Without the use of normal procedures in seeking employees, agencies can use different methods to recruit diverse and qualified students. For example, use program advertisements on their websites, and/or use third party platforms, essentially creating flexibility in who and how they identify and recruit applicants. With this new authority, vacancies in job positions, retirements, and budget cuts no longer pose a hindrance to federal agencies.

By: D’Andrea Tucker

Source: Agencies get another tool for recruiting and hiring student interns, OPM says | Federal News Network

NSA wants you to be careful where you log in to telework

The National Security Agency (NSA) urges you to be more mindful and careful of logging in to telework at public places (like your local Starbucks or coffee shop for example). As teleworking becomes more common, NSA believes that federal employees and contractors should prioritize securing their data when in public as public Wifi networks are oftentimes unsecured and active Bluetooth signals can give access to private information on your devices.

As the United States faces cyber threats, The Cybersecurity and Infrastructure Security Agency (CISA) has launched a public-private partnership to develop and combat defense strategies and incident response plans to ransomware and threats. In an effort to strengthen the cyber workforce, training has been provided by CISA with certification prep courses and resources to help federal employees strengthen and tighten their cybersecurity skills.

A bill has been formulated and specifies that the crash course training should include the ethical practices, the security risks and a thorough understanding of what the AI technology poses. The bill is being sent to the Senate floor to be voted on.

By: D’Andrea Tucker



Labor Day


On September 5th, 1882 New York City celebrated the first Labor Day to provide tribute to the social and economic achievements of the American worker. This, however was not considered a national holiday at the time, and the second “Labor Day” that was celebrated in 1884 was proposed by the Central Labor Union and it was known as a “Workingmen’s Holiday”.

How it began:

While most National Holidays have a known origin; Labor Days origins are rather unknown, and to this date no one knows who first proposed the idea for a worker’s holiday.

The first who was thought to propose the holiday was a man by the name of Peter J. McGuire, a general secretary for the “Brotherhood of Carpenters and Joiners and a co-founder of the American Federation of Labor”. He was the first to suggest a specific day in order to appreciate those ‘who from rude nature have delved and carved all the grandeur we behold’.

However, some research suggests that Peter J. McGuire was not the first to propose this idea. It is said that in 1882 a man by the name of Matthew Maguire, a machinist for the International Association of Machinists in Paterson, N.J proposed the idea while he was serving as secretary of the Central Labor Union in New York.


As with every national holiday, the bill to introduce Labor Day was first introduced by the state of New York, but the law was first passed by the state of Oregon on February 21st, 1887. During this same year, more states, which included Colorado, Massachusetts, New Jersey, and New York enacted legislation and created the “Labor Day Holiday”.

By the end of 1889, Connecticut, Nebraska, and Pennsylvania also enacted the same legislation. By 1894, 23 more states had also followed suit with recognizing Labor Day as a holiday, and on June 28th, 1884 the holiday was passed at a federal level and that it would be celebrated on the first Monday in September of each year.


Highest In 17 Years – Share of U.S. Small Business Raising Pay Picked in April 2018

As mentioned in previous article “Best Time for Small Business”, small businesses are having a hard time finding skilled workers. As a result of low supply and high demand, pay packages offered by small business owners are going more and more aggressive.

According to the statistics provided by National Federation of Independent Business, job opening experienced a significant increase of 35% this season. Actual Compensation rate raised 33% accordingly in response to the tight job market, which reached the highest share since November 2000.

“Small businesses are telling us that they’re optimistic, hiring, and willing to raise wages to find the right employees for their businesses,” said NFIB President and CEO Juanita Duggan.

In response to the 53% job openings this year, up to 30% of small businesses reported that they are planning on keep increasing their payrolls in order to attract the most qualified workers.


American Manufacturing: Past and Future

New proposals to revive the American manufacturing seems to appear every day from politicians, industry leaders, and engineers. Many of these ideas and suggestions have merit, however, they fail to recognize that manufacturing has been declining for long decades and cannot be simply resolved by tax cuts or trade policy. The United States was primarily an agricultural economy through the 19th century. The industrial revolution then, late 19th century, swept the landscape and transformed America once and forever. America than standing as the industrial powerhouse of the world by the 1950s produced more goods than any other nation in the world. Manufacturing stayed strong until the late-1970s and 1980s, when the US first lost its edge to the Japanese, then to the Chinese, and have now become a service economy that doesn’t produce stuff.

Although America has always been a service-based economy, where the number of employed Americans has been greater in the service sector than in manufacturing since the turn of the 20th century. It is noteworthy that the manufacturing efforts in the US have declined dramatically in the past few decades. Industrial powerhouse cities like Detroit and Tennessee fell victims to the migration of American manufacturing. The loss of manufacturing in America manifested itself most Cleary in job losses, according to the Economist, for the first time since the Industrial Revelation fewer than 10 percent of American workers are employed in manufacturing.

However, in the most recent years, we’ve seen a comeback from some American manufacturers, even Japanese. The proposed 1.6 billion dollars manufacturing planet from Toyota and Mazda has left the states in a bidding war. The planet that is still in the solicitation processes of finding a home state is estimated to employ 4 thousand employers and tag along with a few thousand indirect jobs. This planet will be a major boost for the American economy and its manufacturing sector.

Author: Faris Souman, Sabre88 LLC

Editor’s note: Original Sources

In-memory computing can trigger agencies to accelerate.

A profound shift is underway across government that affects how data is stored, accessed and processed. Federal agencies with mission-critical applications are learning to unshackle themselves from slow, disk-bound databases and embrace the tremendous benefits that come with managing data in-memory.

As agencies deal with aging software applications or seek to launch new public-facing Web portals, ensuring system performance is paramount. Caching data in RAM puts the data closer to the user and the applications that are trying to access it. It’s analogous to a chef who puts his knives and ingredients closer to his prep station rather than in remote pantries. Reducing that distance can make a significant impact on how long it takes the chef to prepare a meal.

For agencies that want to help their applications gain speed, in-memory computing works the same way — and can drastically and cost-effectively boost application performance.

Most traditional applications are constructed so that every time a user wants information from the system’s vast database, the application or website has to read that data by querying the database. The growing number of concurrent users an application might have and the growing amount of data that is likely being added to the database often create a huge and unnecessary bottleneck. Furthermore, the data that comes back from the database typically must be converted into application objects in order for the application to use it.

Addressing that choke point is vital to unlocking application speed. Storing data in a layer above the database — called a cache — allows data access to become exponentially faster and reduces connections to the database. The result is an end to the performance issues plaguing most applications. Using in-memory data is the road to success for agencies when they need to showcase system improvements quickly.

Although the decline in the cost of RAM is clearly attractive to budget-conscious agencies, it’s not the only benefit that appeals to federal IT leaders. Four key reasons stand out when discussing why in-memory computing is making inroads at agencies:

1. Speed and acceleration — perfect for today’s analytical needs. In-memory data is accessed in microseconds, resulting in immediate, near-real-time access to critical data. Imagine retrieving data nearly 100 times faster than is possible from disk-based storage accessed across a network. No matter where data resides — whether in an application, in the cloud or within a remote sensor — federal agencies will no longer be dogged by a lack of fast and efficient movement, and users will no longer need to wait for a report of days-old data. With in-memory computing, federal IT teams can analyze data at a speed that improves its relevancy in decision-making, helping agencies meet ever-shrinking decision windows.

2. Easy to use and easy to add. In-memory computing satisfies the “need it now” demands of users waiting for tabulations and evaluations. There is also no simpler way to store data than in its native format in memory. Most in-memory solutions are no longer database-specific, which makes it easy to add to an agency’s current system and platform. No complex APIs, libraries or interfaces are typically required, and there’s no overhead added by conversion into a relational or columnar format. That is true even for agencies that have custom-developed solutions based on open-source technology. For instance, the Java-based standard for caching, Ehcache, is available in an enterprise version. That means agencies running commercial software or open-source applications can turn on the power of distributed caching by changing just a few lines of configuration. There’s no need to rewrite code or rip up applications.

3. Cost savings and enhanced storage capabilities. With a precipitous drop in the cost of RAM in the past decade, in-memory computing has become a budget-friendly option for federal agencies. When procurement officials can buy a 96 gigabyte server for less than $5,000, in-memory storage of data makes smart fiscal and technical sense. Terabyte servers are sized to harness, in memory, the torrent of data coming from mobile devices, websites, sensors and other sources. An in-memory store can act as a central point of coordination for aggregation, distribution and instant access to big data at memory speeds.

For agencies that still rely on mainframes, in-memory computing holds even more appeal because a large portion of their overall IT budgets is likely dedicated to keeping those mainframes running. That is due in part to the way mainframes are traditionally licensed: by how many millions of instructions per second they perform, which is essentially a measurement of how much of the mainframe is used for processing. The more you use, the more you pay. Open-data initiatives are already pushing such costs upward, but by using in-memory computing to “move” data off their mainframes, agencies can reduce their costs by nearly 80 percent.

4. Higher throughput with real-time processing. In-memory computing significantly lowers system latency, which leads directly to dramatically higher throughput. Agencies that run high-volume transactions can use in-memory data to boost processing capacity without adding computing power. During real-time processing for some applications — such as fraud detection and network monitoring — delays of seconds, even milliseconds, won’t cut it. Acceptable performance requires real-time data access for ultra-fast processing, superior reactive response and proactive planning.

In-memory computing offers unprecedented opportunities for innovation within government. Agencies can transform how they access, analyze and act on data by building new capabilities that directly benefit the mission and help them achieve their goals faster.



Darryn, Graham. “4 ways in-memory computing can bring agencies up to speed– FCW.”

FCW. N.p., 24 Nov 2014. Web. 12 Jan. 2016.

Continuous Diagnostics and Mitigation program; A Game Changer.

An effective cybersecurity strategy requires more than a periodic safety check. That’s the thinking behind continuous monitoring, a risk management approach that seeks to keep organizations constantly apprised of their IT security status.

The National Institute of Standards and Technology describes continuous monitoring as providing an ongoing awareness of security threats and vulnerabilities. That approach provides a sharp contrast to what has been the federal norm of annual security reviews and more thorough recertifications every three years.

The rapid proliferation of malware and other cyberattacks encourages a faster monitoring tempo. IT security vendor Kaspersky Lab said in late 2013 that it was detecting 315,000 new malicious files each day, up from 200,000 new files per day the previous year. Panda Security, a security solutions provider, reported earlier this year that 20 percent of the malware that has ever existed was created in 2013.

As the onslaught continues, the federal sector has been taking steps to improve its situational awareness. Indeed, agencies have been following continuous monitoring directives and guidelines for a few years now. The Continuous Diagnostics and Mitigation program, which the Department of Homeland Security manages with support from the General Services Administration, is the government’s latest take on continuous monitoring. CDM provides a more comprehensive approach and makes funding available for agencies to adopt the security practice.

The [CDM] program reflects the evolution of continuous diagnostic programs over the past 10 years,” a DHS official said.

However, Ron Ross, a NIST fellow, acknowledged that continuous monitoring is difficult given the number of IT systems in the federal sector and agencies’ diverse missions and business functions. “It is a big job to have a good continuous monitoring program so we can give senior leaders the best information that we can possibly give them,” he added.

Why it matters

The Federal Information Security Management Act (FISMA) of 2002 requires agencies to review their information security programs at least annually, and Office of Management and Budget Circular A-130 calls for agencies to review their systems’ security controls at least every three years.

The government’s current security push, however, favors a more dynamic approach. The emphasis on continuous monitoring reflects the realization that nothing stays the same in the IT environment. The threat landscape changes with each new attack vector and malware variety, while agencies’ systems and networks are subject to frequent reconfiguration.

As a result, a security regimen that keeps the IT infrastructure locked down today might not provide adequate protection tomorrow. The moment-to-moment vigilance of continuous monitoring seeks to ensure that an agency’s security controls remain relevant.


Editor’s Note: Ideas inspired from;

John, Moore. “Can CDM change the game?– FCW.”

FCW. N.p., 10 Oct 2014. Web. 22 Dec. 2015.

‘Internet of Things’, may change ‘Internet of Everything’

The term; Internet of Things (IoT) emerged which means, a network of physical objects or “things” embedded with electronics, software, sensors, and network connectivity, which are enabled to collect and exchange data. IoT is an ultimate idea which is going to change the entire internet system. Yes, something that neither the Government nor any agencies can afford to ignore.

Internet researchers believe that IoT is the future of internet; shouldn’t we gear-up for this change? This much-hyped idea is not just an alarm, but time for the entire market to evolve. Iot is exponentially much risky, challenging, yet rewarding than any technical arrangement that was deployed yesterday. Increasingly connected, sensor-laden and data-driven systems are poised to change everything from national security to office-space management. The only issue is that, implementing IoT would generate more data, therein increasing complexity which most of the agencies couldn’t handle.

Cisco posits that IoT will generate $4.6 trillion for the public sector before 2025 in value added and costs saved. And although the General Services Administration (GSA) has not yet come close to those sorts of returns, the agency— which manages nearly 10,000 government-owned buildings around the country— has pioneered IoT building management with its GSALink initiative. Read more in the Original article: Internet of Everything: A $4.6 Trillion Public-Sector Opportunity. Collecting 29 million data points per day from myriad sensors throughout its buildings, GSA is able to monitor everything from light use to humidity, enabling the agency to boost productivity and promote good health by optimizing conditions when workers are present and saving on energy costs when they’re not.

Other big adopters include the intelligence community and the Defense Department. Warfighters can benefit from sensors that improve their tactical awareness, while vitals monitors can help commanders know who’s healthy or injured. Gary Hall, chief technology officer for Federal Defense at Cisco said, “I do see the Defense Department out in front [of IoT].” Hall added that there is plenty of room for crossover. Municipal experiments with smart lighting or parking, for instance, could inform similar adoption on agency campuses or military bases. “I’ve been on a lot of military bases, and the parking situation could certainly be improved,” he quipped.

The term “Internet of Things” refers to the physical elements of a connected network — the “things” — while the term “Internet of Everything” is all encompassing including: servers, sensors, data flows between them, people interpreting the data and even people talking to other people about the system.

Now the most important question remains unanswered; Can humans deal with the volume?

The number of connected “things” is expected to balloon from around 16 billion today to 50 billion by 2020, with skyrocketing data generation spurring a need for a 750 percent expansion in data center capacity. Hall pointed to the problem of “big, large data” because both the overall volume and the size of individual files have exploded. That creates a need for pre-processing with machines rather than people. He stated that, “Humans can’t deal with the volume of data we’re producing”. The CTO of Federal Defence concluded by warning the Government agencies quoting, “It’s not something they can avoid.”

Editor’s Note: Ideas inspired from;

Noble, Zach. “Are Agencies Really Ready for the Internet of Things?
— FCW.”FCW. N.p., 1 June 2015. Web. 10 Dec. 2015.

Government Login System; A Critical Business.

Website, an ultimate tool to match and keep-up with the ever-growing market. What makes it so special? The architecture? The feel? The Service? Or the ability to take control as per our convenience? Be it a personal website or a government website. Every website demands for security, especially those involving transactions.

Federal Agency’s website is always on the front lines when it comes to deliver services to the public. Majority of American Population go online to seek government services. According to the survey, the Research Center estimated that 82% of the U.S. Internet users search for information and complete a transaction on a government website, including 40% of those via smartphones. So, the utmost important ingredient here is; Security.

What makes a website secure? Yes the Login Portal to authenticate users. But Federal Agencies have their own protocol, therefore individuals who want to access government applications and services generally must create a username and password for each agency site they visit. And agencies maintain their own identity management systems to authenticate users. This approach invites data redundancy: Same user maintains multiple passwords whilst the government maintains multiple systems for managing credentials. Security suffers as well; weak and stolen passwords rank among the top ways an online system can be compromised.

In response, the federal government has been moving toward an identity management approach that will let people use the same credential to conduct business with multiple agencies, thereby creating a common mechanism for transmitting identity information and introducing stronger authentication. “The usability of secure identity solutions is something that the market has been struggling to improve for years,” said Jeremy Grant, senior executive adviser for identity management at the National Institute of Standards and Technology. “We’ve had no problem developing ‘secure’ identity technologies, but if people don’t use them, then they really don’t offer much security.”

Since the passage of the E-Government Act of 2002, myriad federal services have emerged online. A 2014 Government Accountability Office report noted that agencies operate more than 11,000 websites. As more people make the Web their default choice for government interactions, the need to provide safe access has become even more important. The sharp rise in the use of mobile devices to access federal websites adds another dimension to the security challenge. The White House’s 2012 Digital Government Strategy states that “policies governing identity and credential management may need to be revised to allow the introduction of new solutions that work better in a mobile world.”

In 2009, the White House published a Cyberspace Policy Review that included the need to create a “cybersecurity-based identity management vision and strategy” on a list of 10 action items. That paper led to the launch in 2011 of the National Strategy for Trusted Identities in Cyberspace, which works with private- and public-sector entities to support the development of interoperable identity credentials. That move set the stage for a cloud-based, federated identity management solution.

A NIST-managed National Program Office coordinates NSTIC activities. The office collaborated with the General Services Administration to draft the requirements for the Federal Cloud Credential Exchange and awarded a contract to SecureKey Technologies in 2013 to create the exchange. FCCX was designed to let people use third-party credentials to access federal services online. In addition to improving the user experience, the governmentwide exchange would help agencies sidestep the cost of credentialing the same person numerous times.

FCCX is now known as and falls under the auspices of GSA. The program allows people to use digital credentials provided by government-approved sign-in partners to confirm their identities when requesting access to online government services. When they log in, users consent to share what describes as a “limited set of personally identifiable information.” then serves as the pipeline for transmitting identity information from the sign-in partner to the agency’s online application.

Editor’s Note: Ideas inspired from;

John, Moore. “How many parties does it take to provide a single government login?– FCW.”

FCW. N.p., 22 May 2015. Web. 26 Nov. 2015.