Speech Recognition: Difficult then, Common now

Speech recognition is almost as natural as breathing for us, but for a computer, it has taken more than half a century to solve this ‘problem’. Previously, the fundamental drawbacks to Speech Recognition were its poor accuracy, sensitivity to noise, over dependence on training to a particular voice and similar problems meant it worked in principle, but not in practice. This has been hugely improved now, it is often reaching the high nineties in percentage terms for several reasons: the general increase in the availability of affordable computing power, the advent of the cloud and the vast numbers of people now using it. Last year, IBM announced a major milestone in conversational speech recognition by building a system that achieved a 6.9 percent word error rate. Since then, it has continued to push the boundaries of speech recognition. Today, it reached a new industry record of 5.5 percent where these are measured on a very difficult speech recognition task: recorded conversations between humans discussing day-to-day topics like “buying a car.” This recorded corpus, known as the “SWITCHBOARD” corpus and has been used for over two decades to benchmark speech recognition systems.

The 21st century has seen many improvements in this field. In the 2000s DARPA sponsored two speech recognition programs: Effective Affordable Reusable Speech-to-Text (EARS) in 2002 and Global Autonomous Language Exploitation (GALE). The National Security Agency has made use of a type of speech recognition for keyword spotting since 2006. This technology allows analysts to search through large volumes of recorded conversations and isolate mentions of keywords. Google‘s first effort at Speech Recognition came in 2007 when its first product “GOOG-411” a telephone based directory service was released. Now, Google voice search is supported in over 30 languages and particularly in 2015, Google’s speech recognition reportedly experienced a dramatic performance jump of 49% through new techniques involving deep learning.

These advancements in Speech Recognition Technology had diversified its application. It has been implemented by many Healthcare and Military organizations:

Health care

Medical documentation

In the health care sector, Speech Recognition is implemented in front-end or back-end of the medical documentation process. In Front-end speech recognition, the provider dictates into a speech-recognition engine, the recognized words are displayed as they are spoken, and the dictator is responsible for editing and signing off on the document. Whereas, in Back-end or deferred speech recognition the provider dictates into a digital dictation system, the voice is recognized and a draft document is made out of it which is routed along with the original voice file to the editor, where the draft is edited and finalized. Deferred speech recognition is widely used in the industry currently.

Therapeutic use

Particularly in short-term-memory re-strengthening of brain AVM patients, the use of speech recognition software in conjunction with word processors has shown significant benefits. Further research needs to be conducted to determine cognitive benefits for individuals whose AVMs have been treated using radiologic techniques.

Military

High-performance fighter aircraft

Significant progress in the test and evaluation of Speech Recognition in fighter aircraft has taken place in the last decade. Of particular note have been the US program in speech recognition for the Advanced Fighter Technology Integration (AFTI)/F-16 aircraft (F-16 VISTA). In this program, speech recognizers have been operated successfully in fighter aircraft, with applications including setting radio frequencies, commanding an autopilot system, setting steer-point coordinates and weapons release parameters, and controlling flight display.

Air Force Chief of Staff Gen. T. Michael Moseley announced Lightning II as the F-35 name during a Joint Strike Fighter inauguration ceremony July 7 at the Lockheed Martin Aeronautics Co. at Fort Worth, Texas. The F-35 Lightning II is the next generation strike fighter bringing cutting-edge technologies to the battlespace of the future. The Lightning II features an advanced airframe, autonomic logistics, avionics, propulsion systems, stealth and firepower. (U.S. Navy photo/Chief Petty Officer Eric A. Clement)

Also, speaker-independent systems are being developed and are under test for the F35 Lightning II (JSF). This system has produced word accuracy scores in excess of 98%.

Training air traffic controllers

Training for air traffic controllers (ATC) represents an excellent application for speech recognition systems. In the current scenario, many ATC training systems need a person to act as a “pseudo-pilot”, engaging in a voice dialog with the trainee controller, which simulates the dialog that the controller would have to conduct with pilots in a real ATC situation. Speech recognition techniques can eliminate the need for a person to act as pseudo-pilot, thus reducing training and support personnel. The USAF, USMC, US Army, US Navy, and FAA as well as a number of international ATC training organizations are currently using ATC simulators with speech recognition from different vendors.

 

Editor’s note: Original Sources


 

How to create Cloud Security

The cloud is not a physical thing, it is a series of servers designed to connect companies, families, and individuals and the cloud is growing. A recent survey consisting of over a thousand IT companies, 48% of which having employee populations over 1,000, reported remarkable statistics regarding cloud usage and cloud growth. 95% percent of the surveyed companies reported cloud usage a 2% increase since 2015. This use is separated into two overarching categories, Public and Private. Public clouds are accessible to all, you only need a computer and an internet connection, an example of such is the newly released Adobe Creative Cloud, which has transformed applications such as adobe acrobat from a box set into a monthly subscription service. Public clouds connect us with strangers and acquaintances alike. Private clouds are more internal, such as the servers that allow employees to share and receive documents without having to even send an email. Private Cloud and Public Clouds are on the rise, 72%, and 89% respectively with regards to use in the surveyed companies. On average respondents of the recent survey also stated that 32% of their data is within public clouds and 43% of their data is held within a private cloud.

Read more

The New Wave of Identification is Finally Here

Biometrics has a wide array of potential usages, which is why different agencies and governments are developing the software in order to provide more efficient and secure identification processes.

US Military

The United States military is in the beginning phases of utilizing behavioral biometric software in order for soldiers and other military personnel to access technology. What used to be the territory of a common access key card or passcode is now being modified to be individualized, greatly reducing the risk of threat. This behavioral technology goes further than a thumbprint or an iris scan. It analyzes the way a soldier walks, or the way a technician types on a keyboard and uses a mouse. The same systems somebody is using is constantly collecting data on them. This continuous proof of presence will deliver a cyber security solution that promptly recognizes breaches, can assist in forensic investigations, and guarantees regulatory compliance.

Airport Security

Biometrics have started to become the long-awaited answer for numerous security dilemmas in the United States. U.S. Customs and Border Protection has begun implementing biometric software at airports nationwide. Following a long overdue mandate (15 years), new biometrics will solve the difficulty of scanning the identity of thousands of departing foreign visitors every day. The combination of cloud computing and the latest facial recognition technology will turn a process from about two minutes into only a couple of seconds, countering the rise in foreign travel into the US.

Border Security

The State Department is also developing biometric technology to enhance coordination on border security and migration activities on the US-Mexico border. According to a January report out of the Congressional Research Service, The U.S. and Mexican governments in 2015 actually approved a $75 million Mérida program to help Mexico “develop an automated biometrics system to help agencies collect and share information on criminals and migrants.” This is perhaps an alternative to the estimated $12 billion dollars it would take to build a border wall, according to Senate leader Mitch McConnell.

Editor’s note: Original Source ‘Fedscoop’, ‘Defenseone’


Patrick Tucker. “The Future of Military IT: Gait Biometrics, Software Nets, and Photon Communicators”

Defenseone. N.p.15 June. 2017, Web. 18 July. 2017

FedScoop. “Military testing behavioral ID technology that would replace CAC card”

FedScoop. N.p. 6 July. 2017, Web. 18 July. 2017.

Sabre88 Awarded Contract with GSA’s 8(a) STARS II contract vehicle

Newark, NJ:

Sabre88 Awarded Contract with General Services Administration’s 8(a) STARS II contract vehicle.

Sabre88 is pleased to announce that it has been named as an industry partner on the GSA 8(a) STARS II government-wide acquisition contract (GWAC). STARS II is a small business set-aside contract vehicle that allows government agencies access to unique IT solutions from a varied pool of 8(a) experienced associates. STARS II offers fixed price, time and materials, labor hour, and blended task order types; provides greater flexibility in procuring IT service solutions to meet federal mission requirements. It also offers a shortened procurement lead time, and a $10 billion program ceiling. The scope of the 8(a) STARS GWAC contracts includes all the technology services and associated products required to meet the requesting activity’s necessities derived from the applicable NAICS definition.

s

As a result of Sabre88’s competitive pricing and technical proficiency, it will have access to four functional areas within the vehicle.

“We are extremely pleased to be named as an industry partner with on the STARS II” said Sabre88 CEO Robert Cottingham.

Sabre88 has been providing the federal government with technical services at an incredibly high standard and competitive rate for years.  This STARS II award will allow Sabre88 to continue this standard of excellence and expands the reach of government clientele that can benefit from the resources we provide.

To learn more about how to utilize GSA 8(a) STARS II click HERE

Sabre88’s Ben Bratton wins National Championship

How an Evolving perception of Net Neutrality will benefit Small Businesses.

An Open Internet means users are not limited to guidelines on what to search or when to search, on the contrary an Open Internet, or Net Neutrality, allows for free range of the internet. It fosters growth, as developers are able to create on their whims[1]. Net Neutrality sees consumers pursue faster broadband, as with its regulations, implemented by the Federal Communications Commission, FCC, broadband providers are not allowed to create special “fast lanes” depending on the content[2].

During the previous administration the FCC implemented Open Internet rules, some of which protect free expression and innovation on the internet, while also promoting broadband networks. The FCC’s Open Internet rules link themselves to Title II of the Communications Act of 1934, and Section 706 of the Telecommunications Act of 1996[3]. In particular, Section 202 of Title II of the Communications Act, Discrimination and Preferences, makes it “unlawful for any common carrier to make any unjust or unreasonable discrimination in charges, practices, classifications, regulations, facilities, or services… to make or give any undue or unreasonable preference or advantage to any particular person, class of persons, or locality”[4]. This is all done in attempts to regulate against paid prioritization, which would give financially dominant companies priority of broadband speed, fast lanes, and leave smaller companies with slower un-prioritized lanes[5]. Paid prioritization would make it so streaming giants Netflix and Hulu could block any possible entry, as their broadband would remain at a constant fast speed, allowing one to browse and binge watch shows without the fear of buffering. On the other hand smaller streaming services would find themselves in a viewer decline as their inability to pay for a fast lane would leave their services loading and lagging. Viewers would switch from the small businesses unable to afford a fast lane, to the larger businesses able to maintain a fast, reliable web speed.

By June 12, 2015, the FCC’s Open Internet rules were cemented, ensuring consumers and businesses fair, fast internet[6]. Chairman of the FCC, Thomas Wheeler described the Open Internet Rules as “a referee on the field to protect consumers and innovators on line” Wheeler later says “After a decade of debate these rules finally provide strong safeguards for free expression and innovation on the internet”[7].

To ensure Net Neutrality continued beyond 2015, the Open Internet rules implemented a legal standard for other broadband provider practices, to ensure the rules do not unjustly interfere with small businesses[8].     

Now, in 2017, under a new administration, and under a new FCC chairman, Ajit Pai, the question again is brought up, How will we improve our Net Neutrality laws even more so? Pai has history serving the FCC; he was previously the Federal Communications Commission’s Republican commissioner and while his current views on Net Neutrality are not as voiced as previously where he was stated against it, a recent statement of his is as follows, “I look forward to working with the new Administration, my colleagues at the Commission, members of Congress and the American public to bring the benefits of the digital age to all Americans”[9].

Pai comes into the Administration not too different from how former commissioner Thomas Wheeler entered. Wheeler was a former lobbyist for large cable and wireless companies as well as the president of the National Cable and Telecommunications Association. Many feared Wheeler’s views on Net Neutrality. However, he turned out to be very pro-neutrality throughout his term as FCC Commissioner.

As the internet evolves, so too will Net Neutrality. The unpopular idea of paid prioritization, fast lanes and big businesses vying for web speed against small businesses will be discussed frequently within the coming four years. The best way to be prepared is to be knowledgeable.


Works Cited

[1] (“Open Internet” 2016)

[2] (Green “How Changes to Net Neutrality Laws Could Affect Small Businesses” 2017)

[3] (“Open Internet” 2016)

[4] (“Communications Act” 1934)

[5] (“Why net neutrality activists are pushing for Title II classification for ISPs” 2014)

[6] (“Open Internet” 2016)

[7] (“Open Internet” 2016)

[8] (Green “How Changes to Net Neutrality Laws Could Affect Small Businesses” 2017)

[9] (Albanesius “Trump Picks Net Neutrality Foe as New FCC Chairman” 2017)


Albanesius, Chloe. “Trump Picks Net Neutrality Foe as New FCC Chairman.” Entrepreneur. N.p., 24 Jan. 2017. Web. 28 Feb. 2017.

Communications Act 1934, 36 §§ Title II-202 (a)-202 (c) (1934). Print.

Green, Keegan. “How Changes to Net Neutrality Laws Could Affect Small Businesses.” Entrepreneur. N.p., 22 Feb. 2017. Web. 28 Feb. 2017.

Telecommunications Act , 119 § Title VII-706 (1996). Print.

Dailydot. “Why net neutrality activists are pushing for Title II classification for ISPs.” The Daily Dot. N.p., 20 May 2014. Web. 28 Feb. 2017.

“Open Internet.” Federal Communications Commission. N.p., 25 Aug. 2016. Web. 28 Feb. 2017.

Protect yourself from new FOIA rules as it opens more risks of disclosure

Last summer, Congress passed and President Obama signed into law the FOIA Improvement Act of 2016 (Public Law No. 114-185), which adds to and amends the Freedom of Information Act.

The amendments create a “presumption of openness” limiting the federal government’s discretionary power to withhold requested information only when disclosure would result in “foreseeable harm.”

For those that transact business with or even simply communicate with the government (referred to as “submitters” in FOIA parlance), the FOIA changes mean that submitters such as government contractors and grant recipients must proactively respond when a FOIA request potentially targets confidential and/or proprietary data that has been shared with the government.

Importantly, the 2016 FOIA improvement Act did not change FOIA Exemption 4, which protects from disclosure “trade secrets and commercial or financial information obtained from a person [that is] privileged or confidential.” Under Exemption 4, the government is prohibited from disclosing trade secrets or other proprietary/confidential information that any submitter has shared with the government.

Unlike with some of the other FOIA exemptions, in their interpretation of Exemption 4, courts have determined that the government lacks any discretion to disclose trade secret or commercial confidential/proprietary information in response to a FOIA request.

The 2016 FOIA Improvement Act was passed to accelerate the FOIA process and to compel government FOIA officials to provide as much information as soon as possible in response to a FOIA request. The act now imposes a penalty (i.e., the waiver of the statutory FOIA fees) on the agency for failing to provide a timely FOIA response. The act also requires that the FOIA response segregate exempt information from releasable information in the same document, as an agency can no longer simply refuse to produce any document containing exempt information.

In addition, the Act requires the agency to produce electronic copies of documents/data, which can be instantly disseminated by the requesting party, rather than paper documents, in response to a FOIA request.

Furthermore, the act requires the creation of a federal government FOIA portal that allows the same FOIA request to be simultaneously submitted to multiple agencies. As a result, submitters must be poised to respond immediately as soon as the government provides notice that a FOIA request seeks disclosure of the submitter’s data and/or documents.

As an initial step, whenever any person or entity first shares information/data with the government that it does not want disclosed to any third party, the title page and each subsequent page of the confidential document or data should be plainly marked as containing “confidential and proprietary information which is exempt from disclosure under FOIA.”

Next, when the agency contacts the submitter (as FOIA requires) to tell them that a request seeks the disclosure of their information, the submitter should promptly respond by identifying:

  1. The specific information within each responsive document that is exempt from disclosure.
  2. The particular FOIA exemption (there are nine) that prohibits disclosure (as stated above, Exemption 4 protects trade secrets and confidential/proprietary data)
  3. Why that exemption applies to each identified section of data/information that the submitter seeks to protect.

Also, the submitter (or submitter’s counsel) should attempt to maintain an open dialogue with the assigned agency FOIA official throughout the FOIA process to promptly address and resolve any disagreements about what should and should not be disclosed before the agency takes a final disclosure position, which is often difficult to unwind.

Finally, the submitter must be ready to assert a “reverse FOIA” action to prevent the disclosure of trade secrets or other confidential/proprietary information in the event that the agency disregards the submitter’s exemption recommendations before the agency releases the submitter’s trade secrets and confidential information in response to a FOIA request.

Editor’s note: Original Source ‘Washington Technology’


Doug Proxmire. “New FOIA rules open contractors to more risks of disclosure”

Washington Technology. N.p., Web. 17 February. 2017.

Never Let Down Your Computer Virus Awareness

Operating in today’s internet shrouded atmosphere is getting to be like playing in one of those first person shooter video games where the most aware succeed and the oblivious become chowder. Everyone is at risk from the high profile business to the private user. Even government and industrial networks of various countries have taken big hits from an array of dangerous computer viruses to hit the internet since its inception.

So, indeed, you are in a sort of wild-west arena when you logon, an aptitude for recognizing threats has had to become a staple any business, government or private user can’t be without. Having a top of the line anti-virus software will go a long way towards creating your force field. However, you still have to possess the skill to maneuver around the computer bombs that are there if you “click it” and some, these days, don’t even require a “click”.

Protecting today’s on-line atmosphere is no short of big business. The hackers will keep on trying and the anti-virus companies will keep developing the revisions of their software to combat them. This threat, and its apparent, will always be out there. Hackers are getting more sophisticated and complex as the clock ticks as well. While it is unclear whether the powers that be thought in depth of the attacks that could happen, the launch of the internet was definitely the future. The earliest hacks and implementation of viruses no doubt, had to originate from an individual with an idea to cause havoc. This practice caught like wildfire and created some of the worst viruses in the short history of the internet.

From the early 1990’s on, dangerous and damaging viruses have shocked the world. Take the virus “NIMDA” for instance. In 2001, a week after the 9/11 attack, this virus affected millions of computers. NIMDA’s main thrust was to slow down Internet traffic resulting in widespread network shutdowns. Another, in 2006 was a malicious Trojan horse program called “Storm Worm”. Storm Worm   suckered users into clicking on a subject line in their email; “230 die as storm batters Europe”. Of course the subject line was a fraud and users clicked on the fake link which would enable the perpetrator offsite    to operate a PC remotely. They utilized this path to send spam throughout the Internet. It was estimated Storm Worm affected 10 million PCs.

In 1998, one of the most destructive of viruses came to play. The “CIH” or “Chernobyl virus” infected the Windows 95 and 98 executable file and remained in the machines memory. It would constantly infect other executables within the machine. It is estimated that the CIH virus caused 250 million worth of destruction. 1999 brought in a macro-virus called “Melissa” it was a mass mailer virus that activated in the machine when the user clicked on an email link. The email came from a known source so it would appear ok, especially when the title was ” here is the document you asked for don’t show anyone else “The virus would then immediately seek out the first 50 users listed in the the users Outlook address book and email itself to them. This virus was one of the first utilized in email attachment, “Melissa “caused an estimated $300-$600 million in damage.

And it went on, in 2003, the “SQL Slammer” or “Sapphire” virus targeted servers by generating random IP addresses and discharging itself  this worm affected many businesses, banks and community operations including significant services provided by Bank of America; Continental Airlines and Seattle’s 911 emergency system to name a few. Estimated losses were between 950 million and 1 .2 billion.

Others such as the “Code Red” virus in 2001 activated on July 13 of that year. This virus did not require you to open an email attachment. It simply needed an open Internet connection and then gave you an opening webpage that said “Hacked by Chinese”; it brought down an estimated 400,000 servers including the White House Web server. Its damaging effect is estimated $2.6 billion loss.  The “SobigF” virus got in machines by an email telling the user that they have a security issue, when opened the intruder sends itself and traps the entire address book. This virus replicated itself to the tune of infecting millions of PC’s world-wide. Damages were estimated in the 3-4 billion range.

The first on to do the most damage was the “My Doom” or “Novarg” virus. On 26 January 2000 this virus circled the globe via email swiftly and 152 million computers and countless servers went on the blip. Creation of a huge “denial of service attack” and crippled computer atmospheres causing damages worldwide estimated at 30 billion.

Recent dangerous viruses have been “Poison Ivy” a remote access Trojan were the perpetrator uses backdoor technology to infect the user’s computer .Once installed the perpetrator has control of everything including record audio and video. This virus targets personal information to compromise identities that were proven to be bought and sold globally. This included online banking, shopping, and social security number and birth information reaping.

Cornficker, in 200i is a worm that targeted stealing financial data. A   very complex, difficult to stop virus, Cornficker caused the creation of a coalition of experts decided to stop it.  It was also called “Superbug”. The fact that this virus got into where it wanted and was able to do just about everything stumped. Cornficker has reconfigured several times and each time its effects are more sophisticated. Incredibly, the perpetrators have designed it to track the efforts taken to eradicate it. .

We have a very unique responsibility, being “on-line”. The internet is, at this point, just like any town on the map. There are places to go; there are places not to go. There are places in the internet that you might have to go to that are laced with lurking hackers just waiting for users to make that fateful
“click”. A good part of the battle can be waged here by just being vigilant. While you’re doing your financials, the stock market, shopping and all the day-to-day things that technology has provided you with that “one touch” to get to.

While aptitude to recognizing the “baddies” out there is a strong first suite, you’ll need help. The root of your defense lies in making sure you have a good Anti-Virus program, making sure it is always running and also your virus database is updated very often. Most of the anti-virus software out there has options for making all these concerns automatic. If you do, make sure you check they are all running as scheduled periodically. There are viruses that serve as precursors to bigger threats. What they do is literally turn-off all your anti-virus mechanisms without you knowing it until it’s too late.

So, be careful out there in your computing. Learn the signs that there is something amiss and act on it before taking another click. Once you get to know the common “this doesn’t look right” occurrence, the harder ones to recognize will be more understandable. One tip here, personnel users (because most businesses will not let users do this) DO NOT download any .EXE (executable) program or file without running it through the scanner. You might just be saving your computers life.


 

Brian J. Schweikert “Never Let Down Your Computer Virus Awareness”

Sabre88 LLC. N.p., Web. 19 October. 2016.

Editor’s note: Original Sources;

http://www.crn.com/news/security/190300322/the-10-most-destructive-pc-viruses-of-all-time.htm
http://listdose.com/top-10-most-dangerous-computer-viruses-ever/
http://www.smithsonianmag.com/science-nature/top-ten-most-destructive-computer-viruses-159542266/?no-ist

 

Secret Behind Artificial Intelligence’s Preposterous Power

Spookily powerful artificial intelligence (AI) systems may work so well because their structure exploits the fundamental laws of the universe, new research suggests.

The new findings may help answer a longstanding mystery about a class of artificial intelligence that employ a strategy called deep learning. These deep learning or deep neural network programs, as they’re called, are algorithms that have many layers in which lower-level calculations feed into higher ones. Deep neural networks often perform astonishingly well at solving problems as complex as beating the world’s best player of the strategy board game Go or classifying cat photos, yet know one fully understood why.

It turns out, one reason may be that they are tapping into the very special properties of the physical world, said Max Tegmark, a physicist at the Massachusetts Institute of Technology (MIT) and a co-author of the new research.

The laws of physics only present this “very special class of problems” — the problems that AI shines at solving. “This tiny fraction of the problems that physics makes us care about and the tiny fraction of problems that neural networks can solve are more or less the same.

Last year, AI accomplished a task many people thought impossible: DeepMind, Google’s deep learning AI system, defeated the world’s best Go player after trouncing the European Go champion. The feat stunned the world because the number of potential Go moves exceeds the number of atoms in the universe, and past Go-playing robots performed only as well as a mediocre human player.

But even more astonishing than DeepMind’s utter rout of its opponents was how it accomplished the task.

“The big mystery behind neural networks is why they work so well,” said study co-author Henry Lin, a physicist at Harvard University. “Almost every problem we throw at them, they crack.”

For instance, DeepMind was not explicitly taught Go strategy and was not trained to recognize classic sequences of moves. Instead, it simply “watched” millions of games, and then played many, many more against itself and other players.

Like newborn babies, these deep-learning algorithms start out “clueless,” yet typically outperform other AI algorithms that are given some of the rules of the game in advance.

Another long-held mystery is why these deep networks are so much better than so-called shallow ones, which contain as little as one layer. Deep networks have a hierarchy and look a bit like connections between neurons in the brain, with lower-level data from many neurons feeding into another “higher” group of neurons, repeated over many layers. In a similar way, deep layers of these neural networks make some calculations, and then feed those results to a higher layer of the program, and so on, he said.

To understand why this process works, Tegmark and Lin decided to flip the question on its head.

“Suppose somebody gave you a key. Every lock you try, it seems to open. One might assume that the key has some magic properties. But another possibility is that all the locks are magical. In the case of neural nets, I suspect it’s a bit of both,” Lin said.

One possibility could be that the “real world” problems have special properties because the real world is very special, Tegmark said.

Take one of the biggest neural-network mysteries: These networks often take what seem to be computationally hairy problems, like the Go game, and somehow find solutions using far fewer calculations than expected.

It turns out that the math employed by neural networks is simplified thanks to a few special properties of the universe. The first is that the equations that govern many laws of physics, from quantum mechanics to gravity to special relativity, are essentially simple math problems, Tegmark said. The equations involve variables raised to a low power (for instance, 4 or less).

What’s more, objects in the universe are governed by locality, meaning they are limited by the speed of light. Practically speaking, that means neighboring objects in the universe are more likely to influence each other than things that are far from each other, Tegmark said.

Many things in the universe also obey what’s called a normal or Gaussian distribution. This is the classic “bell curve” that governs everything from traits such as human height to the speed of gas molecules zooming around in the atmosphere.

Finally, symmetry is woven into the fabric of physics. Think of the veiny pattern on a leaf, or the two arms, eyes and ears of the average human. At the galactic scale, if one travels a light-year to the left or right, or waits a year, the laws of physics are the same, Tegmark said.

All of these special traits of the universe mean that the problems facing neural networks are actually special math problems that can be radically simplified.

“If you look at the class of data sets that we actually come across in nature, they’re way simpler than the sort of worst-case scenario you might imagine,” Tegmark said.

There are also problems that would be much tougher for neural networks to crack, including encryption schemes that secure information on the web; such schemes just look like random noise.

“If you feed that into a neural network, it’s going to fail just as badly as I am; it’s not going to find any patterns,” Tegmark said.

While the subatomic laws of nature are simple, the equations describing a bumblebee flight are incredibly complicated, while those governing gas molecules remain simple, Lin added. It’s not yet clear whether deep learning will perform just as well describing those complicated bumblebee flights as it will describing gas molecules, he said.

“The point is that some ’emergent’ laws of physics, like those governing an ideal gas, remain quite simple, whereas some become quite complicated. So there is a lot of additional work that needs to be done if one is going to answer in detail why deep learning works so well.” Lin said. “I think the paper raises a lot more questions than it answers!”

 

Editor’s note: Original Source ‘LiveScience’


Tia Ghose. “The Spooky Secret Behind Artificial Intelligence’s Incredible Power”

LiveScience. N.p., Web. 7 October. 2016.

Sabre88, LLC Breaks Into Top 15 Among ICIC and Fortune’s Inner City 100 Winners

 

Annual ranking showcases the fastest-growing urban businesses in America

For the third time in as many years The Initiative for a Competitive Inner City (ICIC) and Fortune have announced that Sabre88 has been selected for its prestigious 2016 Inner City 100 list. This recognition places Sabre88 in an exemplary lineage of nearly 900 fast-growing and innovative inner city businesses.

Sabre88 ranked 14 overall on the list of 100. Sabre88, which provides consulting services to the federal government, reported 2015 revenues of 2.7 million and a five-year growth rate of 731 percent from 2011-2015.  “We are delighted to earn a spot on the list of fastest growing inner city businesses.  It is a testament to the hard work and dedication of the Sabre88 team serving our government customers each day.”  stated CEO Robert Cottingham.

ICIC’s Inner City 100 is an annually compiled and released list featuring high-power, high-potential businesses from around the country with headquarters in inner cities. Each company is selected by ICIC with help from a national network of nominating partners who seek to identify, spotlight, and further enable the named companies’ innovative urban entrepreneurship. Ranked by revenue growth, the esteemed recipients go on to have their names published in Fortune.

The list can be viewed on the Fortune website here.

In addition to announcing the list, company CEOs were invited to gather for a full-day event featuring thought-provoking sessions, insightful leadership advice, and robust networking opportunities. Past winners have reported meeting future multi-million dollar investors as a result of appearing on the Inner City 100 list and attending the accompanying colloquium.

The rankings for each company were announced at the Inner City 100 Conference and Awards Ceremony on Wednesday, September 14, 2016 at the Aloft Hotel in Boston, MA. Before the awards celebration, winners gathered for a full-day business symposium featuring management case studies from Harvard Business School professors and interactive sessions with top CEOs. Keynote speakers at this year’s event included Interim CEO of Staples Shira Goodman, Chairman and CEO of Pinnacle Group and Inner City 100 alumnus Nina Vaca, and Harvard Business School Professor and ICIC Founder and Chairman Michael E. Porter.  Other speakers included Corey Thomas, CEO of Rapid 7, Loren Feldman of Forbes, Lynda Applegate and Amy Edmondson from Harvard Business School,  John Stuart of PTC, Robert Wallace, CEO of Bithenergy, and Brook Colangelo of Houghton Mifflin Harcourt.

“We are extraordinarily proud of these pioneering entrepreneurs who lead the way in economic revitalization in America’s inner cities,” says Steve Grossman, CEO of ICIC, of the list of 100.

The Inner City 100 program recognizes and supports successful inner city business leaders, and celebrates their role in providing innovation and job creation in America’s cities. These companies strengthen local American economies, provide job opportunities for underrepresented communities, and drive forward economic and social development.

Boasting an average five-year growth rate of 458 percent between 2011 and 2015, the 2016 Inner City 100 winners represent a wide span of geography, hailing from 42 cities and 25 states. Collectively, the winners employed 7,324 people in 2015, and on average over a third of their employees live in the same neighborhood as the company.

Highlights of the 2016 Inner City 100 include:

  • Employ 7,324 workers total in 2015.
  • Created 4,696 new jobs in the last five years.
  • On average, 34% of employees live in same neighborhood as the company.
  • Average company age is 16 years.
  • Average 2015 revenue is $12.2 million.
  • 34% are women-owned.
  • 37% are minority-owned.
  • 6% of the winners are certified B-Corps.
  • 26 industries represented in the top 100.

# # #

Company description:  Sabre88 is a global consulting firm applying capabilities in financial services, billing support, FOIA, IT Help Desk Support, Data Entry and Document Scanning to government and commercial clients. With more than twenty years of combined personnel experience offering strategic solutions, Sabre88 staff advance the firm’s mission to provide civilian and defense agencies of the government with the necessary tools to address emerging challenges. Sabre88 was formed in January of 2008, with a mission to serve both civilian and defense agencies of the federal government. The founder, Robert Cottingham, Jr., started the firm out of a government need for innovative small businesses which provide a 100% customer focused service.

Inner City 100 Methodology: The Initiative for a Competitive Inner City (ICIC) defines inner cities as core urban areas with higher unemployment and poverty rates and lower median incomes than their surrounding metropolitan statistical areas. Every year, ICIC identifies, ranks, and spotlights the 100 fastest-growing businesses located in America’s inner cities. In 2016, Companies were ranked by revenue growth over the five-year period between 2011 and 2015. This list was audited by the independent accounting firm Rucci, Bardaro, and Falzone, PC.

Initiative for a Competitive Inner City (ICIC)

ICIC is a national nonprofit founded in 1994 by Harvard Business School professor Michael E. Porter. ICIC’s mission is to promote economic prosperity in America’s inner cities through private sector investment that leads to jobs, income and wealth creation for local residents. Through its research on inner city economies, ICIC provides businesses, governments and investors with the most comprehensive and actionable information in the field about urban market opportunities. The organization supports urban businesses through the Inner City 100, Inner City Capital Connections and the Goldman Sachs 10,000 Small Businesses programs. Learn more at www.icic.org or @icicorg.

 

FOR IMMEDIATE RELEASE

Contact:

Benjamin Bratton
973-321-4886
bbratton@sabre88.com

Matt Camp, ICIC
(617) 238-3014
mcamp@icic.org