Tuesday, May 16, 2017

Hacked to Death–Yahoo Ignores Repeated Warnings to Get Out of the House

We’ve all seen the classic haunted house horror movies where, after having more than enough warning that something is critically wrong, the lead character and his supporting cast simply won’t get out of the house.  Yahoo seems to have taken on this role over the last five years as the company has been the target of repeated cyber hacking while it’s market cap slid from over $50 billion to less than $5 billion.  Meanwhile, its board of directors and C-level management appear to have largely ignored warnings outside of its in-house security resources, all the while failing to educate themselves about the risks and reasons that cyber security needs to be a board and C-level management priority.  As quick as Yahoo’s trip to the bottom was, for many other firms it may be that the worst is yet to come.   

At a high level, this is a symptom of a more fundamental problem that tech market analyst, Gartner has been warning F1000 companies about lately as it relates to cyber security, which is a failure to innovate. Corporate cultures that suffer from a not-invented-here syndrome, and the tendency to look to insiders and vendors who are cronies of in-house personnel, are just some of the killers of innovation, and sometimes the entire enterprise.  Harvard’s Professor Clay Christensen, author and former consultant to Apple’s Steve Jobs, is fond of saying that most large organizations strip the disruptive innovation out of good ideas before they can even get started, in part because they lack a clear process to capture and implement those ideas.  Failure to innovate is how IBM created Microsoft, Yahoo created Google, Reuters created Bloomberg (albeit all unwittingly), and the list goes on.  Even the Federal Government has recognized that if they identify an innovative solution to a significant problem there is often no process to procure it.

The real horror is for Yahoo’s investors and clients who know this is no movie.  Customers are already adding digital asset security to their shopping criteria when choosing suppliers, and there is a future coming where the SEC will require disclosures of such risks that directly impact shareholder value.  Add to this the fact that global hacking is now a $2 trillion annual business (more lucrative than drug dealing and not nearly as dangerous), and it’s clear that senior management and boards need to heed the warnings to get out of the house before it’s too late.                







Thursday, April 28, 2016

Taking Aspirin for a Headache Caused by a Brain Aneurysm? The Most Popular Cyber Security Solutions Are Just That.

Billions of dollars are spent each year for cyber security solutions that simply don’t work. Even the newest and most popular solutions providers fail to deliver genuine cyber security because their solutions focus on symptoms and not the real problem.  Nearly all cyber security solutions providers have failed to correctly identify the fundamental problem which lies in a legacy computing architecture that did not anticipate the Internet.  As a result, implementing these solutions is not unlike taking aspirin to get rid of a headache caused by a brain aneurysm. 

In the case of cyber security, the aneurysm is embedded malware and unknown threats that simply can’t be stopped by the most popular solutions providers in the market today.   This includes McAfee, Symantec, Fire Eye, Palo Alto Networks and virtually all legacy solutions providers.   The problem for corporate executives and their boards becomes exacerbated when, after spending potentially millions of dollars on these flawed solutions, IT managers are put in the difficult position of having to defend the expense or explain why they must spend even more.  
The truth is that nearly all of the most popular cyber security solutions providers are completely ineffective against embedded and foreign language malware.  Remarkably, these same solutions providers readily admit that over 90% of computers are already compromised with exactly this type of malware that their solutions can do almost nothing to stop.  As a result, the annual RSA Conference of security solutions providers held in San Francisco each year has become like a cosmetics convention that offers help and hope but few genuine solutions.         
The good news is that genuine cyber security is being achieved by a core group of Silicon Valley technology experts with some new and patented technologies that are quickly being recognized as the way forward by industry and government experts.   These solutions secure endpoints by isolating files and applications in secure containers and use secure processes that can make cyber threats irrelevant.  So if your solutions provider does not secure computing endpoints with secure containerization, isolation and kernel level built in secure processes your headache may be the least of your problems.   

Wednesday, May 28, 2014

Would You Spend $100 to save $1 Billion? If Not, You Could Be the Next Target.

Estimates are that the cost of a relatively simple hack on retail giant Target this past December has reached over $1 billion to date, not to mention the ongoing legal exposure and subsequent damage to its high-value brand.   The ripple effect of over one hundred million credit card numbers being stolen and quickly sold online to a global black-market of ready customers is staggering,  and will ultimately impact millions of consumers, as well as thousands of banks and retailers worldwide.  

Remarkably, Target was one of the largest retail customers of Wall Street darling, Fire Eye, which failed to identify and thwart the hack.  Despite a more than $40 million dollar marketing budget last year, Fire Eye’s advertising no doubt rings hollow with senior management at Target, particularly the CIO and CEO who were terminated in the wake of the hack.  This is just another example of an internal IT group that, because they lacked the needed cyber security domain knowledge, put the fate of their company in some well marketed but flawed legacy cyber security solutions.  It has now been proven that these legacy solutions simply cannot deliver genuine cyber security.   What is particularly poignant is the fact that for as little as $100 per credit card terminal, Target could have secured these critical infrastructure components of the company’s high-value digital information assets.
There are thousands of companies similarly deluded by well-meaning IT marketers, as well as their own generally competent IT managers.   In fact, most IT managers will admit that cyber security is a horse of a different color and is far different from the typical challenges that IT managers regularly face.  In general, IT managers understand the vastness of the problem, but have been led to believe by legacy providers that there is no “silver bullet” solution.   Like any good fallacy, this folklore has some truth in it.  There is no single solution to the problem of cyber security; however there are several components that, when fully integrated and complemented by tools that incorporate IT management’s intimate knowledge of their own business, can push the effectiveness of cyber security to 99.99% to 100%.   The best part is that it’s not expensive. 

I’ll be writing about these very real “silver bullets” and how to properly deploy them in my next article on the subject, some time in June.  Until then, if you can’t wait, email me and I’ll lay out the seven steps for you.  (Ed@vir2us.com)  



Thursday, June 27, 2013

Snowden Classified Data Theft Incident was Avoidable


The Snowden incident, (where a government intelligence worker was able to easily copy and disseminate large amounts of highly classified data), highlights one of the fundamental problems of legacy cyber security and the thinking behind it.   Like many complex technology problems, people without the needed domain knowledge required to identify solutions tend to focus on the symptom, at least in part to cover up the fact that the knowledge is lacking.  Unfortunately, next-generation cyber security technology, which the government is trying to adopt and implement, is a solution that few people in government understand.  However, the Federal Government is not alone in its slowness to implement next-generation cyber security.  Banks, oil, gas, water and power utilities are similarly vulnerable when it comes to protecting digital assets and critical infrastructure. 
The Snowden incident could have easily been avoided with some next-generation digital asset protection.  Snowden’s ability to simply copy terabytes of classified data was possible, at least in part because of a reliance on obsolete technologies, security strategies and processes.  The government (NSA) has for some time focused on the use of high-grade cryptography to protect data and, in this area, commercial firms have tended to follow the government’s lead.  However the advent of the Internet and global networks changed the game significantly with respect to protecting data. 
 
The government tends to use encryption as an all or nothing proposition, encrypting hard drives on computers or databases at the file level.  The problem with this approach is that, once a user has entered the access credentials, the entire file or drive is completely exposed.  Instead, using triplex-authentication in conjunction with folder and record level encryption solves the problem.  In this environment Snowden would have been able to do his job and even bring large amounts of data and data files together but all the data would have remained encrypted, except when viewing query results or a limited number of individual records.  He never would have been able to copy entire files, at least not without triplex authentication notification and approval of a higher-up, and not without the copied files remaining encrypted at the record level.  This means that even if he had gotten approval to copy the data to an external storage medium or the cloud, the file would not be divorced from the triplex authentication access required to view or query the data.  Additional protections are available that would have destroyed the encryption lock if the authentication failed even once, since the files would have been tagged as a copy outside of its home domain.  
There are other considerations and failures that the government says may have occurred in this incident but most of these revolve around manual processes, policies and procedures that are only reliable if they are part of closed-loop processes, and even then rely on timely communication.  Finally, the Federal Government continues to operate with legacy cyber security that provides little or no security once access is achieved. The President recently issued an executive order to address the issue and I recommend firms consider doing the same.

Thursday, May 30, 2013

Why Many Companies Are Failing to Achieve Genuine Cyber Security

There are several key reasons many companies are failing to successfully implement genuine cyber security.  Cyber security was an after-thought of a computer industry that did not envision or plan for the connected world we live in today.   Nearly all cyber security solutions in the market today fail to follow the eight time-tested principles of security, instead relying on a post-attack ability to identify and create lists of known-threats after the damage has been done.  Nearly all solutions available today were not “built-in” but instead sit on top of the OS and rely on it for their functionality.  Another major reason for this failure is that senior managers are looking to IT professionals to solve a problem that is less about IT than it is about process and mathematics.  Few IT professionals are process engineers or mathematicians. 

Next-generation cyber security will be built-into applications and computing environments to create inherently secure processes that do not need to identify threats but rather handle processing in a way that makes such threats irrelevant.  Many still don't realize that the computing platform architectures we are leveraging today are more than thirty years old and reaching the end of their lifecycles.  They were not designed with the Internet in mind, nor did they envision the potential secure computing problems that such an environment would produce. 

Wednesday, May 29, 2013

Is Your Company in Denial about Denial-of-Service Attacks?

Denial-of-service attacks are a direct assault on your company’s online revenue stream.  These attacks are pretty easy for hackers to pull off, and your company should not simply be hoping that it won’t be targeted.  Denial-of-service attacks are not limited to a few high-profile companies—every company with significant online revenue is at risk and the attacks are costing firms $billions.  The bad news about denial-of-service attacks is that legacy cyber-security firms have no genuine solution, in part because most of these firms don’t have the deeper domain knowledge required to problem-solve and innovate in this space.   
  
Back in the very early 1990s when the Internet was still new,  some of the big ISPs like UUNet were positioning themselves to be acquired by big telecom operators (for ex. UUNet was acquired by WorldCom).  I remember a discussion at a network planning session when I noted to UUNet executives that the Internet lacked the identifiers that governed telecom networks and that these would be easy to add to the Internet at this early stage of development.  The response was – well I don’t recall precisely what it was—but it went something like, “we don’t need no stinking identifiers”.  Their attitude was understandable at the time.  Demand for access and bandwidth was already growing at a mesmerizing rate.  All they could think of was how to feed the beast.

The design I had suggested at that time would have identified every user that hopped onto the Internet along with their location, point of access, etc.   Also like telecom networks, it would have assigned them a class-of-service or COS that determined what they were or were not allowed to do.  If for any reason they managed to get on the network without this independent channel authentication (something that was very difficult to do) they were assigned a default class-of-service that allowed them to do almost nothing.  
I recently resurrected this design with my engineering group to create a denial-of-service solution that will be offered by Vir2us this Fall (2013).   I’ve added some cool features and tools that we didn’t have back at that time when processors were slower, storage and memory were not such low-cost commodities, and we lacked cloud based speeds and scalability.  There is some complexity here to be sure, and we’ve created some new IP with these innovations that we expect to license to others, but we know it works because we implemented its older brother in hundreds of early private and public digital networks.

Just how does all this stop denial-of-service attacks?  It’s really quite elegant and will also solve some other annoying problems that plague us about the Internet’s architecture.   A denial-of-service attack is like too many people asking you a variable question all at the same moment rather than in succession.  At some point you simply can’t respond quickly enough and everything stops.  Now imagine that only the people you pre-selected were allowed to ask you questions, and you and they were speaking and hearing in a language known only to you and that select group.  You simply wouldn’t hear the requests made in other languages and therefore would not feel any necessity to respond.  There’s a little more to this of course but you get the idea.  You can get notice of the beta release by subscribing to this blog.