What you don’t know about SSH can hurt you.

SSH is a powerful access protocol that was developed some 20 years ago by Tatu Ylonen of Finland.  The protocol’s primary function is to provide trusted access and encrypt communication in transit to prevent man-in-the-middle attacks.  Once a connection is established, SSH effectively creates an encrypted tunnel to facilitate secure communication between two points.  Since it’s development, the SSH protocol has grown so much in popularity that SSH now comes pre-installed in every Unix, Linux, Mainframe, Mac, and most Network Devices.

The Responsibility Gap

Since SSH comes pre-installed on servers and devices most organizations do not have any group or individual responsible for monitoring SSH activities.  In fact, most businesses make the leap that SSH = Encryption and Encryption = Security.  In this day and age, who doesn’t want more encryption and security?  The premise that encryption alone negates the need for vigilance and oversight of SSH use is dangerously flawed.  Here is why, SSH does encrypt communication but the real formula of SSH is best represented by a more accurate equation of SSH = Access.  SSH access comes in two variants: 1. Interactive (Human to Machine) and 2. Non-Interactive (Machine to Machine).  Furthermore, access to critical resources and data needs to be managed, monitored and controlled.  Thus, closing the SSH responsibility gap should be a Tier 1 priority for an enterprise.

Knowing the Risks

SSH functions by establishing key pairs consisting of a private and public key.  To understand the function of these keys it’s best to use an analogy: A public key is similar to a lock on a door, whereas a private key is similar to a physical key you keep in your pocket.  Presenting a matching private key to a public key grants establishes an encrypted connection.

  • Keys are Self-Provisioned – How comfortable would you be allowing any employee or consultant access to critical applications?
  • SSH Keys Don’t Expire – A key pair created some 20 years ago still work today.
  • SSH Encryption bypass security controls – Those security tools you spent millions of dollars on, yeah they don’t work on SSH encrypted traffic effectively creating a security blind spot.
  • SSH Tunneling – (just what the name implies) enables traffic to traverse routers and avoid being blocked.
  • SSH Keys Are Passed Around – SSH keys are often copied and shard, preventing you from knowing who did what when.
  • Root Level Access:  SSH can provide root (command) level access to systems and data

In short, in the wrong hands, SSH can present the ultimate doomsday scenario for any business.  Which is, providing bad actors with the ability to do all sorts of nefarious things beyond detection within this security blind spot created through SSH.  Fortunately, SSH Communications Security, the inventors of the protocol have developed commercial solutions to help you mitigate these risks.

Permanent link to this article: http://demystifyit.com/what-you-dont-know-about-ssh-can-hurt-you/

Who should be responsible for SSH?

In my job at SSH, I meet with IT executives in many large businesses and government agencies.  Aside from their initial surprise that there is an actual company behind SSH, there is one question comes up most often, namely which functional group within IT should own SSH?  The reason that this is such a struggle is that unlike other IT investments, Open SSH comes pre-installed on servers, networking, and storage gear.  By default, it’s just there to be used, which administrators and application developers use extensively.

Some background, SSH is a protocol that is used by both system administrators and application owners to securely communicate, control machines or to facilitate secure file transfers.  SSH works remarkably well, and the encryption is extremely effective at preventing man in the middle eavesdropping attacks.  However, in the wrong hands, this same SSH encryption can be leveraged to circumvent security controls.  SSH use requires the creation of matching public and private key pairs for authentication.  The public key is primarily used for automation and sometimes by system administrators for single sign-on is placed on the target machines and the private key is either placed on a connecting server (for machine-to-machine use) or is given to a user to facilitate human-to-machine interaction.

There is a notion that the PKI management team should also manage SSH access.  Some vendors add to this belief by claiming that SSH keys are similar to managing certificates.  However, comparing certificates to SSH keys is actually more akin to comparing apples with coconuts, they both provide authentication but the similarities end there.  Unlike certificates, SSH keys are easily copied, easily shared, and by default aren’t set to expire.  Moreover, unlike certificates, SSH is also used extensively for machine-to-machine interaction.  For all these reasons, we do not believe that SSH aligns to the function of PKI and would advise against assigning the responsibility of SSH within this group.

Another group often considered to manage SSH is Cryptography.  It’s easy to see why that’s the case since SSH provides encryption – it seems reasonable that the encryption team should own it.  While this is true, SSH also enables remote interactive command and control of machines extending well beyond the purview of just cryptography.   Instead, it’s our view that the most logical group to own SSH is the identity and access management team.  Why?  Well, SSH Keys = Access.  Therefore, granting, monitoring and revoking access to resources via SSH should adhere to the same, if not more process and rigor used to grant system access for employees, contractors, partners, and suppliers.

The inherent challenge to SSH is that unlike identity and access management that applies to humans, there typically is no on-boarding and off-boarding of SSH Key access, which is something we strongly advocate.  Given all the complexities, providing safe and secure access via SSH we feel that three things are needed:  1. Well defined policies and procedures related to SSH 2. Training and education for key employees and 3. Continuous system monitoring and enterprise software to enforce that the issuance, monitoring, and revocation of SSH access is adhered to.

In conclusion, the SSH protocol is a vital technology that is used extensively throughout every business and government agency the world over which makes protecting and managing SSH access a Tier 1 security concern.  In fact, if you investigate the core of most major breaches that a. exfiltrated large sums of data undetected b. installed software or disabled systems or c. occurred over an extended period of time undetected, it’s likely an indication that SSH had been leveraged by the hackers.

SSH protocol – the inventor of the Secure Shell protocol have developed several thoughtful and purpose-built solutions that address the unique complex security and compliance requirements of SSH to compliment your existing security investments.  To learn more, or to schedule an SSH security risk assessment please feel free to contact me by filling out the contact form within this website.

Permanent link to this article: http://demystifyit.com/who-should-be-responsible-for-ssh/

Video

Beware the Invisible Man Using SSH

Permanent link to this article: http://demystifyit.com/beware-the-invisible-man-using-ssh/

A theory on the 1 billion account hack – and what you should do to avoid being Yahoo’d

how-yahoo-got-hacked
Yahoo has been making a lot of news lately and not for good reason. Marisa Myers failed attempt to turn the company around which resulted in the sale of the company to Verizon for 4.8 Billion has been placed in jeopardy due to its inability to protect and secure its users data.

In September of this year, Yahoo had announced that information pertaining to 500 Million user e-mail accounts had been stolen dating back to 2014. It had taken them two years to discover and report on this loss,

In the immediate aftermath of this announcement, some US senators demanded to know what was learned of the first breach and called for hearings on the matter. But, as the news faded from the media spotlight, so too did the pressure to understand just what happened, and who knew what information when. Fast forward to December, with Yahoo announcing a second data breach, that eclipses the first breach, both in sheer scale over 1 Billion User Accounts, and elapsed time – the breach occurred in 2013.

Horrifically, it’s been reported that the information collected is being sold to hacker groups within the Dark Web and fetching upwards of $300,000 for the information. The information was turned over to US Authorities by an unnamed third party, which upon further investigation deemed it to be credible and formally notified Yahoo. Unfortunately, the downstream effects of this stolen information will be felt for years as hackers seek to exploit this data for financial gain. Therefore, I would urge every yahoo account holder reading this article to do two things right now: 1. Change your password and security questions. 2. Use multi-factor authentication on every web based account you have – bank, credit card, amazon, etc. Do yourself a huge favor, and do that right now.

Back to the important question, just how was someone able to steal user account information for over 1 billion accounts completely undetected? My guess, without having any inside knowledge is that it has all the markings of exploiting SSH Key Access, which is precisely how Snowden, and later Martin stole troves of data from the NSA and how the Government of North Korea stole every bit of sensitive information they could get their hands on from Sony.

SSH creates and encrypted channel that can’t be monitored. That’s what makes SSH so powerful and effective when used for good. However, if used for bad, all those fantastic capabilities render all your security tools – SIEM, DLP tools, etc. ineffective. Which is why we (as the inventors of SSH) strongly advocate that customers implement stronger controls over the entire SSH lifecycle, and recommend the immediate remediation of any SSH Key Access issues within your own company.

If you’d like to learn more about what can and should be done about SSH in your company, please feel free to contact me – I’m happy to help, or at least point you in the right direction.

Permanent link to this article: http://demystifyit.com/a-theory-on-the-1-billion-account-hack-and-what-you-should-do-to-avoid-being-yahood/

May the Brute Force NOT Be With You

bruteforcenotbewithyou
I recently met with a customer that is using username and password instead of keys to control SSH access.  For the past several months I’ve been so engrossed with solving SSH key management issues that I was somewhat taken aback by the approach.  Upon further discussions with some experts on the subject, I’ve come to understand just how dangerous that is.  Here is what I’ve discovered:

SSH Keys are the gold standard for SSH access.  SSH Keys are long and complex, far more than any username and password could be. Keys can be created for different sets of users, different levels of access, and no secret value is ever sent to the server and as such, SSH Keys are not prone to Man in the Middle Attacks.  In fact, modern SSH keys can use an extremely high level of encryption eliminating the possibility of brute force attacks. SSH Key’s definitely have their own security challenges, but there are solutions to eliminate those risks.

On the other hand, passwords are subject to the human element – forgotten passwords, password reuse, simple passwords that are easily guessed, or they are susceptible to brute force attacks.  Passwords are transmitted to the server and are also susceptible to Man in the Middle Attacks.

Okay, after hearing all of this I asked what turned out to be a very naive question – How likely is it that someone could crack an SSH password?  The response was – It can be child’s play, just Google it.  I did, and I experienced the same reaction that the Sheriff in the movie Jaws did from looking at clippings of previous shark attacks – quick, everyone get out of the water!

If your company is taking this approach to managing SSH access, then I strongly advise you to make it a top priority to change right away.  I didn’t come to this conclusion without first doing the research.  Having read the articles, watched the videos, investigated the software programs, and spoken to experts on the topic that I can confidently conclude that relying on username and passwords alone to control SSH access is extremely risky and a very dangerous proposition.  At a minimum, it would be very wise to assume that all username and passwords are compromised and further restrict access through multi-factor authentication (MFA). I’m a fan of SecureAuth, but many identity and access management vendors provide that capability. One thing to remember is that enforcement is key, you can’t simply use MFA for the jumphost (that won’t solve the problem), MFA has to be applied to all servers. If this is going on unabated in your company, we should talk about devising a more secure and comprehensive approach to ssh security.

May the brute force NOT be with you…

Permanent link to this article: http://demystifyit.com/may-the-brute-force-not-be-with-you/

This could be heaven, or this could be hell.

california sign

 

In late February, California’s attorney general Kamala Harris released a breach report that you can find here.  The report requires companies conducting business in the state of California to use “reasonable security procedures and practices…to protect personal information from unauthorized, access, destruction, use, modification, or disclosure.”

Essentially, the reasonable security protocol’s she’s referring to are essentially the SANS top 20 security controls.  However, Ms. Harris expanded on the 20 security controls and emphasized that consumers should have the option to employ Multi-Factor authentication for system access, and use strong encryption of all customer data.  In the event that there is a data breach, she went on to say that the business that had been breached should provide fraud alert services to affected parties.

Even if you don’t conduct any business in the state of California, I believe that it’s just a matter of time before other states follow California’s example and establish similar requirements – so your business should really start planning for that accordingly.  I know, I know – add it to the list.  If you were looking for a irrefutable reason to justify additional security-related project funding, this is report is heaven sent.  However, if your budget can’t grow, re-prioritizing security initiatives will likely be hell.

What a nice surprise (what a nice surprise).  Bring your alibis…

Permanent link to this article: http://demystifyit.com/426-2/

Outside-In To Win

speed-bump-400x400With very few exceptions, most established business operate from the inside out.  The business becomes focused on what’s core (manufacturing, processing, etc.) and they project that focus out to the world.  What I find fascinating, is that most businesses don’t start out that way, but complacency and the mindset of “we’ve always done it this way” drive such behavior.  The further away from the core of the business one moves, the closer one gets to what really drives business, namely customers.  However, it’s been my experience that most established businesses inevitably fall victim to this inside out vs. an outside in perspective.  Of course, the risk of behaving in this way leaves these “established businesses” vulnerable to competition.  Also newer disruptive businesses that are hyper focused on improving user experience along this outer edge are forever changing customer perspectives and expectations.

Netflix is a great example of this “outside in” behavior. While established television, cable, and movie rental providers were focused “inside out”, Netflix was hyper focused on servicing the outer edge of the user experience. For Netflix, this mindset drove their behavior to give their customers what they wanted – on demand entertainment, wherever and whenever they wanted it.  The result of this strategy helped propel Netflix from a nascent media provider, to a fearsome entertainment powerhouse that the established providers are still struggling to contend with.

However, I would submit that the “outside in” approach doesn’t involve a ground breaking idea and could be something as simple as merely removing customer friction in a process.  For example, I recently signed up with a financial service provider over the telephone.  Overall, the process was fairly painless and positive until the representative told me that he would have to mail me several forms to sign.  Although the same representative helped me pre-fill the paperwork, mailing me documents to review and sign introduced what I felt was an unnecessary speed bump in an otherwise smooth process. To add insult to injury, the paperwork took 4 days to show up, and it sat on my desk unopened for about a week and a half.  It wasn’t until I got a reminder phone call from the company that I finally got around to completing the document and sending it back. It wasn’t because I changed my mind about using the service, but I got busy with work and the service became a lower priority.  The truth is, if I didn’t receive that reminder call, there’s a good chance the envelope would still be sitting on my desk unopened.

If the financial service provider had a greater “outside in” perspective, they likely would have been using DocuSign to allow me to complete the entire sign-up process, during the initial call with the representative.  Overall, this would have saved weeks of time, removed frustration, and provided me a very favorable view of the business.   Instead, my perception of this business is that they are a great company, with great people, but they are difficult to do business with.  This perception may not be a fair assessment, but since I had been exposed to the DocuSign technology when I purchased my home, that excellent customer-focused experience rightly or wrongly gave me a new standard for modern day document management.

I submit that established businesses should pay a lot more attention to their customer experience.  Examine all the clicks, telephone punches, phone transfers, and paperwork involved in the customer process and see what can be eliminated either through process re-engineering, or applied technology.  If one takes a Kaizen (continuous improvement) approach, customer satisfaction will soar, and as a result, the business will grow.  Isn’t that worth the effort?

Permanent link to this article: http://demystifyit.com/outside-in-to-win/

What does the Microwave, 3D Printing and Splunk all have in common?

Microwave3DSplunk

Q. What does the Microwave, 3D Printing and Splunk all have in common?  

A. They are all technologies that have introduced a fundamental paradigm shift from the conventional ways of doing things. Before the invention of the microwave oven, if I told you that I could place food in a box, turn it on, and cook food in a fraction of the time that it normally takes without making the box hot you would have looked at me like I was crazy. Prior to the microwave, your reference point would have been either a gas or electric range which incorporated a heating element (fire or heated coils) that would heat your oven.  You place your food into a pre-heated oven, and the ambient heat would in turn heat your food.

If you happen to be old enough to remember the mind blowing experience of witnessing the a microwave in use for the first time – you may have thought that it was some magic trick.  Personally, it took me a little to wrap my head around the technology.  No pan, no pre-heating, no metal – just place the food on a plate, place the plate in the microwave, set the time, hit run and pow – out comes hot food. Most shockingly, when you opened the door to retrieve your food, the inside of the microwave (minus the plate with your food) was cool to the touch – remarkable!  The need to heat food has existed forever, but this new approach to heating and defrosting food was fast, efficient, and radically different.

3D Printing is equally remarkable. Until the invention of the 3D printer, creating a product prototype or one of mechanical part was a manual, time consuming, and often-expensive process. Prior to 3D printing, if you needed to create or recreate something you would have to hand craft a model or prototype, create a mold and have the part fabricated in a special facility. The introduction of the 3D printer changed all that.  Now anyone outfitted with a CAD Cam program and a 3D printer can spin up mechanical designs in just a few hours, resulting in enormous savings of manpower and time.

Splunk is like a microwave and a 3D printer in that introduces a whole new way of accessing and leveraging machine and human generated data in ways that have heretofore have been impossible to achieve. Human generated data is simple enough to understand, that’s all the data we create from filling in information manually.  But just is machine data? Here’s the short answer – any electronic device that has some intelligence built into it generates machine data.  That would include everything from your automobile that you drive and the elevators your ride, to the laptop you type on, and the cell phone you talk on – in short, most everything creates machine data.

For the most part, this machine is often unused and often discarded, which is a huge mistake, here is why.  Imagine your automobile engine is running erratically.  You bring it into mechanic, he/she takes it for a drive, listens for the sound, and simply guesses what it might be, replaces several parts, and sends you on your way – on your drive back from the mechanic, you discover that engine is still operating erratically.  You are out hundreds of dollars, spent hours of you time bringing the car in and the problem still isn’t fixed.

Using the same example, you bring your car into the mechanic, the technician connects a handheld computer to the automobiles on-board interface, and reads the machine data (error codes) from the engine.  The handheld computer interprets these codes and alerts the technician to the malfunctioning part – the repair is made, and you drive home with the problem resolved.  No guesswork, no unnecessary repairs, and the problem is quickly repaired properly the first time.  That’s the power of leveraging machine data.

Just imagine that you could collect this machine information from all over your company, from every device (servers, applications, databases, hardware devices, etc.) and like the automobile example, you could quickly make sense of this data in real-time, how many hours of wasted activity chasing down technical problems could be eliminated?  Better yet, imagine you could break down informational silos and ask questions of this amassed data to help you manage your business more efficiently. Just imagine that you could easily organize and visualize all of this information in meaningful ways, with information updating these visualizations in real-time. Just imagine that you could set alerts, so that when conditions that you’ve outlined are met notifications are sent or programs are triggered to take immediate corrective action. Now imagine a world where finding information no longer involved taking days, weeks or months, but instead only took seconds or minutes. Actually, you can stop imagining, just like the Microwave Oven and 3D Printers, Splunk actually exists.

Permanent link to this article: http://demystifyit.com/what-does-the-microwave-3d-printing-and-splunk-all-have-in-common/

The inherent risk of a fixed focal point security posture

Fixed Focal point

There are inherent limitations to relying upon traditional Security Information & Event Management Systems or SIEMS, which are often overlooked that every organization must be made aware of. These limitations are: 1) SIEM’s fixed focal point and 2) Dependencies upon structured data sources

Maintaining a fixed focal point (or monitoring just a subset of data) only encourages nefarious opportunists to find vulnerabilities outside of this narrow field of vision. Any experienced security professional will say that all data is security relevant. However, traditional SIEM’s limit their field of vision to just a fixed focal point of data. To understand why this matters, let’s look at an example outside of information technology that’s perhaps easier to follow. Imagine for a moment that three are a string of home break-ins happening in your neighborhood.  To safeguard your property you decide to take precautionary measures. You consult a security professional and they make several recommendations – Place deadbolts on the front and back doors, reinforce locks on the first floor windows, set camera’s and alarm systems above the front and back doors and windows. With all this complete, you rest easier feeling far more secure.   This is what a traditional SIEM does. It takes known vulnerability points and monitors them.

Building upon this example, let’s imagine that a bad person comes along and is intent on breaking into your home.  She cases the house, spots the camera’s, and decides that the windows and doors on the first floor pose too much of a risk of detection. After studying the house for a while, she finds and exploits a blind spot in your defenses. Using a coat hanger, she quickly gains access to the home in just six seconds through the garage without any alarm being tripped. How can this be?  Well, your security professional didn’t view a closed garage door as one of your vulnerability points, so no cameras or security measures were installed there.  As a result, your home has been breached, and no alarms have been triggered since the breach occurred outside your monitored field of vision.  This scenario illustrates the inherent limitation of defining a problem based upon anticipated vulnerabilities.  Determined inventive criminals will figure out ways to defeat known defenses that haven’t been considered. That too is the inherent problem of traditional SIEM’s; they are designed to only look at known threats and vulnerabilities, as a result – they do little to no good alerting you to unanticipated threats or vulnerabilities.

Also, the dependence upon structured data sources also creates another serious security limitation. Traditional SIEMS store information in a relational database. The limitation of this approach is that in order to get information from different sources into a database, users first need to define a structure for this information, then force ably make the data adhere to this defined structure. Oftentimes, imposing this structure leads to relevant security information being left out in this process.

To illustrate why this is an issue, let’s imagine that detectives are trained to only look for finger prints when analyzing a crime scene. Their investigations totally ignore any information that isn’t a finger print – they search for finger prints, partial prints, and if they are really advanced, maybe they’ll include hand and foot prints. However, in the course of their investigation they completely ignore collecting blood, hair, saliva or other DNA related evidence. Now, just how effective would a detective be in solving this case if the criminal wore gloves and shoes? I think everyone would agree that the answer to that question is that wouldn’t be a very effective investigator. Well, that’s exactly what happens by limiting the types of data captured by force fit different data types into a standard database schema – running through a schema format process effectively removes lots of relevant information that can be of great help in an investigation.

Instead, since all data is security relevant, to be truly effective, security professionals must have the ability to collect information from all sources of data in its full fidelity. Since traditional SIEM’s strips out this ability, then it follows that no business should solely rely upon a traditional SIEM for security – make sense?

Instead, what is needed is more of a fluid approach to security, one that captures information from multiple sources, evaluates all known exploits, and allows you to correlate different information to uncover new potential exploits before a report-able data breach occurs. Splunk’s real-time machine data platform is extremely well suited to that task.

 

Permanent link to this article: http://demystifyit.com/the-inherent-risk-of-a-fixed-focal-point-security-posture/

What is the difference between Business Intelligence and Operational Intelligence?

Dashboard

The differences between Operational Intelligence (OI) and Business Intelligence (BI) can be confusing. Just the name, Business Intelligence sounds like Nirvana. Show of hands, who doesn’t want their business to be intelligent? No, the names are fairly ambiguous so let’s turn to Google define to shed some light on their meaning;

Business intelligence, or BI, is an umbrella term that refers to a variety of software applications used to analyze an organization’s raw data. BI as a discipline is made up of several related activities, including data mining, online analytical processing, querying and reporting.

Operational intelligence (OI) is a category of real-time dynamic, business analytics that delivers visibility and insight into data, streaming events and business operations.

These definitions are helpful, but I think the picture above really illustrates the differences quite clearly.  Business Intelligence comes after the fact which is illustrated by looking in the rear view mirror of a car. Therefore, it’s helpful to think about BI as a reference to where you’ve been or what’s happened in the past.  Yes, you can store information in a data mart or data warehouse, and you can “mine that data”, but that doesn’t fundamentally change the fact that the information you are looking at or analyzing occurred sometime in the past.

On the other hand, operational intelligence is represented in the above photograph as the front windshield of a car depicting what’s happening right now in real-time. If you spot a large pothole in the distance, OI will alert you to that fact, and enable you to make a course correction to avoid ruining your alignment;  whereas, BI will only let you know that you had driven through a pothole as your car is wobbling down the road from all the damage.

Most businesses have the potential to leverage Operational Intelligence for competitive gain, but many are still stuck in the past with traditional BI tools. If you want to really crank up your business, I say it’s time to get real-time and discover what a paradigm shift of moving to OI can do for your business.

Permanent link to this article: http://demystifyit.com/what-is-the-difference-between-business-intelligence-and-operational-intelligence/

Older posts «