Yesterday, we sponsored the Argyle Webinar “Building a Proactive Framework to Securing the Business”. Our CEO and co-founder Guy Bejerano moderated a great set of industry panelists to share best practices. Our panelists included:
Guy kicked it off with a quick thought leadership perspective about being proactive and thinking like a hacker. He talked about some of the characteristics of hackers – persistent, creative, goal-oriented, and how our security mindset needs to combat this, for example, validating risks in a more continuous fashion and breaking the kill chain.
Guy also discussed some proactive, offensive techniques of getting ahead of the attackers today such as hiring ethical hackers and crowdsourcing the issue via bug bounty programs. The challenge of course is the dependency on specialized humans amidst the current industry cybersecurity talent shortage, as well as the limitations of the current approach– periodic snapshot of risks focused on external attacks.
Guy advocated a move towards automation (i.e. automated security validation or attack simulation) as a way to deliver continuous validation of risks. The benefit is that the solution can be rolled out without impact to users or environment, and is available to all enterprises. The key requirements for an automated solution are the ability to validate risks in real production environments, across the entire kill chain using real hacker breach methods. The benefit of automation is not just in efficiency of executing breach methods (versus humans), but the ability to tackle both internal and external attacks and try out “what if” scenarios.
Next we jumped into the panel discussion. The panel discussed how the economics of the cybersecurity landscape are now skewed towards attackers. Large amounts of money can now be made from successful attacks, and well-funded criminal enterprises are now developing cutting edge tools and malware.
There were several key tips that the panel advocated toward building a more proactive framework:
In cybersecurity, one size doesn’t fit all, so it’s really important to create a security architecture that works for your organization. To start with, you need to understand how your organization makes money. The type of business you have dictates what your high value targets are, and this in turn tells you which assets may be of most interest to attackers. The core security architecture needs to focus on protecting these assets. Aligning with business objectives allows you to inject a red team approach and predict the next layer of attack rather than just layering tools over tools.
In a dynamic organization, data is the only constant. By tracking it, you can identify abnormal patterns of behavior that can be an early indication of issues. In one example, our panelist talked about monitoring their data center distribution center load over a period of time. When they analyze this traffic and the patterns diverge from normal, they can tell when something is wrong and preemptively pinpoint security and network issues.
The panel talked about the importance of information sharing. Data from SIEM or other security detection technologies may not be as interesting until you compare information with peer groups and formal/informal industry groups. Of course information sharing comes with a variety of privacy issues, and these need to be addressed before we can begin to take full advantage of knowledge from peers.
Our panelists discussed how many organizations confuse penetration testing with red team exercises. Red team exercises focus on business-related breach scenarios and are an effective way to test an organization’s security processes. The most important aspect of red team exercises is the ability to “drill” responses, i.e. creating a muscle memory for security teams to know how to respond to an event.
In summary, this was a great webinar with lots of good insights. If you haven’t implemented offensive security techniques or looked at automation to stay a step ahead of attackers, now is a good time to start.