While watching TV today two completely different shows on TV made me think of information security issues and our shortcomings when it comes to data. The first was related to national security where a guest on Campbell Brown on CNN was discussing the successes and failures of our national intelligence agencies. The guest stated we were gathering 1.something billion communications (think email and phone calls) per month and that we don’t have the ability to analyze all that information. After getting my fill of the news I switched over to a cable channel playing Fletch, the 1985 comedy starring Chevy Chase as a reporter following a story. In one scene Fletch is performing some research on his subject. He and his assistant are shown using a microfiche machine looking through old news articles. For those too young to understand please see (http://en.wikipedia.org/wiki/Microfiche). He then goes to the hospital to find medical records on his subject and on the desk in the records room, replete with a bumper sticker that says “I heart my computer”, is a lone computer in this room full of shelves of files. He accesses the green screen and pulls the needed information and the story moves on. What made me think of security is that prior to this past decade we suffered, as Fletch does, with a lack of readily searchable and available security information *think logs. Today we suffer just as the intelligence agencies do in that we have more information than we know what to do with. *think poor guy who has to read the logs. We have gone from “I can’t find it because I don’t have the information” to, well, “I can’t find it because there is too much information”.
As a example, SIEM systems today work to some extent. While they all claim to work perfectly with all types of sources the claims often stray somewhere from worthless to somewhat helpful in reality. Again, just like our issues of today and too much information, we often miss attacks or warning signs due to a lack of correlation of the right kinds of information. These systems take in log files and spit back out alerts fired off via email. This made me think of an issue that has long bothered me…why can’t these systems take data and make it actionable through visual representations versus an email? Why can’t we take all of the traffic logs of our egress devices (i.e firewalls, web proxies, etc.) and represent the data in a visual pattern? Maybe as spheres of different sizes placed on a map that represent sources and destinations, or port and protocol combinations. While we’re at it, why can’t we represent ALL security data visually? To highlight the difference between visual and non-visual representation think of a doctor reading an x-ray film. While it may be beneficial for you to hear it described as “your ulna, which is next to the radius, and connects your hand to the upper arm via the humerus, has a hairline fracture approximately 3 inches above the wrist bone” wouldn’t it be better if he showed you the x-ray and pointed to the site and said “right there, your bone is broken”?
One other unexplored area, which is understandable given that we rarely know how to analyze the data we do have, is in access recertification. In case you’re not familiar, recertification is the process of certifying that an account, or individual, needs a certain level of access within an application or system. For example, a clerk may need the ability to cut checks up to a certain dollar amount while anything over that amount requires the next level of “supervisor” access. While we have systems that facilitate this process by providing detail on current access levels and a nice web interface none of these systems use actual log data to help a “certifier” make a decision. Wouldn’t it be interesting if a manager could certify a person’s level of access, but also have the knowledge of what levels of access or systems that individual has used in say the past month or year? It may be easier to understand what levels of access individuals need if we could simply point to the analysis of the logs and say “that user has not logged into this system, although they have access, in the past year”? Then, let’s make that a visual representation…
Just a thought.
The well known practice and art of providing controls at multiple layers of the infrastructure is nothing new to those involved in the protection of information. As a general rule in defense-in-depth one must assume that at least one layer of the security architecture will fail. The follow-up question is, then what? How well will the systems and data be protected if we lose one layer of control? The current threat environment must also be taken into account when answering that question. Organizations have risks because of the way in which they operate IT, the industry they are in, and the business model they follow. A small manufacturing firm has a different threat and risk profile than a large multi-national bank. But I ask the question, given the advances in technology, is your CIO keeping up? Does he or she understand that security controls are layered with some overlap for a specific purpose? Do they understand the specific risks in their own organization and why the controls were chosen in the first place?
Take for example the use of anti virus. Almost all organizations use AV in some way to protect their systems from the latest malware, virus, and worm threats. But given that new malware threats seems to be growing at an alarming rate that can be directly attributed to the maturing cybercrime organizations run out of Russia, HK, and Eastern Europe and their ever increasing return on investment (run more like businesses than chop shops it seems). Symantec alone counted a 200+% increase in newly detected malware threats from 2007 to 2008. Can your AV keep up? Or better yet, how effective has it been over the last 6 months?
If you said you had an additional layer of security you need to apply to protect your systems, would your CIO understand the difference between policy-based controls that rely on the monitoring of the interaction of system process and applications with the operating system itself and that of AV? If they don’t they may likely fall into the category of answering all questions with, “but we use AV”. I’d question how many of those answering this way understand how effective (or ineffective) their current AV is. If they don’t understand the answer to that question they will likely make the fatal mistake of relying on an out-dated technology (AV) that is ineffective at stopping the new and growing malware threat, rejecting new technology that they don’t understand.
As a side note, infections today seem to come from users browsing the web. And this doesn’t have to be browsing to “suspect” or non-business sites as many mainstream web sites have been compromised over the past few years. The security breach that is self-inflicted is always the hardest to live with as a security professional…all the while knowing you could have done something to stop it if only your CIO could keep up with current security threats and technology.
No, this isn’t an article about Amsterdam. So I was cleaning out an old hard drive and came across a paper I wrote a while back on Microsoft password hash passing attacks using gsecdump and msvctl. While the paper is a bit dated you’ll get the idea of how these tools work and see the results of trying this out in a test environment. The paper is here.
I promised myself I would post more often to the site…but it looks like that didn’t work out so well. Since I have some time while I’m rebuilding my test lab at home I thought I’d quickly write something about the state of information security. I’ve had some time recently to attend conferences and CISO roundtables where various topics were discussed. That fact and based on some recent strategy project work I’ve noticed the same emerging theme…IT IS ALL ABOUT THE DATA.
Starting with the most recent event, I attended the local FBI Infragard meeting in Chicago where one of the presentations discussed a new Senate bill (SB 1490) being introduced that would preempt the state laws around security controls and breach notification. While the bill apparently needs some work I found the requirements around the “minimum” security controls that need to be in place most interesting. While I think the requirements are quite vague and wording such as “A business entity shall implement a comprehensive personal data privacy and security program that includes administrative, technical, and physical safeguards” doesn’t exactly tell us anything we didn’t already know, the bill heads in the right direction. Paraphrasing, it states we as a business must design a system to protect the DATA based on risk.
The reason I bring this topic up is that we, as information security professionals, often get caught up in protecting the network layer as opposed to the data layer. Why? Because the data layer is messy, hard to control, and contains structured data (pulled from applications) and unstructured data (think random file shares). Who owns it may be a mystery and even if we identify an owner sometimes we don’t know how sensitive or critical the data may be (i.e. lack of data classification).
So if I posed the question, what are the most mature controls of your information security program what would you say? I almost can’t stop from answering firewalls, AV, IPS, and vulnerability management. Vary rarely can an organization answer data loss prevention, data classification, encryption controls around sensitive data, and even incident response or forensics. Why? Because it is messy.
Going back to meetings, the meeting prior was a CISO round table event at Washington University in St. Louis. The topic was about unstructured data and how various organizations were dealing with the issue. The overriding theme of course…it is messy. Unstructured data lives in file shares, both one’s you know about and those you do not, and contains everything from benign information to your organizations most critical information assets. One of the presentations was a vendor, Varonis, who has a tool to identify and detail who has access to unstructured data. While I can’t speak for the tool the concept is interesting and shows that some organizations get it, but need some technology to assist their data classification and access control needs. Again, the topic of data is messy. We’d like to talk about the cool new SIEM appliance we just put in, or how our IPS is blocking the advances of organized cybercrime…not so much when it comes to the actual data we are charged with protecting.
Please don’t read too much into my statements and think I’m against the commodity security controls at the network layer. It would be foolish to ignore these controls as they do serve a purpose in protecting our systems and networks. However, if you haven’t’ started then I’d urge you to examine the data you’re attempting to protect and see if you could answer these simple questions: Where is your most sensitive data? Who has what level of access to that data? And the last time it was accessed what did the person do with it?
As hard drives become larger in storage capacity, the areal density (amount of bits manufacturers can stuff on the physical platter) needs to increase as well, which introduces more noise when reading data.
A consortium of hard drive manufacturers (IDEMA) are now pushing to increase the sector sizes from 512 bytes to 4096 bytes. This change means faster seek times and larger capacity. The downside translates to more slack space, but with larger hard drive capacity does it really matter? Probably not to the consumer, but it’ll be interesting to see how much more time this will add to forensics work.
Link to a IDEMA’s whitepaper for more detail.
No, this isn’t a post about using a Bic pen to pick a Kryptonite lock off of a bike. I ran into a problem in that I had an older Kryptonite lock stuck on my bike for the past 3 years. No key, bike shop was no help, Bic pen doesn’t work. My main issue is I paid extra for an aluminum-frame Trek in order to save on weight, and here was this 5 lb. lock stuck, banging around in the carrier, adding unnecessary weight to my ride.
First, do not try to use a standard hacksaw blade. Second, if you’re thinking about bolt cutters you’ll need a pair with 4′ handles and a 300 lb. guy to snap through the metal. I found a simple solution, assuming you have 20 minutes, 1 hand cramp, and about 200 calories to spare.
You can cut through the lock using a standard hacksaw, but you’ll need to change the blade out for a carbide grit rod saw. I used one from Stanley, 10″, Home Depot for $4.48 as seen below.
20 minutes later your lock should look like this:
I know this may seem obvious, but you’ll probably want to cut the lock off from the “lock” side near the base as it is easier to grip while you hack away at the metal. As an alternative, a Dremel tool and a diamond cutting disc will also do the trick. My lack of a corded Dremel and the fact that it was $21 for one disc (and I had to hope one did the job) I went the manual route.
In the last quarter of IS433 – Security Management at DePaul I posed the following questions. While I’m still reading through the research papers I thought I’d post these to the site.
1. There appears to be a set of security controls that are like commodities an organization cannot do without these days. Some examples may be SPAM or web content filtering, antivirus, authentication technologies, remote access controls (VPN, SSL VPN, etc.). What do companies see as the entire profile of commoditized security technologies (those that do not offer a competitive advantage) and those that set them apart from other organizations? Are there security technologies or controls that do create a competitive advantage, and if so find an organization that has and profile how they have used this advantage. (Possible books: The New School of Information Security by Shostack http://www.amazon.com/New-School-Information-Security/dp/0321502787 ).
2. Related to point 1 above, how has this changed the focus of security groups (or security blueprint) in the past few years? Do security groups or functions focus more on risk management and closer alignment with the business, or do they focus more on making sure the commoditized controls above function correctly? What is the difference between risk management and security?
3. Why are IT auditors so annoying? (That may be a rhetorical question). Has SOX, HIPAA, industry regulations, etc. actually helped focus security functions or organizations or simply created distractions? I’d be looking for research in this area and actual numbers if possible.
4. How are security groups, or are they at all, using metrics to show value to an organization? If so how do they differ from IT metrics? Is IT using metrics to show organizational efficiency and cost savings? If no one seems to be using metrics why may this be the case? Too difficult to track, not a priority of IT management? If security groups are using metrics what metrics seem to hold the most value? (Possible books: Beyond Fear Uncertainty and Doubt http://www.securitymanagement.com/article/security-metrics-replacing-fear-uncertainty-and-doubt, or Beyond Fear by Bruce Schneier http://www.schneier.com/book-beyondfear.html).
5. What are the effects of Web 2.0 or social media and user-generated content on information security? How has media helped or hurt security? Are there safe ways to allow social media, or should we even allow social media as an outlet for users of an organization from a security perspective? This one is very broad and can be taken in many directions…
6. Visualization of security events (Possible book: Applied Security Visualization http://www.amazon.com/Applied-Security-Visualization-Raffael-Marty/dp/0321510100/ref=sr_1_1?ie=UTF8&s=books&qid=1241707763&sr=1-1 or the Edward Tufte visualization books if you dare). Can you generate real data and try some of the techniques presented in the book?
7. Cloud computing and the current state with regards to information security. Challenges? Any major legal issues related to compliance? Do you trust the current providers?
Kind of a misleading title, but the fact that I’m sitting in a Caribou Coffee with 6 other people who are all using their laptops to watch movies, surf, or do work, it struck me that everyone outside the corporate walls seems to be using a Mac. Of those 6 people I see 4 MacBook Pros, 1 Macbook, and 1 lonely HP Netbook. It could be that I’m on a college campus and that the Mac is the current “in” laptop to have based on Apple’s genius marketing campaign.
The other side of that campaign is based on the the fact misconception that the OS X is more secure than Windows. In 2007 OS X had 243 total software flaws that required patching versus just 44 for XP and Vista combined (OK, mainly XP since no one is actually using Vista). Also, the current release of the OS 10.5.7 fixes nearly 70 security flaws in OS X. One thing to keep in mind if you’re looking purely at the number is that OS X has many security patches to a singe Windows patch. As anyone on a linux box who has run the yum -y update command recently should be able to tell you, each component of the system requires an individual update or patch. Since OS X is built on many of these open source components it is no wonder the numbers seem to be in Windows’ favor.
If we can assume that OS X is as flawed as, if not more than, Windows then why aren’t we seeing a barrage of attacks against OS X? I think the right question should be is this even a viable platform to attack? If the motive of current attacks is money in the form of credit cards, bank accounts, identities, etc. then we can speculate as to why not. Ignoring the fact that OS X holds a low market share of the market, if most users are college student using Mac’s would it even make sense to compromise a system to access a credit card that has a $500 limit…mainly because the student filled out an application just to get a free $2 tee shirt. Or, is it the fact that Windows is so easily compromised that it makes no sense to go after Macs. I’m going with the latter for now.
While I wrote this a while ago I’m glad I held off on posting it. Since that time a study was conducted at the University of Virgina of incoming freshmen and which OS they picked for their laptop. Seems that Apple is starting to take a larger share of the higher ed. market...well, at least at Virginia. The issue around the total number of vulnerabilities in OS X was also covered today in an IBM ISS meeting I attended to discuss the findings from the X-Force 2008 Security Study.
In thinking through this question and working towards an answer I found my self flip-flopping on which position I was going to take. Just for background purposes, the main driver for this question came from my reading of The Big Switch by Carr. In this book, and another by the same author, Does IT Matter?, Carr eludes to the fact that Information Technology has become so common across companies and industries that it rarely creates a competitive advantage. While I found myself disagreeing at first (mainly because all IT people want to think they are special) I was eventually persuaded by his argument. One thing hat helped sway me was my consulting background and the fact I had an opportunity to see the inside of so many different IT organizations. In thinking back to these companies I started to think of how each did almost the same thing on almost the same platforms. An example is email and the fact that all organizations were all providing the same service to their employees using the same software. It didn’t matter if the organization under review was a large healthcare provider, a bank, or an energy company. The Big Switch, which is actually supposed to be more about cloud computing, brings in the idea that these IT “commodities” can be outsourced to someone who can do the same thing faster and cheaper than your IT staff can do it in-house.
So, if information security is an extension of IT then I would ask if it also will be viewed as common place and therefore subject to the same forces or consolidation? Does information security matter? I made the mistake early on, working at a privately held company not subject to many regulations or security requirements, in thinking that the direction of the program could take any route necessary to meet the end goals. I didn’t view our organization as having any of the tedious compliance requirements. But then, we started to see our clients (mainly healthcare, banks, and energy companies) sending questionnaires asking how “well” we were handling security and what controls we have implemented. Their concern was very simple. They have certain security controls in place to protect their data while it is in their possession…how do intend to protect their data while it is in OUR possession. My days of not worrying about compliance were quickly coming to an end. The reality is that we need to benchmark our controls against our client’s requirements and not other law firms.
To date I am unaware if these questionnaires have actually caused us to gain clients, keep existing clients, or to lose clients. One reason I believe I don’t see a change quite yet is that we are at the “questionnaire” stage. I would assume as we move into “questionnaire followed by assessment” that information security may be able to provide a competitive advantage…obviously in conjunction with reputation and bill rates. One other open item will be how quickly we move from competitive advantage to doing the same as our IT friends. At what point will security be uniformly applied across companies such that it no longer offers a competitive advantage, and more importantly, when and if security will be outsourced?
My prediction is that it will take some time as security (good security) is lacking at too many organizations. Regulations and other industry standards (i.e. PCI DSS) are forcing companies to enhance security, however, I don’t think we have reached the stage of standard security controls being applied across companies. For now it appears to have a neutral effect at least for law firms being neither an advantage nor disadvantage. Then again, we could be one big law firm breach away from that position shifting very quickly in favor of advantage.
Agree of disagree?
When looking for web graphing apps which could take multiple series of data and plot them on a graph suitable for embedding on a web page I came across a simple, and free, Flash graphing application worth mentioning. Although my journey started with AJAX, I soon found the Flash-based graphs amCharts provides were more than sufficient. You can embed these into a website, or a security metrics site, while retaining control of the data. Data is formatted in xml (giving more control over the graph) or in a csv format. The simple to use settings files are also in xml, giving you a lot of control around the graph fonts, colors, and forms.
In fact, the most difficult part of using amChart graphs is going to be pulling data in from the various data sources and updating the xml data files. While this is a work in progress, I thought it would be worth sharing this and some quick examples below. I should also mention that PHP scripts exist to pull data from a database and into the correct format. For more information see the amcharts site.
The first chart below is from the amCharts site. Click and drag on a portion of the graph you would like to highlight and try out the scroll bar at the top of the graph. This is helpful for graphs that show trends, yet have granular underlying data that may be hidden. The second pie graph is my gift to Art…and fairly accurate I might add.
Please upgrade your Flash Player to view these graphs
|
Please upgrade your Flash Player to view these graphs
|
Another mention should go out to a fairly new site called Timetric. Timetric allows you to graph data by entering fields manually, uploading a csv or xls file, or pulling data through an api. I did try this out and had some issues in formatting and uploading the data to produce graphs. It would also be obvious that these graphs are stored on a third-party server…something to keep in mind should you plan on using this for any data deemed sensitive. Here is a quick example from the site.