Tag Archives: Testing

Does “Research” Terminology Reduce Adoption Rates?

What is your reaction to this tweet?

In the drive to “do something,” many applaud this as a reasonable step. I think it actually might harm our efforts and slow our progress.

Words matter.

Does the use of the term “research” reduce adoption rates vs. if we used the term QA or QC?

What is wrong with the term security research? Why might QA or QC be a better selling point?

Consider how businesses handle “research” versus quality assurance/control. In most cases, businesses have budget for quality work. They recognize the importance of producing to the level of quality expected in the marketplace.

The role of QA/QC is one of trust. Partnering together to produce a better product. A way to protect the company while growing the bottom line.

Research is a confusing concept. It either harkens back to grade school papers, college projects, or huge corporate investments. And in the corporate world, research is tightly controlled and wrought with failure. The hope is a small amount of success to make up the difference.

Research is about the future. Quality is about the current state.

Confusing the opportunity: security research

Security research is not well understood. Not even within the “research” community – Bug Bounties refer to their testers as “researchers”, “bounty hunters”, etc.Combining two expensive, confusing terms together creates additional barriers and hurdles.

Where does it fall within the budget? Is it a security item, an application item?

Does this make security testing or research bad? No. It highlights the fact that when working with an organization, perception matters.

When you approach an organization regarding security testing and approval, are they more apt to go with something that sounds familiar, they understand the value, and fits their model, or go with an option that is often interchanged with “hacker”, and they really don’t understand the value? You hear all the time how different groups need to speak the language of their consumer. While I am not a fan of the idea of all these different languages, I do think that using terminology that is familiar to the consumer provides a better connection and opportunity.

In this case, you are selling testing services. These are QA/QC services to offset the internal testing they are doing, while adding a specific focus on a limited classification of bugs. Would changing our terminology change the adoption rate?

I would love to hear others opinions on how they think choice of terminology affects adoption rate.

Hacking Cars: Taken Seriously?

Turn on an ad for new vehicles and you are bound to see how connected they are to our lives. Gone are the days when your vehicle is just a stand alone product. Now we are seeing cars that have internet connectivity. We are moving past the simple satellite radio or GPS systems and becoming connected to a lot of data. Security folks have been talking about vehicle security for a while now and a few researchers have been focusing on showing how serious the security of these vehicles is.

Today, a story was released on Wired “Hackers Remotely Kill a Jeep on the Highway – With Me In It” (http://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/) describing how a Jeep was remotely controlled by a laptop 10 miles away. For the full details, check out the link I just provided. Once the story hit the airwaves, it received lots of attention, both good and bad.

Lets start with the positive side of things that were shown. It is possible to actually show the capability to breach a vehicles systems (remotely) and then control many of the functions. These functions include the radio, wipers, temperature controls, transmission and brakes to name a few. It is a concern that this can be done without authorization. I certainly do not want my vehicle to be taken over while I am driving it making it unsafe for myself or my family. The highlight: Security is important for vehicles with them being more reliant on software and internet connectivity.

Rumor is that there is a patch for the vehicle to fix this issue. The issue we now have to address is how do we efficiently and effectively get these patches to the vehicles. At this point, bringing the vehicle in to a dealership to have the software updated is the only real option.

The negative reception is where it gets interesting. They decided to do this experiment on a highway with other vehicles around traveling at the speed limit (70 MPH). At one point the driver is explaining how he can’t see because the windshield wipers are going with the fluid spraying. At another point, they cut out the transmission and the vehicle slows way down on the highway where there was no breakdown lane. That is a brief and probably insufficient summary, however the point is that a lot of people are upset.

This type of testing in a public place like this puts the other drivers on that highway at risk. This is not much different than the plane hacking bonanza that happened a few months ago (http://www.cnn.com/2015/05/17/us/fbi-hacker-flight-computer-systems/) causing a huge backlash. It is one thing to look for security issues that may help make things safer, but it is critical that the testing of these theories are done in a controlled environment, not putting people at risk. They don’t test vehicle crash ratings on the highway, they do it in a secluded area where safety is a priority.

If you are going to research security issues, no matter what they are, it is critical to think about this type of stuff before you just jump on in. While I understand that this type of stunt hacking is great for advertising an upcoming talk at your local hacker conference, it is not acceptable when directly putting other people at risk. You want to hack a plane? Get an airline to get you into a hangar in a controlled environment. The other option, by a plane to test out yourself. But don’t do it on a plane full of passengers at 30,000 feet. In this case, the researchers went out and acquired the vehicle and researched in their own facilities. The issue arose when they did their testing on a highway and not on a closed course. Security research is walking a fine line and it will require the best foot forward to push it in a positive direction. If all people see is the stunt hacking they will lose sight of the real issue at hand and just see these stunts as reckless. It will have the opposite effect of what the end goal is: to increase security awareness and security of the devices or products.

If you are in the market for a new vehicle, don’t be afraid to ask questions about the security of the vehicles communication systems. The more we dig as consumers the more aware the manufacturers will be. At some point, promoting security as a feature will be critical to beating out the competition ultimately forcing everyone to get on board. Be smart and stay safe.

Hacking Airplanes…Lets Think About This

Recent news of airplane security and the did or didn’t someone take control of an airplane during a flight is scattered across the web. There are lots of opinions on whether or not the inflight entertainment systems and the airplane control systems are connected or not. I haven’t tested an airline system, so I can’t say for sure, and it may be different depending on the type of plane. One glaring issue here is we don’t know and there are a lot of people that don’t know either, while acting as if they do know. Is airplane security a concern? Of course it is, what security isn’t a concern? What is the right approach to having it tested?

United Airlines recently announced a bug bounty program. For those that may not know, a bug bounty program is set up by companies to recognize or reward security testers for identifying security bugs in their applications. Some of the big names like Google, FaceBook and Twitter have been doing this for a while now. While not something everyone is prepared for, it can help identify some of the security bugs in your applications, although many of these flaws should be identified internally by developers and QA before release to production. Any average person can participate in most bug bounties, no skills required (we won’t dig into that for this piece).

What seems to be interesting with the United program, at least what we see on Twitter, is that there is some concern that the airplane and in-flight systems are out of scope. This means that while you can test United’s external applications, they are NOT giving permission for anyone to test the airline systems during a flight. Airline security has been propelled into the spotlight recently with stories like GAO: Newer aircraft vulnerable to hacking and Chris Roberts tweeting on a plane about it and then getting questioned by authorities for hours upon arrival.

Does United have it right, by banning hacking on the plane? But what about the children you say? First off, without permission, you shouldn’t be security testing something that isn’t yours. I know there are lots of debate around this topic, but lets just get the permission thing out of the way. I understand, if the systems are not safe, then the issue should be addressed. Many will tell you that the only way to know if it is safe is to have any Joe Blow out there firing away at it. If telling the airline about it doesn’t get them to fix it then doing something a bit more rash is needed “for your safety”. Be prepared, when it comes to public disclosure of flaws that contain working exploits that are not patched “YOU” are the collateral damage.

Lets get real here for just a moment. Lets take a moment to realize that things that happen on computers DO have real consequences. Messing around on a website that exposes sensitive information is bad enough, but to think that allowing anyone to attempt hacking a plane to look for security vulnerabilities at 30,000 feet is a good idea is just ludicrous. You are directly, and immediately putting the lives of everyone on that plane at risk. Maybe you should do a vote to see who is ok with you attempting this. After events such as 9/11, I don’t think you want to announce you are hacking the plane.. you may find yourself duct taped to a chair and bruised up a bit for the remainder of the diverted flight.

In the professional world of security, when we want to test the security of something like this, we seek out the vendor and get a contract that outlines what testing will be done. Obviously this requires the vendor to agree to a contract and the testing. In this scenario, the testing would most likely be done in an airplane in a hangar at the airport, not at 30,000 feet and with no other passengers on board. If you are unable to get the vendor to commit to a contract for testing, then hopefully making people aware of the potential issue and the risks they assume by using that vendor could be enough to force the vendor into it. In our market, people stop using a service, vendor starts listening to people.

In the case of United, and hopefully any other airline that decides to open a bug bounty, I think they are making a good decision in not opening up a bounty on the airline systems. Of course these systems are critical, especially since they keep the plane safe in the air, but we need to make intelligent decisions about how things get tested. This decision by United does not seem to be a method of trying to silence “researchers” about the potential security vulnerabilities in the airplane. This is a move to keep people safe during a flight. We have ways to test, as mentioned earlier with the contract in a controlled environment, we don’t have to do it in the air with other passengers. It is also a smart decision to not open a bug bounty on those systems because with critical systems like this you want to ensure that only trained experts are assessing the system. Someone that can understand the fragility of the environment, the way it works, the things that shouldn’t be done. You start letting John in 34C who just learned what Metasploit is start firing exploits at a system all ad hoc, you are asking for a world of hurt.

If you really want to test the security of an airplane and its flight controls, pony up and buy a plane to do the testing. We see this with the guys that are testing the security of cars. They get funded or pay out of their own pocket to get a vehicle that they can test out the security. Look at some of what they have done, it doesn’t always go as planned. They are not hopping on a city bus and hacking it. They are not hopping on a train and attempting to hack it. They are doing their best to create a controlled environment to test in a safe environment.

Everything has security issues. There will never be a time when we don’t have some security issue still around in a system. We should be glad that due to recent events the airlines have not banned electronic devices on airplanes.. Yet if we keep making decisions to put people at risk with this type of “research” we will probably really learn with “chilling security research” really means.

QA and Security Pt. 1: What QA Are You Talking About?

There has been a lot of chatter recently regarding QA and the role it plays in security testing. Hashing this out over Twitter with 140 characters per post just makes things more confusing. While I think that Twitter helps start some of the most important discussions, there is a need for other mediums to expand on the topic. In this series of posts, I want to take the opportunity to bring up some of these topics and hopefully expand on then a little bit. We all have our opinions and we will certainly not all agree on everything, however these are conversations we need to have.

What do you think of when you hear Quality Assurance (QA) mentioned in a conversation? Is it just a process that documents or processes go through to make sure they are sound? Do you think of that group of people that test applications to make sure they function as they should? It is important for everyone in the discussion to understand the context in which the term QA is being used to ensure everyone is on the same page.

As mentioned above, there are a few different things we think of when we hear the term QA. The one that comes to mind the most to me is the team that tests applications or software to ensure that they are working properly. Most likely I go this route because I come from a lengthy development background and that is the QA I mostly dealt with. This QA team is responsible for identifying bugs in the system, documenting them and getting them back to developers to resolve them. This team is engaged after software is written (well, during the iterative development process) and before it is released to the production environment. Even this group can be very different between companies which I will talk about in future posts in this series.

Other people when they hear about QA think about a different group (most likely) that is responsible for reviewing documents and procedures to ensure accuracy. One example of this is sending requirements or design documents through a QA process to ensure they are correct.

Of course there are other types of QA, but those are the two we will focus on throughout this series. If you have other examples, please feel free to send them my way on twitter (@jardinesoftware) or james@jardinesoftware.com. The more we start distinguishing the context we are discussing the more we can get done by cutting through the confusion. Defining our context is a critical first step.

In the rest of this series I will dive into the different types of QA and how they relate to security. What role can they play. What should they be able to do and what should they not be able to find. Can QA find security flaws? Follow me on this short journey to see what we find out.