April 24, 2006

Clearing Up Some Misconceptions - Part I

A few days ago Atlanta's Hartsfield-Jackson International Airport was evacuated due to a computer glitch in TSA software.

Slashdot had an interesting comment thread on the story. I say interesting because I am surprised (and pleased) that there is so much ignorance in the general public regarding TSA security procedures. This is very good for security. However, there are problems with such rampant ignorance; it opens TSA up for far too much ridicule and scorn and causes misconceptions of what we, as TSA agents, actually do - misconceptions that always seem to be very negative and condescending.

So I am going to address some of the issues raised in the Slashdot comments. However, since much of the actual information is sensitive and is available only on a "need to know" basis, I will be very general in how I address this. Hopefully, though, this will help clear some misconceptions up.

There were four pages of comments on Slashdot, so this will take multiple posts to cover it all.

This incident raises the possibility of tampering with the software to either: 1. purposely display an image of a dangerous item where none exists, inciting a scare like the one witnessed Wednesday, disrupting thousands of lives and paralyzing a major terminal, or: 2. display an image of an innocuous item instead of the actual image of the luggage containing a dangerous item, allowing terrorists to smuggle said items onto aircraft.

Actually, number one is exactly what is being done. However, this does not incite a scare, excepting in this instance, when the program failed to identify that it was a false image. I think number two would be next to impossible.

That's insane. Images to test their alertness sure, but images of bombs? That's just plain crazy. All you're doing is desensitising them...

It's not crazy at all. We are looking for bombs. If we never see a bomb on the x-ray, we probably won't recognize one if one ever actually appears. This is valuable training. In a way it does desensitize us, I suppose, but that's a good thing. It helps prepare us for being in the moment when it may be real, much like military and police training.

...if screeners know this kind of thing is going to happen every so often and they see something suspicious, they may become a bit jaded after a while and assume it's a test, even if the indication doesn't appear.

Jaded is not really the proper word, but upon seeing a suspicious image, particularly of a bomb or gun, I'm sure almost all of us assume it's a test. That's far more likely than the alternative. However, if the indication doesn't appear, I guarantee none of us would assume it's a test. The software simply doesn't fail, which is why this evacuation happened when it actually did fail for the first time that I'm aware of. Certainly there was no assumption of a test in this instance, and the TSA at the checkpoint apparently did everything exactly like they should.

How frequent are these "tests" given?

Without being too specific, it's random and can be adjusted. In my experience I have ranged between about 3-15 a day, depending on which checkpoint I work at.

What are the chances that they coincide with an actual suspicious device, which the screener would then assume was part of the "test" which happened to occur simultaneously.

The chances are fairly remote, in that we don't see a lot of real prohibited items, other than lighters. The odds of a real threat and a test image occuring in the same bag is slim. However, the latter half of this comment doesn't occur. The software sees to it that we don't confuse a test image with any actual items in the bag. We are reminded (like we would ever forget) with every test image to check the bag again for any real threats.

Shouldn't said software be tested to something just shy of infinity ?

I don't know how tested it was, but any software is subject to bugs. This is the first time I've ever heard of the software doing this.

Isn't there an alternate verification process that doesn't involve computers?

Yes, and it was carried out in this case. You might check the suspect baggage. You call law enforcement to the checkpoint. They determine whether to call a bomb disposal unit.

It makes it very easy to hold "suspicious" looking people with no evidence whatsoever. "Well, the machine showed a picture that looked a bomb."

Need to borrow some tinfoil?

I wonder if this software uses the same "random" number generator that ensures only suspicious-looking people get "randomly" screened. That way suspicious-looking people are chosen to "randomly" have imaginary bombs in their luggage.

Do you really think only suspicious-looking people get "randomly" screened? We're almost always accused of the exact oppsite, you know, screening grandma and baby but not Mohammed. I don't know how computer software with no visual capability could determine someone was suspicious enough to put an imaginary bomb in their bag, nor what it would accomplish. We don't treat people with imaginary bombs in their bags any different than those with nothing in their bags at all. More tinfoil?

This incident raises the possibility of tampering with the software to either

3. Display "This is a test" right after Mr. Terrorists luggage containing dangerous items has passed through the X-Ray machine.

Not how the software works. All fake images are removed from the bag before it comes out of the x-ray. If your "dangerous items" didn't disappear, it's not a test.

If there are severe consequences for the operator if they miss one of the test images, then I doubt they'll be desensitized. On the other hand, if there's no consequence for being a slacker, you'll see a group of operators hudding around the display laughing at the "fake" bomb image while a terrorist walks right on through.

Missing one image does not involve severe consequences. This is training after all. I'm new to the checkpoint side of operations, actually, so I'm not fully versed in this yet. I know that on some x-rays, the tests can be reviewed, but I'm not sure if that's universal for all of the different machines. However, we have similar computer based tests away from the checkpoint as well. This, I know, can be monitored. I have been told that poor enough results can lead to remedial training. And we are recertified every year. If you fail to pass recertification on the x-ray, you will likely be fired. No one would laugh at a "fake" bomb. We see too many of them for them to be funny. However, that dog I saw in the x-ray - that was kinda funny.

Yes, you would think in "system to positively identify bombs" the flowchart box labeled "automatically and without further inquiry disregard positive image of bomb" would raise a few eyebrows.

Again, not how the software works. In a test situation the positive image of a bomb would disappear; then you would disregard the image. If the image doesn't disappear, you have a situation.

More importantly: After enough false alarms, the screeners will more likely not react should a real bomb appear. "Oh well, surely just another software fault, just like the three we've had earlier this week. We better don't scare our passengers again ..."

This would be a legitimate worry if this were a common problem. However, like I've said, this is the first I've ever heard of this software glitch occuring.

...never underestimate the apathetic state of the government-hired drone. I know they say they say they're picking the best people for these jobs, but in my many recent trips I've discovered a "lack of urgency" in some screeners and an "I'm in charge" attitude in a few too many for my liking.

Pejorative aside, I have found that most TSA officers are not apathetic, quite the contrary. Yes, like in any group of people, some are apathetic or lack urgency, but it is not the norm. Also, many screeners are ex-military, ex-law enforcement, or current reserve/national guard, which could account for the "I'm in charge" attitude. And to be frank (since being Brian isn't always enough), when it comes to airport/airline security, we are in charge.

I assume they "cut in" these test on the conveyor belt, meaning you see n+1 suitcases instead of n real ones. So if you see two suspicious devices and one "this is a test" message, you'll know that message doesn't cover both of them.

Not how the software works.

The TSA screeners' raises are based on how many hits/misses they get.

That is only just now becoming the case, and it is only one small factor of what goes into a raise.

Why put in images of bombs and such? Someone eyeballing that that isn't a screener would blow a gasket if they saw it.

TSA tries very hard to make it difficult for the public to see the x-ray monitors. While in many cases it's not possible to prevent a line of sight, we make sure you keep moving so that you can't just stare at the monitor for any extended time. Also, without being trained in what to look for and with how far from the monitor you would be, the odds of you picking out anything in an image is slim. At best, you may be able to recognize a simple gun image, but I doubt you would recognize anything else.

How about pictures of assorted dildos/vibrators? No, I'm serious. That'll catch your eye, male or female. Or a very carefully and perfectly laid out bra of panty?

Most people don't put these items in their carry-on, but in the event they did, do you really want us to be looking for adult novelty items rather than things that can kill you? As for bras and panties, x-rays go right through clothing; we would never see them.

This is training, you WANT people to see these things. You WANT them to have experience reacting to stuff they think is real. How do you expect them to identify bombs in suticases if they've never seen examples, especially in real world situations. Watching films in a classroom is nice and all, but not real enough.

This is one of the most intelligent comments in this thread. Sadly, it has very little competition.

The TSA funds fundamental research in sustaining human performance in search tests to ensure that these baggage screeners are performing well.

One thing that has been found is that the human brain cannot keep searching efficiently for something that never appears, you just tend to zone out. We're not robots after all, and searching day in and day out for a 1 in a million event that may not occur for months or years is not a task we're equipped to do.

By giving the visual system periodic targets, it stays frosty. So some kind of periodic fake bomb is necessary.

Now you can do this in two ways: with real fake bombs, or images of bombs. One of these options is going to cost about 100 times as much to implement as the other and at the end of the day, if properly implemented, both will serve the same purpose. It all comes down to how much security can we get for our dollar, and paying actors to play dress up terrorists and slip fake-bombs through the baggage system is hugely inefficient compared to a software solution.

I don't know the accuracy of the cost comparisons, but TSA does both! Excellent comment.

You're saying that in order to train someone to react to an extreme situation you have to constantly bombard them with false examples of that situation?

It's one thing to learn to identify a bomb on an X-ray machine. It's quite another to have them randomly flash the image through when you're actually doing the work then a "just kidding" message.

Hell that's like always training with live ammo.

First, it's not a constant bombardment. Second, while bombs have common components, they can be assembled in a variety of ways, and the components can vary greatly in appearance. I know what components make up a bomb, yet I still miss some of the test images. Why? Because those components don't always look the same. Bombs are limited in appearance only by the imagination. Not to mention, some bomb components are common items that we see in the majority of normal bags.

A test is much different than a joke or prank.

Actually, it's more like sometimes training with live ammo, which the military does. It would be worse to never train with live ammo. When the real bullets start flying, you want to be as prepared as possible.

While the baggage screeners might not know when random tests are run, their supervisors damn well should. If baggage inspection is a real time operation it'd be tragic if a "test" image with a fake bomb appeared over baggage with a real bomb. While the screeners are in the dark as to when the tests are run, the security system itself should clearly know when the tests are run.

The supervisors don't know either. These tests are random computer generated tests. They happen all day long and are not small in number. There is no need or any practical way for the supervisor to know. As I stated before, the fake bomb on a real bomb scenario is accounted for and is not a problem.

Hey, here's an idea. Cut some metal words out of old scrap metal and make the phrase "This is a test" and put it inside your luggage. I wonder what kinds of things you could get through the screening system

This idea comes up many times in the comments using metal or lead. First of all, lead (and the metal, if it was dense enough) would be impenetrable to x-rays. Being that we could not see what was under the words, we would check the bag. Second, evan if you used thin metal, and I could see through it, the words would make me look at the bag especially closely. Even if I didn't see anything, I would still probably alert my supervisor, whom I'm sure would have many questions to ask regarding why you did that. DHS might have questions too. Third, the software doesn't work like that, so we would know it's not the kind of test we are discussing.

How do we really know a bomb did not get on a plane?

Let's see, no plane exploded? Mind you, this comment was posted two days after the event in question.

Guess we will have to wait a few days to see if one goes down.

How many flights have you been on that lasted a few days?

they've got devices coming out for cars and trucks that test driver awareness far more subtly than just popping up a test picture at random... the software actually monitors the drivers eye movements and other parameters... so there shouldn't be anything stopping them from doing something similar for this x-ray scanner application...

The software you speak of is to make sure drivers keep their eyes on the road. Just because I'm looking at the screen doesn't mean I'm seeing or paying attention to it. The tests in question require you to pay attention to what's on the screen.

Then again, perhaps it would be better to dump the human out of the loop altogether and rely on AI to determine if an item of luggage warrants further attention... but these days it's still cheaper to use people to do it and pay them peanuts at the same time...

Interestingly, I've been moved over from the baggage side of things to the passenger side because this is exactly what happened. My airport has implemented an in-line baggage system that greatly reduces the need for baggage screeners. The baggage side has software that does help to determine if an item of luggage needs further attention. Of course, the human element is still needed to inspect that luggage. I believe they are in the process of designing checkpoint x-rays that are more similar to the ones used for checked bags that would help determine if luggage requires more attention, thus supplementing the human element. It probably is cheaper to use people for this, but like I said, they spent the money on the baggage side costing me my much prefered job there. And, I wish I made more money, but I hardly get paid peanuts.

"What brainiac thought this one up?"

Jeremy Wolfe, possibly the world's foremost expert on human performance in visual search tasks did.

You can read about his research on his publications page here.

http://search.bwh.harvard.edu/recent_publications.htm [harvard.edu]

Definately a link worth checking out. I have read a couple of articles Wolfe has written and find them very interesting. At some point, they'll merit a more in-depth blog post.

To be continued...

Posted by Brian at 11:39 PM | Comments (2) | TrackBack

July 14, 2005

TSA Log #1

CONSTANT VIGILANCE!

Last Sunday, a traveler at the airport who worked for the FAA noticed an unattended bag at curbside and alerted TSA to its presence. Excellent! A short time later, I learned one of the Skycaps saw the shuttle bus driver put it there (presumably mistaking it for the luggage of one of the people getting off) and drive away leaving it there. He did not inform us. Not good! Anyway, I watched the bag, while one of my co-workers called upstairs to our supervisor, who contacted DPS.

Now, what troubled me was the behavior of passersby in the 20 minutes or so it took DPS to respond. Only two times did any individuals seem to take any overt notice of the bag and look to tell someone, and one of those times it was fellow TSA employees. I can understand no one taking note when there were vehicles or people next to or near the bag. It could easily have belonged to those people. But there were several times when nobody was near that bag, when I was the closest one to it, at least 20 feet away. And invariably cars would pull up and park right next to it, or people would walk right by it, without ever seeming to notice it, even though it was so conspicuously alone.

Keep in mind, this was just three days after the attacks in London!

Folks, this is not good! Please, please, be more vigilant. That innocuous bag may just contain a bomb. Especially be wary in transportation facilities and around large groups of people. And if you see something suspicious, please report it.

Posted by Brian at 12:37 AM | TrackBack

June 27, 2005

TSA Thoughts

I have hesitated mentioning that I work for the Transportation Security Administration (TSA) for a variety of reasons, but too many things happen not to blog about them. So from now on I will begin blogging random thoughts I have about the job as events warrant.

In the meantime, if you have any questions regarding the TSA, please feel free to ask.

Posted by Brian at 11:10 PM | Comments (3) | TrackBack