http://www.washingtonpost.com/wp-dyn...-2003Sep8.html
IMHO, I trust the system more than the screeners today. As of now, it's to discressionary. At least it's going to attempt to combine some sort of reason to the delay.
In theory, it would make it a little less likely to hold someone, just for having dark skin. Or just for being young (as many between 18-25 are held, and searched extensively for drug trafficing, especially from Europe).
When you look at the overall picture. Should change perspectives a bit.
Although I wonder where they get that percentage of those flagged yellow comes from. Since I see the following percentages of our population being:
black 12.9%
other 4% (2000)
-----------------
16.9% will be Yellow
Still trust the system, more than the individuals. Screeners who make these decisions don't have great background checks for any sort of bigotry, or bias. I saw one when going on vacation this summer who was putting all the old people to the side. Each getting a good second check. Guess that screener got caught behind an elderly driver on the way to the airport
Discression is a bit silly. Especially when it's based on nothing more than personal prejudice.
At least now, we are moving towards using a persons background and information.
I've said all along that the next 9/11 like attack, will be done by someone who doesn't appear to be middle-eastern.... and I still stand by that belief. There were many "Johny Taliban" like guys missing out in the region (to be fair, same any part of the world)... it would be the easiest way to pull it off. A middle aged white guy would be somewhat likely to slip right by, without a second look.
Perhaps with such a system in place, it would be a little tougher for the next attack to take place.
IMHO, background info is the best way to detect at the time. It's better than just deciding based on the look of someones face. It's the pretty boys who always end up being serial rapists.
Perhaps the TSA can implement a Bayes filter to enhance the quality? Perhaps rip off of SpamAssassin's implementation, as it works very well:
http://cvs.sourceforge.net/cgi-bin/v...assin/Bayes.pm
Seriously. I see this generating less bigot waste. At least now the accusation is based on something worthwhile. These methods seem a bit more accurate than personal judgement (which is essentially based on nothing).
So I'm all for it. At least now an Indian family going on vacation from New York to LA, isn't has likely tt have wait 7 hours while they are questioned, anal probed, and processesed because they have dark skin.
Some of this stuff was getting a bit rediculus.
I trust computer science, more than someone who isn't very educated, and performs their job out of pure bigotry, of race, gender, age, clothing, whatever.
I also am under the impression that it's not a microsoft product, and will be mantained, updated, and improved (including Bayes, if they could get it working) for as long as the product is in use. Not during it's "life cycle".