This would make the users more knowledgeable, and censorware that doesn't behave properly would have to improve to compete, and we'd help create an environment in which PICS tags on member pages are no longer needed!
Could Mensa be held accountable for incorrect actions taken by faulty censorware? I'm sure some would try, but if a child were to view an inappropriate page because the *censorware* did an illogical thing... Well, it's something to think about.
Searching for bad words was a good idea when there were no rating systems and no rated pages, but now it's outdated. Except maybe as an enhancement: if a page fails the badword test, block it even if it does claim to be a good guy via its ratings. (Of course, we all know the problems associated with badword processing.)
If your goal is to protect the children, then you've got to assume that a page is guilty until proven innocent. Would you let somebody not known to be bad take your child away for a trip to the mall? No! That "somebody" must be known to be good!
I've described a mechanism that would work pretty well. Combine that with severe penalties for falsely enabling inappropriate pages, and you've got a winner.
But I wouldn't. It's not good enough. For example, how is the BlackList generated? People complained. Why did they complain? Because somebody saw something they shouldn't have. Ergo, the software failed. If I buy censorware, it's because I don't want my kids seeing that stuff, and I don't care what the disclaimer says, they better not see that stuff! Particularly when it's so simple to do it right. If you try this instead, they won't see that stuff...
accept = (picsRating(url) != NULL) && (picsRating(url) < ratingThreshold);
Much cleaner, and faster, and the software is not going to fail. The PICS tag might lie, but that's not the software's fault.
I disagree. What would you think of Air Traffic Control software that was *designed* to *usually* function properly? And whenever it fails (BOOM!) we just add that misbehavior to the List Of Things To Not Do.
But would you accept somebody else's test procedures when you *know* that they have failed in the past? And that instead of fixing their test procedures, they merely added those badnames to the BlackList?
That's a problem, and a big one, for the vendor. But it's irrelevant to good program design.
I'll say it again. The customer buys censorware to prevent the children from seeing bad stuff. The only way for screening software to guarantee this is to only show good stuff. Simple, no?
The current climate is the vendor's problem.
Did you complain to the vendor? Did you suggest to her usual sites that they rate their pages? If so, why haven't they? Did you ask for your money back?
I said...
...to which Mr. A replied...
...to which I reply...
They can't. Because they do not make good software!
Really, it comes down to this. They could have done it right, but the software would have been difficult to sell because, initially, there would be so few sites with access enabled. Instead, they could flip their logic over, throw in some badword processing and a blacklist table, and they'd have a faulty product but one which could function even without many sites enabled. This was a marketing decision, not an engineering decision. And for business purposes, short term and quick return on investment, a wise one. But it's still a poor design, and it's not what the customer wants.
Honestly, if you could choose between screening software that *usually* blocks the bad stuff and screening software that *always* blocks the bad stuff, *and they both always let the good stuff through*, which would you choose? The latter, naturally. That's what you really want.
You object because the software that's available now can't satisfy the *always letting the good stuff through* criterion, and that's because all good pages would have to be rated, and they're not. That's another challenge for the vendor, who then needs to convince people to rate their good pages. As time goes on, more and more pages would be rated, and the software would become more attractive. A better product and a long term strategy.
1) I'm a programmer... it's an uncontrollable urge.
2) Because the Right Thing is for a PICS tag to be an Enabling Mechanism, and if the software does the Right Thing, then one doesn't need to rate a page except to enable access; and if the software does the Wrong Thing, then it is the fault of the software, not Mensa, and we have no liability when faulty software fails to block things that should be blocked. Thus, the PICS requirement is unnecessary.