Posts Tagged ‘pattnernicity’

Patternicity – defined by Michael Shermer – a writer for Scientific American – is the tendency to find meaningful patterns in meaningless noise.  When I read this article on Pattnernicity I immediately related it to the challenges we face with information access. 

Patternicity deals with false positives and we have a compartive with search tools – too many responses that may or may not be what we are looking for.  Human Patternicity is meant to err on the side of caution because as Shermer points out – “the cost of believing that the rustle in the grass is a dangerous predator when it is just the wind is relatively low compared with the opposite. Thus, there would have been a beneficial selection for believing that most patterns are real.”

Digital Patternicity is also meant to err on the side of caution because the cost of believing that the keyword matches your intentions is relatively low compared with returning a false negative.  Therefore returning a false positive is better than returning a false negative. 

The problem in both Human and Digital Patternicity is that the algorithms are limited and have stopped evolving because they don’t need to improve.   Human beings are very successful and don’t require more sophisticated methods for returning fewer false positives.  Likewise, search companies like Google are very successful and have built a huge business in spite of the number of false positives they return. 

However, increasingly within the world of business – where information equates to revenue, competitive advantage and market growth – there is a big price to pay with false positives and a shift in the evolution of Digital Patternicity must occur.  There will always be a place for acceptable false positives in the mass market – but when you get to specialization, when the stakes become too high, when survival is at risk – then evolution aggressively adapts.

Read Full Post »