Last week the New York Times featured a company, known as Social Intelligence, that, at least at first blush, seems to take the gathering of online personal information about prospective and current employees to an entirely new level. It conducts deep searches of the Internet for what it claims are examples of both positive and negative information about an actual or prospective employee. It then provides a report to your current or prospective employer. “Negative examples,” it says, may be evidence of “racist remarks or activities, sexually explicit photos or videos, and illegal activity such as drug use.” “Positive examples,” it says, may consist of “charitable or volunteer efforts, participation in industry blogs, and external recognition.” Check it out at Social Intelligence HR.
It flatly claims that “federal and state protected class information is redacted from the reports we provide. Employers are only exposed to information that is job relevant and may legally be considered in the hiring process.” In other words, it says it removes information about race, gender, religion, disability, national origin, color and the like. But that seems hard to do. What if an employee donates money and time to the NAACP, Florida A & M University, and the African Methodist Episcopal Church? Or suppose an employee received awards from the Anti-Defamation League and the American Jewish Community? While not all supporters or award-winners for these organizations are African-American or Jewish, might an employer draw that conclusion, rightly or wrongly? How will the company scrub its automated searches and reports of these kinds of protected-class identifiers?
In other words, doesn’t nearly every piece of information provide some clue to our age, our race, our gender – everything that Social Intelligence says it won’t include? What happened to the good old resume and application?
More importantly, what does it consider a “protected class,” which includes much more than just race, gender and similar personal characteristics? Whistleblowers, for example, are a protected class. And there are dozens of state and federal whistleblower laws. How will the company assure (without an army of lawyers familiar with every state and federal whistleblower law) that something in a report won’t finger an employee as a whistleblower? What about workers compensation claimants? Or employees who’ve taken leave to care for a cancer-stricken family member under the FMLA? If I post on a blog for families of cancer survivors, that could implicate a range of protected-class issues under the Americans With Disabilities Act, Family & Medical Leave Act, OSHA, and dozens and dozens of state laws.
The extraordinarily sophisticated use of data-mining techniques should put us all on high alert. Increases in processing power, hard drive space, and the connectivity between personal computers and the Internet has transformed unprecedented quantities of random digital data into usable intelligence on everyone. Data mining is widely used in military and law enforcement circles for profiling. It has now arrived in the workplace.
So, beware. And, think ahead – far ahead. Computer analysis is exceedingly sophisticated now. It will become more so. Assume that if you’re posting content online – anywhere – that data-mining techniques will catch it and be able to link it to you. Think about not only your Facebook or LinkedIn profile, but the online comment you left at a WordPress blog of a friend, the online letter to the editor of the Wyoming Tribune-Eagle, or a password protected website. The Internet eliminates geographic and cost barriers to data collection. A data-mining company in New York can search for your activities in any state and any country with equal ease. Your names, your email addresses, your nicknames, your photos and the content of your postings may all identify you. In the future, other unique information, such as your IP address, might suddenly link you publicly to postings all over the web. Even the structure of written or spoken words – so-called linguistic fingerprints – might linking your unique way of writing and talking to you even in the complete absence of personally-identifiable information. Facial recognition software used to be Star Wars stuff – now Facebook uses it. Google allows limited image searching, but has held back the release of true facial recognition searching out of privacy fears. But that’s coming too. Today Google announced that it bought PittPatt, a facial recognition software company. Check it out at PittPatt. What might it mean to you if a company like Social Intelligence could assemble a report identifying websites containing photos of you?
Have you ever posted anything you don’t want linked to you?
Categories: Age Discrimination, Disability Discrimination, Discrimination, Ethnicity, National Origin & Color Discrimination, Gender Discrimination, Pregnancy Discrimination, Race Discrimination, Retaliation, Uncategorized