You Can Spot A Narcissist From This Facial Feature, According To New Study

Aside from suspiciously white teeth and an ungodly number of selfies on their phone, is there a way to spot a narcissist? According to a new study, look no further than their eyebrows.

New research from the University of Toronto, published in the Journal of Personality, has suggested that people with “distinctive eyebrows” are more likely to display narcissistic personality traits.

The word “narcissist” comes from the ancient Greek story of young Narcissus, who fell in love with his own image reflected in a pool of water. In general, people with strong narcissistic personality traits score very highly on the self-loving spectrum. They often make good first impressions, appearing to be likable and charming, but they also exhibit self-centered and selfish behavior, often with a grandiose view of their own abilities or appearance.

For an unclear reason, they also have great eyebrows too.

The team of researchers came to this conclusion by photographing almost 40 undergrad students with neutral expressions. They then got the students to carry out a psychological test known as the Narcissistic Personality Inventory to test the strength of any narcissistic traits. The researchers showed the photographs to other participants and asked them to guess how narcissistic each person was based on how they looked.

First of all, their initial results showed participants were particularly good at using eyebrows to make an estimation of the student’s levels of narcissism. In particular, eyebrow thickness and a high density of hair were most likely to be used as an accurate judge of narcissism.

They expanded on this by measuring how much perceptions of narcissism changed when swapping narcissists’ and non-narcissists’ eyebrows between faces. This showed that they rated narcissists’ faces as less narcissistic when they donned non-narcissists’ eyebrows and vice versa.

They concluded that this shows “distinctive eyebrows reveal narcissists’ personality to others,” as well as strongly influencing whether people view you as narcissistic.

So, why could this be the case? The researchers didn’t look for a mechanism to explain this link, but they note eyebrows are highly important for social functions and nonverbal communication, so we have an especially acute sense for them. Furthermore, the eyebrow can be used as a microcosm of a person’s wider appearance and identity.

Narcissists seek to be admired so maintain a high level of grooming. “Individuals reporting high levels of narcissism tend to wear more fashionable, stylish, and expensive clothing; have a neater, more organized appearance; and look more attractive,” the study authors write.

Eyebrows are also very important for facial recognition and mate selection (in both females and males), and a pair of meticulously well-kept eyebrows suggest the owner knows this.


Source Article from

Intentions, not moods: Intriguing study suggests that we use our facial expressions to get our way, not to display our feelings

Image: Intentions, not moods: Intriguing study suggests that we use our facial expressions to get our way, not to display our feelings

(Natural News)
The reason behind that smile may not be because the person is happy. Instead, that person is using their smile to get their way. A study suggested that humans use facial expressions to get their way and not to show their feelings.

Researchers at the University of California, Santa Barbara and De Montfort University aimed to understand human facial displays better and to bring back continuity with modern views of animal communication. This study also debunked the earlier, widely held belief that facial expressions are used to display the emotions of people.

Researcher Alan Fridlund, an associate professor in the Department of Psychology and Brain Sciences at UC Santa Barbara, said that facial expressions mainly come from intentions and not feelings.

“Our faces are not about us, but about where we want a social interaction to go,” Fridlund, who is also a social and clinical psychologist, said. “For example, the ‘cry’ face is usually considered an expression of sadness, but we use that face to solicit succor, whether that means reassurance, words of comfort or just a hug.”

In the past years, biologists re-assessed the way animals communicate and started to perceive them as sophisticated communicators and negotiators. Fridlund said that humans’ facial expressions serve the same ends. (Related: Monkeys discovered to use complex grammatical structures in their language.)

“There is no doubt that what we do with our facial displays is different than what nonhumans do,” Fridlund said. “But our displays function in many of the same ways. They act as social tools in behavioral negotiation.”

Get more news like this without being censored: Get the Natural News app for your mobile devices. Enjoy uncensored news, lab test results, videos, podcasts and more. Bypass all the unfair censorship by Google, Facebook, YouTube and Twitter. Get your daily news and videos directly from the source! Download here.

The study also included the research of Carlos Crivelli, co-author of the study and a lecturer at De Monfort University in Leicester, England. Crivelli’s study focused on how indigenous Trobriand Islanders in Papua New Guinea perceive emotion and use facial expressions. In this study, he and his team discovered that for the Trobrianders, the universal face of fear actually means a threat display to frighten others into submission.

Fridlund added that researchers in the 1960s had preconceived notions regarding specific expressions equating to particular emotions. As a result, their studies were intended to attest those beliefs. On the other hand, a lot of new studies analyzed the associations between facial expressions and emotions and discovered little evidence of a relationship between the two.

For example, an “angry” face does not necessarily mean the person is angry. Yet, the person may be feeling frustrated, hurt, or constipated. Still, the “angry” face is supposed to intimidate or signal retaliation against whomever you demonstrate those expressions at, regardless of feelings.

The researchers concluded that facial expressions are used as social tools for social influence. They are not about emotions, but about “changing the behavior of those around us.” The findings of the study were published in the journal Trends in Cognitive Sciences.

Facial expressions and social interaction

A study published in the journal Philosophical Transactions of the Royal Society B: Biological Sciences found that facial expressions influence social interaction. The study focused on how behavior can develop into a sophisticated communication system. For example, the facial display of fear has direct behavioral advantages for the person doing it. As he widens his eyes, his visual field increases, which in turn increases the likelihood of detecting signals of danger. Then, this expression serves as public information in which the observers can interpret as a signal to be alert. In the next step, the doer of the action will be able to control the sending of the signal that was unintentionally conveyed earlier. Eventually, both the actor and receiver become conscious of the exchanging of signals and that these can be used for intentional communication. At this stage, the development of the communication starts.

Read more news stories and studies on social interactions by going to

Sources include:



Source Article from

Bezos the SPY master: Amazon now powering facial recognition surveillance technology for police

Image: Bezos the SPY master: Amazon now powering facial recognition surveillance technology for police

(Natural News)
It’s time for Americans to stop what they’re doing, have a look around, and ask themselves the following question: Are we still living in the United States of America, or are we living in some kind of science fiction novel where the government knows more about us than we’d like them to know? At times, the difference between the two isn’t immediately obvious.

Take Amazon, for example, which has recently decided to market a powerful face recognition tool called “Rekognition” to police. This technology would help law enforcement identify and track individuals at their convenience, even if those individuals are not involved in crimes. As you probably can imagine, this has many constitutionalists and privacy advocates extremely concerned.

Even though Rekognition officially launched in late 2016, it’s still unclear as to how many law enforcement agencies have actually purchased the tool and are using it on the job. Since then, Amazon has added specific features that allow Rekognition to identify individuals in videos and follow their movements almost instantaneously.

Earlier this month, a number of privacy groups like the American Civil Liberties Union urged Amazon to stop selling this face recognition technology to government agencies, arguing that they could use Rekognition to “easily build a system to automate the identification and tracking of anyone.” These privacy advocates also noted that such technology could have an even harsher impact on minorities, illegal immigrants or political protesters, which they claim are already arrested at disproportionate rates.

Get more news like this without being censored: Get the Natural News app for your mobile devices. Enjoy uncensored news, lab test results, videos, podcasts and more. Bypass all the unfair censorship by Google, Facebook, YouTube and Twitter. Get your daily news and videos directly from the source! Download here.

“People should be free to walk down the street without being watched by the government,” the groups wrote on Tuesday in an official letter to Amazon. “Facial recognition in American communities threatens this freedom.” (Related: Here is a list of all the ways in which the federal government is spying on you.)

How much privacy do we still have?

While it goes without saying that our police and law enforcement agencies should be equipped with the tools they need to better serve our communities, there is a very fine line between technology that keeps us safe and technology that infringes upon our rights. As it stands right now, surveillance technology seems to be going to the extreme, and as a result, personal information is being collected without our knowledge or consent.

Back in January, the Waking Times reported on new technology being developed by researchers at the Massachusetts Institute of Technology that is capable of accurately reading a person’s concealed emotions from a distance. That means that regardless of whether you feel happy, sad or angry, and regardless of how well you are trying to conceal it, this new device will be able to see through even the best poker face and read you like a book. (Related: Yes, your smart TV is really spying on you and collecting your personal information.)

According to the researchers, their device, which is officially called the “EQ-Radio,” is accurate 87 percent of the time at detecting individuals’ emotions, and what’s even more frightening is that it does not even need to be directly linked to a person’s pulse and body to operate. Rather, the EQ-Radio works by sending WiFi signals that bounce off a person to read things like their heart rates and other relevant information that helps it to determine how the person is feeling.

For obvious reasons, and like the Rekognition device, there is a lot of potential for abuse through the use of EQ-Radio. Companies could set them up outside of their shops, for example, in order to gather information on how customers feel both entering and exiting their store. If we’ve reached a point in our nation’s history where even our emotions are no longer private, then can we really continue calling ourselves a constitutional republic?

Sources include:



Source Article from

New York Schools To Install Facial Recognition Tech Used By Police and Military

 Amid the highly-publicized spate of mass shootings in America, one public school district is set to impose unprecedented police state tactics to ensure “security” and safety for students.

Starting next school year, schools in New York’s Lockport district will be equipped not only with bulletproof glass and surveillance cameras but also with facial recognition technology used by police forces and military units.

“We always have to be on our guard. We can’t let our guard down,” Lockport Superintendent Michelle T. Bradley told“That’s the world that we’re living in. Times have changed. For the Board of Education and the Lockport City School District, this is the No. 1 priority: school security.”

“When it comes to safety and security, we want to have the best possible,” Depew Superintendent Jeffrey R. Rabey said. “From what I’ve seen, there’s no other like it.” The Depew school system is also working to obtain the same Aegis system, supplied by the Canadian company SN Technologies. It is part of the school district’s $2.75 million push for security, which includes 300 digital cameras.

According to Tony Olivo, an Orchard Park security consultant who helped develop the system, Lockport schools will be the first in the world to use the facial recognition technology, though other schools in the U.S. do employ other kinds of facial recognition. He noted that “Scotland Yard, Interpol, the Paris police and the French Ministry of Defense” already use it.

Olivo also said Lockport schools were used in test videos the Aegis software developers used as they created the technology. The district has been eyeing the technology since the 2012 Sandy Hook shooting and is paying for it with funds secured with New York state’s Smart Schools Bond Act of 2014.

Though school officials are enthusiastic about the new additions, facial recognition technology has a questionable track record.

Documents recently obtained by Wired magazine showed that during a sports events where South Wales police used facial recognition tactics, “2,297 [matches] turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect.”

Olivo claims the Aegis system has improved technology and simply needs numerous cameras to be strategically placed around the school. K.C. Flynn, a partner in SN Technology, says “We have absolute confidence in our product. It does work, and we have no concerns whatsoever.”

Nevertheless, the system does not use x-ray technology, nor does it detect metal, concealed weapons, or explosives, though Olivo claims the video software may be able to identify some weapons. notes it “alert[s] officials if someone whose photo has been programmed into the system — a registered sex offender, wanted criminal, non-custodial parent, expelled student or disgruntled former employee — comes into range of one of the 300 high-resolution digital cameras.”

At least one parent is skeptical. Jim Schulz told the outlet he thinks the system will only save a few seconds at best and will be unable to stop a perpetrator with an AR-15. Adding that keeping doors locks and using a visitor check-in system would be just as effective, he said: “The only alarm you’re going to need, if you need any, is the people screaming.”

Lockport technology director Robert LiPuma acknowledged that Aegis cannot stop shootings on its own:

“It takes a human response, still, to respond to that. There’s no security system or piece of technology that’s going to prevent something from happening. It’s just giving us more information and alerting us to issues.”

That information has civil liberties advocates concerned. Though students’ photos will not be uploaded to the system “unless there is a reason,” the system can then track that student’s movements around the school, tracing who they interacted with and where they went during school hours over the previous 60 days.

“Tracking every move of students and teachers is not the best way to make them feel safe at school and can expose them to new risks, especially for students of color who are already over-policed in the classroom,” said the John Curr, who chairs the New York Civil Liberties Union’s Buffalo chapter. Facial recognition technology is most effective on white people and less accurate in correctly identifying minorities.

Curr also commented on the dangerous precedent of extreme surveillance the Lockport school district is now setting:

“This plan sets a dangerous precedent for constant surveillance of young people and risks exposing data collected about students and educators to misuse by outsiders or law enforcement.”

Further, Olivo has been accused of a conflict of interest because his firm, Corporate Screening and Investigative Group, is one of SN Technology’s partner firms. He and school district officials deny any wrongdoing, arguing that Olivo does not receive a commission and that SN does not have a direct contract with the district. Rather, they are a subcontractor with a local firm.

Regardless of the alleged conflict of interest — and the questionable effectiveness of the invasive technology — the adoption of the Aegis system appears to be just the latest in the trend of policing schools. In an age of overwhelming fear about potential school shootings — and despite the advice of experts — educational institutions increasingly resemble prisons, and students are increasingly treated like potential criminals.




Source Article from

South Wales Police Facial Recognition System Generated 92% False Positives at Soccer Match

South Wales Police Facial Recognition System Generated 92% False Positives at Soccer Match

May 7th, 2018

Via: Engadget:

Ask critics of police face recognition why they’re so skeptical and they’ll likely cite unreliability as one factor. What if the technology flags an innocent person? Unfortunately, that caution appears to have been warranted to some degree. South Wales Police are facing a backlash after they released data showing that their face recognition trial at the 2017 Champions League final misidentified thousands as potential criminals. Out out of 2,470 initial matches, 2,297 were false positives — about 92 percent.



3 Responses to “South Wales Police Facial Recognition System Generated 92% False Positives at Soccer Match”

  1. dale Says:

    Whoa. That could be life changing. You pass a camera, then, “you have ten seconds to comply.�

  2. Dennis Says:

    Imagine the problems for the Chinese system! Perhaps cross-referencing cellphone data will do the trick.

  3. tm Says:

    Sorry “Aussies”, but you let the gov’t disarm you, so now you are completely at the state’s mercy. Enjoy your land being restored to a penal colony.


Buy gold online - quickly, safely and at low prices


Leave a Reply

You must be logged in to post a comment.

Source Article from

Police defend facial recognition technology that wrongly identified 2,000 people as potential criminals

A police force has defended its use of facial recognition technology after it was revealed that 2,000 people at the 2017 Champions League final in Cardiff were wrongly identified by the software as potential criminals.

South Wales Police began trialling the technology in June last year in a bid to catch more criminals, using cameras to scan faces in a crowd and compare them against a database of custody images.

As 170,000 people descended on the Welsh capital for the game between Real Madrid and Juventus, 2,470 potential matches were identified.

However, according to data on the force’s website, 2,297 – 92% – were found to be “false positives”.

South Wales Police admitted that “no facial recognition system is 100% accurate”, but said the technology had led to more than 450 arrests since its introduction.

It also said no-one had been arrested after an incorrect match.

A spokesman for the force said: “Over 2,000 positive matches have been made using our ‘identify’ facial recognition technology with over 450 arrests.

“Successful convictions so far include six years in prison for robbery and four-and-a-half years imprisonment for burglary. The technology has also helped identify vulnerable people in times of crisis.

“Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops.

Source Article from

Public to Be Scanned in Real Time as Police Body Cameras May Soon Get Facial Recognition


New innovations in technology are allowing police officers to contribute to a growing database by implementing facial recognition software in Breathalyzer tests and body cameras.

The largest maker of body cameras in the United States, Axon, announced last week that it has purchased two artificial intelligence companies and it is creating an ethics board for the purpose of preparing to use the technology with its current products.

Despite acknowledging the “bias and misuse” that will likely take place with such a system, the company’s founder, Rick Smith argues that the tech’s benefits cannot be ignored. “I don’t think it’s an optimal solution, the world we’re in today, that catching dangerous people should just be left up to random chance, or expecting police officers to remember who they’re looking for,” he told WaPost. “It would be both naive and counterproductive to say law enforcement shouldn’t have these new technologies. They’re going to, and I think they’re going to need them. We can’t have police in the 2020s policing with technologies from the 1990s.”

Both China and the UK — two major police states — already deploy cameras that use facial recognition in public.

In a letter to Axon, 42 groups including the ACLU, NAACP, and the National Urban League pointing out the police state dangers associated with such a program, calling it “categorically unethical to deploy.”

Axon has a responsibility to ensure that its present and future products, including AI-based products, don’t drive unfair or unethical outcomes or amplify racial inequities in policing. Axon acknowledges this responsibility—the company states that it “fully recognize[s] the complexities and sensitivities around technology in law enforcement, and [is] committed to getting it right.”

Certain products are categorically unethical to deploy. Chief among these is real-time face recognition analysis of live video captured by body-worn cameras. Axon must not offer or enable this feature. Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests….Real-time face recognition could also prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires. No policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.

But Axon is not alone in their police state profiteering, in order for the prison industrial complex to be fed a constant stream of inmates, allowing prison corporations, county jails, and police departments to profit off of arrests, the charging of an impaired driver must be streamlined.

One company believes it has solved the problems that exist with subjective field sobriety tests and faulty roadside drug tests. Breathalytics has developed a kiosk that serves three purposes. It scans a person’s fingerprint, scans their face for biometrics data, and tests their breath for the presence of alcohol, and presumably, drugs. It also records the entire kiosk visit on video.

The company calls the kiosk an example of “effortless alcohol screening” and says its kiosk helps facilitate the “alcohol monitoring industry.” Yes, that’s right. Monitoring drug rehab attendees, parolees, work/release, and roadside participants is now being called an “industry.” If for one second you ever wondered what was meant by the term “prison industrial complex,” Breathalytics is a part of that self-professed “industry,” consisting of any business connected to the imprisoning of individuals.

Breathalytics is offering its kiosk to police departments, courts, and jails right now as a free trial. The temptation may be too great for some law enforcement agencies to resist. But what are the ethical implications for a sobriety kiosk? Let’s take a closer look at the issues.

Breathalyzer claims to calibrate itself and is near foolproof. However, who makes that determination? According to Massprivatei, the source code for the kiosk is proprietary and cannot be independently verified. Who makes such a determination?

Blood alcohol content (BAC) rates are often subjective and are not very reflective of a person’s impairment, especially near the United States’ legal limit of .08. Some states are not happy that people are allowed to drink alcohol and drive with a .08 BAC. Many are pushing to have the BAC levels lowered, which could mean that anyone who was caught after drinking one bottle of beer or one glass of wine could be subject to the full ramifications of the law following a DUI arrest.

But more disturbing may be the data control implications of a kiosk that records a person’s biometrics, fingerprints, and breath. The Breathalytics machine can also collect DNA if saliva is produced when blowing through the machine. Who ultimately collects the data, stores it, and then keeps it from being used for more nefarious purposes, such as cataloging Americans’ DNA?

Do Americans have a God-given right to their privacy? Are we allowed to keep our DNA, our fingerprints, and our biometrics to ourselves? It seems citizens have lost that right. Already, in any international airport in the United States, millions of travelers are having their bodies scanned, supposedly in an effort to sniff for explosives.

Remember the time Americans were walking around with explosives in their pockets? No? No wonder! It never happened! The U.S. government used the 9/11 attacks as an excuse to take away Americans’ rights to travel freely without having their privacy infringed upon, by creating the Transportation Safety Administration and implementing full body scanning of individuals who want nothing more than to travel.

Now, it seems, the same type of TSA-style spy devices are making their way to the streets with police officers who are encouraged to adopt a breathalyzer kiosk. The machines could easily be installed in the roadside travel trailers police use in DUI checkpoints to charge people with oftentimes frivolous DUI charges that ultimately enrich the police department, the prosecutor, and the industries which benefit from random drug testing and the imprisoning of people in its complex.

If police departments adopt the alcohol kiosks they will, in essence, allow a private company to possess nearly all identifying information a person possesses. What could possibly go wrong?

DASH cryptocurrency and The Free Thought Project have formed a partnership that will continue to spread the ideas of peace and freedom while simultaneously teaching people how to operate outside of the establishment systems of control like using cryptocurrency instead of dollars. Winning this battle is as simple as choosing to abstain from the violent corrupt old system and participating in the new and peaceful system that hands the power back to the people. DASH is this system.

DASH digital cash takes the control the banking elite has over money and gives it back to the people. It is the ultimate weapon in the battle against the money changers and information controllers.

If you’d like to start your own DASH wallet and be a part of this change and battle for peace and freedom, you can start right here. DASH is already accepted by vendors all across the world so you can begin using it immediately.

Source Article from

Facebook Is Violating Your Privacy via Facial Recognition Technology

On April 6, a coalition of consumer privacy organizations led by the Electronic Privacy Information Center filed a complaint with the Federal Trade Commission, accusing Facebook of violating individual’s privacy via the company’s facial recognition practices. The complaint focuses on changes to Facebook’s policy which went into effect in early 2018, namely the ability to scan user photos for biometric facial matches without consent.

The organizations say that Facebook is deceptively selling the facial recognition technology to users by encouraging them to identify people in photographs. “This unwanted, unnecessary, and dangerous identification of individuals undermines user privacy, ignores the explicit preferences of Facebook users, and is contrary to law in several state and many parts of the world,” the complaint states.

The coalition also claims Facebook’s policy violates the 2011 Consent Order with the Commission, calling the scanning of faces without “unlawful”. The organizations are calling on the FTC to reopen a 2009 investigation of Facebook due to recent revelations regarding Cambridge Analytica accessing millions of Facebook users private data. The Electronic Privacy Information Center has called on the FTC to investigate Facebook’s facial recognition practices since 2011.

“Facebook should suspend further deployment of facial recognition pending the outcome of the FTC investigation,” EPIC President Marc Rotenberg said.

Other organizations participating in the complaint against Facebook include The Campaign for a Commercial Free Childhood, The Center for Digital Democracy, The Constitutional Alliance, Consumer Action, The Consumer Federation of America, Consumer Watchdog, The Cyber Privacy Project, Defending Rights & Dissent, The Government Accountability Project, The Privacy Rights Clearinghouse, Patient Privacy Rights, The Southern Poverty Law Center, and The U.S. Public Interest Research Group.

This is not the first time Facebook has been under fire for their facial recognition technology. As far back as 2015, The Anti Media reported on a lawsuit involving a man who, despite not having a Facebook account, was fighting to get his “faceprint” from the company. The complaint was filed by Frederick William Gullen of Illinois. Gullen’s complaint stated:

Facebook is actively collecting, storing, and using — without providing notice, obtaining informed written consent or publishing data retention policies — the biometrics of its users and unwitting non-users … Specifically, Facebook has created, collected and stored over a billion ‘face templates’ (or ‘face prints’) — highly detailed geometric maps of the face — from over a billion individuals, millions of whom reside in the State of Illinois.

Although no federal law exists to govern the commercial use and collection of biometrics, Illinois and Texas have passed laws designed to protect the public. Illinois’ Biometric Information Privacy Act made it illegal to collect and store faceprints without obtaining informed written consent. The law also made it illegal for companies to sell, lease, or otherwise profit from a customer’s biometric information. Lawsuits filed against Facebook allege the company is violating BIPA because it makes faceprints without written consent.

With Facebook CEO Mark Zuckerberg facing pressure from the U.S. Congress and public, the company may shift towards privacy oriented practices. However, for the moment, users should be aware that their words and face are owned by Facebook and whoever else they decide to share the data with.


Source Article from