We trust the state with liberty and security
Maybe there has been polling on these issues before, but I don't recall such a comprehensive report in the UK following a YouGov poll.
A couple of passages:
Quote:
suggests a majority of British voters are neither overly moved nor concerned by the surveillance question, and tend to err on the more hawkish side of debate. (See end of post for methodology)In a list of online issues including cybercrime, cyber attacks, surveillance, trolling, propaganda and fake news, only 21% of respondents listed UK government surveillance of its own citizens among their main concerns, compared with 66% citing cybercrime, 46% citing cyber attacks and 45% citing access to inappropriate content by children.
(Later) levels of public trust in key institutions of establishment seem relatively high, with clear majorities saying they trust judges and senior policy officers to act in the country’s best interests, and trust the police and intelligence services to behave responsibly with information obtained from online surveillance.
In short, where Britain stands on surveillance could be more about where it sits on a scale of institutional, rather than political, trust.
Link:https://rusi.org/commentary/security...r-surveillance
Facial recognition technology in real life fails
A puzzling academic article 'DNA techniques could transform facial recognition technology', added here as it opens with some facts and the states there is an answer - genomics.
It opens with:
Quote:
When police in London recently trialled a new facial recognition system, they made a worrying and embarrassing mistake. At the Notting Hill Carnival, the technology made roughly
35 false matches between known suspects and members of the crowd, with one person “erroneously” arrested. Camera-based visual surveillance systems were supposed to deliver a safer and more secure society. But despite decades of development, they are generally not able to handle real-life situations. During the 2011 London riots, for example, facial recognition software contributed to
just one arrest out of the 4,962 that took place.
Link:https://theconversation.com/dna-tech...omment_1434283
In Your Face: China’s all-seeing state
A BBC World Service report from Guiyang, a Chinese city in the south-west, with a population of over 4m and noted for its investment in big data and computing. Background and map:https://en.wikipedia.org/wiki/Guiyang
The BBC reports (5mins podcast) states:
Quote:
China has been building what it calls "the world's biggest camera surveillance network". Across the country, 170 million CCTV cameras are already in place and an estimated 400 million new ones will be installed in the next three years. Many of the cameras are fitted with artificial intelligence, including facial recognition technology. The BBC's John Sudworth has been given rare access to one of the new hi-tech police control rooms.
It is a rather curious piece, complete with interviews - all reassuring, bar one - and is aimed at everyone.
Link:www.bbc.co.uk/news/av/world-asia-china-42248056/in-your-face-china-s-all-seeing-state
Weapons Of Mass Surveillance
I missed this BBC Arabic Service report (55 mins podcast), in July 2017, and it fits here well. Their summary:
Quote:
Middle Eastern governments are using high tech mass surveillance tools to monitor their citizens. Western companies, including Britain's largest weapons manufacturer, BAE, are among those selling surveillance technology to these governments.
The trade is attracting criticism from human rights organisations who question whether a British company should be selling such equipment, much of it classified, to repressive regimes in the Arab world.
Link:http://www.bbc.co.uk/news/av/world-m...s-surveillance
Palantir Knows Everything About You
This IT company Palantir arouses controversy and this latest article is no exception. I did not realize that commercially it is less than successful.
Link:https://www.bloomberg.com/features/2...?terminal=true
Facial recognition tech used by UK police is making a ton of mistakes
A good article on the problems with this new technology, although I recall several reports of deployment over ten years ago. Perhaps it is only now being released to "ordinary" police work? A 90% failure rate at one event and in London at least one false arrest. Read on.
Link:http://www.wired.co.uk/article/face-...-hill-carnival
More data and surveillance are transforming justice systems
A typical long article from 'The Economist' on policing with the joys of more data and surveillance. It cites many different systems, nearly always with an American context and ends calling for a wider public debate - which at least here in the UK has yet to emerge.
Link:https://www.economist.com/technology...-05-02/justice
Welcome to the age of surveillance capitalism
A 'long read' article, the full title of which is: 'The goal is to automate us': welcome to the age of surveillance capitalism. The book's title is: The Age of Surveillance Capital.
The article acts as an introduction and has a ten question Q&A with the author Shoshana Zuboff, of Harvard Business School. It is a large tome (600 pgs), so it will take a long time to become a best seller.
Here is a helpful passage that explains:
Quote:
The name Zuboff has given to the new variant is “surveillance capitalism”. It works by providing free services that billions of people cheerfully use, enabling the providers of those services to monitor the behaviour of those users in astonishing detail – often without their explicit consent. “Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”
How about this from the author herself:
Quote:
This antidemocratic and anti-egalitarian juggernaut is best described as a market-driven coup from above: an overthrow of the people concealed as the technological Trojan horse of digital technology. On the strength of its annexation of human experience, this coup achieves exclusive concentrations of knowledge and power that sustain privileged influence over the division of learning in society.
Link:https://www.theguardian.com/technolo...oogle-facebook
Automated Facial Recognition Technology: a UK review
The use of Automated Facial Recognition (abbreviated to AFR) Technology has become a public issue here, sometimes with references to China's use and the implications for liberty, human rights etc. By a odd quirk of media reporting, judging by Twitter traffic, a BBC TV film clip showing an incident in London when the police used AFR and a man covered his face - the police stopped him, he was abusive and was given an on the spot fine.
Link to the film clip via a campaign group:https://bigbrotherwatch.org.uk/all-m...ring-his-face/
My Twitter feed a few days ago provided this police-commissioned academic report; the full title is: 'Evaluating the Use of Automated Facial Recognition Technology in Major Policing Operations'.
Two key passages IMHO:
Quote:
The study found that while AFR can enable police to identify persons of interest and suspects where they would probably not otherwise have been able to do so, considerable investment and changes to police operating procedures are required to generate consistent results.
The report suggests that it is more helpful to think of AFR in policing as 'Assisted Facial Recognition' rather than a fully 'Automated Facial Recognition' system. ‘Automated’ implies that the identification process is conducted solely by an algorithm, when in fact, the system serves as a decision-support tool to assist human operators in making identifications. Ultimately, decisions about whether a person of interest and an image match are made by police operators. It is also deployed in uncontrolled environments, and so is impacted by external factors including lighting, weather and crowd flows.
I have not read the full report, that caveat aside read on:https://crimeandsecurity.org/feed/afr