The development of neural networks leads our world to a hitherto unknown level of totalitarian control over society

Shutterstock/Fotodom The development of neural networks leads our world to a hitherto unknown level of totalitarian control over obshestvennije for people using the technologies of artificial intelligence (AI) promises to be one of the hottest

Shutterstock/Fotodom

The development of neural networks leads our world to a hitherto unknown level of totalitarian control over obshestvennije for people using the technologies of artificial intelligence (AI) promises to be one of the hottest topics 2019. In January at the Davos forum, billionaire George Soros criticized China, the furthest advanced in the deployment of a national network CCTV. According to Soros, the Chinese regime seeks to subjugate life half a billion citizens, “unprecedented in the history methods.” At the same forum, Microsoft CEO Satya Nadella called upon States to pay attention to the problem of face recognition and to limit the initiative in this area of the legislation.

Over the past few years, the identification of the user facial portrait was everywhere: we “Shine face” in smartphones, social networks, cars, web cameras. Technology, adopting big business and intelligence agencies. The ubiquitous camera to recognize you at every corner and documenting life every minute, never before had the totalitarian dream of total control over the individual could not be realized so fully.

While such systems operate with errors or cover a limited area. But after a few years people will be really “under the hood”. This also applies to Russia. This year “smart” surveillance will start in Moscow. Later, in the framework of building “smart cities”, technology will come to the regions. What threatens us with a new reality, “Profile” tried to figure out with the help of experts from high-tech companies.

Technique sight

In the base of the computer making a thousand images labeled “dog”, then another thousand cats. Once artificial intelligence (i.e., specific program code, algorithm) will handle them, by understanding the differences, he needs with a certain probability to distinguish “dog” from “cat” in the following sample. The operation is repeated many times, marking incorrect answers, and accuracy of the results improves. The program has no idea who the “cat” with “dog”, and even more do not know the classification of the animal world. Looks so simplistic methodology of deep learning that creates one of the main technologies of our age is the computer vision.

Face recognition is one of the main areas of its application. The program focuses on the characteristic features of the appearance, noting about 100 control points: distance between eyes, size of eyebrows, nose, chin. The combination of these parameters gives a unique portrait.

In a broad sense is one of the methods of biometric identification, which is not considered to be know-how. But, unlike fingerprinting or retinal scan, face recognition can be performed at a distance, it does not require the consent of the subject. It is possible not only to identify but also to define the emotions, reaction to external stimuli, sincerity behaviour — almost “look into the soul”.

But if the face is poorly visible or camouflaged, the AI is able to calculate human gait by analyzing the position of his body in each phase of the step (gait analysis). In a word, not hide.

All-seeing dragon

In early 2010, x the undisputed leader in the research of AI was considered the United States. But in recent years the situation began to change in favor of China. According to popular belief, a catalyst was the success of the computer in the game of go: the beginning of 2017, the program Alpha Go defeated the best Chinese grandmaster. And if to the West it was only the conquest of the next level, then in China, where it has almost sacred status, the news made a splash.

In the summer of 2017, the President of China XI Jinping announced the intention to build the country’s economy of the future using artificial intelligence and big data. According to the plan, in 2020 the Chinese market AI needs to reach $22 billion in 2025, m — $60 billion in 2030, m — $150 billion

High-tech companies promised broad support measures, such as subsidies, government contracts. The trump card of Chinese developers AI denied to their competitors in other countries, has become a huge database on which to train the algorithms. Chinese authorities are eager to share it with scientists.

No wonder “smart” startups in China are growing by leaps and bounds. One of them is the SenseTime from Hong Kong — was established in October 2014 as a laboratory for face recognition in one of the universities. For four years, the company has acquired more than 400 customers, has attracted $3 billion investment and now is the most expensive AI startup world. Its closest competitors — Face++, Deep Glint and startup Watrix with good practices for the recognition of gait. As well as national search engine Baidu, which opened its own Institute of deep learning.

For its part, the market was supported by the equipment manufacturers. The Chinese smartphone facial recognition and a variety of “facial team” became a signature thing. And the company Hikvision Digital Technology flooded the world with cheap CCTV cameras, according to IHS, Markit, its share in the global market of video surveillance devices has risen from 8% in 2012 to 21% in 2018. m. In USA Hikvision cameras installed, even in police stations and military bases than very concerned about politics.

China itself began to light weight cameras in 2005. In 2015 m they had covered all the streets of Beijing. Initially, the video was viewed by police (in Beijing it took 4.3 thousand employees), making data collection inefficient — it turned out only to investigate the emergency situation. But in 2015, the authorities announced the device and connect it to the automatic face recognition.

In 2017 in China, there were 176 million “smart” cameras, and in 2020 m, their number is expected to reach 626 million it is Planned that by the time the program called “the Watchful eye” (in the Western press — SkyNet), will cover public places in all cities of the country and to identify citizen for three seconds with an accuracy of 88%.

Behalf of the Chinese people will recognize not only on the streets. The country is literally obsessed with new technology. Payment service Alipay is tied to the users account to their photos, and now all they have to do a selfie to transfer the money. In the KFC restaurants are available to order, showing himself to the camera payment terminal.

In some machines instead of “present” person, and the system will give bills. In airports from 2017 practiced check-in without documents: a boarding pass is the exterior of the passenger. In residential complexes face serves as a grounds pass or in separate buildings — the “Recognizer” can be embedded even in the front door.

Vigilant control installed on Chinese teenagers. Truant is not easy: training in the form of embedded chips that are read by cameras at the entrance to the school. Surveillance cameras in the class recognize the mood of the students, looking for those who are distracted or bored: they will reduce the score for the work in the classroom. And in the evening will “hang” in video games. Their largest producer of Tencent uses camera gadgets to identify players: children up to 12 years old the system allows you to play more hours a day, teenagers — no more than two, then the game disconnects.

Face control for the bad guys

The Chinese authorities argue that the public benefit of face recognition is invaluable. Regularly reported on the “exploits” of AI in the order. In September 2017, the police thanks to the camera located 25 wanted citizens at the beer festival. And in April 2018 th found the suspect in the 60 strong crowd at the pop concert.

Last year, police began to distribute portable glasses-“recognizers”. Wearing them, the patrol certainly will not miss the offender: points scan passers-by in sight, for a couple of minutes finding a match in the database if the face is seen at least by 70%. It was reported that for the first week of use glasses helped apprehend dozens of criminals.

Separate the front of the AI was formed in connection with the desire of the authorities to regulate traffic. At the crossroads of the mounted cameras along with screens where data is instantly displayed on violators, crosses the street at a red light. Later the police finds them, offering as punishment one of three measures: a fine of $3, attending lectures about the rules of crossing the carriageway, or assisting controller in managing traffic.

The most painful for the “runners” is a public reprimand on the big screen. In the end, where we have installed equipment, the number of violations decreased by 10 times. The conclusion of the authorities: the cost in $15 thousand to the intersection pay off.

The Chinese believe that the recognition algorithms can do more: not just to catch criminals but to prevent crimes themselves. AI startup Cloud Walk from Guangzhou has developed a system to analyze the degree of suspicion behavior in public places: if a person is noisy or strange (for example, wanders aimlessly through the station instead of take the train and leave), the police signal you to pay attention to it. To record the suspects and those who visit the stores of arms and clubs with gaming machines and other hot spots.

This brings to mind a system to predict crime (crime prediction) of a Particular opinion? Or the thought Police (Thought Police) of “1984”, the punishment for thoughtcrime?..

Digital dictatorship

George Soros is not the first who drew attention to the similarity of the realities of China with stories of dystopia. Concern about the consequences of control over society to such an extent, sounded with various stands: Chinese rules called “the digital dictatorship,” and “a modern form of political engineering.”

Of adds fuel to the fire developed by Beijing since 2014, the system of social credit. It is largely based on video surveillance, but not restricted to: keep track not only move the person, but any online and offline activity. Schedule of tax payments and Bank statements, purchase history and likes the Internet and social networking — everything is in motion.

Artificial intelligence maps of heterogeneous information and shows the overall “rating” of the citizen. Not paid on time the fine or rude on the Internet, went without a ticket or Smoking in transport, slandered party or just stood next to the unwanted person? You lose points. Donate to charity, donate blood voluntarily, you respect your elders? The rating grows.

The number of accumulated points directly affects the quality of life. “Movers” receive discounts, preferential loans and other incentives from the authorities. “Losers” are harder to rent an apartment and get a job, they do not serve in the popular institutions that do not register on Dating sites. An ominous shadow falls on the family and friends of a loser: their rating is also affected, which makes to avoid contact with the “untouchables”.

“Social lending deals not with criminals but with ordinary, though not the most responsible citizens, says an expert on deep learning of the company “Figure” Dmitry Lukovkin. — It is designed to reinforce a set of behavioral patterns, “optimize” people. Can authorities do this on the basis of what is set to the target behavior — it is other issues. But in China, they don’t ask”.

Beijing plans to build a nationwide system by 2020, and then will prevail “harmonious socialist society”. Now it works in some cities. Coming there, “harmony” has received intense media attention in the spring of 2018, when it became known that the authorities had closed the “last-place” Chinese access to the “elite” forms of transport: their applications for 11 million tickets and 4.25 million tickets for high-speed trains, was rejected.

The most radical example of social lending was the Xinjiang Uyghur Autonomous region in Western China, where two-thirds of the population are Muslim Uighurs. Judging by the attitude of the representatives of this ethnic group, they are a priori rated “unreliable” rating. Some authorities sent to re-education camps, officially called centres of professional training. In September, 2018 th Human rights organization Human Rights Watch issued a report about the oppression of Uighurs, suggesting that they are Beijing rehearses “crackdown” on a national scale.

“That is staged in Xinjiang, like a prison under the open sky, says sinologist, expert on cybersecurity technologies Leonid Kovacic. — Though I must admit that the terrorist attacks in the region faded away, but before they were many. Anyway, most Chinese are too worried about what is happening in the West of the country, and the system of social lending is little discussed. Only sometimes semi-official media presented it in a positive way, saying that we should live right, and now for it is a bonus. It is believed that the system contributes to raising public morality, the Chinese “braces”. Whether it will work in full force in 2020, it is unclear in each region, accounting of deeds is in his own way, to unify”.

The camera looks to the world
Beijing is not averse to share their know-how in supervising the case with other countries. The emphasis is on Latin America, where China is consistently increasing its economic impact. Ecuador actively imitates Asian cartridge: surveillance of citizens is established in almost all provinces recorded any action in their mobile phones. The branches of the company CEIEC, the agent of Beijing in the field of “intellectual” censorship, also working in Brazil, Peru, Bolivia, Venezuela.

“Many look to Chinese technology, but not everyone is able to extract from them the result, says Kovacic. — The Venezuelan regime, as we can see, they are not helped to avoid mass protests. In Ecuador, on the contrary, during the first year of surveillance crime fell almost 30%”.

In the West “champion” for facial recognition are the United States, where the locomotive of progress are the IT corporations. Facebook is developing this niche in 2010: then the option Tag Suggestions opened up the ability for users to tag photos of yourself and friends. Gradually, the company has assembled one of the world’s largest directory of persons and now, according to rumors, keep it behind seven seals from commercial partners.

Microsoft 2016 m introduced the Azure Media Face Detector that can recognize the face on a tiny section of the image size of 24×24 pixels. On pictures with high definition service can find up to 64 persons. Soon he had a competitor — the system Rekognition from Amazon that identifies one frame in a hundred.

Last year, Apple introduced a recognition algorithm FaceID in the newest model the iPhone X. 2019 m scanner should appear in the Android operating system from Google.

The U.S. intelligence services also do not disdain the computer vision technology. In 2012, the FBI launched in several States the system Trapwire $1 billion improving the same: to identify a person, determine his emotional state, predict intentions. And the Pentagon in conjunction with the Department of advanced research projects (DARPA) has developed a system of aerial surveillance ARGUS-IS: mounted on drones superkamery able to give the picture with a resolution of 1.8 billion pixels.

There are reports of success of such systems. In the summer of 2018, when the face detection is adopted in 14 U.S. airports, the first attacker, who wanted to get into the country on a foreign passport, found on the third day of the cameras.

Overall, however, the US has lagged behind China: the country has only 50 million smart cameras, and machine vision market is $2.9 billion and is growing at 0.7% per year (in China — $6.4 billion and +12.4% per year, respectively; data from IHS, Markit). According to estimates of the Association for the advancement of artificial intelligence, today and tomorrow China will overtake the US in publications in the field of AI. In 2012, the ratio of scientific papers from those countries accounted for 41% 10% in favor of the United States, in 2017 this figure was 34% to 23%. If we take articles on the topic of deep learning, the Chinese surged to first place in 2013.

The rest of the world the facial recognition software used rather occasionally than systematically. In the European Union the most advanced project is iBorderCtrl, launched in November 2018. In the framework of this programme the algorithms of the AI-equipped border checkpoints in Latvia, Hungary and Greece — with their help, the Brussels hopes to curb the flow of illegal migrants. In Japan are preparing a big premiere in 2020 m — NeoFace system from NEC will have to recognize the guests of the Olympics in Tokyo.

The values of resistance

Sometimes the tracking system does not allow you to turn criticism of public men, scrupulous related to the issue of confidentiality of data. So, in the US a few years ago, it was the idea that to collect the “fingerprints of persons” illegal. In the summer of 2012, Senator al Franken issued a famous report in which it was claimed that the fact that the analysis of photographs is an invasion of privacy, after all, extracted from it a unique appearance settings.

In 2018, the scandal broke with the system Rekognition. It turned out that Amazon has to rent it out. the government of Palantir company specializing in big data, as well as the US Immigration service and the police of the state of Florida, planning on Chinese manners to cover cameras the city of Orlando.

In may, the American Union of civil liberties (ACLU) and 70 human rights organizations issued an open letter to the head of Amazon Jeff Bezos, demanding to stop the cooperation with the authorities. “People have a right to walk around town without supervision”, — stated in the letter. In June, a collective protest against “spyware” applications Rekognition expressed already Amazon employees. Justification of the company’s management did not look too convincing: that if we find out that someone uses Rekognition “irresponsible”, the access service will be blocked.

This example served as a lesson for Google: in December, the company announced that it plans to sell its developments in the field of face recognition. I have “carefully” to use AI “has not led to abuse” and “does not contradict our values,” said Google Vice President Kent Walker.

But this is not all satisfied. A number of organizations with those names like Big Brother Watch (“Big Brother on guard”) has announced its intention to fight the “capitalism of supervision” (surveillance capitalism) to the bitter end. Like, today, for us and so it follows in full, on the Internet is logged every click. But street camera is a bust, because people can’t hide or encrypt your face. Therefore, it is necessary to achieve a total ban on recognition.

“These stories show how the prospects for implementation of AI depend on social factors — the nature of the political regime, local ethics, says the co — owner of telecommunication company “Vertical” George]. — In societies with a strong democratic tradition, the society will resist the technologies of surveillance, and not without success”.

Work on the bugs

Apart from ideological reasons, the spread of “smart” surveillance prevent a purely technical problem. It often happens that the algorithms are far from perfect. For example, last year the program Rekognition took 28 members of the U.S. Congress for criminals, putting its creators in an awkward position. About the same camera with the AI module tried to use in London — it turned out that the program is wrong in 90% of cases, and therefore, its full implementation is out of the question.

In China, where testing has moved quickly to deal, computer errors have had to pay to innocent civilians. In November 2018 th artificial intelligence Ningbo fined for crossing a red light, a woman named Dong mingzhu. But she’s at the scene: the intersection drove a bus with a photo of pearl on the advertising banner.

“Today, the possibility of false triggering of the surveillance system in an average of about one part in a million, — says Dmitry Lukovkin from “Numbers”. But when hundreds of millions of cameras this will lead to hundreds of false identifications per day. Especially high risk when such cameras are installed in Autonomous security or combat systems.”

Of course, over time the accuracy of the results increases. For example, in 2015 Facebook and instagram were not able to distinguish a black man from apes. In MIT Media Lab conducted an investigation, which proved that the cause of the error in such cases is the asymmetry vector. AI train mostly on white males, therefore, women and blacks he perceives worse. The developers have taken to eliminate “discrimination”, and in 2018 the code Rekognition have already demonstrated local successes in the face of African Americans. But 31% of errors is still a lot.

How many years will it take to finalize algorithms, experts say a loss. And there will come, even the day when AI systems will work like a Swiss watch? American scientist Vinton CERF, considered one of the founders of the world wide web, believes that people can not control the “avalanche” of technology: billions of “smart” devices do go on sale, will gradually accumulate a critical number of errors, and there will be a global collapse.

“At this stage it is necessary that all inquiries using recognition technologies the active involvement of the person who will critically evaluate the received information and make the final decision,” concludes the IT Director Mail.Ru Group Denis Anikin.

To lose face

Another threat — the deliberate theft of data is accumulated with video surveillance systems. According to estimates, the South China Morning Post, the base samples of the appearance of Chinese citizens “weighs” 90 terabytes — that is, to carry out a full “personal directory” the most populous country in the world is just on multiple hard drives.

For China this issue is more urgent that the data for social lending is collected from a variety of sources, and hence the higher the risk of leakage. Moreover, using the same neural network, the attacker can based on stolen pictures to represent the victim in any light. Last year the Network began to spread deepfake-video: artificial intelligence videos using appearance of real people. This is not the Amateur “fotozhaby”: fake commercials frighteningly realistic, their heroes would be hard to prove that they are not involved in cognizable offences.

“In Chinese cities where it runs social lending, there were occasions when the accounts of the residents were forged on their behalf was done “wrong” purchase, and thus people falling ratings, says Kovacic. In the end, the discussion began about how to improve information security system, but it is no strategy, concrete measures are not developed. Beijing hopes to solve all problems administrative resource”.

According to experts, face recognition has become a phenomenon of life, and national governments will not be able to observe him from the side. In December 2018, on the need for early development of regulations for this technology, said the President of Microsoft brad Smith. In January it was supported by the CEO of the company Satya Nadella, admitting that self-regulation of the IT companies in this sensitive issue is not enough.

The relevant law in the United States may be adopted in the current year, one project has presented the Federal trade Commission, and other experts Georgetown University (Center on Privacy & Technology). As suggested by the American media, the law and prescribe the powers of state bodies in the field of recognition, and safety decisions in case of fatal errors.

And we go
In Russia advocate the deployment of “spy” networks stands the city of Moscow. Last year, the facial recognition works in the Moscow metro. M in 2019 it is planned to equip this camera technology in all areas of the city, which will cost the budget of Moscow 7.5 billion plus 6 billion for equipping data centers.

According to the Metropolitan police, AI surveillance has helped to prevent a 27 murders, 77 cases of causing of heavy harm to health, 165 robberies, more than 300 robberies. The future plans of the municipality — to establish the identity of the person at the railway stations. Passengers with valid passes will be able to pass on the platform without a ticket: the system “thinks” a person and determine that the fare has been paid.

Recognition of persons engaged in domestic IT start-UPS. The most successful of them is NtechLab, regularly winning international competitions identification algorithms. In 2015 the company introduced a popular service FindFace, allowing photos to find the person’s profile in social network “Vkontakte” (now they also use to search for criminals). In March 2018, the share of NtechLab bought the state Corporation “rostec”, are hatching plans to develop in Russia a “smart city” based on artificial intelligence.

According to interlocutors “Profile”, in our country there is no restraining the security services of the factors of equal power to the democratic institutions of the West. But there is no coherence and administrative systems and resources of China. Which means until a special law on recognition is not needed: this field adjusts 152 FZ “On personal data”, it is only necessary to monitor its implementation.

“Algorithms in Russia there is no shortage, but with the databases came a hitch, explains project Manager IDX Svetlana Belova. In 2018, we have launched a Single biometric system, but it is not really earned — the Russians are in no hurry to surrender their biometrics. But if the data is not easily mined, there is no “food” for neural networks”. According to experts, there is an option to combine private databases that are created primarily when banks. But they are not eager to “merge” the data of their clients to the side. While this conflict of interest is not overcome, the Russians can sleep peacefully: surveillance cameras may not be used except for police investigation.

Leonid Kovacic believes that the Chinese social credit system in principle has no prospects in Russia. “The moral evaluation of actions, complex social hierarchy, life in plain sight — these things are very characteristic of China, — the expert believes. — Social lending is a direct prototype in the ancient tradition of collective responsibility with a system of checks and rewards. For Russian culture it is unusual. Our citizens prefer once again not to Shine”.

Read more •••

Leave a Reply

Your email address will not be published. Required fields are marked *