Security Weekly 42: Spy Columns And Hacking Of Hotel Keys
Okay, Google, once, I hear, reception? On April 12, the video blogger Mitchell arranges a live experiment on YouTube to answer a simple question: if you have any software installed on your computer from Google (for example, the Chrome browser), does this mean that Google, as a native land, always listens to you through a microphone? The author of the video first shows a tablet with the name of the goods (toys for dogs), about which he later speaks out loud for a couple of minutes. After that, having opened a couple of popular sites instantly comes across advertisements of pet stores in large quantities.
From this beautiful experiment, you can make a lot of far-reaching conclusions. For example, we now live in a certain world of the Wild West, in terms of the unbridled surveillance of users on the network, where we are profiled in all respects – from utterances to selfies and patterns of the accelerometer built into the phone. Perhaps, it is, but on April 24 the same videoblogger puts out not just a refutation, but rather an attempt to show that not everything is so unambiguous. As is usually the case on the Internet, the first video with a draft on the fan looks more than two million times, the second video is gaining only 160,000 views.
- Indeed, the experiment turned out to be not the most scientific one. First, the author of the video, after seeing the first relevant link, immediately clicked on it. All the rest of the advertising on the theme of dog toys could no longer be shown: it is obvious that after the click, targeting will show you similar advertisements to the finish. Secondly, perhaps the error itself was the fact of live broadcast: if a person speaks into a microphone, consciously transferring sound to Google servers, it is clear that on the other side will listen. They also allowed him!
A slightly more scientific study was conducted by experts from Checkmarx (news, a full study is available after registering with them on the site). They found a vulnerability in the voice control system Alexa, which is used, in particular, in wireless speakers Amazon Echo. Echo allows you to install third-party applications – more precisely, in the new terminology of the marvelous miraculous world they are called either skill or skills. The researchers managed to write the application moderately malicious, it exploits a typical scenario: the user turns to the column, the column recognizes (in a clever cloud with intelligent intelligence) a query and provides an answer. But it was found the ability to listen to the user just like that. Not for long, since the time for “listening to the request” is limited: the standard API implements the script “I did not understand anything” and asks the owner – what did he mean?
This restriction also managed to be circumvented, replacing the “re-asking” for silence. That is, it turns out about such a scenario: the owner of the column asks to turn on the toaster, the Jupiter column, the owner spits and turns on the toaster, the column continues to listen. And with the help of standard methods of voice recognition, it transmits the decoding of all that was said to the attacker. It did not turn out only to turn off the backlight of the column, which is activated only during listening: some pale in the process of espionage is still present.
- There is an error in the logic of the system, which the researchers were able to use. The hole, according to Amazon, was closed, and henceforth it is planned to detect and eradicate “empty” requests to the user, as well as to determine suspiciously long listening sessions. In other words, Alexa and Amazon Echo still have default mechanisms, which sometimes allow the microphone to be turned off.
In this story, as well as in the unscientific experiment of Youtuber, you can see a threat different from the expected “bad” scenario: when a big vendor eavesdrops and pays for you. Most people should proceed from the fact that a big vendor personally does not care about them. Each of us is just one of the billions of user IDs, a couple of megabytes on a server rack. Here the potential problem is different: the infrastructure of a large vendor can be vulnerable to the actions of third parties who use standard tools to switch the flow of private information to themselves. In the vast infrastructure of Google or Amazon even catch such activity will not be easy. And then we are waiting for the distribution of machine-learning systems, which generally do not understand how they work, is there security at the level of three rules of robotics, or what? I say a brave new world.
F-Secure specialists learned how to clone hotel electronic keys
It all started with the fact that F-Secure researchers stole a laptop from the hotel room. On the door there were no signs of hacking, so the hotel after the investigation refused to compensate for anything – suddenly they themselves lost it? Researchers became frustrated, and after spending a couple of (or ten) years on the study of typical keys from the hotel, they found a way to enter any hotel room without leaving traces.
- From a technical point of view, F-Secure’s report on the work performed is extremely unconfined. The developer of software for managing locks is called Vision VingCard, and the manufacturer of the locks is Assa Abloy. The problem was allegedly detected (and closed) at the software level, there was no need to change the locks, but the amount of fog in the original blog post hints at the possibility of alternative scenarios. If so, it is understandable why the technical details of the research are disclosed a little less than in any way.
Generally not a cognitive video, in which suddenly there is some Russian hotel.
Nevertheless, the attack scenario is fairly simple (provided that there is a doped RFID reader and knowledge of how everything works). We need any key card from a particular hotel, even an expired and lost one, even five years ago, will do. Information from the card is read, magic occurs, and the card is written back with a master code that can open almost any door in the same hotel. A useful story in order to remind a simple truth: do not leave valuable things in the hotel room. Even if it’s pathos five stars.