Skip to main content
  • Share to or
stories

‘We’re making a system that’s fatal to us if it’s misused’ ‘Meduza’ talks government relations and transparency with Kaspersky Lab’s Vice President for Public Affairs

Source: Meduza

One might expect the cybersecurity company Kaspersky Lab to focus primarily on hackers and viruses, but in the last two years, the company has had to defend the safety of its own products. It all started in 2017 when the U.S. government prohibited all of its institutions from using Kaspersky Lab’s antivirus service out of concern that the company might be cooperating with Russian intelligence services. A year later, in the fall of 2018, Kaspersky opened a “Transparency Center” in Switzerland that offers experts the chance to examine the source code behind its products firsthand. A second center will open in Madrid in the summer of 2019, and by the end of the year, all data the company receives from European users will be processed directly in Europe. Meduza’s Deputy Chief Editor Sultan Suleimanov spoke with Kaspersky’s Vice President for Public Affairs Anton Shingarev, whose portfolio includes government relations, to ask how helpful the new centers will be in restoring users’ and governments’ trust in the company.

Note: This conversation took place in March of 2019. At the time, plans for a Transparency Center in Madrid had not yet been announced.

Sultan Suleimanov: Why are you opening these centers and spreading the word about them? Do you think they will help you regain trust?

Anton Shingarev: They’re already helping. In November of 2018, we opened a Transparency Center and a data processing center in Switzerland. At the beginning of this year, we started collecting initial reactions from regulators, business partners, customers, and politicians. We went to a couple of conferences and chatted with people on the sidelines. People are saying that it’s a good thing, that we’re moving in the right direction.

Regulators in France and Germany and the prime minister of Belgium have all announced publicly that Kaspersky Lab is a fine company that they are not planning to ban.

And this was after you opened the Transparency Center?

Yes. This was in December or sometime close to that. But we understand that it’s not enough, it won’t solve everything. We do have to be realists here, and I understand beautifully that there will always be questions about trust. We used to be here (gestures), where we said trust us, we’re good. But that doesn’t work anymore, there’s no assumption of innocence, and people won’t believe you by default. We’re moving over here (gestures again), where we’re saying trust us because we have a data processing center in Switzerland, we have a Transparency Center, we have an auditing company auditing our code, and so on.

We’re saying believe us because we can be believed. But there’s still this little element, no matter what steps we take, this push toward the facts, there will always be this question of trust. In some countries, like the U.S., for example, where the political action is a bit hotter and there are more emotions floating around, it’s harder for us. In European countries, where people tend to look to the facts more often, it’s easier. Things are difficult in the Baltics too because there’s a difficult history, there’s an emotionally politicized form there too. So we’re moving. It’s helping, but it’s a lengthy process.

I’ve read that you are opening up your code and giving certain “trusted” individuals access to it. Have there already been people who have looked at something? Regulators, maybe, or corporate customers?

Here, it’s important to separate companies from government agencies. We’ve already had a string of requests from companies, and we bring them in and show them what they need. Not the code, actually, they don’t have any interest in the code. They don’t even have the technical skills to understand what’s going on there. They come in to talk to us, to have us explain what kind of product they’re dealing with and how it works.

Where regulators are concerned, we’ve only had one request. It’s a Western European country, but I can’t say yet which one because we’re still in negotiations about what’s going to happen and when and where. But we do have one request, and that analysis will be conducted.

Anton Shingarev gives a presentation on Kaspersky Lab’s transparency initiatives
Kaspersky Lab

There’s an interesting phenomenon we’ve come across. Regulators tell us, “The transparency center is really important to us, but we won’t go there ourselves. We won’t go there because we already know everything’s okay, but the center is important because we have that possibility.”

We put all our binaries [modules of compiled code] that are sold to users up there, and we put up all our updates too. And the regulators can run retrospective checks on any code at any time. Essentially, what’s important to them is the fact that the possibility is open to them, but so far, they’re in no hurry to actually check anything.

How do you guarantee that the code, these binaries that you display in these centers, is the same code that goes out or has already gone out to users?

So you have compilation happening — a container puts it all together. I’m not a techie myself, but I’ll explain how I understand it on my own level. When you’ve bought our product in the store and installed it, there’s a file on there with our compiled code. We take the source code, bring it to Switzerland, compile it there, and then you can compare: here’s the code you bought in a box at the store, and here’s the compiled code.

I’ll say up front that there will be differences because there are always differences when you compile code. That’s why one of our experts will be sitting nearby, ready to explain why those differences arose.

When I was reading people’s reactions to the news about your Transparency Center on Reddit, I saw an argument that went something like this: the code might all be fine, but the antivirus that’s already installed on users’ computers is connected to the cloud, and a command can come through that cloud that says, “send me these documents in an encrypted format.”

Yes, that is theoretically possible: an update can come through the cloud that changes the functionality of the product, and theoretically, it can do something like that. That’s why we put every update in our repository as well and make it available for testing.

The problem here is that making a system that can test and verify everything on the fly isn’t possible yet. Even the regulators understand that. So we’re making a system that’s fatal to us if it’s misused. If some kind of insider decides to do something like that, it will be discovered sooner or later.

Speaking of insiders, I also read that you are offering $100,000 for finding bugs in Kaspersky products. I was thinking that $100,000 is probably a large enough sum for someone to try and get a job at Kaspersky Lab, put in a bit of malware, and then make a big deal of discovering it. What would you do if someone tried to pull that off?

This is the first time I’ve heard this idea. That would all be pretty dubious from a legal point of view. I’m not sure it would be legal, but let’s just imagine. We’re currently under audit. I can’t say which firm is doing it, but one of the “big four” [Deloitte Touche Tohmatsu, PricewaterhouseCoopers, Ernst & Young, or KPMG] is auditing our engineering practice: how our code is written, who checks it, who has access, who checks the checkers, who tests it, and so on. After we pass that, as I am sure we will, we’ll have a certificate testifying that our practices correspond with the strictest industry standards. When a third party does an audit, that’s the biggest guarantee.

So you’ll just prove that this scheme would be impossible even in theory?

In theory, of course, everything is possible. But we have all the means necessary to minimize that risk. I don’t want to give false promises: in this industry, everything is possible in theory. We’re talking about risks and their minimization. Our whole initiative isn’t intended to eliminate risks entirely because that’s impossible. It’s intended to minimize risks to a level that’s acceptable to regulators and to our business partners. We’re realists.

A regulator in a certain European country told us, “If you want to be installed in our atomic station, you need to do this, that, and this other thing.” In the end, they said we had to pick up and move, hire a local manager, local developers. We understand all this, and we tried to find a compromise: we’re not prepared to move the entire company, so you can forget about the atomic stations. But what can we do to widen that window of possibility?

And what is this Transparency Center in Switzerland? In that photo with Eugene Kaspersky, it looked like a data center next to some kind of administrative building.

There are two entities there. There’s a data center, and next to it, there’s the Transparency Center. They are two different things that are not to be conflated. The data center has racks, servers, there’s nothing all that amazing over there.

The aforementioned photo of Eugene Kaspersky

There’s not much of anything special in the Transparency Center either. It’s just a separate, isolated room with heightened security measures and security cameras inside. You can get in using a pass we issue after taking down your passport info. Cameras and telephones aren’t allowed in.

Inside, there’s a computer with almost all of its portals sealed. To start working, you have to input a special username — there are only two or so per company. You log in; you turn on the computer. A VPN connection activates between you and the code repository, and you can sit there and look through it.

And a company specialist sits next to you? Or you do it all alone?

Yes, a specialist is there with you. It’s not so much for security purposes — we don’t let just anybody in there, and there are cameras. That person is there to help because anyone who sees 20 million lines of code for the very first time has a very hard time understanding what’s going on and what’s going where. So you need expert support.

If I request access, will you make sure I am who I say I am?

We run checks using open source materials. But we’ve made it clear from the beginning that every visit is public. We retain the right to publish each visitor’s first name and last name, when they visited, and why; which modules they requested and what they checked. And we check passports. We don’t have the ability to run a full background check, but we take measures to minimize risk.

On top of opening a Transparency Center, you’re moving some of your processing to Europe. Why?

I think that’s even more important. After we started getting these accusations, we stopped running around in circles in a panic, and we knew we had to do something. What did we do? We decided that it would be wrong for us to take it upon ourselves to think everything up, and it would be right to go talk to the people who have been openly doubting us. We went to regulators in Western Europe and asked them honestly: what are you worried about? What are your concerns? What can we do?

And we were able to put our finger on two broad areas. The first risk is backdoors. Intentional, unintentional, insider-made, outsider-made, it doesn’t matter. That, we already talked about — that’s the Transparency Center.

Second is the risk that intelligence services could get their hands on the information we collect about viruses. Even though that risk is hypothetical, it makes them nervous. They’re nervous about SORM even though we’ve said many times that we’re not subject to SOM because we’re not a telecommunications provider. We’re software developers, and the software we make is encrypted. But it’s difficult to explain that to them. That’s why we decided to move our malware data processing centers to Switzerland. It’s a process, it’s not all that fast — you can’t just press a button and rent new facilities, unfortunately. You have to rewrite the code, rewrite the product. The process will be over by the last quarter of 2019, and from that moment forward, all our information about malware files gathered from European users, including in Latvia, Lithuania, and Ukraine — Europe in the broader sense — will be collected, stored, and processed there.

But if someone doesn’t trust you and they think you want to cooperate with the Russian government, then how will locating these centers in Switzerland calm them down? After all, all that data can still ‘fly over’ to Russia and land on some government comrade’s server.

That would be visible. We’re looking for a third-party organization now that will monitor that kind of thing.

Let’s imagine a scenario that’s completely out there: we’ve collected some mass of data in Switzerland, and all of a sudden, we have to send it to Russia. There will be a big spike in traffic, and it will be noticeable.

Again, we’re talking about probability. We can’t get rid of that risk entirely. Well, we can, but to do that, we would have to move to Switzerland. Then the Swiss would be happy, but everybody else wouldn’t be. So either we open 189 Kaspersky Labs or we think up something else. Our logic is that our development will stay in Russia because Russian programmers are very strong and do very high-quality work without being too expensive. In terms of the cost to quality ratio, they’re right at the top. But we’ll distribute the infrastructure itself.

There’s this trend in the world toward so-called balkanization, fragmentation. The laws that are currently being passed in Russia are entirely aligned with that trend. Usually, China comes first, but the trend is completely obvious: building borders, building distance from others. And that’s where Europe is going too.

Kaspersky Labs

Tell me about the States. Have you put a stop to your activity there entirely because there’s just no sense in it anymore, or are you waiting for something to change?

Not entirely. Even now, despite the drop, America is a huge, extremely important market for us, and it’s growing especially fast in digital — online sales are rising. We would never leave there in any case whatsoever. But what’s definitely frozen now is our connection with government institutions. Three or four years ago, we were helping the FBI and the Department of Homeland Security investigate crimes that had connections to Russian hackers. Now we’re nowhere close to that, which was their decision, not ours.

And have they been telling you through some kind of side channels that things have gotten harder for them or not?

They have. Your average officer at the middle management level who is responsible for investigating and catching criminals wants to work with us. But the people who are making decisions under political influence aren’t letting it happen. It’s a classic case, in my view, of “poking out your own eye so that your mother-in-law will have to have a one-eyed son-in-law,” as they say. They’re shooting themselves in the foot. This is unambiguously bad for American business and for American citizens because we know Russian threats and Russian hackers better than anybody. It’s just because we’ve worked in Russia for so long — in some way or another, we defend practically every bank in the Russian banking system, and we know almost everything about Russian cybercrime. And they’re eliminating their own access to this enormous mass of data that might allow them to defend themselves. I think it’s a very big mistake.

Sultan Suleimanov

Translation by Hilah Kohen

  • Share to or