Milestone Systems leaders talk about the need for a better ethical framework to guide the digital age.
Advancements in video technology can be used to prevent bullying in schools, provide faster emergency care, and flag compulsive gamblers when they try to enter a casino.
But facial recognition can also be used for less ethical purposes. In China, facial recognition technology is used to give citizens social scores, which can diminish if they engage in dissenting behavior. In the United States, facial recognition used for policing has been criticized for discriminatory racial bias.
The issue of data collection and surveillance is a major concern at Milestone Systems, a Danish video technology company that has its US headquarters in Portland. Milestone provides network video recorders and open platform IP video management software and has over 150,000 customer installations in city surveillance, education, finance, government, healthcare, hospitality, manufacturing, retail and transportation industries.
Thomas Jensen, the CEO of Milestone Systems since 2020, and Tim Palmquist, Milestone vice president of Americas, tell Oregon Business how and why prioritizing ethical use of video recording technology is the video management software company’s priority as it expands in Oregon.
This interview has been edited for length and clarity.
Why have ethical practices become important in tech companies? What does it mean for you to be a responsible technology company?
Thomas Jensen: We see gender, ethnic and religious discrimination all being driven by video technology. We have initiatives at Milestone that support how governments and societies are implementing video and collecting data in a non-discriminatory way, and we are looking at how we develop our software, how we allow our partners to sell our software, and very importantly, how we expect our customers to use our software.
That's all wrapped in an umbrella we call responsible technology. It’s something we've been doing for a long period of time, but the industry has been very technology-focused on safety and security, and not really been looking beyond those use cases, that can really help businesses succeed beyond just building or perimeter security.
It’s one of the things leading companies are stepping up and taking a stance on and will actually be willing to accept more legislation and regulation for the better good of society. And that's one of the things that we want to be sure we watch and drive forward.
Do you guys do business with the Chinese market? Or other countries that use facial-recognition technology to monitor citizens?
TJ: We are not physically present in the Chinese market, and that is by design. Similarly, we withdrew all operations in Russia and Belarus at the start of the war. We were one of the first companies in Denmark to do so.
We do have a lot of American customers who have their manufacturing facilities in China and we want them to get access to responsible technology as well. So we are not limiting their ability to use it in China, but we're not going in and building things together with Chinese players.
We further strengthened that by incorporating clauses in our end user license agreements to limit the exposure in that field. And for that reason, we are not actively collaborating with Chinese players, naturally.
Is there any way to build safeguards into video technology to ensure it is used ethically?
TJ: That would be like asking Ford Motor Company to stop people from speeding on the freeway.
What we are doing —we're building mechanisms to stop it, if it happens, and building educational practices for how we both sell it, but also how we inform our customers of how they can protect themselves, if it's foreign governments trying to gain access to their technology, and also how we expect them to act with our products.
We were amongst a small group of technology companies taking the initiative to a concept called the Copenhagen letter about how we expect technology companies to act and deploy technology for the better good of society and for mankind.
We initiated a whistleblower set up in particular to support us on that quest, and we have very rigorous screening methods when we do get cases.
Police reform advocates have suggested use of facial recognition by the police force is racially biased and unethical. How do you decide what use of video technology is unethical?
TJ: Right now we are looking to tie our company closer to the United Nations principles for good corporate governance and for ethical standards in technology development.
We are currently evaluating all the various technology areas in which our video software can be used, whether it's artificial intelligence and facial recognition, we are rigorously mapping the advantages for societies of utilizing the technology as well as we're mapping the disadvantages that we can see in those societies. And then we are building principles as the basis of that.
We operate in the US, EU and the UK, and you have to have a dialogue within a society about how we can make meaningful regulations and get more legislation in the area of how technology is being used, rather than limiting the access to the technology itself.
Facial recognition is a great example of this. Europe is currently considering making live facial recognition illegal unless you have a warrant. That would be a huge inhibitor for proactive terrorism prevention, which I would find concerning as a citizen in Europe. So what we would do is invite a dialogue on how governments are allowed to store the data. So, if Tim and I walked down Broadway and we're being mapped out against the FBI’s most wanted list and we do not stick out, then the City of Portland should not be allowed to store the data of our whereabouts at that point in time.
That would still give society a way to prevent a potential crime from happening.
Tim Palmquist: There's a practicality element here as well. There are a lot of ways people can envision that technology might be misused and discriminate against people in society. I heard someone say the other day, “Is Milestone software being used to identify people crossing state lines?” But currently there's no infrastructure to support that. So that might be a concern in someone's mind, but there's no practical element to their concern.
We need to balance our concerns against the practical limitations of what is real in our society as well. At Milestone, we care about nondiscrimination, we care about responsible technology, and we're doing everything we can to create those guardrails around our technology.
Facial recognition technology could be used on women who travel to Oregon from Idaho to have abortions in order to prosecute them later. Would you consider that to be a breach of human rights and your ethical standards?
TJ: This is exactly the sort of problem I was alluding to. I would consider that to be a huge invasion of privacy according to international standards.
I believe we can yield a lot of advantages by utilizing facial recognition for proactive terror prevention, but the worry I have is where the authorities are mapping and storing data about us as individuals when we're doing something.
And that's the reason why we're inviting more legislation on this technology rather than less. And we at Milestone feel we carry a responsibility to society to support that approach.
Unfortunately, many technology leaders are so profit driven that they would prefer no legislation whatsoever in the interest of protecting citizens, rather than them pushing it forward. And what we're defining right now is how can we strengthen our principles in case of violation of those principles? Currently, we can take out our customers, if they're in breach of human rights, but we want more specific legislation on these various elements from that privacy and human rights perspective going forward.
Do you find that to be an inhibitor on your business? Does having high ethical standards hurt you versus a company that might have fewer restrictions written into the contract?
TJ: If you take a purely short-term perspective, then yes, we are walking away from business opportunities. But we have the firm understanding and expectation that in the midterm, ethical standards are going to be a huge advantage for us because citizens, whether it's in Portland or back in Denmark, are sick and tired of technology companies abusing their data and monetizing from things that have not opted in to do.
I think we will see many more companies, and governments looking towards the companies that do have a responsible mindset and high ethical standards for partnerships in the future.
This election cycle, there has been debate about how the state of Oregon is for businesses. Do you see Oregon as a difficult place to do business for your company?
TJ: There are not many places on earth right now where it's not difficult to do business. We’re operating under very unusual circumstances, just coming out of a two-year long lockdown and now we see inflation. We also see generational shifts in the workforce. I haven't spoken to a single country where there's not a shortage of nurses these days.
I think, naturally, in a smaller political environment, everybody is focused on their own area, but with a global perspective, I think it's difficult to do business everywhere.
TP: I 100% agree with Thomas's comment that it's difficult to do business anywhere. But as far as the Portland metro area, for us being located in the suburbs, where we are, we find the business conditions to be just fine.
We’re hiring quite a few people. And we're having really good success finding available talent in the market that is local here. If we were located in downtown Portland, I think I might have answered your question a bit differently.
To subscribe to Oregon Business, click here.