According to Paulsson, the world is seeing a growing momentum towards computing at the ‘edge’ of the network.
"More of the devices that are connected to the network require or would benefit from the ability to analyse received data, make a decision and take appropriate action. Autonomous vehicles are an obvious example. Whether about communications with the external environment or through sensors detecting risks, decisions must be processed in a split second. It is the same with video surveillance. If we are to move towards the proactive rather than reactive, more processing of data and analysis needs to take place within the camera itself", he said.
Next year will see more processing power in dedicated devices designed for a specific application, is essential with the move towards higher levels of edge computing.
"Connected devices will need increased computing power, and be designed for the purpose from the ground up with a security-first mindset. The concept of embedded AI in the form of machine and deep learning computation will also be more prevalent moving forward", he said.
There will be a move towards trusted edge technology as personal privacy will continue to be debated around the world, Paulsson thinks.
"While technologies such as dynamic anonymisation and masking can be used on edge to protect privacy, attitudes and regulation are inconsistent across regions and countries. The need to navigate the international legal framework will be ongoing for companies in the surveillance sector. Many organisations are still failing to undertake even the most basic firmware upgrades, yet with more processing and analysis of data taking place in the device itself, cybersecurity will become ever more critical", Paulsson said.
He thinks that attitudes towards appropriate use of technology cases and the regulations around them differ around the world.
"Facial recognition might be seen as harmless and even desirable. However, when used for monitoring citizens and social credit systems, it is regarded as much more sinister and unwanted. The technology is the same, but the case is vastly different. Regulations are struggling to keep pace with advances in technology. It’s a dynamic landscape that the industry will need to navigate, and where business ethics will continue to come under intense scrutiny", Paulsson said
With some of the regulatory complexities, privacy and cybersecurity concerns, the industry is seeing a move away from the open internet of the past two decades, Paulsson said.
"While public cloud services will remain part of how we transfer, analyse and store data, hybrid and private clouds are growing in use. Openness and data sharing was regarded as being essential for AI and machine learning, yet pre-trained network models can now be tailored for specific applications with a relatively small amount of data. For instance, we’ve been involved in a recent project where a traffic monitoring model trained with only 1,000 photo examples reduced false alarms in accident detection by 95 percent."