One of the biggest benefits of a networked solution is the flexibility to locate resources where they can deliver optimal impact without a need for bringing everything back to a central location. While this was necessary with older technologies, today’s infrastructure makes system design and deployment simpler for integrators and installers, and more cost-effective for end users. For engineers who aren’t using edge-based options, it could be smart to reconsider system design.
When the security sector relied on analogue technology, one of the biggest restrictions was that devices required dedicated cabling. The inability of data to share infrastructure meant each individual link required a dedicated cable, and each point where data needed to be shared required some sort of switch or interface. Often the easiest thing to do was bring all the data back to one central point and manage it there.
The security industry never opted for a centralised approach because it was the best option, nor because it was cost-effective. If anything, it was an approach fraught with difficulty, and it often increased costs while also wasting time and energy in the field.
The most obvious issue with a centralised approach is that it includes a single point of failure by design. It also necessitates an increased amount of cabling, with excessive runs becoming the norm. Interestingly, end users never equated increased costs with lengthy cable runs. Customers saw value in the hardware, not the infrastructure, and this often made it difficult for integrators and installers to justify high installation costs.
The centralised approach also created limitations in terms of performance. With all the site’s data being managed in one place, it was often inevitable that capacities would be insufficient. This lead to a whole range of ways of ‘managing’ the situation, which invariably meant data would be thrown away.
For many years the centralised approach was – and for some installations still is – the predominant model when designing security systems. In the past, technology simply didn’t allow any other approach to be achieved for a realistic cost. Not only did every device require a dedicated cable, but many centralised devices supported a limited number of inputs.
A great example of this comes when you consider how video surveillance systems used to archive data. Video feeds from multiple cameras would be brought back to a centralised point, where banks of recorders would archive the footage. The recorders could only handle a single input, so management units, typically switchers or multiplexers, converted the multiple video feeds into one single stream.
This limited topology forced the design of surveillance systems to feature a centralised recording system.
This mindset was so firmly entrenched in the security industry that when networked devices first appeared, security manufacturers focused on producing centralised units. This made even less sense because it was vital that bandwidth resources were carefully managed. The business case for taking an edge-based approach was even stronger, but despite this some manufacturers hung on to the use of a centralised model.
Moving to the edge
The introduction of networked-based security platforms has allowed systems to utilise ever more flexible topologies, delivering economies with regard to cabling and overall design. This has become increasingly important as the data being collected by security systems is increasing all of the time, to a point where bandwidth consumption is something which integrators and installers increasingly need to address.
The collection, analysis and use of ‘big data’ is a reality in today’s smarter security solutions. The information being gathered not only has value for enhancing the protection of businesses and organisations; it also has a value for a host of other purposes. These include site management, business efficiencies, process control, HR, health and safety, marketing, etc..
As the focus on data mining increases, the additional value of what can be achieved often underpins the return on investment many end users seek. It is therefore imperative that modern security systems are capable of allowing the data to be interrogated for a whole host of reasons, and the best place to do this is at the edge.
Taking a step back for a moment, many integrators and installers will have first looked at edge-based technologies in regard to video storage. It was the first business use case widely introduced into the market, and it still makes a lot of sense. Of course, edge solutions are not all the same, and it is important to not undervalue the principle because some edge options are low key.
For example, recording video onto SD cards at the camera is an edge option, but equally AI-enabled analytics and object classification can take place at the edge. Increasingly, the brand-name IT-driven data management ad analytics providers are moving their processing to the edge, because it is more efficient, more secure and it makes good sense. Why ship large amounts of data around a network when the processing and analysis can happen at source?
The constantly increasing demand for data, including video and audio streams, transactional information, metadata, status reports, etc., creates a significant network load and places a greater emphasis on the resilience of the supporting infrastructure. As a result, it makes sense to ensure much of the ‘heavy lifting’ in terms of data processing and analytics algorithm management is performed at the edge of the network. This localises the processing as data is managed either at the capture device or in a peripheral unit, rather than transmitting large amounts of traffic to a central server which handles all management tasks.
As security data plays a greater role in site and process management for businesses and organisations, it becomes vital that continuity is ensured. Any failure or weakness in the infrastructure can result in performance issues and the loss of continuity, which in turn will impact on the total cost of ownership and return on investment.
A distinction needs to be made between the core role of the system and the peripheral benefits. While it is imperative the system offers a high level of security and protection for the business or organisation, the customer’s perceived return on investment will come from the additional benefits and the added value on offer. If that is compromised, their thinking could be a return to the ‘grudge purchase’ mentality.
While issues with the peripheral benefits might not compromise the security protection on offer, the added value elements are the ones the user interacts with every day. As a result, issues with realising those benefits can lead to the system not meeting their expectations.
The increased use of artificial intelligence (AI) in security-based solutions, along with the processing power of combining CPUs and GPUs, does mean the load on networks and data processing in general will increase, making effective infrastructure management a more critical issue. As users look to exploit metadata-rich security information and combine it with data from other sources, the big data will enable ever smarter implementations to be created. However, because security data is generated in real-time, around the clock, from an ever-increasing number of devices, the sheer volume of data being captured, transmitted, processed and analysed can be staggering.
If a centralised approach is taken, the infrastructure will need significant computational resources, storage capacity and communications redundancy. This will add to the complexity of systems, as well as the capital investment and cost of ownership. The use of unsuitable hardware and infrastructure will lead to data loss, latency, slow or incomplete processing cycles or – in a worst-case scenario – failure of the entire system.
The solution to managing the burgeoning data flow is the use of edge-based processing and analysis. With a correctly designed infrastructure, this approach helps to reduce the overall network load by implementing data processing at the source of the security information.
As the processing happens at the edge, the need for data transfer is reduced, freeing up bandwidth and enhancing overall system efficiency. This approach not only reduces costs but also enhances cybersecurity as data is not moved around the network for centralised processing to take place.
Edge-based management also reduces the risk of data loss, and simplifies compliance with important data management policies, as usage is kept to a local level.
Integrators and installers delivering advanced analytics systems and smart solutions cannot allow bandwidth and data transfer issues to jeopardise the functionality and resilience of the security system. Equally, if the system is to meet the expectations of customers, it should be able to deliver the additional smart benefits they have been sold.
An edge-based approach saves money, reduces network load, offers a more stable solution and reduces the potential impact of system failures. In short, edge-based systems represent best practice!