The second part of the Dataveillance trilogy explores how Dataveillance has influenced key developments within Synectics’ flagship command and control solution – Synergy 3.
“With a powerful rules engine at its heart, the opportunity was there to explore data integration and management on a different level ‒ to deliver complete situational awareness.”Neil Waudby, Software Development Director, Synectics
Welcome to the second installment of our three-part Interview Room special on Dataveillance.
Last time, we heard from Martyn Rowe, Head of Client Delivery, who talked about the concept of Dataveillance and its origins as a tool to combat fraud in the gaming sector.
In this Interview Room we hear from Software Development Director Neil Waudby, who talks to us about turning that initial concept into a reality, and about how Dataveillance has influenced key developments within Synectics’ flagship command and control solution – Synergy 3.
How influential has Dataveillance been in the development of Synergy?
Very. Our ambition with Synergy was, and always has been, to push the boundaries of surveillance and go beyond traditional limitations. From the outset the platform was engineered to extract relevant information from integrated third-party systems, pair it with video data – whether analog or digital ‒ and present it in a meaningful way to surveillance operatives. Essentially we made sure Synergy’s open architecture design facilitated video and data integration to elevate basic video recording to an enterprise-wide video management platform.
But it was our work around Dataveillance that opened our eyes to how that core principle could evolve further. With a powerful rules engine at its heart, the opportunity was there to explore data integration and management on a different level ‒ to deliver complete situational awareness.
How did you start the development process and what are some things you learned along the way?
The starting point was a lot of math! I won’t go into detail here, but we’d set ourselves a big challenge ‒ creating an engine that would allow data from different sources to be paired, searched and acted upon in such a way that alarms could be set and generated if all, some or none of the specified rules criteria were met.
Joining abstract data is a complex task, and we had a lot of development obstacles to overcome in order to make this solution ‘real world ready’. For example, any software based on rules usually deals in absolutes and exact matches. But of course, we learned that’s not always how data works in reality, especially not in busy, high-pressure environments that rely on multiple systems.
Time stamping is a good example of data that doesn’t always cooperate. You will often find small differences in time stamps from system to system; say between point of sales tills and access control solutions. We needed our engine to allow for variations like this – to allow for ‘imperfect syncing’ and what we call ‘fuzzy joints’ between datasets. Without accommodating this type of scenario, we would be limiting the data interrogation potential.
“We had developed Synergy 3 to become a data enabler ‒ a route for users, regardless of sector, to achieve greater levels of situational awareness through integrating and interrogating information from multiple sources to flag up risks based on rules.”
What have been the key milestones since the initial development period?
It’s a constantly evolving process, but there have been two crucial milestones since those early days; the integration of workflows and simplification work around the GUI. These were developments critical to the development and launch of Synergy 3.
We had developed Synergy 3 to become a data enabler ‒ a route for users, regardless of sector, to achieve greater levels of situational awareness through integrating and interrogating information from multiple sources to flag up risks based on rules. The obvious development progression was to then harness this capability to guide users through their next logical steps i.e. associate the risks and threats identified with appropriate and standardized response protocols. Our objective was for Synergy 3 to facilitate informed, consistent decision making in order to help ‘close the incident loop’. So that’s what we did with our development of workflows.
I won’t go into detail here, as I know David Aindow, our Product and Technology Director has covered the benefits of workflows previously, but we knew that integrating this functionality within Synergy 3 was a huge step forward and would really help unlock the potential of Dataveillance. It was a hard process, but we were in a great position to push ourselves thanks to the type and size of customers we work with.
One in particular ‒ a large financial institution ‒ gave us an ideal opportunity to show what was possible when they asked us to develop an integrated surveillance solution for its network of data centers.
They needed a solution that would support supply chain/visitor ID and area access verification at different stages in the visitor journey by using a series of automated guidance protocols, i.e. workflows, triggered by conditions based on live on-site data from security, safety, and operations systems, and offsite database access.
The scale and complexity of the project have been hugely influential in how we have developed the workflow capabilities in Synergy 3.
You mentioned the importance of GUI simplification?
Yes. For any powerful digital tool to succeed you need two things to happen – initial innovation and customer adoption. We knew that the key to achieving adoption would be user simplification, a way to generate Dataveillance rules simply and interactively from datasets and then use these rules to trigger alarm events in specific conditions.
Front-end layout customization (with the capacity to automatically reconfigure based on alarm and event triggers), color coding, automated field and filter generators, touch screen map-based navigation ‒ they might seem like minor details, but cumulatively they help expedite the user experience. In time-pressured environments, particularly where live, incident management is a priority, this makes a huge difference. The easier and more intuitive a solution is, the more likely people are to use it and explore its potential.
What’s the next big thing for Dataveillance and Synergy 3?
Sticking with the theme of usability, GIS mapping capabilities are going to be increasingly important, a topic our Vice President of Global Gaming, John Katnic is going to touch on in the last of this three-part series on Dataveillance.
From our perspective ‒ based on feedback from the sectors we’ve been working with closely ‒ we are also looking at built-in ‘plug and play’ Dataveillance rules for the most common scenarios and settings our customers are encountering. It’s a very exciting time.
Thank you, Neil. In our next and final installment of this special Interview Room series, we talk to John Katnic, Vice President of Global Gaming about what the future holds for Dataveillance.