The 6 Must Have Features of an Award Winning Edge
July 8, 2020
Author: Aaron Allsbrook –
This week ClearBlade won the Edge Computing Company of the Year from Compass Intelligence. This huge honor comes at a fantastic time in the Edge ecosystem as many industry eyes have been opened to the possibilities for computing outside of the datacenter. As many organizations are working hard to create great Edge based solutions it can be a daunting challenge to get the right team of skills together to achieve the set of solutions.
Like everyone else, the ClearBlade Edge is built by a team that has broad sets of skills and capabilities. These are the skillsets we have put into place in order to build the leading edge in 2020.
Many edge offerings see Edge computing as mostly a devops problem and sold as Edge Device Management. Azure Edge, IBM Open Horizon, Rigado, and others have made solutions that make it easy to move applications to certain locations. If you have already built a smart monitor app and want to get it running on a gateway, you can get those devops focused Edges to move to the application where you want it. Most of these Edge Device Management products look like simple wrappers around container technologies, as containers have already reduced the friction by adding some abstraction to the underlying operating system.
The ClearBlade Edge team decided that to run anywhere on any processor of any size, they needed the ability to run on bare metal or as a container and then allow for the edge to still provide the capability to move sidecar applications across the architecture. It’s a tough devops challenge combined with a nasty content management system challenge to keep the right executables running the right way from an intermittently connected instance of compute. ClearBlade has invested heavily in the devops capability to make what is a huge network of distributed computers simple to maintain.
2. Cross and Static Compilation
The ability to run across any hardware has been the secret sauce for ClearBlade in industries like rail, mining, and oil and gas. These are places with existing brownfield devices just waiting to be used. The capability of running across different processor architectures and linux distributions still requires a “hero engineer” that can cross compile and statically compile our Edge for any environment. If you go to the ClearBlade release page here https://github.com/ClearBlade/Edge/releases you find not a mish-mash of code for someone else to deal with but everything neatly packaged for wherever you want it to go with no need for adding new linux packages or dependencies. Of course this same ability allows us to drop our binary into docker https://registry.hub.docker.com/r/clearblade/edge for those who have the luxury of running containers.
Without the ability to cross and statically compile the edge would be cumbersome in each environment, demanding python be installed or java be updated. The complexity of a solution without a skilled developer in cross compilation would make edge computing much harder.
3. Protocol Integration
While we love the idea that one day there will be a single IoT Protocol that everyone uses, at ClearBlade we don’t see it ever being achieved. There are preferred protocols like MQTT available today, but the SNMP, MODBUS, and OPC of yesterday are still around. Some protocols have been in the enterprise for years such as AMQP and HTTP while others are emerging like LWM2M or NBIoT that may be the next big standard. It’s all but guaranteed that the future holds not one but hundreds of protocols an edge team must ready to engage and integrate with.
At ClearBlade, Edge engineers are constantly being assigned to protocols to study, learn, master and bidirectionally send data. This results in not only an edge product that can communicate with the entire IoT Ecosystem but also seeds our engineers with new patterns that can be used for efficiency and optimizations.
4. High Performance Stream Processing
Much of what happens at the edge needs to happen very efficiently and fast. The devices and machines that are streaming their data into an edge are not slowing down their operations to ensure that the edge can keep up. Instead it becomes critical that the Edge development team have high performance stream processing skills as a common base. Many Cloud IoT offerings have relied on hosted open source tools like Spark or Kafka to make the capability available, but these solutions were not built IoT Native (or Cloud Native in some cases) and instead they scale poorly and have no hope of running on constrained computers in the operation field.
Stream processing at this scale was once mostly focused in financial trading algorithms where a few milliseconds could impact millions of dollars. Now these design patterns have moved their way into the factory and onto the locomotive where instead of having supercomputers to crunch the information they have hardened devices like Raspberry Pi’s. Developers working in this environment must understand that every memory allocation, every cache miss, and every authorization check can have a huge impact on the operational factory trying to meet its daily production goals.
5. Data Synchronization
Data that is streamed and processed at the edge must eventually make its way to a data center for the business to analyze and understand. This data is not just information that gets copied over like a glacier and put on ice but is instead data that flows back and forth like hot water. Railroad crossing event data synchronizing to the cloud every minute, while activation failures are synched in real-time. Action information produced by back office systems must flow back to the edge every half hour, while new models to predict failure must move in an on-demand fashion.
This data transfer requirement results in a challenge first of efficiently putting rules to securely move data from location to location. Next it creates tremendous complexity at a low level of reconciliation. The challenge of having a single source of truth contained in two locations has existed for many years. At the edge it is made more difficult by the possibility that internal clocks are set to different times and the edge computer may not be connected all the time.
ClearBlade puts heavy emphasis on Edge engineers who are intimately familiar with this challenge and solving it in a way that makes it easy for end users to put policies in place that results in getting when and where they need it.
6. Device Security
In a world where every system has the potential to become a victim of cyber attack it is constantly important to dedicate a portion of the edge engineering team to security. These security principles run the breadth of best practices for authenticating and authorizing actors in the system, encrypting data as it moves and as it rests, and establishing trust between hardware, binaries, and management applications. Engineers must understand how networking security is architected making sure edge connections are opened up to the server.
Importantly ClearBlade edge engineers have a passion to find each other’s vulnerabilities no matter how small or obscure.
At ClearBlade we have been working for many years to build the best edge platform on the market today. We have done it the hard way – not by simply wiring together open source components or attempting to put together cloud service lego bricks. Instead, the edge has been built from the ground up with core skills to achieve the most critical elements of edge solutions which are running anywhere, communicating with anything, and doing it faster than all the others. We are confident the ClearBlade Edge is the best and most complete edge technology on the market today and we welcome the opportunity to be compared head-to-head versus anybody.
Thank you again to Compass Intelligence for the recognition and thank you to the talented engineering team that has poured so much time and thought into the ClearBlade Edge Platform.