Since Internet of Things technology started to gain mainstream traction, multiple platforms, solutions and strategies have been developed. At the moment there are more than 450 ‘platforms’ commercially available. Yet, realistically speaking, most of these have been designed for a very specific function on out-dated technology and mostly down a vertical application path.
Similarly, gateway players have developed powerful gateway technology with a portion that generically aggregates data to the cloud.
Why? Well, historically, technology companies argued that the best way to quickly create commercial value was to develop a strong vertically integrated application encompassing an ecosystem of partners.
The quickest way to show value was to focus on a vertical and go after it. We have a different view.
The true power and differentiator in IoT.nxt resides in our full IoT stack capability encompassing the edge and the cloud.
Our thinking from the outset has been that we wanted to adopt thinking and develop tech that creates horizontal interoperability between multiple systems and platforms in a technology agnostic manner.
In 2002, it was all about the cloud. Amazon Web Services was launched and, when OPC Unified Architecture was released in 2006 enabling secure communication between devices, data sources and applications, adoption of IoT began to rise. The early adopters developed their projects with the cloud in mind. The thinking being a simple connected mindset where billions of sensors will be deployed and easily spin up supercomputers at low cost in the cloud to process all of this valuable Big Data… how could they go wrong?
During the .com bomb era, people ran around with amazing ideas that they thought would take over the world once mass adoption took place. This was followed by an implosion which saw a huge number of concepts, ideas and investments disappear. A similar trend is developing in the adoption of IoT and in digitalisation in general.
Part of the demise of this is because they were too early on the initial curve and either ran out of cash, were unable to build what they said they could, or saw new, sexier, more agile technology drive competitors closer to adoption. The .com bomb was a rationalisation and a reality for companies and their investors resulting in fortunes being made and lost in the hype. Timing is key in driving Big Tech. If you’re too soon, you are potentially busy developing a concept that will not only age quickly but give competitors plenty to learn from and piggy back off allowing them to develop better tech that is more relevant and value driven. Often a cool idea is exactly that – a cool idea, but without real substance it doesn’t get wide commercial adoption. The commercial viability ultimately sits with the ability of a product to produce ‘real value’, whether quantitative or qualitative.
And then there are the guys who make it.
Amazon, Alibaba, Google. They were unprofitable for a number of years before they started to bear fruit simply because they played the long game. They saw past the hype and created products of real value. They made sure they will be relevant in future economies.
The importance of timing
Timing is everything, and tech is hard to time
We entered this market at the perfect time. Two years in, our solution is strong and businesses at enterprise level are rallying to adopt Big Data technology. They’re embracing VR, AR, AI, cognitive, algorithmic machine learning technologies as they become a reality.
As irrelevant solutions are being seeded out, the IoT.nxt approach to the problem of IoT is making us a major contender; cementing our position in the market.
If we look at the solutions currently available, we understand more than most of these ‘platforms’ have all been built in the cloud. Five years ago everything was in the cloud, it is therefore unsurprising that it is still dominating IT discussions.
Anyone who has, up until this point, embarked on an IoT initiative, has probably
- built a solution that resides in the cloud;
- leverages the power of the cloud and its ability to centralise and leverage processing power from the supercomputers that exist there;
- adopted a top down approach incorporating the cloud as the central power behind the application.
The competitive landscape
Looking at the IoT industry and where the ‘competition’ and ‘incumbents’ are in the current IoT cycle, it is evident that IoT development are in a perfect bubble that I believe is not far from rationalisation. I think it will be less severe than 2000 as I think investors have been more calculated; but there certainly will be a correction in the not so distant future. Driving my belief in this is that you need this type of event for eminence to be created. People need to start understanding where the true value lies. The companies that have the ability to lock into this IoT business value proposition and convert that into investor value will survive and will gain eminence. There are a number of great technologies and concepts available but only the ones that are able to truly unlock value will remain.
What sets us apart
The IoT.nxt approach has been somewhat different, defying the norm and, to date, it is my firm belief that ours is the only company that has this unique approach. Addressing the problems of interconnectivity from the bottom up, our solution acknowledges the power of the cloud and Big data, but also acknowledges that power is greatly diminished or even nullified if the edge layer is not correctly managed.
Our definition of interoperability and data orchestration is, at times, diluted by platform players claiming to provide the same. They don’t.
The general platform interoperability discussion talks to cloud interoperability. This is a hugely complex play that causes massive headaches for some of the most influential players as they try to fathom how to seamlessly integrate multiple platforms. API’s are the talk of the day, with the current solution to solving this dilemma, but it is simply not sustainable or practical. On a whiteboard it might look great having several platforms integrated via API and then plugging into some ESB via microservices, but I challenge to you to construct all of that and take into consideration the small part all of these guys initially did not deem necessary – the edge.
This methodology is hugely reliant on smart sensor technology that has the ability to push data into the cloud. There’s a heavy reliance on networks and, as a result, ‘platforms’ are struggling to grapple with edge technology, all the while hopeful that a 5G, no, 20G network will resolve this problem.
At almost all the international conferences we have attended in the last 24 months the major discussion has been Big Data and smart sensors, so most of the more mature platforms have been designed around the premise of them being able to receive data directly from the sensor. The problem now is how to talk back to the sensor or machine and, more importantly, how to do this cross platform. An even bigger issue creeping to the forefront of discussions are regarding ecosystems in which near real-time data feeds are crucial.
Yet still, the focus is on the cloud and understandably so, especially if you have invested millions into a technology that is reliant on the cloud. We do not believe this.
For some time now we’ve been saying that the edge is eating the cloud.
We’re not implying that the cloud will lose relevance. What we’re saying is that a true IoT ecosystem will become less and less reliant on the cloud and, in fact, that ecosystem design will rely heavily on edge capabilities.
A natural oversight, but a crucial detail destined to form an integral part of this industry’s ability to commercialise in the near future. The IoT industry is inhibited by an inability to create interconnectivity and interoperability at the edge.
Retrofit and decrease the barrier to entry and sweat the assets.
Correctly designed and engineered, edge technology enables edge interoperability and, more importantly, the ability to retrofit into legacy systems. Legacy systems, to a large extent, were disregarded, with current players relying on the ‘rip and replace ‘mentality that has governed and, to a degree, plagued the IT industry since the beginning, befuddling brands that have become household names.
This mentality of winner takes all is not congruent with the ideation of a connected world and certainly does not embrace the concept of true scalability. Having to rip out and replace existing technology and infrastructure on your journey towards digitalisation introduces a huge amount of additional complexity, disruption and a cost, all of which makes it a difficult sale to the business, contributing to the slow adoption rate of the 4th Industrial Revolution.
So whilst the ‘big dogs’ are all trying to figure how they can develop and ensure technology lock-in to secure future revenue, they’re contributing towards the mixed message that is being sent out to the market, diluting the value of IoT technology as a tool to unlocking real business value.
Value is a simple exercise for any business leader – Look at expenditure, then ROI. Satisfied? Great. Here’s the next question – is it relevant to my business?
All data is not the answer.
When we enter into discussions with big companies, the issue of legacy investments in technology at the edge comes up without fail. Remember that everyone is selling some type of cloud platform that is going to ‘change the business’, but that cloud engine is reliant on edge data i.e. devices, sensors, machines, protocols, PLCs, SCADAs, CCTV, access control systems – the list goes on and on. Clients start considering negotiating with each vendor and realising that, much like when our 1000 piece holiday puzzles has 1 missing piece can ruin the picture and make the whole exercise seem futile. It’s the same with many of the algorithms and predictive applications – the true power of these platforms lie in their ability to provide companies with insights. For this they are 100% dependent on having the correct, filtered, aggregated, curated, secure, real-time data from the edge, and they need all the pieces of the data puzzle to build the Big Data picture.
In every environment, on every piece of the puzzle there is information that is critical to the task at hand, and then there’s other information that isn’t needed in real-time. Things like whether a device needs to be serviced in a weeks’ time, whether stock is going to be depleted by the end of the month, etc. Now consider a sensor having a fixed normal range, and only recording exceptions rather than all data all the time – you’re able to reduce the amount of data passed by around 60%- 90% in real time monitoring environments, as a basic statistic.
We are throwing away the rule book.
While the rest of the industry scrambles to figure out how to showcase the exponential value of IoT whilst also attempting to lock clients in to their technology stack, we’re taking the IoT rule book and throwing it out of the window.
We don’t care what technology our clients have now, and what technology they will have in five years’ time. We don’t talk about vendors, we talk protocols. We’re driving our clients to get to Big Data quicker, using what they have, thanks to our trademarked Raptor.
Raptor technology is the missing link in most of the discussions around digitalisation. A normalised, edge layer of physical and virtual intelligence that can be retrofitted, deployed and connected seamlessly into an ecosystem of existing technologies and things, radically reducing the cost and time of having to develop multiple edge integrations into disparate cloud applications.
The IoT.nxt Power-play
Being able to retrofit onto all deployed devices, whether analog-, or IP-based has a huge benefit.
- It reduces disruption to business processes,
- cost of implementation,
- cost of training,
- cost and impact of enterprise-wide change management,
- reduces vulnerability and cyber risk because of less technology disparity at the edge,
- reduces data moving across the network that
- reduces the cost of the network and network congestion,
- reduces processing required at the cloud platform level as the data has already been curated at the edge,
- reduces the cost associated with maintenance of edge integrated gateways,
- has less attack surface at the edge as the gateways are rationalised and
- simplifies real-time subsystem integration
All of this allows us to better leverage the power of our cloud platform as we can now understand the up-, and down- stream effects of an event-triggered occurrence and effect dynamic and seamless recalibration and interoperability throughout ALL edge connected devices.
We ensure that all the pieces of the puzzle are in the box, and ready to be pieced together to create the big picture.
Edge normalisation of data at the edge gateway layer form the foundation for rapid digitalisation and digital transformation. The disruption that everyone talks about is vested in the ability for an organisation to continue its business but iteratively and rapidly start to address the core issues within its business through digitalisation. This leads to more visibility on a real-time basis allowing for dynamic recalibration back into the business ecosystem to achieve optimised levels of production and efficiency that bring about change and new ways of doing the same thing, better.
Peer-to-peer intelligence and learning will further drive this thinking – Raptor thinking – making us even more relevant as the necessity to drive edge analytics and decision making in critical business environments nullifies the cloud. Are you with me?
If you control the edge you unlock the cloud, a bottom up approach.