Work With Us

CES 2018 Tech Trends: Decision Making on the Edge

In last week’s post, CES 2018 Takeaways: Focus on Four Product Trends, Chris Murray and I wrote about the products we saw at CES, and mentioned we were particularly impressed by the explosion in the numbers of robots that can act independently.

In this post I’ll explore the technology trends we saw at CES 2018 that are behind that intelligent behavior. There are three primary developments that made this year’s bots cannier than last year’s: neural nets, the migration of automotive sensor technology, and cross-platform communication:

1. Edgy Thinking with Edge Computing

Neural nets are computing systems modeled on the human nervous system that allow computers to develop rules for complex systems of decision making by observation and characterization of the results. Neural nets enable devices to make computationally intensive decisions quickly and locally, instead of having to access larger computing resources in the cloud.

This way, cloud resources are employed only as needed and available, and more work moves to the connected device. Also known as edge computing, such systems make it possible for a drone to make a decision to fly left or right to avoid an obstacle, or decide what ‘remain on station’ means in a changing landscape — literally on the fly.

It’s only recently that mid-range micro-controllers have had the horsepower and resources to realistically apply neural net software.

Neural net software has been available for a long time, but it has required too much memory and power to run on mobile devices. It’s only recently that mid-range micro-controllers have had the horsepower and resources to realistically apply it. Some of the first devices to equip the technology were autonomous vehicles, which by definition need to make very complex decisions independently.

But ever since NVIDIA’s launch of their AI and neural net computing platforms last year, there are applications sprouting up in other fields such as industrial controls, consumer drones, security, video analytics, and home automation. Most processor suppliers we spoke with are now offering and showcasing neural net development resources and libraries, which they didn’t have last year. Pictured above is a demo at CES 2018 of NXP Semiconductors’ Edge Compute Experience.

2. Sensors From Automotive Driving Us Forward

The second major trend we saw was sensors and technologies that debuted in the automotive sector migrating across the landscape of devices. For example, collision avoidance is now in drones, logistics robots, and home robots. This can help an unmanned forklift safely navigate across a warehouse, or a hospitality bot navigate a crowded lobby.

The second major trend we saw was sensors and technologies that debuted in the automotive sector migrating across the landscape of devices.

Although it seems simple, collision avoidance requires the integration of several different types of sensors — such as radar and lidar to ‘see’ objects, magnetometers for direction, accelerometers, gyroscopes, and time of flight Doppler to measure speed — and is not a trivial problem. Adding neural net processing to this type of sensor fusion really helps it work effectively, and that kind of processing power carried around in a drone or small robot is new.

One thing we should mention is that no one came right out and told us that their system uses neural nets in sensor fusion and decision making. But the sort of complex behaviors on display by the independent vehicles, hospitality robots, logistics drones, and other applications at CES are likely accomplished with this type of development methodology.

CES 2018 technology trends

We also noticed augmented reality (AR) taking hold in the auto industry. Head-up displays projected onto windshields to help navigate, provide status, or point out potential hazards were everywhere — NuViz’s helmet-mounted Head-Up Display (HUD) for motorcycle riders (pictured above) is a good example that’s available today. So are the Vuzix Blade’s Augmented Reality Smart Glasses, and WayRay’s AR in-car head-up display for the vehicle market is projected to be coming soon to Honda vehicles.

CES 2018 technology trends

The application of radar and lidar scanning in automotive volumes drives the cost of those sensors down, and helps create new applications. We saw AR/VR with good 3D scanning migrate outside of the auto industry this year, particularly into robotics, fashion, sport, and gaming applications. For example, the HiMirror Mini (pictured above), which won a CES 2018 innovation award, allows you to try on makeup virtually. Kodak’s Full Body 3D Scanner with Twindom and Artec’s rapid prototyping solutions, such as its handheld 3D Eva scanner, show where some applications are heading.

3. Growing Cross-Platform Communication in IoT

The third trend we see growing is cross-platform communication in IoT. This has been a long time in coming, and it’s still one of the major barriers to a truly functional Internet of Things. Until Alexa can talk to Google Home, industrial sensors can talk to common controllers, and there are generic device integration applications, consumers will have to choose. And no one wants to be locked in to a single ecosystem — regardless of how much each player in the field wants to have overarching control.

CES 2018 technology trends

But the signs were encouraging. Unsurprisingly, the auto industry is ahead here: there were standards for mesh wireless communications between vehicles, complete with data protocols, and new high speed automotive ethernet standards to support the huge amount of data the increases in onboard technology represent. Sensors have all evolved in performance, and lowered in price.

A couple of providers are starting to provide a communications meta-service across multiple ecosystems. There are applications that now bridge ecosystems between Apple HomeKit, Alexa, and Google Home. They still aren’t mainstream, but it provides a jumping off point to a broader industry, integrating standards, and providing more choice — and more applications — for IoT devices to a broader audience. IFTTT is an example of a company focused on this sort of integration.

What It All Means About What’s Next

From what we saw at CES, we’re looking forward to seeing neural nets and sensor integration add resilience and nuance to the IoT. If the cloud goes down, a neural net-equipped device won’t freeze up or go silent like Alexa off the WiFi. Utility drones equipped with intelligent decision making might be able to crawl through underground infrastructure and make repairs without needing a radio signal.

Will today’s small player become tomorrow’s dominant provider? Remember: Google started off as the search engine that searched across other search engines.

Cross-platform communication is still in its infancy, but we’re curious to see how it shakes out. The communications standard and integration platform that wins might not come from a household name. Remember: Google started off as the search engine that searched across other search engines. It only developed its own engine later. We could see a similar dynamic here, of a small player coming in with a communication metaservice and only later becoming the dominant provider in the industry.

Catch up on our trendspotting posts from CES.