Written by Raffi Jabrayan, Exyn Technologies
As any operator on any job site can tell you, dust gets into everything. It breaks down parts, messes with sensors, and is ever present even after you’ve tried everything in your power to get rid of it. And now with the advancement of 3D mapping drones making their way onto construction sites and other industrial environments, you can bet their propellers will kick up quite a bit of dust too, which causes problems for onboard navigational sensors.

Or – at least it used to.
Robots equipped with a Nexys can autonomously navigate through an online SLAM algorithm (you can read all about how SLAM works here). Essentially, if too many dust particles are picked up through the LiDAR sensor SLAM will interpret that as a solid object, causing robots to become ‘stuck’ in dust clouds leading to wasted battery cycles – or worse, a crash.
For years this has been a major hurdle for SLAM-powered navigational platforms. Researchers have been working to help these platforms ‘see’ dust in real time, understanding what it is and how to navigate around it. However, when you try to tune the LiDAR data to filter out the dust the robot then has a hard time seeing thin wires and smaller obstacles, which is potentially catastrophic.
Exyn Technologies has now developed a proprietary machine learning algorithm that can now accurately detect dust clouds in real time and adjust automatically to ensure the area is mapped safely and accurately. And this Autonomy Level 4B (read more about the standardized levels of autonomy here) is now baked into every autonomy-enabled Nexys for underground UAV operators.
What is a dust filter for autonomous navigation?
You can think of a dust filter as a software version of polarized sunglasses you might wear during a road trip.
When driving toward the setting sun, glare can obscure the fine details of the road in front of you. Road signs and even other cars are often partially obscured by the glare. A dust cloud has a similar effect on how SLAM might interpret LiDAR data.
In a car, you would put on a pair of polarized sunglasses and suddenly those obscured details would become visible, giving you a clear image of the road ahead.
A dust filter for a LiDAR system is similar, but instead of being a physical lens or filter like a pair of sunglasses, it’s a computer algorithm that removes the ‘glare’ by enabling ExynAI – the package of proprietary SLAM algorithms that power Nexys’ autonomy and mapping – to parse millions of LiDAR data points into real objects or simply dust particles.
However, not all dust filters are created equal, and the new ExynAI dust filter has a few tricks up its sleeve that allow you to conduct digital mapping in more challenging environments.
Advanced online dust filtering with Nexys
Exyn understood that a real-time dust filtering algorithm couldn’t consume a ton of computational resources – otherwise it would bog down autonomous missions. It needed to work quickly, have a robot navigate confidently, and most importantly couldn’t affect the accuracy of the point cloud once captured and processed.
To get started, ‘Exyneers’ looked at years of previous flights in dusty environments and fed them into a machine learning algorithm to help teach ExynAI what dust looked like. This flight data would help train a model that could better detect dust in real-time.
This efficiency allows for real-time dust filtering during autonomous exploration of mines and other hazardous environments that would pose significant safety and accuracy issues with other less capable dust filtering systems.
How the ExynAI dust filter works
The ExynAI dust filter works in three phases to make it ultra-efficient while also providing the accuracy needed for autonomous exploration in difficult environments.
- Stage one: As ExynAI is ingesting LiDAR data during flight, it broadly identifies points that might be dust based on simple metrics. This has been a fairly standard approach for other navigational systems. And as the robot navigates new environments, researchers can tune this filter stage to better detect dust in novel environments but ultimately you still filter out too much for the robot to navigate safely.
At this point the robot knows what a dust particle might look like, but it doesn’t know what the shape of a dust cloud could look like. And that’s where stage two comes in.
- Stage two: You can think of these stages like sifting sand through a screen. Stage one was broad and pulled in a lot of potential dust particles, now in stage two the real-time data is fed through a machine learning model developed by Exyn to show the robot what the “shape” of dust looks like.
This is the crucial step, because without this “shape” ExynAI would filter out thin wires or other small features which could cause a crash or the robot to get stuck. This has been an historic hurdle for LiDAR-based SLAM platforms operating autonomously in dusty environments.
- Stage three: The last step in the filtering process is how the robot is able to do all of this immense calculation while in the middle of an autonomous flight. While in flight, ExynAI is taking point cloud data and interpreting it through an occupancy mapping pipeline. This converts a group of points into a 3D box, or voxel (like a Minecraft brick), to define an obstacle and help the robot determine a safe flight corridor.
This adds further robustness to Nexys’ autonomy because this occupancy pipeline is running thousands of times per second to determine each voxel classification. This helps eliminate spurious dust misdetections which can cause a robot to get stuck in place.
While testing this filter in the field and with Exyn customers, they saw operators become increasingly confident sending Nexys on missions deeper into underground cavities well beyond visual line of sight with this advanced dust filtering knowing the robot would return safely. This level of confidence in autonomous robots in rugged environments will help to further drive adoption once surveying professionals experience the reliability and robustness firsthand.

Experience Autonomy Level 4B
So, if you’re tired of dust determining what works and what doesn’t on your job site, the new advanced filtering included in Nexys autonomy could be the solution you’ve been waiting for.
Contact us today to learn more about the Nexys portable mapping and modular autonomy ecosystem and book a personalized demo for yourself and your team.

About Raffi Jabrayan
Raffi Jabrayan is the Vice President, Commercial Sales and Business Development for Exyn Technologies. He oversees the expansion of the business internationally in the mining and construction sectors, as well as penetration into other industries. A large part of his role at Exyn is to help miners leverage the data produced by Exyn’s autonomous aerial robots to streamline underground inspections, enhance operational efficiency, and reduce risk.
Prior to joining Exyn, Raffi managed digital and technology innovation projects for Dundee Precious Metals and was intimately involved with operationalizing new technologies into Dundee’s workflow. Raffi oversaw the scouting, due diligence, implementation, and post integration assessment of Dundee’s digital and technology projects.
Raffi is a seasoned mining professional with practical experience at both the plant and corporate level in various capacities and has completed the Digital Business Strategy Program at MIT Sloan as well as Driving Strategic Impact from Columbia Business School.

About Exyn Technologies
Exyn Technologies is pioneering multi-platform robotic autonomy for complex, GPS-denied environments. For the first time, industries like mining, logistics, and construction can benefit from a single, integrated solution to capture critical and time-sensitive data in a safer, more affordable, and more efficient way. Exyn is powered by a team of experts in autonomous systems, robotics, and industrial engineering, and has drawn talent from Penn’s world-renowned GRASP Laboratory as well as other storied research institutions. The company is VC-backed and privately held, with headquarters in Philadelphia.
For more information, please visit www.exyn.com, you can also contact us on our website.
Leave a Reply