
Zebra Matrox Design Assistant X Software


Price Match Guarantee
Found a better price? We'll match it within 30 days of your purchase.
Software Expert!
Zebra Matrox Design Assistant X Software Model Overview
- Solve machine vision applications efficiently by constructing flowcharts instead of writing program code
- Choose the best platform for the job within a hardware-independent environment that supports Zebra smart cameras and vision controllers and third-party PCs with CoaXPress®, GigE Vision®, or USB3 Vision® cameras
- Tackle machine vision applications with utmost confidence using field-proven tools for analyzing, classifying, locating, measuring, reading, and verifying
- Leverage deep learning for visual inspection through image classification and segmentation tools
- Use a single program for creating both the application logic and operator interface
- Work with multiple cameras all within the same project or per project running concurrently and independently from one another, platform permitting
- Interface to Zebra AltiZ and third-party 3D sensors to visualize, process, and analyze depth maps and point clouds
- Rely on a common underlying vision library for the same results with a Zebra smart camera, vision system, or third-party computer
- Maximize productivity with instant feedback on image analysis and processing operations
- Receive immediate, pertinent assistance through an integrated contextual guide
- Communicate actions and results to other automation and enterprise equipment via discrete Zebra I/Os, RS-232, and Ethernet (TCP/IP, CC-Link IE Field Basic, EtherNet/IP™2, Modbus®, OPC UA, and PROFINET®, and native robot interfaces)
- Test communication with a programmable logic controller (PLC) using the built-in PLC interface emulator
- Maintain control and independence through the ability to create custom flowchart steps
- Increase productivity and reduce development costs with Vision Academy online and on-premises training
- Protect against inappropriate changes with the Project Change Validator tool
- Flowchart-based vision software
- Aurora Design Assistant, is an integrated development environment (IDE) for Microsoft Windows where vision applications are created by constructing an intuitive flowchart instead of writing traditional program code. In addition to building a flowchart, the IDE enables users to design a graphical web-based operator interface for the application.
- Application design Flowchart and operator interface design are done within the Aurora Design Assistant IDE hosted on a computer running 64-bit Windows. A flowchart is put together using a step-by-step approach, where each step is taken from an existing toolbox and is configured interactively. Inputs for a subsequent step—which can be images, 3D data, or alphanumeric results—are easily linked to the outputs of a previous step. Decision-making is performed using a flow-control step, where the logical expression is described interactively. Results from analysis and processing steps are immediately displayed to permit the quick tuning of parameters. A contextual guide provides assistance for every step in the flowchart. Flowchart legibility is maintained by grouping steps into sub-flowcharts. A recipes facility enables a group of analysis and processing steps to have different configurations for neatly handling variations of objects or features of interest within the same flowchart. In addition to flowchart design, Aurora Design Assistant enables the creation of a custom, web-based operator interface to the application through an integrated HTML visual editor. Users alter an existing template using a choice of annotations (graphics and text), inputs (edit boxes, control buttons, and image markers), and outputs (original or derived results, status indicators, and charts).
- Deep neural networks for classification and segmentation Aurora Design Assistant includes classification steps for automatically categorizing image content using machine learning. These steps make use of deep learning technology—specifically the convolutional neural network (CNN) and variants—in two distinct approaches. The first or global approach—implemented by the CNNClassIndex step—assigns images or image regions to pre-established classes. It lends itself to identification tasks where the goal is to distinguish between similarly looking objects including those with slight imperfections. The results for each image or image region consist of the most likely class and a score for each class. The second or segmentation approach—implemented by the CNNClassMap step—generates maps indicating the pre-established class and score for all image pixels. It is appropriate for detection tasks where the objective is to determine the incidence, location, and extent of flaws or features. These features can then be further analyzed and measured using traditional tools like the BlobAnalysis step. These classification steps are particularly well suited for analyzing images of highly textured, naturally varying, and acceptably deformed goods in complex and varying scenes. Users can opt to train a deep neural network on their own—using the included Aurora Imaging Library CoPilot application—or commission Zebra to do so using previously collected images that are both adequate in number and representative of the expected application conditions. Different types of training are supported, such as transfer learning and fine-tuning, all starting from one of the supplied pre-defined deep neural network architectures. Aurora Imaging Library CoPilot provides what is needed for building the required training dataset—including the labeling of images and augmenting the dataset with synthesized images—as well as monitoring and analyzing the training process. Training is accomplished using a NIVIDIA GPU or x64-based CPU while inference is performed on a CPU in a Zebra vision controller, smart camera, or third-party computer, avoiding the need for specialized GPU hardware.
- 3D data display in the Operator View Aurora Design Assistant interfaces to a range of 3D cameras and sensors. These include Zebra AltiZ 3D profile sensors, as well as third-party 3D cameras and sensors through their SDK or the interface standard they support. The type of 3D data acquired includes profiles, depth maps, and point clouds. Point-cloud surfaces can be meshed and filled as required by subsequent operations. The software includes an SDK to allow users to create interfaces to 3D camera and sensors on their own. Profiles and depth maps are visualized as charts and greyscale or color-coded images, respectively. Point clouds and depth maps can be viewed in the 3D display, as is or with a color map which is available both in the IDE or in the Operator View. In the IDE, users also have the option of showing the point clouds as a mesh with or without filling.
- Custom Step SDK Users have the ability to extend the capabilities of Aurora Design Assistant by way of the included Custom Step software development kit (SDK). The SDK, in combination with Microsoft Visual Studio, enables the creation of custom flowchart steps using the C# programming language. These steps can implement proprietary analysis and processing, as well as proprietary communication protocols. The SDK comes with numerous project samples to accelerate development
- Project Change Validator Flowchart-based vision software Aurora Design Assistant, the IDE enables users to design a graphical web-based operator interface for the application.
- Why a flowchart? The flowchart is a universally accessible, recognized, and understood method of describing the sequence of operations in a process.
New in This Release
• Updated Aurora Imaging Library CoPilot companion application for simplified deep learning training • Support for IEEE 1588 Precision Time Protocol (PTP) timestamps for GigE Vision acquisition • Custom step SDK • Deployment options • Project templates • Custom flowchart steps
Markets and Applications
• Automotive • Electronics • Food and Beverage • Life Sciences • Logistics and Distribution • Packaging • Pharmaceutical • Robotics • Semiconductor and Solar • Security • Transportation • Utilities • Vision-Guided Robotics
Application Design
Flowchart and operator interface design are done within the Aurora Design Assistant IDE hosted on a computer running 64-bit Windows. A flowchart is put together using a step-by-step approach, and charts)
Deep Neural Networks for Classification and Segmentation
Aurora Design Assistant includes classification steps for automatically categorizing image content using machine learning. These steps make use of deep learning technology—specifically the convolutional neural network (CNN) and variants—in two distinct approaches. The first or global approach—implemented by the CNNClassIndex step—assigns images or image regions to pre-established classes. It lends itself to identification tasks where the goal is to distinguish between similarly looking objects including those with slight imperfections. The results for each image or image region consist of the most likely class and a score for each class. The second or segmentation approach—implemented by the CNNClassMap step—generates maps indicating the pre-established class and score for all image pixels. It is appropriate for detection tasks where the objective is to determine the incidence, avoiding the need for specialized GPU hardware.
3D Data Display in the Operator View
Aurora Design Assistant interfaces to a range of 3D cameras and sensors. These include Zebra AltiZ 3D profile sensors, users also have the option of showing the point clouds as a mesh with or without filling.
Custom Step SDK
Users have the ability to extend the capabilities of Aurora Design Assistant by way of the included Custom Step software development kit (SDK). The SDK, as well as proprietary communication protocols. The SDK comes with numerous project samples to accelerate development.
Markets and Applications
• Automotive • Electronics • Food and Beverage • Life Sciences • Logistics and Distribution • Packaging • Pharmaceutical • Robotics • Semiconductor and Solar • Security • Transportation • Utilities • Vision-Guided Robotics
Application Design
• Flowchart and operator interface design are done within the Aurora Design Assistant IDE hosted on a computer running 64-bit Windows. A flowchart is put together using a step-by-step approach, and charts)
Markets and Applications
• Automotive • Electronics • Food and Beverage • Life Sciences • Logistics and Distribution • Packaging • Pharmaceutical • Robotics • Semiconductor and Solar • Security • Transportation • Utilities • Vision-Guided Robotics
Standard Deployment
• Aurora Design Assistant runtime software license key. The user must supply a lock code generated using the license management utility (MILConfig’s Licensing tab) or the smart camera portal (Licensing tab on the Settings page). This unique lock code identifies the target system and package(s) to license. The correspondence between the below packages and flowchart steps comes from the Project License Information dialog accessible from the Platform menu in Aurora Design Assistant.
Standard Deployment
• Aurora Design Assistant Image Analysis package. Enables BeadInspection, BlobAnalysis, Camera (with calibration), CameraFocus, EdgeLocator, Measurement, ImageCorrection, IntensityChecker, ImageProcessing (also see DAXRT…Q…), LoadImage (with calibration), Remap, and Mask steps. Also required for AlignPlane, Crop3D, ExtractProfile, and Volume3D steps.