Xilinx® Vitis™ AI is an integrated development environment that can be leveraged to accelerate AI inference on Xilinx platforms. This toolchain provides optimized IP, tools, libraries, models, as well as resources, such as example designs and tutorials that aid the user throughout the development process. It is designed with high efficiency and ease-of-use in mind, unleashing the full potential of AI acceleration on Xilinx SoCs and Alveo™ Data Center accelerator cards.
Vitis AI Key Components¶
Vitis AI is composed of the following key components:
DPUs - Configurable computation engines optimized for convolution neural networks. Efficient and scalable IP cores that can be customized to meet the needs of many different applications and devices.
Model Zoo - A comprehensive set of pre-trained and pre-optimized models that are ready to deploy on Xilinx devices.
Model Inspector - A tool and methodology through which developers can verify model architecture support.
Optimizer - An optional, commercially licensed tool that enables users to prune a model by up to 90%.
Quantizer - A powerful quantizer that supports model quantization, calibration, and fine tuning.
Compiler - Compiles the quantized model for execution on the target DPU accelerator.
Runtime (VART) - An inference runtime for Embedded applications.
Profiler - Performs an in-depth analysis of the efficiency and utilization of AI inference implementations on the DPU.
Library - Offers high-level C++ APIs for AI applications for embedded and data center use-cases.