Accelerate the transition of your GPU-based Edge AI to FPGA-based Edge AI!
DMP's AI processor IP "ZIA DV720" is integrated into the Xilinx Zynq MPSoC FPGA. It standardly supports FP16 precision floating-point calculations, allowing models trained on PCs or cloud servers to be used without retraining. Additionally, it can maintain high inference processing, making it an optimal AI FPGA module for AI systems that require high reliability, such as autonomous driving and robotics. We provide strong support for customers from system considerations using the 'ZIA C3 Kit' (Xilinx version) development kit to application development utilizing the 'ZIA C3 Kit' (Edge AI starter development kit). 【Features】 ■ "ZIA C3 SoM" equipped with AI processor DV720 ■ Development environment (SDK/Tool) that facilitates AI application development ■ Specifications optimal for industrial equipment and long-term stable supply ■ Capability to keep up with the latest AI technologies *For more details, please download the PDF or feel free to contact us.
Inquire About This Product
basic information
【Kit Contents】 ■ZIA C3 SOM ■Carrier Board ■SDK/Tool (Github: https://github.com/DigitalMediaProfessionals) ■User Manual ■Startup Guide ■AC Adapter *For more details, please download the PDF or feel free to contact us.
Price range
Delivery Time
P3
Applications/Examples of results
For more details, please download the PDF or feel free to contact us.
catalog(1)
Download All CatalogsCompany information
Our company has been developing and licensing proprietary GPU IP cores for embedded devices and related software under the slogan "Visualize the Future," as well as our own SoC business. Thanks to this, the total shipment of products equipped with our GPU IP cores, including game consoles, cameras, and printers, has exceeded 100 million units. Today, we are focusing not only on edge AI, which requires real-time inference processing of large amounts of data, such as in autonomous driving and factory automation, but also on the field of cloud AI, where learning capabilities to improve inference accuracy at the edge are key. Since our founding in 2002, as one of the world's leading GPU companies, we leverage our expertise in miniaturization, low power consumption, and high performance developed for embedded GPUs to provide highly competitive edge AI inference processor IP, module products equipped with it, as well as software products and cloud services that integrate DMP's AI and image processing technologies, all through our proprietary AI platform, the "ZIA series."