Learn About the New Block Floating Point Arithmetic Unit for Processing AI/ML Workloads in Speedster7t FPGA

A New Block Floating Point Arithmetic Unit for AI/ML Workloads

Block floating point (BFP) is a hybrid of floating-point and fixed-point arithmetic where a block of data is assigned a common exponent. Learn about the only FPGA with machine learning processors that can deliver native BFP capabilities with higher performance and lower power consumption compared to traditional FPGA DSP blocks.

  • Learn what block floating point is and why it's being used for AI/ML applications
  • Understand how the new MLP in Speedster7t FPGAs is optimized for BFP and AI/ML workloads
  • Get 8x the performance with similar power consumption using BFP vs. floating and fixed-point arithmetic  
     

Register to View Webimar

Country