Vision Transformers Change the AI Acceleration Rules
Transformers were first introduced by the team at Google Brain in 2017 in their paper, “Attention is All You Need“. Since their introduction, transformers have inspired a flurry of investment and research which have produced some of the most impactful model architectures and AI products to-date, including ChatGPT which is an acronym for Chat Generative Pre-trained Transformer.
Transformers are also being employed for vision applications (ViTs). This new class of models was empirically proven to be viable alternatives to more traditional Convolutional Neural Networks (CNNs) in the paper “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale“, published by the team at Google Brain in 2021.
Vision transformers are of a size and scale that are approachable for SoC designers targeting the high-performance, edge AI market. There’s just one problem: vision transformers are not CNNs and many of the assumptions made by the designers of first-generation Neural Processing Unit (NPU) and AI hardware accelerators found in today’s SoCs do not translate well to this new class of models.
What makes Vision Transformers so special?
To read the full article, click here
Related Semiconductor IP
- Quadruple Capacitor Switch
- HDMI 2.1 Quad-Pixel Tx Controller
- 40nm 1.1V 2GHz-4.7GHz Fractional-N RF Quadrature PLL
- Quadrature former 1.25-1.9 GHz
- 130 to 935 MHz Quadrature down mixer
Related Blogs
- CNNs and Transformers: Decoding the Titans of AI
- Vision-Language Models (VLM) – the next big thing in AI?
- From ChatGPT to Computer Vision Processing: How Deep-Learning Transformers Are Shaping Our World
- AI Workload Acceleration with SiFive's New Intelligence XM Series
Latest Blogs
- Enhancing PCIe6.0 Performance: Flit Sequence Numbers and Selective NAK Explained
- Smarter ASICs and SoCs: Unlocking Real-World Connectivity with eFPGA and Data Converters
- RISC-V Takes First Step Toward International Standardization as ISO/IEC JTC1 Grants PAS Submitter Status
- Running Optimized PyTorch Models on Cadence DSPs with ExecuTorch
- PCIe 6.x: Synopsys IP Selected as First Gold System for Compliance Testing