Flow Computing
says that its PPU can help anyone who is looking to maximize performance, reduce infrastructure costs, and meet sustainability goals.
The company’s website says that the “PPU architecture will finally make locally-hosted AI a reality.”
But the question remains, what is a PPU – Flow Computing describes it as “an IP block that integrates tightly with the CPU on the same silicon. It is designed to be highly configurable to specific requirements of numerous use cases.”
The CPU modifications needed to integrate it with a PPU are minimal, and it is also possible to customize it extensively, with 4/16/64/256 cores, depending upon the need.
The latency of memory references is hidden by executing other threads while accessing the memory in PPUs. No coherency problems arise, since no caches are placed in the front of the network.
.........