HPC & Quantum
27 subscribers
11.4K photos
668 videos
3 files
30.6K links
Download Telegram
HPC Guru (Twitter)

RT @ProjectPhysX: How much floating-point precision do you need for lattice Boltzmann #CFD? #FP64 is overkill in most cases. #FP32/#FP16 mixed-precision works with almost equal accuracy at ¼ the memory demand and is 4x-10x faster on #GPU.
🧵1/7
Big @SFB1357 #PhD paper👉 https://www.researchgate.net/publication/362275548_Accuracy_and_performance_of_the_lattice_Boltzmann_method_with_64-bit_32-bit_and_customized_16-bit_number_formats https://twitter.com/ProjectPhysX/status/1552225695044190212/photo/1
HPC Guru (Twitter)

If confirmed, that’s ~100x the cost of the largest supercomputers used for traditional #HPC (Simulation etc.)

Drop the 6 from #FP64, FP4 is where all the💰is 😜

#AI #GenAI @OpenAI @Microsoft https://twitter.com/spectatorindex/status/1773773660198969573#m