In 2020, NVIDIA released the A100 with 54 billion transistors, 80GB of memory, and 2 TB/s bandwidth. It was VERY BIG and VERY FAST(tm).
In 2022, the H100 arrived with 80 billion transistors, the first Transformer Engine optimized for the neural network architecture behind ChatGPT, and performance gains