Google Trained a Trillion-Parameter AI Language Model

An anonymous reader quotes a report from VentureBeat: Google researchers developed and benchmarked techniques they claim enabled them to train a language model containing more than a trillion parameters. They say their 1.6-trillion-parameter model, which appears to be the largest of its size to date, achieved an up to 4 times speedup over the previously largest Google-developed language model (T5-XXL). As…

Researchers Achieve AI Breakthrough Using Light To Perform Computations

“Researchers have achieved a breakthrough in the development of artificial intelligence by using light instead of electricity to perform computations,” reports the Independent. “The new approach significantly improves both the speed and efficiency of machine learning neural networks…”
A paper describing the research, published this week in the scientific journal Applied Physics Reviews, reveals that their photon-based (tensor) processing unit (TPU) was…

Nvidia CEO Says Google Is the Company’s Only Customer Building Its Own Silicon At Scale

An anonymous reader quotes a report from CNBC: Nvidia’s CEO, Jensen Huang, has reason to be concerned about other chipmakers, like AMD. But he’s not worried about Nvidia’s own big customers turning into competitors. Amazon, Facebook, Google and Tesla are among the companies that buy Nvidia’s graphics cards and have kicked off chip-development projects. “There’s really one I know of that…