on data centers, especially as massive AI models are being used more and more in our everyday tasks." getty "AI and machine learning inference in general, and large language models are becoming a heavy burden. Moreover, increased power consumption leads to increased operational costs of data centers.” “According to the International Energy Agency,” says Messica, “data centers and transmission networks make up almost 3% of the global electricity use, and this adversely affects CO2 emissions. This helps to slow down the expansion pace of data centers.”ĭata centers are seemingly critical for computational tasks but they come at a heavy price. Power consumption saving, even a minute fraction of a percent, is a blessing for data centers.Furthermore, saving chip area enables packing more transistors on the silicon area, and therefore more computation power is delivered, thereby improving the computation efficiency. “Steadily increasing workloads of AI and machine learning (ML) tasks are focused on power consumption efficiency in tandem with increased computational power. “Data centers are especially sensitive to power consumption,” Messica explains. reduction of digital circuits by modifying their topology to deliver circuits that do not exist within the current technology paradigm." gettyĮven before ChatGPT, the growing use of AI and ML led to a significant growth in data centers housing IT infrastructure for storage and data processing. Messica: "We’re applying a new approach to digital circuit design that relies on complexity. Enterprises have already established data center facilities around the U.S.ĭr. The more extensive and data-intensive computation becomes, the more memory and energy we need. Each transmission of data between memory and processors consumes energy, and the higher the bandwidth of data per time the higher the power consumption. This is of significant importance and value to any company that designs a microprocessor and can lead not only to a reduced chip cost and reduced power dissipation but also to a longer life span, and more, as well as to fringe benefits such as ESG contribution, less consumption of raw materials, and less waste at the foundry.ĪI and machine learning inference in general, and large language models such as ChatGPT specifically are becoming a heavy burden on data centers, especially as massive AI models are being used more and more in our everyday tasks. NeoLogic’s chip design technology delivers up to two-digits of percentage improvement in processors’ power consumption or area. In advanced technology nodes such as 16 nanometer and down to 3 nanometer, every bit of energy saved, even half of a percent, makes a difference. These circuits allow chip engineers to design more power efficient and more compact processors than can be achieved by using the state-of-the-art technology namely, CMOS technology,” Leshem adds. “We’re applying a new approach to digital circuit design that relies on complexity reduction of digital circuits by modifying their topology to deliver circuits that do not exist within the current technology paradigm, such as logic gates of a high number of inputs, exceeding the existing four-inputs limitation to be exact, and the like. They invented a new way of microprocessor’s operation to deliver a solution that goes beyond Moore's Law. Neologic’s founders do not push back against that premise. Neologic invented a new way of microprocessor’s operation to deliver a solution that goes beyond.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |