Revision as of 09:28, 21 July 2023 editHakanIST (talk | contribs)Extended confirmed users, Pending changes reviewers4,709 editsm Reverted edit by 41.109.66.208 (talk) to last version by Rolf h nelsonTag: Rollback← Previous edit | Revision as of 02:52, 13 August 2023 edit undoDaihatsumaven (talk | contribs)29 editsm →AI acceleratorsNext edit → | ||
Line 22: | Line 22: | ||
{{Main|AI accelerator}} | {{Main|AI accelerator}} | ||
Since the 2010s, advances in computer hardware have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer.<ref>{{cite web |last1=Research |first1=AI |date=23 October 2015 |title=Deep Neural Networks for Acoustic Modeling in Speech Recognition |url=http://airesearch.com/ai-research-papers/deep-neural-networks-for-acoustic-modeling-in-speech-recognition/ |website=AIresearch.com |access-date=23 October 2015}}</ref> By 2019, ]s (GPUs), often with AI-specific enhancements, had displaced ] (CPUs) as the dominant means to train large-scale commercial cloud AI.<ref>{{cite news |last=Kobielus |first=James |date=27 November 2019 |url=https://www.informationweek.com/ai-or-machine-learning/gpus-continue-to-dominate-the-ai-accelerator-market-for-now |title=GPUs Continue to Dominate the AI Accelerator Market for Now |work=InformationWeek |language=en |access-date=11 June 2020}}</ref> ] estimated the hardware compute used in the largest deep learning projects from Alex Net (2012) to Alpha Zero (2017), and found a 300,000-fold increase in the amount of compute needed, with a doubling-time trend of 3.4 months.<ref>{{cite news |last=Tiernan |first=Ray |date=2019 |title=AI is changing the entire nature of compute |language=en |work=ZDNet |url=https://www.zdnet.com/article/ai-is-changing-the-entire-nature-of-compute/ |access-date=11 June 2020}}</ref><ref>{{cite web |date=16 May 2018 |title=AI and Compute |url=https://openai.com/blog/ai-and-compute/ |access-date=11 June 2020 |website=OpenAI |language=en}}</ref> | Since the 2010s, advances in computer hardware have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer.<ref>{{cite web |last1=Research |first1=AI |date=23 October 2015 |title=Deep Neural Networks for Acoustic Modeling in Speech Recognition |url=http://airesearch.com/ai-research-papers/deep-neural-networks-for-acoustic-modeling-in-speech-recognition/ |website=AIresearch.com |access-date=23 October 2015}}</ref> By 2019, ]s (GPUs), often with AI-specific enhancements, had displaced ] (CPUs) as the dominant means to train large-scale commercial cloud AI.<ref>{{cite news |last=Kobielus |first=James |date=27 November 2019 |url=https://www.informationweek.com/ai-or-machine-learning/gpus-continue-to-dominate-the-ai-accelerator-market-for-now |title=GPUs Continue to Dominate the AI Accelerator Market for Now |work=InformationWeek |language=en |access-date=11 June 2020}}</ref> ] estimated the hardware compute used in the largest deep learning projects from Alex Net (2012) to Alpha Zero (2017), and found a 300,000-fold increase in the amount of compute needed, with a doubling-time trend of 3.4 months.<ref>{{cite news |last=Tiernan |first=Ray |date=2019 |title=AI is changing the entire nature of compute |language=en |work=ZDNet |url=https://www.zdnet.com/article/ai-is-changing-the-entire-nature-of-compute/ |access-date=11 June 2020}}</ref><ref>{{cite web |date=16 May 2018 |title=AI and Compute |url=https://openai.com/blog/ai-and-compute/ |access-date=11 June 2020 |website=OpenAI |language=en}}</ref> | ||
== Sources == | == Sources == |
Revision as of 02:52, 13 August 2023
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
Specialized computer hardware is often used to execute artificial intelligence (AI) programs faster, and with less energy, such as Lisp machines, neuromorphic engineering, event cameras, and physical neural networks. As of 2023, the market for AI hardware is dominated by GPUs.
Lisp machines
Main article: Lisp machineLisp machines were developed in the late 1970s and early 1980s to make Artificial intelligence programs written in the programming language Lisp run faster.
Dataflow architecture
Main article: Dataflow architectureDataflow architecture processors used for AI serve various purposes, with varied implementations like the polymorphic dataflow Convolution Engine by Kinara (formerly Deep Vision), structure-driven dataflow by Hailo, and dataflow scheduling by Cerebras.
Component hardware
AI accelerators
Main article: AI acceleratorSince the 2010s, advances in computer hardware have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer. By 2019, graphics processing units (GPUs), often with AI-specific enhancements, had displaced central processing units (CPUs) as the dominant means to train large-scale commercial cloud AI. OpenAI estimated the hardware compute used in the largest deep learning projects from Alex Net (2012) to Alpha Zero (2017), and found a 300,000-fold increase in the amount of compute needed, with a doubling-time trend of 3.4 months.
Sources
- "Nvidia: The chip maker that became an AI superpower". BBC News. 25 May 2023. Retrieved 18 June 2023.
- Maxfield, Max (24 December 2020). "Say Hello to Deep Vision's Polymorphic Dataflow Architecture". Electronic Engineering Journal. Techfocus media.
- "Kinara (formerly Deep Vision)". Kinara. 2022. Retrieved 2022-12-11.
- "Hailo". Hailo. Retrieved 2022-12-11.
- Lie, Sean (29 August 2022). Cerebras Architecture Deep Dive: First Look Inside the HW/SW Co-Design for Deep Learning. Cerebras (Report).
- Research, AI (23 October 2015). "Deep Neural Networks for Acoustic Modeling in Speech Recognition". AIresearch.com. Retrieved 23 October 2015.
- Kobielus, James (27 November 2019). "GPUs Continue to Dominate the AI Accelerator Market for Now". InformationWeek. Retrieved 11 June 2020.
- Tiernan, Ray (2019). "AI is changing the entire nature of compute". ZDNet. Retrieved 11 June 2020.
- "AI and Compute". OpenAI. 16 May 2018. Retrieved 11 June 2020.
This computer hardware article is a stub. You can help Misplaced Pages by expanding it. |