ICT Today

ICT Today March/April 19

Issue link: https://www.e-digitaleditions.com/i/1081695

Contents of this Issue


Page 38 of 63

March/April 2019 I 39 Therefore, forward-thinking ICT professionals no longer view AI as merely algorithms and software in support of smart products and services but rather as a vital functional component of today's and tomorrow's ICT networks and applications. CLEARING THE CONFUSION BETWEEN AI, COGNITIVE COMPUTING, AND AUGMENTED INTELLIGENCE Amid the commercial media hype for AI, true AI is suffering from a case of mistaken identity. In the article, Artificial Intelligence Has Become Meaningless, it is argued that AI "has been hijacked by companies wanting to make their software algorithms sound smarter than they really are. Chat bots are often classed as AIs, for example, when they are mostly glorified phone trees, or else clever, automated Mad Lib-type programming." 3 For example, upon close inspection, Facebook's so-called AI that detects suicidal thoughts posted to its platform is little more than a pattern-matching filter that flags posts for human community managers, while Google's so-called Perspective machine learning algorithm to identify toxic online comments can be easily fooled by simple typos. 4 Further blurring the definition of AI is the confusion between identifying a cognitive computing system and an AI system. Both are needed in supercomputing and big data, but each has a unique purpose, task, and goal. Essentially, cognitive computing is designed to "assist" human decision-making, while AI is designed to replace it. Cognitive computing solves problems the way people do by analyzing, reasoning, and remembering. Conversely, AI, on its own, can autonomously make decisions in isolation without human intervention using algorithms and processes that are not necessarily human-like in cognition, and it does not mimic human-like thought in decision making as does cognitive computing. The ability for AI to make decisions without human intervention, for example, is paramount for the many unmanned micro data centers that will be installed as edge computing evolves, as well as IIoT, smart cities and 5G. The idea that AI is designed to replace human thought—coupled with the now infamous tweet of controversial tech leader Elon Musk who proclaimed that AI Artificial Intelligence (AI), a very old technology dating prior to the 1920s and with 40 year old roots in ICT, is central to making Klaus' prediction a reality. So much so, Intel has invested well over $1 billion in businesses that are advancing the field of AI, updated its total available market estimate for all AI silicon (i.e., servers, accelerators, memory, networking, storage) to $200 billion by 2022, and launched an aggressive strategy to compete with NVIDIA for AI, AMD for data center graphics processing units, and the many other AI players vying for market dominance. 2 Moreover, recent research forecasts predominantly agree that global artificial intelligence uses will grow at a compound annual growth rate (CAGR) between 50 to 63 percent over the next three years. Though AI has subtly impacted ICT throughout the years, it has been a subject far removed from commonplace ICT discussions; it was a technology out of the realm of Layer 1 connectivity and was best left to IT, data processors, coders and mathematical wizards. However, as the ICT industry is transitioning into the IoT world of network convergence and IP device-driven global connectivity, ICT designers and installers should become familiar with AI's general capabilities, misconceptions, trends, and its most lucrative uses in order to recommend, design and build the most innovative and efficient networks for their enterprise customers. Each and every decade that followed the 1920s to the present day promised that an autonomous vehicle would soon be ready for commercialization.

Articles in this issue

Archives of this issue

view archives of ICT Today - ICT Today March/April 19