As banks increasingly look for ways to grow their market share, there are several ideas to consider. Among them is an approach that envisages empowering…Learn More
Sep 25, 2020
The term “artificial intelligence” (AI)-referring to the use of computer systems to perform tasks that normally require human understanding—has been around for nearly 60 years. But it is only recently that AI appears on the brink of revolutionizing industries as diverse as health care, law, journalism, aerospace, and manufacturing, with the potential to profoundly affect how people live, work, and play.
A number of forces have converged to bring AI into its own. Increased processing power makes it possible for computers to execute complex tasks at speeds once unimaginable—at a cost that has fallen rapidly. The ramp-up in cloud computing and the outsourcing of data storage, which has come down significantly in price, have allowed companies to develop and use AI applications. Mobility and bandwidth ubiquity make it possible for workers to access applications from most remote locations. Finally, our increasingly sophisticated understanding of how the human brain works and our ability to embed brain-like elements into computers have engendered such capabilities as voice and pattern recognition, natural language learning, and machine learning.
Within the next three to five years, we expect there will be an exponential increase in the number of commercial AI-based applications. A Deloitte study titled Cognitive technologies: The real opportunities for business published earlier this year concluded that AI applications fall into three broad categories:
Posts per page: 6