As for most of the technology industry, 2018 was a year of reckoning for artificial intelligence. As artificial intelligence systems were integrated into more products and services, the disadvantages of technology became more apparent. Researchers, companies and the general public are increasingly beginning to struggle with AI limitations. and its negative consequences, asking such important questions as: how is this technology used and in whose favor?
This calculation was most noticeable as a parade of negative headers about algorithmic systems. This year there were first deaths caused by self-driving cars; scandal with Cambridge analytics; accusations that Facebook contributed to the genocide in Myanmar; the revelation that Google helped the Pentagon prepare drone surveillance tools; and ethical questions about the artificial assistant to the artificial intelligence technical giant. The AI Now research team described the year 2018 as a year of “cascading scandals” for the industry, and this is an accurate, albeit depressing summary.
But there is no need to treat these headlines as only negative. In the end, the scandal is better than the evil that goes unnoticed, and the contradictions, theoretically, can help us improve.
Take face recognition. It was one of the fastest growing technologies in 2018, with such success as the Chinese police identifying a criminal at a music concert and broadcasters using this technology to identify guests at the royal wedding, but also serious problems, including bias, false positives and other life-changing mistakes. Police forces around the world have begun to use face recognition in the wild, despite the fact that research after research has shown serious shortcomings, and the authoritarian potential of this technology has painfully reached China, where it is one of the many tools used to suppress the Uyghur minority.
All this is unpleasant to read, but as a result of these contradictions, companies began to create tools to combat bias problems, and large technology firms, such as Microsoft, are now openly calling for the regulation of facial recognition. To read this news in a positive light, more controversy means careful study, and ultimately more solutions.
And despite this cascade of scandals, in 2018 there were also dozens, hundreds of encouraging and positive implementations of machine learning and artificial intelligence. There were small victories, everywhere from astronomy, where machine learning discovered new craters on the moon and missed exoplanets; basic scientific research, such as using artificial intelligence to develop stronger metals and plastics; and health care, where there were many examples of AI systems that can detect diseases faster and more accurately than humans. New tools, such as plug-and-play machine learning services from Google and Amazon, and accessible training courses from organizations such as Fast.ai, provided artificial intelligence with additional hands, and the results were largely useful and often inspiring.
These successes do not balance large failures, but together they show that AI is a complex area. He does not move in the same moral direction, but, like all technologies, has been involved by many players who use it to achieve a number of results.
If we consider the whole year as a whole, then one lesson stands out: the AI is not magic. This is not a two-letter spell that can be used to summon venture capital and institutional trust at whim; and this is not a magical dust that can be sprinkled on products and institutions for instant improvements. Artificial intelligence is processSomething that needs to be studied, pondered and, if everything goes well, understand. In other words, the payoff can continue for a long time.
Final grade: B
Verge 2018 Report Sheet: AI
- AI tools are becoming more accessible.
- Countless uses have been found in various areas.
- A world-changing technology, just starting to evolve
- Potential to enhance surveillance and help authoritarian states
- Large technology companies and governments first implement AI systems, and then ask questions
- It will end in tears (maybe)