The Big in Japan column titled "Humans face singular concern over AI jobs" in the June 28 edition contains an error which needs to be corrected.

The article defines "singularity" as the moment "when artificially intelligent machines become smarter than we are." The correct definition of "singularity" is when machines gain the ability to design improvements to themselves.

Machines are already "smarter" than humans by almost every measure, but that intelligence is based on programs written by humans. In other words, a human had to teach the machine to be smart. Once a program can start improving itself, it can develop independent of human intervention. Romantically, singularity might be referred to as "critical thinking," "free will" or "evolution."

Whatever you call it, it is literally impossible for humans to predict where this development will lead. This is the true fear of singularity. The appearance of robots in the workforce, which the article ties to singularity, is a completely separate issue.

As made apparent by the misconceptions in this article, the real cause for concern is that technology is developing faster than our ability to comprehend, or even articulate, all of the consequences.

Ramsey Lundock

Mitaka, Tokyo

The opinions expressed in this letter to the editor are the writer's own and do not necessarily reflect the policies of The Japan Times.