Artificial intelligence or have one’s own will Hawking’s last book is full of worries about the future.

  Science and Technology Daily reporter Liu Xia

  On October 16th, the last book of the late famous physicist Stephen Hawking, A Brief Answer to Major Issues, was published, which involved his thinking on the biggest problems facing science and society, including "Will human beings survive on the earth all the time?" (Probably not) ""Is time travel possible? (still can’t rule out the possibility) ",etc.; There are also the final predictions on the most serious threats facing the earth, the "Superman" population, whether there is intelligent life in outer space and how to develop space colonies. These predictions are full of Hawking’s deep worries about the future of mankind.

  According to a report on the US Shi Ying Finance website on the 15th, Hawking said in his book that artificial intelligence may form its own will in the future — — The will that conflicts with us human beings; A "superman" population that uses genetic engineering to surpass its peers will dominate, which may destroy mankind.

  Human evolution has no boundaries.

  Shi Ying Finance website reported on the 16th that throughout the book, Hawking was pessimistic about the future of mankind on the earth. Political instability, climate change and the possibility of nuclear violence make the sustainability of human development on earth unsustainable.

  Hawking believes that the number one threat facing the earth is asteroid collision, similar to the collision that led to the extinction of dinosaurs. He wrote: "However, we can’t defend ourselves."

  The more immediate threat is climate change. "Rising ocean temperatures will melt the ice cap and release a lot of carbon dioxide. The double effect may cause our climate to be similar to Venus, and the temperature will reach 250 C. "

  Hawking believes that nuclear fusion power generation will give us clean energy, no pollution and no global warming.

  In the chapter "How do we shape the future", Hawking disagrees with the idea that human beings are at the peak of evolution. In his view, human evolution and efforts have no boundaries.

  He believes that human beings have two choices in the future: first, explore other alternative planets that can be inhabited by human beings. He advocates space colonization in more than one chapter, including colonizing the moon, Mars or interstellar planets; Secondly, actively use artificial intelligence to improve our world.

  Artificial intelligence will form its own will.

  Hawking also emphasized the importance of standardizing the management of artificial intelligence. He pointed out that "in the future, artificial intelligence may form its own will, which conflicts with our human will." The possible automatic arms race should be stopped. If there is a weapons crash like the flash crash in the stock market in 2010, the consequences will be unimaginable.

  He wrote in the book: "For human beings, the emergence of super-intelligent artificial intelligence is either a blessing or a curse, and the second must be one of them. The real danger of artificial intelligence lies not in malice, but in ability. Super intelligent artificial intelligence will eventually be extremely good at achieving goals. If these goals are not consistent with our goals, then we are in trouble. "

  He advocates that policy makers, the technology industry and the general public should seriously study the moral impact of artificial intelligence.

  The "Superman" population will dominate.

  According to the British "Sunday Times" reported on the 15th, Hawking’s biggest worry is that the rich will soon be able to edit their own and their children’s DNA to improve their memory and disease immunity.

  Shi Ying Finance website reported that Hawking believed that at some point in the next 1000 years, nuclear war or environmental disaster would "seriously destroy the earth". By then, "our talented race may have found a way to get rid of the shackles of the earth, so it can overcome the disaster." However, other species on earth may not be able to do it.

  These people who have successfully escaped from the earth are probably new "Superman". They use gene editing techniques such as CRISPR (commonly known as "gene scissors") to surpass others. Scientists can use such gene editing techniques to repair harmful genes and add other genes.

  Hawking also said that even if politicians try to ban this practice by law, these people will improve their memory, disease resistance and life expectancy regardless of the legal constraints prohibiting genetic engineering, which will pose a crisis to the rest of the world.

  In the selected section published in the Sunday Times on the 15th, Hawking said: "I am sure that people will find ways to modify their intelligence and talent within this century. Politicians may make laws prohibiting human genetic engineering, but there are certainly people who can’t resist improving human characteristics — — Such as memory, disease resistance and longevity. "

  He pointed out: "Once this kind of superman appears, those who fail to improve themselves through genetic modification will encounter major political problems. They are unable to compete, and may become dispensable or even extinct. However, in the future, there will be a self-designed human being who is accelerating and improving. If this human race can constantly try to redesign itself, it is likely to spread and colonize other planets and planets. "

  Hawking admits that there are various opinions about why intelligent life has not been discovered or intelligent life has not visited the earth. He is cautiously optimistic about this, but his preferred explanation is that human beings have "ignored" the form of intelligent life in outer space.

  (Science and Technology Daily, Beijing, October 17th)