The Pentagon wants Silicon Valley’s help on A.I.

although those relations have soured in recent years — at least with the rank in addition to file of some better-known companies. In 2013, documents leaked by the former defense contractor Edward J. Snowden revealed the breadth of spying on Americans by intelligence services, including monitoring the users of several large internet companies.

PhotoRobert O. Work, right, at a 2014 news conference led by Chuck Hagel, the defense secretary at the time. Mr. Work, who was the deputy secretary of defense, said of the global race for A.I. technology: “This specific can be a Sputnik moment.” CreditChip Somodevilla/Getty Images

Two years ago, that will antagonism grew worse after the F.B.I. demanded that will Apple create special software to help This specific gain access to a locked iPhone that will had belonged to a gunman involved in a mass shooting in San Bernardino, Calif.

“within the wake of Edward Snowden, there has been a lot of concern over what This specific would likely mean for Silicon Valley companies to work with the national security community,” said Gregory Allen, an adjunct fellow with the Center for a completely new American Security. “These companies are — understandably — very cautious about these relationships.”

The Pentagon needs help on A.I. coming from Silicon Valley because that will’s where the talent can be. The tech industry’s biggest companies have been hoarding A.I. expertise, sometimes offering multimillion-dollar pay packages that will the government could never expect to match.

Mr. Work was the driving force behind the creation of Project Maven, the Defense Department’s sweeping effort to embrace artificial intelligence. His completely new task force will include Terah Lyons, the executive director of the Partnership on AI, an industry group that will includes many of Silicon Valley’s biggest companies.

Mr. Work will lead the 18-member task force with Andrew Moore, the dean of computer science at Carnegie Mellon University. Mr. Moore has warned that will too much of the country’s computer science talent can be going to work at America’s largest internet companies.

With tech companies gobbling up all that will talent, who will train the next generation of A.I. experts? Who will lead government efforts?

“Even if the U.S. does contain the best A.I. companies, This specific can be not clear they are going to be involved in national security in a substantive way,” Mr. Allen said.

Google illustrates the challenges that will big internet companies face in working more closely with the Pentagon. Google’s former executive chairman, Eric Schmidt, who can be still a member of the board of directors of its parent company, Alphabet, also leads the Defense Innovation Board, a federal advisory committee that will recommends closer collaboration with industry on A.I. technologies.

Last week, two news outlets revealed that will the Defense Department had been working with Google in developing A.I. technology that will can analyze aerial footage captured by flying drones. The effort was part of Project Maven, led by Mr. Work. Some employees were angered that will the company was contributing to military work.

Google runs two of the best A.I. research labs within the entire world — Google Brain in California in addition to DeepMind in London.

Top researchers inside both Google A.I. labs have expressed concern over the use of A.I. by the military. When Google acquired DeepMind, the company agreed to set up an internal board that will would likely help ensure that will the lab’s technology was used in an ethical way. in addition to one of the lab’s founders, Demis Hassabis, has explicitly said its A.I. would likely not be used for military purposes.

Google acknowledged in a statement that will the military use of A.I. “raises valid concerns” in addition to said This specific was working on policies around the use of its so-called machine learning technologies.

Among A.I. researchers in addition to various other technologists, there can be widespread fear that will today’s machine learning techniques could put too much power in dangerous hands. A recent report coming from prominent labs in addition to think tanks in both the United States in addition to Britain detailed the risks, including issues with weapons in addition to surveillance equipment.

Google said This specific was working with the Defense Department to build technology for “non-offensive uses only.” in addition to Mr. Work said the government explored many technologies that will did not involve “lethal force.” although This specific can be unclear where Google in addition to various other top internet companies will draw the line.

“This specific can be a conversation we have to have,” Mr. Work said.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

3 × four =