Google grants $1.2 million for Natural Language Research

Google has awarded huge amount of grant recently to people researching in natural language understanding that relates to Google’s concept of the Knowledge Graph. Google had earlier shortlisted research topics for the awards that ranges from sematic parsing to statistical models of life stories and novel compositional inference and representation approaches to modeling relations and events in the Knowledge Graph.

In an annoGoogle grants 1.2 million for Natural Language Researchuncement by Google in its research blog, the tech giant has explained its importance of understanding natural language which is an integral part for its knowledge graph technology. In the blog, the company states that “ Understanding natural language is at the core of Google’s work to help people get the information they need as quickly and easily as possible. At Google we work hard to advance the state of the art in natural language processing, to improve the understanding of fundamental principles, and to solve the algorithmic and engineering challenges to make these technologies part of everyday life. Language is inherently productive; an infinite number of meaningful new expressions can be formed by combining the meaning of their components systematically. The logical next step is the semantic modeling of structured meaningful expressions — in other words, “what is said” about entities. We envision that knowledge graphs will support the next leap forward in language understanding towards scalable compositional analyses, by providing a universe of entities, facts and relations upon which semantic composition operations can be designed and implemented.”

The recipients has been awarded with grants from Google worth $1.2 million, according to the company sources.

 

 

Source: Google