Lab, News

Neurosymbolic AI at ACS 2020

Today I presented a long paper, “Neurosymbolic AI for Situated Language Understanding” at the Advances in Cognitive Systems conference, held virtually and hosted by the Palo Alto Research Center.

This was third of three papers submitted as a postdoc and presented as a professor, and I’m really very proud of this one, as it’s a detailed yet concise summary of effectively the last five years of work at Brandeis in developing situated grounding and embodied AI under the Communicating With Computers program, and nicely lays out most of my graduate and postdoctoral career.

The work discussed in this paper forms the foundations of the work in the SIGNAL Lab, and I hope that neurosymbolic AI is just getting started, and provides a number of opportunities for groundbreaking research that are barely on the horizon for the AI and cognitive systems communities. You can find the paper here, the slides here, and a video of the talk here.

(X-posted on nikhilkrishnaswamy.com)

Related article

New Publication at HCI+NLP Workshop @ EACL

SIGNAL Lab Ph.D. student Nada Alalyani and I have a new publication that will be...

Read more

Best Demo at ICAT-EGVE 2020

Our demo, *Situational Awareness in Human Computer Interaction: Diana’s World* has won the [best demo...

Read more

SIGNAL Lab Colloquium Video

Traditionally, computer science faculty at CSU give rapid-fire research presentations at the beginning of the...

Read more