Lab, News, Pubs

Neurosymbolic AI at ACS 2020

Today I presented a long paper, “Neurosymbolic AI for Situated Language Understanding” at the Advances in Cognitive Systems conference, held virtually and hosted by the Palo Alto Research Center.

This was third of three papers submitted as a postdoc and presented as a professor, and I’m really very proud of this one, as it’s a detailed yet concise summary of effectively the last five years of work at Brandeis in developing situated grounding and embodied AI under the Communicating With Computers program, and nicely lays out most of my graduate and postdoctoral career.

The work discussed in this paper forms the foundations of the work in the SIGNAL Lab, and I hope that neurosymbolic AI is just getting started, and provides a number of opportunities for groundbreaking research that are barely on the horizon for the AI and cognitive systems communities. You can find the paper here, the slides here, and a video of the talk here.

(X-posted on nikhilkrishnaswamy.com)

Related article

DARPA FACT AIE Award

I am beyond excited that we have been awarded a contract on the DARPA Friction...

Read more

LREC-COLING Hat Trick and Other Stories

We are delighted to be 3 for 3 on submissions to LREC-COLING 2024, the first...

Read more

RETTL Grant Awarded

Pleased to announce that our grant proposal to NSF's RETTL program, *An AI Tutoring System...

Read more