|
@dileeplearning | |||||
|
Are you skeptical about successor representations? Want to know how our new model can learn cognitive maps, context-specific representations, do transitive inference, and flexible hierarchical planning? #tweeprint...(1) @vicariousai @swaroopgj @rvrikhye biorxiv.org/content/10.110… twitter.com/vicariousai/st…
|
||||||
|
||||||
|
Dileep George
@dileeplearning
|
7. pro |
|
As @yael_niv pointed out in her recent article, learning context specific representations from aliased observations is a challenge. Our agent can learn the layout of a room from severely aliased random walk sequences, only 4 unique observations in the room! pic.twitter.com/JvVgTffjPb
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
And it works even when the room is empty, with no unique observations in the center of the room. The observations are now severely aliased and correlated, but it still recovers the map of the room. pic.twitter.com/nl1hilBD0R
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
A hard problem of transitive inference: two aliased rooms with an overlapping portion, and the agent walks in only in one room at a time. No problem...our model learns a coherent global map from these disjoint episodes! pic.twitter.com/XiL9HdlENA
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
Observations in rooms are aliased, but there is more. Did you notice that a portion in the first room looks exactly like the overlapping patch, making this a harder problem? Recovering all the relative positions required a lot of transitive stitching, which the model did!
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
Note that the model doesn't make any assumptions about 2D/3D space or Euclidean geometry or anything like that. It is purely relational and these are learned purely from sequential random walk observations.
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
Remember the 'splitter cells', where place cells encode the paths taken rather than locations? These emerge when rats follow paths rather than random walks. Happens that way in our model too. Also, lap cells emerge when rats run laps along the same loop. pic.twitter.com/TT0xxg9qzY
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
By transferring learned structural knowledge, the agent can take shortcuts in a new room, including navigating around obstacles, without having seen the whole room. pic.twitter.com/fMp2enEzvm
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
How does it work? By learning variable order sequences! The core idea is very simple: split aliased states and adapt them to different contexts. This representation has many good properties compared to suffix trees or RNNs. pic.twitter.com/bfZR2giyTs
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
The model can be formulated as a highly structured over-complete HMM, and trained with EM. Training results in a directed graph that is very sparse and approximates the latent generative process behind the observed sequences.
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
When the model is exposed to multiple mazes, it splits them apart properly and the responses remap when the agent switches from one maze to next. Rate remapping can be explained by uncertainty in the observations. pic.twitter.com/MSvy0GuiFx
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
When the world has latent hierarchy, our model can recover that. This is then used for efficient planning. pic.twitter.com/8CVeYbysie
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
There is more to be done. We think replay during learning and inference can be explained. The model has strong connections to our earlier work on schema nets vicarious.com/2017/08/07/gen…
|
||
|
|
||
|
Dileep George
@dileeplearning
|
7. pro |
|
I want to point out a few review papers which got us started and had great influence. Viewpoints interview is excellent. What is a cognitive map -- we learned it from @behrenstimb, @neuro_kim et al. and @NealWMorton. Buzsaki&Tingley: importance of sequence learning. pic.twitter.com/6CWPZj4vV0
|
||
|
|
||