Context Attentive Document Ranking and Query...

Preview:

Citation preview

Context Attentive Document Ranking and Query Suggestion

Wasi Uddin AhmadUniversity of California,

Los Angeles

Kai-Wei ChangUniversity of California,

Los Angeles

Hongning WangUniversity of Virginia

https://github.com/wasiahmad/context_attentive_irCodes will be released soon!

Search Logs Provide Rich Context to Understand Users’ Search Tasks

ContextAttentiveRankingandSuggestion 2CS@UCLA

5/29/2012 5/29/2012 14:06:04 coney island cincinnati5/29/2012 14:11:49 sas5/29/2012 14:12:01 sas shoes

5/30/2012 5/30/2012 12:12:04 exit #72 and 275 lodging5/30/2012 12:25:19 6pm.com5/30/2012 12:49:21 coupon for 6pm

5/31/20125/31/2012 19:40:38 motel 6 locations5/31/2012 19:45:04 hotels near coney island

5/29/2012 5/29/2012 14:06:04 coney island cincinnati5/29/2012 14:11:49 sas5/29/2012 14:12:01 sas shoes

5/30/2012 5/30/2012 12:12:04 exit #72 and 275 lodging5/30/2012 12:25:19 6pm.com5/30/2012 12:49:21 coupon for 6pm

5/31/20125/31/2012 19:40:38 motel 6 locations5/31/2012 19:45:04 hotels near coney island

Task: an atomic information need that may result in one or more queries.

Clicks Further Enrich Context to Understand Users’ Search Tasks

CS@UCLA ContextAttentiveRankingandSuggestion 3

𝑄"

𝑄#

𝑄$low cost decoration

Clicked document

Skipped document

decoration options for home, patio, outdoor etc. low cost bedroom decorations.

decorate bedroom on a limited budget

furniture for decoration

cheap furniture for bedroom. buying used furniture is an alternative route.

used and cheap futon, dressers, cupboards.

Guides to reformulate query

Previous clicks help to rank documents

Summary of clicked documents

Our Proposal

• Attend on previous queries/clicks to perform retrieval tasks• Learning to utilize context in multiple retrieval activities

CS@UCLA ContextAttentiveRankingandSuggestion 4

D6

q2 q3 q6q5q0

Rank candidate documents

D2D3 D5

q7

Predict next query

A Context Attentive Solution• A two-level hierarchical

structure

• Task context embedding• Session-query and

session-click encoders• Context attentive

representations

• Multi-task Learning• Document Ranking• Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 5

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2 Q

Rec Q3

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2

QRec

Ranker

Recomm

ender

Query Encoder

Query Session Encoder

Document Session Encoder

initi

al st

ate

initi

al st

ate

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

Clicked Documents

Documents

…..

Inner AttentionDocument

Encoder

master’s degree hospital jobs

Recommender

Ranker

Encoders?

This one should give us better visibility

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Documents

Ranker

Recomm

enderQuery Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Ranker Recommender

………..

………..

Maxout neuron

Projection + Softmax

Cont

ext A

tten

tion

Cont

ext V

ecto

r

Session-levelClick Attention

Session-levelQuery Attention

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Context-AttentiveRepresentation

Context-AttentiveRepresentation

lower-level

lower-level

Context Attentive Ranking and Suggestion

A Context Attentive Solution• A two-level hierarchical

structure

• Task context embedding• Session-query and

session-click encoders• Context attentive

representations

• Multi-task Learning• Document Ranking• Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 6

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2 Q

Rec Q3

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2

QRec

Ranker

Recomm

ender

Query Encoder

Query Session Encoder

Document Session Encoder

initi

al st

ate

initi

al st

ate

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

Clicked Documents

Documents

…..

Inner AttentionDocument

Encoder

master’s degree hospital jobs

Recommender

Ranker

Encoders?

This one should give us better visibility

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Documents

Ranker

Recomm

enderQuery Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Ranker Recommender

………..

………..

Maxout neuron

Projection + Softmax

Cont

ext A

tten

tion

Cont

ext V

ecto

r

Session-levelClick Attention

Session-levelQuery Attention

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Context-AttentiveRepresentation

Context-AttentiveRepresentation

upper-level

upper-level

Context Attentive Ranking and Suggestion

A Context Attentive Solution• A two-level hierarchical

structure

• Task context embedding• Session-query and

session-click encoders• Context attentive

representations

• Multi-task Learning• Document Ranking• Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 7

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2 Q

Rec Q3

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2

QRec

Ranker

Recomm

ender

Query Encoder

Query Session Encoder

Document Session Encoder

initi

al st

ate

initi

al st

ate

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

Clicked Documents

Documents

…..

Inner AttentionDocument

Encoder

master’s degree hospital jobs

Recommender

Ranker

Encoders?

This one should give us better visibility

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Documents

Ranker

Recomm

enderQuery Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Ranker Recommender

………..

………..

Maxout neuron

Projection + Softmax

Cont

ext A

tten

tion

Cont

ext V

ecto

r

Session-levelClick Attention

Session-levelQuery Attention

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Context-AttentiveRepresentation

Context-AttentiveRepresentation

Context Attentive Ranking and Suggestion

A Context Attentive Solution• A two-level hierarchical

structure

• Task context embedding• Session-query and

session-click encoders• Context attentive

representations

• Multi-task Learning• Document Ranking• Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 8

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2 Q

Rec Q3

SD1 SD

2 SD3

SQ1 SQ

2 SQ3

Q1 Q2 Q3

Ranker

D2

DC2

QRec

Ranker

Recomm

ender

Query Encoder

Query Session Encoder

Document Session Encoder

initi

al st

ate

initi

al st

ate

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

Clicked Documents

Documents

…..

Inner AttentionDocument

Encoder

master’s degree hospital jobs

Recommender

Ranker

Encoders?

This one should give us better visibility

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Documents

Ranker

Recomm

enderQuery Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Ranker Recommender

………..

………..

Maxout neuron

Projection + Softmax

Cont

ext A

tten

tion

Cont

ext V

ecto

r

Session-levelClick Attention

Session-levelQuery Attention

Ranker

Recomm

ender

Query Encoder

Session-Query Encoder

Session-Click Encoder

initi

al st

ate

initi

al st

ate

…..

Inner AttentionDocument

Encoder

Inner AttentionInner Attention

it masters nysoftware engineer ny it position ny healthcare2018

Inner Attention

master’s degree hospital jobs

Clicked Documents

Context-AttentiveRepresentation

Context-AttentiveRepresentation

Context Attentive Ranking and Suggestion

Multi-task Learning Objective

• Optimized via Regularized Multi-task Learning

ContextAttentiveRankingandSuggestion 9WasiUddinAhmad@UCLA

Negative log-likelihoodRegularization

𝑄"

𝑄$ low cost decoration

decorate bedroom on a limited budget

Task 1: document ranking

furniture for decoration

Task 2: next query prediction𝑄#

Experiments

ContextAttentiveRankingandSuggestion 10CS@UCLA

Data Source

• AOL search log – 13 weeks search log of ~650k users• Background set – 5 weeks• Training set – 6 weeks• Validation and Test set – 2 weeks

• Aggregation of candidate documents for ranking• Candidates are sampled from the top 1000 documents retrieved by

BM25 from a pool of 1M documents

• Task Segmentation• Averaging the query term vectors → Query embedding• Consecutive queries* with cosine similarity > 0.5

ContextAttentiveRankingandSuggestion 11CS@UCLA

* within 30 mins interval

Data Statistics

• Only the document titles are utilized in the experiments

ContextAttentiveRankingandSuggestion 12CS@UCLA

Evaluation Metrics and Baselines

• Document Ranking• Metrics – MAP, MRR, NDCG@k (where k=1,3,10)• Baselines – BM25, QL, FixInt, DSSM, DUET, MNSRF etc.

• Query Suggestion• Metrics – MRR, F1, BLEU-k (where k=1,2,3,4)• Baselines – HRED-qs, Seq2seq+Attn, MNSRF etc.

MRR – assesses discrimination ability, rank a list of candidate queries that might follow a given query

ContextAttentiveRankingandSuggestion 13CS@UCLA

Evaluation Metrics and Baselines

• Document Ranking• Metrics – MAP, MRR, NDCG@k (where k=1,3,10)• Baselines – BM25, QL, FixInt, DSSM, DUET, MNSRF etc.

• Query Suggestion• Metrics – MRR, F1, BLEU-k (where k=1,2,3,4)• Baselines – HRED-qs, Seq2seq+Attn, MNSRF etc.

F1, BLEU-k – assesses generation ability, measures overlapping between the generated query term sequence and ground-truth sequence.

ContextAttentiveRankingandSuggestion 14CS@UCLA

Evaluation on Document Ranking

CS@UCLA ContextAttentiveRankingandSuggestion 15

− Vocabulary gap limits the performance− Do not utilize any search context information

Evaluation on Document Ranking

CS@UCLA ContextAttentiveRankingandSuggestion 16

− Do not utilize any search context information

Evaluation on Document Ranking

CS@UCLA ContextAttentiveRankingandSuggestion 17

+ Jointly learns ranking and suggestion− Utilizes query history but no click history

Evaluation on Document Ranking

CS@UCLA ContextAttentiveRankingandSuggestion 18

+ Jointly learns ranking and suggestion+ Utilizes both query and click history

Evaluation on Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 19

Do not utilize any context

Evaluation on Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 20

Do not utilize click history

Evaluation on Query Suggestion

CS@UCLA ContextAttentiveRankingandSuggestion 21

+ Utilizes both in-task query and click history

Ablation Study

CS@UCLA ContextAttentiveRankingandSuggestion 22

• Modeling search context is beneficial• Joint learning of related retrieval tasks results in improvements

∗ indicates that the attention in the query recommender

Ablation Study

CS@UCLA ContextAttentiveRankingandSuggestion 23

• Modeling search context is beneficial• Joint learning of related retrieval tasks results in improvements

∗ indicates that the attention in the query recommender

Performance drops

Ablation Study

CS@UCLA ContextAttentiveRankingandSuggestion 24

• Modeling search context is beneficial• Joint learning of related retrieval tasks results in improvements

∗ indicates that the attention in the query recommender

Performance dropsPerformance

increases

Effect of Context Modeling

ContextAttentiveRankingandSuggestion 25CS@UCLA

Hypothesis – longer tasks are intrinsically more difficult.

Short tasks (with 2 queries)Medium tasks (with 3–4 queries)Long tasks (with 5+ queries)

Effect of Context Modeling

ContextAttentiveRankingandSuggestion 26CS@UCLA

• Context information helps more on short/medium tasks• Longer tasks are intrinsically more difficult.

Sample Complexity

ContextAttentiveRankingandSuggestion 27CS@UCLA

In terms of #parameters, CARS > MNSRF > M-Match Tensor

10/14/19 NeuralTaskModeling 28

Case Analysis

How in-task previous queries and clicks impact predicting the next query and click for it?

Conclusion & Future Works

• A task-based approach of learning search context• Exploiting users’ on-task search query and click sequence• Jointly optimized on two companion retrieval tasks

• Future works• Modeling across-task relatedness, e.g., users’ long-term search

interest• Apply to any scenario where a user sequentially interacts with a

system

CS@UCLA ContextAttentiveRankingandSuggestion 29

Thank You!Q&A

ContextAttentiveRankingandSuggestion 30CS@UCLA

In-task Context: a richer way to understand users’ search intent

https://github.com/wasiahmad/context_attentive_irCodes will be released soon!

Recommended