Entity Linking (prodigy training)

Hi Team,

Trying to replicate on my dataset nel_emerson, already prelabeled data and generated corpus like this:

docs = []

for obj in data:
    doc = nlp(obj['text'])

    s = skills[obj['meta']['listingId']]
    labels = filterSkillsByConfidence(s)
    
    matcher = PhraseMatcher(nlp.vocab, attr="LOWER")
    matcher.add("SKILL", [nlp(cls['value']) for cls in labels])
    matches = matcher(doc)
    
    entities = list()
    for match_id, start, end in matches:
        span = Span(doc, start, end, label='SKILL')
        match = next(filter(lambda x: x['value'] == span.text, s), None)
        if match:
            skill = match['skills'][0]
            span.kb_id_	 = skill['id']
            entities.append(span)

    doc.ents = spacy.util.filter_spans(entities)
    
    docs.append(doc)

then divide it into training and test set:

train_docs = DocBin()
test_docs = DocBin()

test_index = int(len(data) * 0.2)

for index in range(0, len(docs)-test_index):
    train_docs.add(docs[index])

for index in range(len(docs)-test_index, len(docs)):
    test_docs.add(docs[index])

print(len(train_docs), len(test_docs))
    
train_docs.to_disk('corpus/train.spacy')
test_docs.to_disk('corpus/test.spacy')

By trying to run training:
python -m spacy train configs/nel.cfg --output training --paths.train corpus/train.spacy --paths.dev corpus/test.spacy --paths.kb tmp/kb --paths.base_nlp tmp/model -c scripts/custom_functions.py

I get this warning and constantly 0.17 accuracy.

/Users/fed/Library/Caches/pypoetry/virtualenvs/nel-riFBMyAx-py3.9/lib/python3.9/site-packages/spacy/pipeline/entity_linker.py:276: UserWarning: [W093] Could not find any data to train the Entity Linker on. Is your input data correctly formatted?

Any ideas what it could be?

Also, KB was created like this:

kb_loc = 'tmp/kb'
nlp_dir = 'tmp/model'

nlp = spacy.load(vectors_model, exclude="parser, tagger, lemmatizer")
nlp.add_pipe("sentencizer", first=True)

kb = KnowledgeBase(vocab=nlp.vocab, entity_vector_length=300)

for skill in skills:
    desc_doc = nlp(skill['description']) if skill['description'] is not None else nlp(skill['name'])
    desc_enc = desc_doc.vector
    kb.add_entity(entity=skill['id'], entity_vector=desc_enc, freq=342)
    kb.add_alias(alias=skill['name'], entities=[skill['id']], probabilities=[1])

print(f"Entities in the KB: {len(kb.get_entity_strings())}")
print(f"Aliases in the KB: {kb.get_alias_strings()}")
print()

kb.to_disk(kb_loc)
if not os.path.exists(nlp_dir):
    os.mkdir(nlp_dir)
nlp.to_disk(nlp_dir)

I have removed EntityRuler here, not sure what kind of game it was doing here :frowning: probably that's where I made my mistake.

I would love to get any help here.

I have an assumption, that I need to write patterns (EntityRulers) into KB, any spans matched (labeled as SKILL) and then map them back to KB entity, and if I do have multiple, then reduce probability value?


Ok, I think this change Update NEL prodigy script (#40) · explosion/projects@9399cc1 · GitHub was done specifically to fix ORG detection instead of PERSON, so I don't really need that EntityRules.

============================ Data file validation ============================
✔ Pipeline can be initialized with data
✔ Corpus is loadable

=============================== Training stats ===============================
Language: en
Training pipeline: sentencizer, ner, entity_linker
Frozen components: sentencizer, ner
235 training docs
58 evaluation docs
✔ No overlap between training and evaluation data
⚠ Low number of examples to train a new pipeline (235)

============================== Vocab & Vectors ==============================
ℹ 155932 total word(s) in the data (9299 unique)
ℹ 20000 vectors (684830 unique keys, 300 dimensions)
⚠ 16028 words in training data without vectors (10%)

========================== Named Entity Recognition ==========================
ℹ 19 label(s)
0 missing value(s) (tokens with '-' label)
⚠ Some model labels are not present in the train data. The model
performance may be degraded for these labels after training: 'QUANTITY', 'TIME',
'GPE', 'ORG', 'LOC', 'PERSON', 'PRODUCT', 'LANGUAGE', 'PERCENT', 'ORDINAL',
'LAW', 'FAC', 'NORP', 'WORK_OF_ART', 'DATE', 'EVENT', 'MONEY', 'CARDINAL'.
✔ Good amount of examples for all labels
✔ Examples without occurrences available for all labels
✔ No entities consisting of or starting/ending with whitespace

================================== Summary ==================================
✔ 6 checks passed
⚠ 3 warnings

Instead of giving multiple spans with kb_id for a large chunk of text (I found that assumption here projects/create_corpus.py at e34a56ad5f22ef91a096e08b54481b69da657682 · explosion/projects · GitHub),
I created sentences with a single-span object and recreated the corpus.
This way I don't get any warnings during the training, but training looks really bad:

============================= Training pipeline =============================
ℹ Pipeline: ['sentencizer', 'ner', 'entity_linker']
ℹ Frozen components: ['sentencizer', 'ner']
ℹ Initial learn rate: 0.001
E    #       LOSS ENTIT...  SENTS_F  SENTS_P  SENTS_R  ENTS_F  ENTS_P  ENTS_R  NEL_MICRO_F  NEL_MICRO_R  NEL_MICRO_P  SCORE 
---  ------  -------------  -------  -------  -------  ------  ------  ------  -----------  -----------  -----------  ------
  0       0           0.99   100.00   100.00   100.00    0.00    0.00    0.00        40.00        25.00       100.00    0.46
  0     200          49.97   100.00   100.00   100.00    0.00    0.00    0.00        40.00        25.00       100.00    0.46
  0     400          28.01   100.00   100.00   100.00    0.00    0.00    0.00        40.00        25.00       100.00    0.46
  0     600          32.74   100.00   100.00   100.00    0.00    0.00    0.00        40.00        25.00       100.00    0.46

I think I found the issue, Knowledge Base aliases are mandatory and case sensitive.

Hi Fedya, apologies for the late follow-up!

What the comment at https://github.com/explosion/projects/blob/e34a56ad5f22ef91a096e08b54481b69da657682/tutorials/nel_emerson/scripts/create_corpus.py#L22 refers to is not that we annotate the full sentence with 1 KB ID, but rather that at that point we create an instance with just a single span in it - i.e. exactly what you did when you said

I created sentences with a single-span object and recreated the corpus.

I just wanted to clarify that point.

But other than that, it looks like you were able to resolve your issue? Is training working well now, or do you have any remaining issues?

Looks like I had those 2 issues that were basically triggering unexpected results, so if anybody arrives here, please check if you do this:

  • single span training at a time with kb-id

  • you MUST have aliases with exact match (case-sensitive)

Anyway, thanks @SofieVL for the support!

1 Like

@XBeg9 - how did you validate both that there was single span training at a time with kb-id and that the Aliases in your KB were exact match / case-sensitive? Here's an example doc in my corpora. I suspect I have multiple spans per doc as well.

{'doc_annotation': {'cats': {}, 'entities': ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-NIL', 'I-NIL', 'I-NIL', 'L-NIL', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-NIL', 'L-NIL', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'U-NIL', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-NIL', 'L-NIL', 'O', 'O'], 'spans': {}, 'links': {(145, 184): {'Q7122548': 1.0}, (365, 380): {'Q58731': 1.0}, (1109, 1114): {'Q2': 1.0}, (1769, 1790): {'Q2269': 1.0}}}, 'token_annotation': {'ORTH': ['NOAA', 'says', 'Earth', "'s", 'oceans', 'becoming', 'more', 'acidic', ' ', '.', ' ', 'According', 'to', 'a', 'study', 'performed', 'by', 'the', 'National', 'Oceanic', 'and', 'Atmospheric', 'Administration', "'s", '(', 'NOAA', ')', 'Pacific', 'Marine', 'Environmental', 'Laboratory', ',', 'the', 'level', 'of', 'acid', 'in', 'the', 'world', "'s", 'oceans', 'is', 'rising', ',', 'decades', 'before', 'scientists', 'expected', 'the', 'levels', 'to', 'rise', '.', ' ', 'The', 'study', 'was', 'performed', 'on', 'the', 'coastal', 'waters', 'of', 'the', 'Pacific', 'Ocean', 'from', 'Baja', 'California', ',', 'Mexico', 'to', 'Vancouver', ',', 'British', 'Columbia', ',', 'where', 'tests', 'showed', 'that', 'acid', 'levels', 'in', 'some', 'areas', 'near', 'the', 'edge', 'of', 'the', 'Continental', 'Shelf', 'were', 'high', 'enough', 'to', 'corrode', 'the', 'shells', 'of', 'some', 'sea', 'creatures', 'as', 'well', 'as', 'some', 'corals', '.', 'Some', 'areas', 'showed', 'excessive', 'levels', 'of', 'acid', 'less', 'than', 'four', 'miles', 'off', 'the', 'northern', 'California', 'coastline', 'in', 'the', 'United', 'States', '.', ' ', '"', 'What', 'we', 'found', '...', 'was', 'truly', 'astonishing', '.', 'This', 'means', 'ocean', 'acidification', 'may', 'be', 'seriously', 'impacting', 'marine', 'life', 'on', 'the', 'continental', 'shelf', 'right', 'now', '.', 'The', 'models', 'suggested', 'they', 'would', "n't", 'be', 'corrosive', 'at', 'the', 'surface', 'until', 'sometime', 'during', 'the', 'second', 'half', 'of', 'this', 'century', ',', '"', 'said', 'Richard', 'A.', 'Feely', ',', 'an', 'oceanographer', 'from', 'the', 'NOAA', '.', ' ', 'The', 'natural', 'processes', 'of', 'the', 'seas', 'and', 'oceans', 'constantly', 'clean', 'the', 'Earth', "'s", 'air', ',', 'absorbing', '1/3', 'to', '1/2', 'of', 'the', 'carbon', 'dioxide', 'generated', 'by', 'humans', '.', 'As', 'the', 'oceans', 'absorb', 'more', 'of', 'the', 'gas', ',', 'the', 'water', 'becomes', 'more', 'acidic', ',', 'reducing', 'the', 'amount', 'of', 'carbonate', 'which', 'shellfish', 'such', 'as', 'clams', 'and', 'oysters', 'use', 'to', 'form', 'their', 'shells', ',', 'and', 'increasing', 'the', 'levels', 'of', 'carbonic', 'acid', '.', 'Although', 'levels', 'are', 'high', ',', 'they', 'are', 'not', 'yet', 'high', 'enough', 'to', 'threaten', 'humans', 'directly', '.', ' ', '"', 'Scientists', 'have', 'also', 'seen', 'a', 'reduced', 'ability', 'of', 'marine', 'algae', 'and', 'free', '-', 'floating', 'plants', 'and', 'animals', 'to', 'produce', 'protective', 'carbonate', 'shells', ',', '"', 'added', 'Feely', '.', ' ', 'Feely', 'noted', 'that', ',', 'according', 'to', 'the', 'study', ',', 'the', 'oceans', 'and', 'seas', 'have', 'absorbed', 'more', 'than', '525', 'billion', 'tons', 'of', 'carbon', 'dioxide', 'since', 'the', 'Industrial', 'Revolution', 'began', '.'], 'SPACY': [True, True, False, True, True, True, True, True, False, True, False, True, True, True, True, True, True, True, True, True, True, True, False, True, False, False, True, True, True, True, False, True, True, True, True, True, True, True, False, True, True, True, False, True, True, True, True, True, True, True, True, False, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, True, True, True, False, True, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, True, False, False, True, True, True, True, True, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, True, True, True, True, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, False, True, True, True, True, False, True, True, True, True, True, False, True, False, True, True, True, True, True, True, True, True, True, True, True, False, True, False, True, True, True, True, True, True, True, True, True, True, True, False, True, True, True, True, True, True, True, True, False, True, True, True, True, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, True, True, True, True, True, True, True, False, True, True, True, True, False, True, True, True, True, True, True, True, True, True, True, False, True, False, False, True, True, True, True, True, True, True, True, True, True, True, False, False, True, True, True, True, True, True, True, True, False, False, True, True, False, True, False, True, True, False, True, True, True, True, False, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, False, False], 'TAG': ['', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', ''], 'LEMMA': ['', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', ''], 'POS': ['', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', ''], 'MORPH': ['', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', ''], 'HEAD': [1, 1, 4, 2, 5, 1, 7, 5, 7, 1, 1, 42, 11, 14, 12, 14, 15, 19, 19, 30, 19, 22, 19, 22, 25, 30, 25, 30, 30, 30, 16, 30, 33, 42, 33, 34, 33, 38, 40, 38, 36, 42, 42, 42, 47, 47, 47, 42, 49, 51, 51, 47, 42, 52, 55, 57, 57, 57, 57, 61, 61, 58, 61, 65, 65, 62, 65, 68, 66, 68, 68, 70, 71, 72, 75, 72, 72, 79, 79, 72, 93, 82, 93, 82, 85, 83, 85, 88, 86, 88, 92, 92, 89, 79, 93, 94, 97, 94, 99, 97, 99, 103, 103, 100, 106, 106, 99, 108, 99, 57, 111, 112, 112, 114, 112, 114, 115, 119, 119, 120, 121, 112, 125, 124, 125, 121, 125, 129, 129, 126, 112, 130, 137, 135, 135, 137, 137, 137, 139, 137, 137, 142, 142, 144, 148, 148, 148, 148, 142, 150, 148, 148, 154, 154, 151, 156, 148, 142, 159, 160, 180, 164, 164, 164, 160, 164, 165, 168, 166, 164, 169, 169, 174, 174, 171, 174, 177, 175, 180, 180, 180, 183, 183, 180, 183, 186, 183, 186, 189, 187, 180, 190, 194, 194, 201, 194, 197, 195, 197, 197, 201, 201, 203, 205, 203, 201, 201, 201, 210, 210, 207, 210, 214, 214, 211, 214, 215, 216, 201, 222, 221, 222, 230, 222, 223, 226, 224, 230, 229, 230, 230, 232, 230, 230, 230, 236, 234, 236, 237, 240, 236, 242, 240, 242, 243, 243, 240, 248, 246, 250, 248, 246, 246, 234, 255, 253, 255, 258, 256, 230, 262, 262, 266, 262, 266, 266, 266, 266, 266, 266, 269, 272, 269, 272, 272, 266, 275, 281, 281, 281, 281, 302, 284, 284, 281, 284, 287, 285, 287, 291, 291, 292, 287, 292, 292, 296, 284, 299, 299, 296, 302, 302, 302, 302, 302, 304, 307, 307, 320, 320, 320, 310, 313, 311, 320, 316, 320, 316, 316, 320, 307, 324, 324, 324, 325, 320, 325, 328, 326, 333, 332, 332, 333, 320, 307], 'DEP': ['nsubj', 'ROOT', 'poss', 'case', 'nsubj', 'ccomp', 'advmod', 'acomp', 'dep', 'punct', 'punct', 'prep', 'prep', 'det', 'pobj', 'acl', 'agent', 'det', 'compound', 'poss', 'cc', 'compound', 'conj', 'case', 'punct', 'nmod', 'punct', 'compound', 'compound', 'compound', 'pobj', 'punct', 'det', 'nsubj', 'prep', 'pobj', 'prep', 'det', 'poss', 'case', 'pobj', 'aux', 'ROOT', 'punct', 'npadvmod', 'mark', 'nsubj', 'advcl', 'det', 'nsubj', 'aux', 'ccomp', 'punct', 'dep', 'det', 'nsubjpass', 'auxpass', 'ROOT', 'prep', 'det', 'amod', 'pobj', 'prep', 'det', 'compound', 'pobj', 'prep', 'compound', 'pobj', 'punct', 'appos', 'prep', 'pobj', 'punct', 'compound', 'appos', 'punct', 'advmod', 'nsubj', 'relcl', 'mark', 'compound', 'nsubj', 'prep', 'det', 'pobj', 'prep', 'det', 'pobj', 'prep', 'det', 'compound', 'pobj', 'ccomp', 'acomp', 'advmod', 'aux', 'xcomp', 'det', 'dobj', 'prep', 'det', 'compound', 'pobj', 'advmod', 'advmod', 'cc', 'det', 'conj', 'punct', 'det', 'nsubj', 'ROOT', 'amod', 'dobj', 'prep', 'pobj', 'amod', 'quantmod', 'nummod', 'npadvmod', 'prep', 'det', 'amod', 'compound', 'pobj', 'prep', 'det', 'compound', 'pobj', 'punct', 'dep', 'punct', 'dobj', 'nsubj', 'csubj', 'punct', 'ROOT', 'advmod', 'acomp', 'punct', 'nsubj', 'ROOT', 'compound', 'nsubj', 'aux', 'aux', 'advmod', 'ccomp', 'amod', 'dobj', 'prep', 'det', 'amod', 'pobj', 'advmod', 'advmod', 'punct', 'det', 'nsubj', 'ccomp', 'nsubj', 'aux', 'neg', 'ccomp', 'acomp', 'prep', 'det', 'pobj', 'prep', 'pcomp', 'prep', 'det', 'amod', 'pobj', 'prep', 'det', 'pobj', 'punct', 'punct', 'ROOT', 'compound', 'compound', 'nsubj', 'punct', 'det', 'appos', 'prep', 'det', 'pobj', 'punct', 'dep', 'det', 'amod', 'nsubj', 'prep', 'det', 'pobj', 'cc', 'conj', 'advmod', 'ROOT', 'det', 'poss', 'case', 'dobj', 'punct', 'advcl', 'quantmod', 'quantmod', 'dobj', 'prep', 'det', 'compound', 'pobj', 'acl', 'agent', 'pobj', 'punct', 'mark', 'det', 'nsubj', 'advcl', 'dobj', 'prep', 'det', 'pobj', 'punct', 'det', 'nsubj', 'ROOT', 'advmod', 'acomp', 'punct', 'advcl', 'det', 'dobj', 'prep', 'pobj', 'nsubj', 'relcl', 'amod', 'prep', 'pobj', 'cc', 'conj', 'conj', 'aux', 'xcomp', 'poss', 'dobj', 'punct', 'cc', 'conj', 'det', 'dobj', 'prep', 'amod', 'pobj', 'punct', 'mark', 'nsubj', 'advcl', 'acomp', 'punct', 'nsubj', 'ROOT', 'neg', 'advmod', 'acomp', 'advmod', 'aux', 'xcomp', 'dobj', 'advmod', 'punct', 'dep', 'punct', 'nsubj', 'aux', 'advmod', 'ccomp', 'det', 'amod', 'dobj', 'prep', 'amod', 'pobj', 'cc', 'advmod', 'punct', 'amod', 'conj', 'cc', 'conj', 'aux', 'acl', 'amod', 'compound', 'dobj', 'punct', 'punct', 'ROOT', 'advmod', 'punct', 'dep', 'advmod', 'ROOT', 'mark', 'punct', 'prep', 'prep', 'det', 'pobj', 'punct', 'det', 'nsubj', 'cc', 'conj', 'aux', 'ccomp', 'amod', 'quantmod', 'compound', 'nummod', 'dobj', 'prep', 'compound', 'pobj', 'mark', 'det', 'compound', 'nsubj', 'advcl', 'punct'], 'SENT_START': [1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}}