Reduce size of tokenizer tables
authorJohn Koleszar <jkoleszar@google.com>
Thu, 16 Sep 2010 14:00:04 +0000 (10:00 -0400)
committerJohn Koleszar <jkoleszar@google.com>
Thu, 16 Sep 2010 14:00:04 +0000 (10:00 -0400)
commit147b125b1596df9bd0c8141b9d09229ab65d3e0f
tree2ae7cde4017c1b966b91917527c027ca16b1677a
parent746439ef6c1dd2fedbe0c24ddb76d40cb9d26357
Reduce size of tokenizer tables

This patch reduces the size of the global tables maintained by the
tokenizer to 16k from 80k-96k. See issue #177.

Change-Id: If0275d5f28389af11ac83c5d929d1157cde90fbe
vp8/encoder/tokenize.c
vp8/encoder/tokenize.h