[Groonga-commit] groonga/groonga at 5dde8bd [master] mecab: fix wrong grn_tokenizer_query usage

Back to archive index

Kouhei Sutou null+****@clear*****
Mon Aug 13 09:45:05 JST 2018


Kouhei Sutou	2018-08-13 09:45:05 +0900 (Mon, 13 Aug 2018)

  New Revision: 5dde8bda13e821a0658d1ce05cd24ca611d088c2
  https://github.com/groonga/groonga/commit/5dde8bda13e821a0658d1ce05cd24ca611d088c2

  Message:
    mecab: fix wrong grn_tokenizer_query usage

  Modified files:
    plugins/tokenizers/mecab.c

  Modified: plugins/tokenizers/mecab.c (+2 -2)
===================================================================
--- plugins/tokenizers/mecab.c    2018-08-13 09:44:55 +0900 (a384bf257)
+++ plugins/tokenizers/mecab.c    2018-08-13 09:45:05 +0900 (0da6ecdb3)
@@ -485,7 +485,7 @@ mecab_init(grn_ctx *ctx, grn_tokenizer_query *query)
                               &normalized_string_length,
                               NULL);
     GRN_TEXT_INIT(&(tokenizer->buf), 0);
-    if (query->have_tokenized_delimiter) {
+    if (grn_tokenizer_query_have_tokenized_delimiter(ctx, query)) {
       tokenizer->next = normalized_string;
       tokenizer->end = tokenizer->next + normalized_string_length;
     } else if (normalized_string_length == 0) {
@@ -553,7 +553,7 @@ mecab_next(grn_ctx *ctx,
   grn_mecab_tokenizer *tokenizer = user_data;
   grn_encoding encoding = tokenizer->query->encoding;
 
-  if (tokenizer->query->have_tokenized_delimiter) {
+  if (grn_tokenizer_query_have_tokenized_delimiter(ctx, tokenizer->query)) {
     grn_tokenizer_token tokenizer_token;
     grn_tokenizer_token_init(ctx, &tokenizer_token);
     /* TODO: Need grn_token version. */
-------------- next part --------------
HTML����������������������������...
URL: https://lists.osdn.me/mailman/archives/groonga-commit/attachments/20180813/dd3c6787/attachment.htm 



More information about the Groonga-commit mailing list
Back to archive index